Dec 12 00:23:28 crc systemd[1]: Starting Kubernetes Kubelet... Dec 12 00:23:28 crc restorecon[4580]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:28 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 12 00:23:29 crc restorecon[4580]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 12 00:23:29 crc restorecon[4580]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Dec 12 00:23:29 crc kubenswrapper[4606]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 12 00:23:29 crc kubenswrapper[4606]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Dec 12 00:23:29 crc kubenswrapper[4606]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 12 00:23:29 crc kubenswrapper[4606]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 12 00:23:29 crc kubenswrapper[4606]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Dec 12 00:23:29 crc kubenswrapper[4606]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.580028 4606 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.583031 4606 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.583051 4606 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.583056 4606 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.583060 4606 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.583065 4606 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.583071 4606 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.583075 4606 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.583080 4606 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.583084 4606 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.583088 4606 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.583106 4606 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.583110 4606 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.583114 4606 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.583118 4606 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.583121 4606 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.583125 4606 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.583128 4606 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.583132 4606 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.583135 4606 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.583138 4606 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.583142 4606 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.583146 4606 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.583150 4606 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.583153 4606 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.583157 4606 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.583160 4606 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.583163 4606 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.583183 4606 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.583187 4606 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.583191 4606 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.583196 4606 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.583200 4606 feature_gate.go:330] unrecognized feature gate: Example Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.583205 4606 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.583209 4606 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.583212 4606 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.583216 4606 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.583220 4606 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.583224 4606 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.583227 4606 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.583231 4606 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.583235 4606 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.583240 4606 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.583244 4606 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.583248 4606 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.583252 4606 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.583256 4606 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.583259 4606 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.583263 4606 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.583266 4606 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.583270 4606 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.583273 4606 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.583277 4606 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.583280 4606 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.583286 4606 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.583290 4606 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.583294 4606 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.583298 4606 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.583301 4606 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.583305 4606 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.583309 4606 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.583312 4606 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.583316 4606 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.583319 4606 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.583323 4606 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.583327 4606 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.583334 4606 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.583342 4606 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.583346 4606 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.583351 4606 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.583356 4606 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.583360 4606 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.583583 4606 flags.go:64] FLAG: --address="0.0.0.0" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.583600 4606 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.583609 4606 flags.go:64] FLAG: --anonymous-auth="true" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.583616 4606 flags.go:64] FLAG: --application-metrics-count-limit="100" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.583622 4606 flags.go:64] FLAG: --authentication-token-webhook="false" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.583628 4606 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.583636 4606 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.583642 4606 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.583647 4606 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.583652 4606 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.583658 4606 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.583663 4606 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.583668 4606 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.583673 4606 flags.go:64] FLAG: --cgroup-root="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.583678 4606 flags.go:64] FLAG: --cgroups-per-qos="true" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.583682 4606 flags.go:64] FLAG: --client-ca-file="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.583687 4606 flags.go:64] FLAG: --cloud-config="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.583691 4606 flags.go:64] FLAG: --cloud-provider="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.583696 4606 flags.go:64] FLAG: --cluster-dns="[]" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.583702 4606 flags.go:64] FLAG: --cluster-domain="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.583707 4606 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.583713 4606 flags.go:64] FLAG: --config-dir="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.583718 4606 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.583723 4606 flags.go:64] FLAG: --container-log-max-files="5" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.583730 4606 flags.go:64] FLAG: --container-log-max-size="10Mi" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.583735 4606 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.583740 4606 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.583745 4606 flags.go:64] FLAG: --containerd-namespace="k8s.io" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.583752 4606 flags.go:64] FLAG: --contention-profiling="false" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.583757 4606 flags.go:64] FLAG: --cpu-cfs-quota="true" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.583762 4606 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.583767 4606 flags.go:64] FLAG: --cpu-manager-policy="none" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.583772 4606 flags.go:64] FLAG: --cpu-manager-policy-options="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.583778 4606 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.583784 4606 flags.go:64] FLAG: --enable-controller-attach-detach="true" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.583788 4606 flags.go:64] FLAG: --enable-debugging-handlers="true" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.583794 4606 flags.go:64] FLAG: --enable-load-reader="false" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.583799 4606 flags.go:64] FLAG: --enable-server="true" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.583804 4606 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.583811 4606 flags.go:64] FLAG: --event-burst="100" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.583816 4606 flags.go:64] FLAG: --event-qps="50" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.583821 4606 flags.go:64] FLAG: --event-storage-age-limit="default=0" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.583826 4606 flags.go:64] FLAG: --event-storage-event-limit="default=0" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.583831 4606 flags.go:64] FLAG: --eviction-hard="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.583837 4606 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.583843 4606 flags.go:64] FLAG: --eviction-minimum-reclaim="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.583847 4606 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.583852 4606 flags.go:64] FLAG: --eviction-soft="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.583857 4606 flags.go:64] FLAG: --eviction-soft-grace-period="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.583861 4606 flags.go:64] FLAG: --exit-on-lock-contention="false" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.583865 4606 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.583870 4606 flags.go:64] FLAG: --experimental-mounter-path="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.583873 4606 flags.go:64] FLAG: --fail-cgroupv1="false" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.583877 4606 flags.go:64] FLAG: --fail-swap-on="true" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.583882 4606 flags.go:64] FLAG: --feature-gates="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.583887 4606 flags.go:64] FLAG: --file-check-frequency="20s" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.583892 4606 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.583898 4606 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.583903 4606 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.583908 4606 flags.go:64] FLAG: --healthz-port="10248" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.583914 4606 flags.go:64] FLAG: --help="false" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.583919 4606 flags.go:64] FLAG: --hostname-override="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.583924 4606 flags.go:64] FLAG: --housekeeping-interval="10s" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.583930 4606 flags.go:64] FLAG: --http-check-frequency="20s" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.583935 4606 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.583940 4606 flags.go:64] FLAG: --image-credential-provider-config="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.583945 4606 flags.go:64] FLAG: --image-gc-high-threshold="85" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.583950 4606 flags.go:64] FLAG: --image-gc-low-threshold="80" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.583955 4606 flags.go:64] FLAG: --image-service-endpoint="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.583960 4606 flags.go:64] FLAG: --kernel-memcg-notification="false" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.583964 4606 flags.go:64] FLAG: --kube-api-burst="100" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.583969 4606 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.583976 4606 flags.go:64] FLAG: --kube-api-qps="50" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.583981 4606 flags.go:64] FLAG: --kube-reserved="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.583986 4606 flags.go:64] FLAG: --kube-reserved-cgroup="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.583990 4606 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.583996 4606 flags.go:64] FLAG: --kubelet-cgroups="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.584001 4606 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.584007 4606 flags.go:64] FLAG: --lock-file="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.584012 4606 flags.go:64] FLAG: --log-cadvisor-usage="false" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.584017 4606 flags.go:64] FLAG: --log-flush-frequency="5s" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.584023 4606 flags.go:64] FLAG: --log-json-info-buffer-size="0" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.584032 4606 flags.go:64] FLAG: --log-json-split-stream="false" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.584037 4606 flags.go:64] FLAG: --log-text-info-buffer-size="0" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.584042 4606 flags.go:64] FLAG: --log-text-split-stream="false" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.584047 4606 flags.go:64] FLAG: --logging-format="text" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.584051 4606 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.584057 4606 flags.go:64] FLAG: --make-iptables-util-chains="true" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.584063 4606 flags.go:64] FLAG: --manifest-url="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.584068 4606 flags.go:64] FLAG: --manifest-url-header="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.584074 4606 flags.go:64] FLAG: --max-housekeeping-interval="15s" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.584079 4606 flags.go:64] FLAG: --max-open-files="1000000" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.584087 4606 flags.go:64] FLAG: --max-pods="110" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.584092 4606 flags.go:64] FLAG: --maximum-dead-containers="-1" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.584097 4606 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.584103 4606 flags.go:64] FLAG: --memory-manager-policy="None" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.584108 4606 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.584113 4606 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.584118 4606 flags.go:64] FLAG: --node-ip="192.168.126.11" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.584124 4606 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.584136 4606 flags.go:64] FLAG: --node-status-max-images="50" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.584140 4606 flags.go:64] FLAG: --node-status-update-frequency="10s" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.584145 4606 flags.go:64] FLAG: --oom-score-adj="-999" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.584150 4606 flags.go:64] FLAG: --pod-cidr="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.584155 4606 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.584163 4606 flags.go:64] FLAG: --pod-manifest-path="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.584190 4606 flags.go:64] FLAG: --pod-max-pids="-1" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.584196 4606 flags.go:64] FLAG: --pods-per-core="0" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.584202 4606 flags.go:64] FLAG: --port="10250" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.584207 4606 flags.go:64] FLAG: --protect-kernel-defaults="false" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.584212 4606 flags.go:64] FLAG: --provider-id="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.584217 4606 flags.go:64] FLAG: --qos-reserved="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.584222 4606 flags.go:64] FLAG: --read-only-port="10255" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.584226 4606 flags.go:64] FLAG: --register-node="true" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.584231 4606 flags.go:64] FLAG: --register-schedulable="true" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.584237 4606 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.584247 4606 flags.go:64] FLAG: --registry-burst="10" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.584252 4606 flags.go:64] FLAG: --registry-qps="5" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.584257 4606 flags.go:64] FLAG: --reserved-cpus="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.584262 4606 flags.go:64] FLAG: --reserved-memory="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.584269 4606 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.584274 4606 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.584279 4606 flags.go:64] FLAG: --rotate-certificates="false" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.584284 4606 flags.go:64] FLAG: --rotate-server-certificates="false" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.584289 4606 flags.go:64] FLAG: --runonce="false" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.584295 4606 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.584300 4606 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.584306 4606 flags.go:64] FLAG: --seccomp-default="false" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.584311 4606 flags.go:64] FLAG: --serialize-image-pulls="true" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.584316 4606 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.584321 4606 flags.go:64] FLAG: --storage-driver-db="cadvisor" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.584326 4606 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.584331 4606 flags.go:64] FLAG: --storage-driver-password="root" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.584336 4606 flags.go:64] FLAG: --storage-driver-secure="false" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.584341 4606 flags.go:64] FLAG: --storage-driver-table="stats" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.584346 4606 flags.go:64] FLAG: --storage-driver-user="root" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.584350 4606 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.584356 4606 flags.go:64] FLAG: --sync-frequency="1m0s" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.584361 4606 flags.go:64] FLAG: --system-cgroups="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.584366 4606 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.584375 4606 flags.go:64] FLAG: --system-reserved-cgroup="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.584380 4606 flags.go:64] FLAG: --tls-cert-file="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.584385 4606 flags.go:64] FLAG: --tls-cipher-suites="[]" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.584391 4606 flags.go:64] FLAG: --tls-min-version="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.584397 4606 flags.go:64] FLAG: --tls-private-key-file="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.584402 4606 flags.go:64] FLAG: --topology-manager-policy="none" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.584407 4606 flags.go:64] FLAG: --topology-manager-policy-options="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.584412 4606 flags.go:64] FLAG: --topology-manager-scope="container" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.584422 4606 flags.go:64] FLAG: --v="2" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.584432 4606 flags.go:64] FLAG: --version="false" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.584438 4606 flags.go:64] FLAG: --vmodule="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.584444 4606 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.584449 4606 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.584568 4606 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.584575 4606 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.584580 4606 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.584584 4606 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.584588 4606 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.584592 4606 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.584596 4606 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.584600 4606 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.584605 4606 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.584609 4606 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.584613 4606 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.584617 4606 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.584620 4606 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.584624 4606 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.584627 4606 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.584631 4606 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.584634 4606 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.584638 4606 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.584641 4606 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.584645 4606 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.584648 4606 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.584652 4606 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.584656 4606 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.584660 4606 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.584664 4606 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.584667 4606 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.584670 4606 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.584677 4606 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.584682 4606 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.584685 4606 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.584689 4606 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.584692 4606 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.584696 4606 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.584699 4606 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.584703 4606 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.584706 4606 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.584710 4606 feature_gate.go:330] unrecognized feature gate: Example Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.584713 4606 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.584717 4606 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.584721 4606 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.584725 4606 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.584728 4606 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.584731 4606 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.584735 4606 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.584739 4606 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.584743 4606 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.584748 4606 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.584752 4606 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.584756 4606 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.584759 4606 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.584763 4606 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.584768 4606 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.584772 4606 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.584776 4606 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.584780 4606 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.584784 4606 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.584789 4606 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.584793 4606 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.584797 4606 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.584802 4606 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.584808 4606 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.584811 4606 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.584815 4606 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.584819 4606 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.584822 4606 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.584826 4606 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.584829 4606 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.584833 4606 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.584836 4606 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.584840 4606 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.584843 4606 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.584856 4606 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.591466 4606 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.591497 4606 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.591559 4606 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.591566 4606 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.591571 4606 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.591575 4606 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.591579 4606 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.591582 4606 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.591586 4606 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.591589 4606 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.591593 4606 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.591598 4606 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.591603 4606 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.591608 4606 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.591613 4606 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.591618 4606 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.591623 4606 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.591628 4606 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.591632 4606 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.591636 4606 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.591641 4606 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.591645 4606 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.591650 4606 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.591654 4606 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.591658 4606 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.591661 4606 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.591665 4606 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.591668 4606 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.591672 4606 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.591676 4606 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.591679 4606 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.591683 4606 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.591687 4606 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.591691 4606 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.591696 4606 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.591700 4606 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.591705 4606 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.591710 4606 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.591715 4606 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.591720 4606 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.591724 4606 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.591729 4606 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.591733 4606 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.591737 4606 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.591742 4606 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.591746 4606 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.591751 4606 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.591755 4606 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.591759 4606 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.591763 4606 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.591768 4606 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.591772 4606 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.591776 4606 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.591781 4606 feature_gate.go:330] unrecognized feature gate: Example Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.591785 4606 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.591789 4606 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.591794 4606 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.591798 4606 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.591803 4606 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.591807 4606 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.591811 4606 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.591816 4606 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.591820 4606 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.591824 4606 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.591828 4606 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.591833 4606 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.591838 4606 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.591844 4606 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.591849 4606 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.591854 4606 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.591858 4606 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.591862 4606 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.591874 4606 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.591883 4606 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.592010 4606 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.592019 4606 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.592023 4606 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.592027 4606 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.592031 4606 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.592035 4606 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.592039 4606 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.592042 4606 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.592046 4606 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.592051 4606 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.592055 4606 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.592058 4606 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.592062 4606 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.592066 4606 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.592070 4606 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.592076 4606 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.592080 4606 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.592085 4606 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.592088 4606 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.592092 4606 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.592095 4606 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.592100 4606 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.592104 4606 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.592108 4606 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.592114 4606 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.592117 4606 feature_gate.go:330] unrecognized feature gate: Example Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.592121 4606 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.592124 4606 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.592128 4606 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.592132 4606 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.592136 4606 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.592140 4606 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.592143 4606 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.592146 4606 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.592152 4606 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.592156 4606 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.592159 4606 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.592163 4606 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.592166 4606 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.592189 4606 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.592193 4606 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.592197 4606 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.592200 4606 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.592204 4606 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.592207 4606 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.592211 4606 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.592214 4606 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.592218 4606 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.592221 4606 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.592225 4606 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.592228 4606 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.592232 4606 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.592236 4606 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.592239 4606 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.592243 4606 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.592247 4606 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.592251 4606 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.592254 4606 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.592258 4606 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.592261 4606 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.592265 4606 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.592268 4606 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.592272 4606 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.592275 4606 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.592279 4606 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.592282 4606 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.592286 4606 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.592290 4606 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.592293 4606 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.592297 4606 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.592301 4606 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.592307 4606 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.592649 4606 server.go:940] "Client rotation is on, will bootstrap in background" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.595221 4606 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.595293 4606 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.595681 4606 server.go:997] "Starting client certificate rotation" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.595698 4606 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.595965 4606 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-11-06 21:45:05.103154246 +0000 UTC Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.596059 4606 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.599711 4606 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 12 00:23:29 crc kubenswrapper[4606]: E1212 00:23:29.601389 4606 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.204:6443: connect: connection refused" logger="UnhandledError" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.603200 4606 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.608927 4606 log.go:25] "Validated CRI v1 runtime API" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.619820 4606 log.go:25] "Validated CRI v1 image API" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.621133 4606 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.622696 4606 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-12-12-00-18-03-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.622728 4606 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:41 fsType:tmpfs blockSize:0}] Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.631824 4606 manager.go:217] Machine: {Timestamp:2025-12-12 00:23:29.630590301 +0000 UTC m=+0.175943187 CPUVendorID:AuthenticAMD NumCores:8 NumPhysicalCores:1 NumSockets:8 CpuFrequency:2800000 MemoryCapacity:25199476736 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:d5982033-b2dc-474c-9cda-275cd567c208 BootID:19ab28be-ccfb-4859-88d0-8d375e2d06f2 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:12599738368 Type:vfs Inodes:3076108 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:5039898624 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:12599738368 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:2519945216 Type:vfs Inodes:615221 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:41 Capacity:1073741824 Type:vfs Inodes:3076108 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:429496729600 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:82:d9:43 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:82:d9:43 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:1e:e0:38 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:59:2d:50 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:c9:bb:80 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:c1:2f:5b Speed:-1 Mtu:1496} {Name:eth10 MacAddress:6e:00:53:21:0f:48 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:56:d4:d0:a7:e0:26 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:25199476736 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.632191 4606 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.632373 4606 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.632878 4606 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.633032 4606 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.633065 4606 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.633266 4606 topology_manager.go:138] "Creating topology manager with none policy" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.633276 4606 container_manager_linux.go:303] "Creating device plugin manager" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.633462 4606 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.633493 4606 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.633650 4606 state_mem.go:36] "Initialized new in-memory state store" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.633719 4606 server.go:1245] "Using root directory" path="/var/lib/kubelet" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.634357 4606 kubelet.go:418] "Attempting to sync node with API server" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.634385 4606 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.634413 4606 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.634429 4606 kubelet.go:324] "Adding apiserver pod source" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.634441 4606 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.637540 4606 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.204:6443: connect: connection refused Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.637629 4606 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Dec 12 00:23:29 crc kubenswrapper[4606]: E1212 00:23:29.637631 4606 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.204:6443: connect: connection refused" logger="UnhandledError" Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.637760 4606 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.204:6443: connect: connection refused Dec 12 00:23:29 crc kubenswrapper[4606]: E1212 00:23:29.637809 4606 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.204:6443: connect: connection refused" logger="UnhandledError" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.638017 4606 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.638598 4606 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.639459 4606 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.639487 4606 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.639494 4606 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.639501 4606 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.639512 4606 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.639520 4606 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.639527 4606 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.639537 4606 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.639545 4606 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.639552 4606 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.639576 4606 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.639583 4606 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.639857 4606 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.640330 4606 server.go:1280] "Started kubelet" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.640653 4606 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.204:6443: connect: connection refused Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.640957 4606 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.641543 4606 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.641628 4606 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.641909 4606 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.641932 4606 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.642083 4606 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-01 08:55:24.662534557 +0000 UTC Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.642111 4606 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 488h31m55.020425569s for next certificate rotation Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.642244 4606 volume_manager.go:287] "The desired_state_of_world populator starts" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.642411 4606 volume_manager.go:289] "Starting Kubelet Volume Manager" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.642518 4606 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.643099 4606 server.go:460] "Adding debug handlers to kubelet server" Dec 12 00:23:29 crc systemd[1]: Started Kubernetes Kubelet. Dec 12 00:23:29 crc kubenswrapper[4606]: E1212 00:23:29.646068 4606 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 12 00:23:29 crc kubenswrapper[4606]: E1212 00:23:29.646203 4606 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.204:6443: connect: connection refused" interval="200ms" Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.646542 4606 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.204:6443: connect: connection refused Dec 12 00:23:29 crc kubenswrapper[4606]: E1212 00:23:29.646610 4606 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.204:6443: connect: connection refused" logger="UnhandledError" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.647204 4606 factory.go:55] Registering systemd factory Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.647227 4606 factory.go:221] Registration of the systemd container factory successfully Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.648947 4606 factory.go:153] Registering CRI-O factory Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.649157 4606 factory.go:221] Registration of the crio container factory successfully Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.649559 4606 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Dec 12 00:23:29 crc kubenswrapper[4606]: E1212 00:23:29.649410 4606 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.204:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.18804ffec8ea186b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-12 00:23:29.640306795 +0000 UTC m=+0.185659662,LastTimestamp:2025-12-12 00:23:29.640306795 +0000 UTC m=+0.185659662,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.649767 4606 factory.go:103] Registering Raw factory Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.650225 4606 manager.go:1196] Started watching for new ooms in manager Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.651450 4606 manager.go:319] Starting recovery of all containers Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.651524 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.651640 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.651652 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.651660 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.652015 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.652029 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.652038 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.652047 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.652066 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.652076 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.652085 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.652096 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.652105 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.652114 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.652123 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.652132 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.652142 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.652151 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.652159 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.652205 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.652214 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.652221 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.652229 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.652237 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.652245 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.652254 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.652264 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.652274 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.652282 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.652291 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.652299 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.652307 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.652316 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.652324 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.652332 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.652341 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.652350 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.652358 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.652366 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.652374 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.652382 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.652390 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.652398 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.652406 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.652428 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.652437 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.652447 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.652456 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.652465 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.652473 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.652481 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.652489 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.652500 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.652508 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.652517 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.652525 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.652534 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.652542 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.652550 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.652559 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.652566 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.652574 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.652581 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.652589 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.657622 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.657651 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.657689 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.657706 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.657725 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.657740 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.657752 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.657768 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.657781 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.657800 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.657815 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.657827 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.657844 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.657856 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.657873 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.657885 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.657898 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.657914 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.657925 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.657941 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.657954 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.657966 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.657983 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.657997 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.658013 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.658026 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.658040 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.658058 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.658072 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.658089 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.658102 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.658115 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.658132 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.658146 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.658163 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.658197 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.658211 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.658228 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.658240 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.658252 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.658273 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.658294 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.658316 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.658332 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.658355 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.658376 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.658391 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.658410 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.658424 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.658445 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.658464 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.658480 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.658495 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.658513 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.658526 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.658543 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.658557 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.658569 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.658587 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.658599 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.658615 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.659408 4606 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.659434 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.659446 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.659456 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.659485 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.659497 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.659509 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.659520 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.659532 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.659546 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.659555 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.659568 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.659578 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.659589 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.659602 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.659612 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.659625 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.659636 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.659646 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.659657 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.659667 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.659679 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.659688 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.659697 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.659709 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.659718 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.659728 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.659739 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.659748 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.659761 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.659771 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.659780 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.659792 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.659803 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.659815 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.659828 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.659841 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.659855 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.659867 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.659883 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.659896 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.659908 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.659923 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.659935 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.659950 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.659964 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.659976 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.659993 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.660005 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.660021 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.660033 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.660046 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.660063 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.660076 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.660092 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.660104 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.660121 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.660138 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.660152 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.660169 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.660204 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.660221 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.660239 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.660253 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.660269 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.660282 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.660294 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.660311 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.660334 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.660352 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.660365 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.660380 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.660397 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.660410 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.660427 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.660440 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.660453 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.660472 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.660485 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.660502 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.660516 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.660530 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.660547 4606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.660560 4606 reconstruct.go:97] "Volume reconstruction finished" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.660568 4606 reconciler.go:26] "Reconciler: start to sync state" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.685585 4606 manager.go:324] Recovery completed Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.695166 4606 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.696674 4606 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.698334 4606 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.698395 4606 status_manager.go:217] "Starting to sync pod status with apiserver" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.698421 4606 kubelet.go:2335] "Starting kubelet main sync loop" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.698426 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.698454 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.698466 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:29 crc kubenswrapper[4606]: E1212 00:23:29.698463 4606 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.699153 4606 cpu_manager.go:225] "Starting CPU manager" policy="none" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.699194 4606 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.699213 4606 state_mem.go:36] "Initialized new in-memory state store" Dec 12 00:23:29 crc kubenswrapper[4606]: W1212 00:23:29.700821 4606 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.204:6443: connect: connection refused Dec 12 00:23:29 crc kubenswrapper[4606]: E1212 00:23:29.700928 4606 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.204:6443: connect: connection refused" logger="UnhandledError" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.707909 4606 policy_none.go:49] "None policy: Start" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.709836 4606 memory_manager.go:170] "Starting memorymanager" policy="None" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.709865 4606 state_mem.go:35] "Initializing new in-memory state store" Dec 12 00:23:29 crc kubenswrapper[4606]: E1212 00:23:29.746781 4606 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.765579 4606 manager.go:334] "Starting Device Plugin manager" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.765685 4606 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.765698 4606 server.go:79] "Starting device plugin registration server" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.766003 4606 eviction_manager.go:189] "Eviction manager: starting control loop" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.766014 4606 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.767032 4606 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.767105 4606 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.767114 4606 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.809756 4606 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.809867 4606 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.810868 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.810889 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.810899 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.811017 4606 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.811416 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.811459 4606 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.812090 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.812108 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.812116 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.812227 4606 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.812370 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.812412 4606 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.812517 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.812550 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.812561 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.813050 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.813066 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.813073 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.813143 4606 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 12 00:23:29 crc kubenswrapper[4606]: E1212 00:23:29.813320 4606 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.813377 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.813416 4606 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.813724 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.813738 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.813741 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.813767 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.813760 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.813800 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.813991 4606 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.814089 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.814106 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.814114 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.814613 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.814664 4606 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.818090 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.818113 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.818122 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.818097 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.818210 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.818223 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.818274 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.818299 4606 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.819564 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.819590 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.819600 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:29 crc kubenswrapper[4606]: E1212 00:23:29.846831 4606 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.204:6443: connect: connection refused" interval="400ms" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.862889 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.862920 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.862942 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.862962 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.862977 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.862992 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.863007 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.863023 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.863038 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.863052 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.863065 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.863079 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.863093 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.863107 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.863129 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.866527 4606 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.867115 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.867143 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.867153 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.867196 4606 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 12 00:23:29 crc kubenswrapper[4606]: E1212 00:23:29.867459 4606 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.204:6443: connect: connection refused" node="crc" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.964209 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.964259 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.964279 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.964297 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.964313 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.964328 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.964343 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.964359 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.964374 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.964387 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.964404 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.964442 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.964483 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.964510 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.964522 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.964500 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.964507 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.964552 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.964593 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.964562 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.964655 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.964662 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.964636 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.964630 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.964726 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.964736 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.964749 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.964779 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.964793 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 12 00:23:29 crc kubenswrapper[4606]: I1212 00:23:29.964864 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 12 00:23:30 crc kubenswrapper[4606]: I1212 00:23:30.067984 4606 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 12 00:23:30 crc kubenswrapper[4606]: I1212 00:23:30.069127 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:30 crc kubenswrapper[4606]: I1212 00:23:30.069197 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:30 crc kubenswrapper[4606]: I1212 00:23:30.069216 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:30 crc kubenswrapper[4606]: I1212 00:23:30.069249 4606 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 12 00:23:30 crc kubenswrapper[4606]: E1212 00:23:30.069969 4606 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.204:6443: connect: connection refused" node="crc" Dec 12 00:23:30 crc kubenswrapper[4606]: I1212 00:23:30.146430 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 12 00:23:30 crc kubenswrapper[4606]: I1212 00:23:30.169545 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 12 00:23:30 crc kubenswrapper[4606]: W1212 00:23:30.173579 4606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-2506fc209e70a843b9fc135ef4f65e95d9d643762f597aef842ed99d04c36f12 WatchSource:0}: Error finding container 2506fc209e70a843b9fc135ef4f65e95d9d643762f597aef842ed99d04c36f12: Status 404 returned error can't find the container with id 2506fc209e70a843b9fc135ef4f65e95d9d643762f597aef842ed99d04c36f12 Dec 12 00:23:30 crc kubenswrapper[4606]: I1212 00:23:30.176474 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 12 00:23:30 crc kubenswrapper[4606]: W1212 00:23:30.186919 4606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-b119c2065071e3a408fbd7a4ddccf30bf1f54af8c59a52d33e431274caf6bfb1 WatchSource:0}: Error finding container b119c2065071e3a408fbd7a4ddccf30bf1f54af8c59a52d33e431274caf6bfb1: Status 404 returned error can't find the container with id b119c2065071e3a408fbd7a4ddccf30bf1f54af8c59a52d33e431274caf6bfb1 Dec 12 00:23:30 crc kubenswrapper[4606]: W1212 00:23:30.193156 4606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-530ec31987d6cfc87b4941550d1d677c782440b9ecc53fc72c7dac8efb2213ba WatchSource:0}: Error finding container 530ec31987d6cfc87b4941550d1d677c782440b9ecc53fc72c7dac8efb2213ba: Status 404 returned error can't find the container with id 530ec31987d6cfc87b4941550d1d677c782440b9ecc53fc72c7dac8efb2213ba Dec 12 00:23:30 crc kubenswrapper[4606]: I1212 00:23:30.193962 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 12 00:23:30 crc kubenswrapper[4606]: I1212 00:23:30.200549 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 12 00:23:30 crc kubenswrapper[4606]: W1212 00:23:30.207002 4606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-44d74dae55fe707f967b8018a63ef88b3a3d4ac46fe9c0814d4fc0f67d75e9ee WatchSource:0}: Error finding container 44d74dae55fe707f967b8018a63ef88b3a3d4ac46fe9c0814d4fc0f67d75e9ee: Status 404 returned error can't find the container with id 44d74dae55fe707f967b8018a63ef88b3a3d4ac46fe9c0814d4fc0f67d75e9ee Dec 12 00:23:30 crc kubenswrapper[4606]: W1212 00:23:30.220448 4606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-4bcbe5f83b6e8e9214f279be5d4194d36488ae66aa264577fe1ff02a6b733dd2 WatchSource:0}: Error finding container 4bcbe5f83b6e8e9214f279be5d4194d36488ae66aa264577fe1ff02a6b733dd2: Status 404 returned error can't find the container with id 4bcbe5f83b6e8e9214f279be5d4194d36488ae66aa264577fe1ff02a6b733dd2 Dec 12 00:23:30 crc kubenswrapper[4606]: E1212 00:23:30.248074 4606 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.204:6443: connect: connection refused" interval="800ms" Dec 12 00:23:30 crc kubenswrapper[4606]: I1212 00:23:30.470135 4606 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 12 00:23:30 crc kubenswrapper[4606]: I1212 00:23:30.475233 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:30 crc kubenswrapper[4606]: I1212 00:23:30.475294 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:30 crc kubenswrapper[4606]: I1212 00:23:30.475311 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:30 crc kubenswrapper[4606]: I1212 00:23:30.475345 4606 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 12 00:23:30 crc kubenswrapper[4606]: E1212 00:23:30.475830 4606 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.204:6443: connect: connection refused" node="crc" Dec 12 00:23:30 crc kubenswrapper[4606]: W1212 00:23:30.524487 4606 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.204:6443: connect: connection refused Dec 12 00:23:30 crc kubenswrapper[4606]: E1212 00:23:30.524571 4606 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.204:6443: connect: connection refused" logger="UnhandledError" Dec 12 00:23:30 crc kubenswrapper[4606]: I1212 00:23:30.642005 4606 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.204:6443: connect: connection refused Dec 12 00:23:30 crc kubenswrapper[4606]: I1212 00:23:30.704932 4606 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="892ac723ca54cc612171eec750846b774fcfc54e09531ff344813a1e1afc61fd" exitCode=0 Dec 12 00:23:30 crc kubenswrapper[4606]: I1212 00:23:30.705057 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"892ac723ca54cc612171eec750846b774fcfc54e09531ff344813a1e1afc61fd"} Dec 12 00:23:30 crc kubenswrapper[4606]: I1212 00:23:30.705219 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"2506fc209e70a843b9fc135ef4f65e95d9d643762f597aef842ed99d04c36f12"} Dec 12 00:23:30 crc kubenswrapper[4606]: I1212 00:23:30.705340 4606 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 12 00:23:30 crc kubenswrapper[4606]: I1212 00:23:30.706589 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:30 crc kubenswrapper[4606]: I1212 00:23:30.706635 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:30 crc kubenswrapper[4606]: I1212 00:23:30.706652 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:30 crc kubenswrapper[4606]: I1212 00:23:30.708666 4606 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="c88faa76d30856d9170f04a153de4239eefbef49b2ee25a5bd04502697248b5c" exitCode=0 Dec 12 00:23:30 crc kubenswrapper[4606]: I1212 00:23:30.708777 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"c88faa76d30856d9170f04a153de4239eefbef49b2ee25a5bd04502697248b5c"} Dec 12 00:23:30 crc kubenswrapper[4606]: I1212 00:23:30.708814 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"4bcbe5f83b6e8e9214f279be5d4194d36488ae66aa264577fe1ff02a6b733dd2"} Dec 12 00:23:30 crc kubenswrapper[4606]: I1212 00:23:30.708914 4606 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 12 00:23:30 crc kubenswrapper[4606]: I1212 00:23:30.710013 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:30 crc kubenswrapper[4606]: I1212 00:23:30.710048 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:30 crc kubenswrapper[4606]: I1212 00:23:30.710065 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:30 crc kubenswrapper[4606]: I1212 00:23:30.710509 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"a380a96c542428bf9cf66391f4d843144052269b294636e863319b2ba954047f"} Dec 12 00:23:30 crc kubenswrapper[4606]: I1212 00:23:30.710555 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"44d74dae55fe707f967b8018a63ef88b3a3d4ac46fe9c0814d4fc0f67d75e9ee"} Dec 12 00:23:30 crc kubenswrapper[4606]: I1212 00:23:30.713009 4606 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="0bbabfdcef0c9f4dcaada0bf5c2e98c083fd0f809462f017f8a767ef800e8e13" exitCode=0 Dec 12 00:23:30 crc kubenswrapper[4606]: I1212 00:23:30.713095 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"0bbabfdcef0c9f4dcaada0bf5c2e98c083fd0f809462f017f8a767ef800e8e13"} Dec 12 00:23:30 crc kubenswrapper[4606]: I1212 00:23:30.713138 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"530ec31987d6cfc87b4941550d1d677c782440b9ecc53fc72c7dac8efb2213ba"} Dec 12 00:23:30 crc kubenswrapper[4606]: I1212 00:23:30.713323 4606 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 12 00:23:30 crc kubenswrapper[4606]: I1212 00:23:30.714381 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:30 crc kubenswrapper[4606]: I1212 00:23:30.714428 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:30 crc kubenswrapper[4606]: I1212 00:23:30.714446 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:30 crc kubenswrapper[4606]: I1212 00:23:30.715570 4606 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="aa7780eb068dc4a157794f9253367d2198a38a2c039cf3906fc2319cf48db3cc" exitCode=0 Dec 12 00:23:30 crc kubenswrapper[4606]: I1212 00:23:30.715609 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"aa7780eb068dc4a157794f9253367d2198a38a2c039cf3906fc2319cf48db3cc"} Dec 12 00:23:30 crc kubenswrapper[4606]: I1212 00:23:30.715637 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"b119c2065071e3a408fbd7a4ddccf30bf1f54af8c59a52d33e431274caf6bfb1"} Dec 12 00:23:30 crc kubenswrapper[4606]: I1212 00:23:30.715761 4606 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 12 00:23:30 crc kubenswrapper[4606]: I1212 00:23:30.715914 4606 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 12 00:23:30 crc kubenswrapper[4606]: I1212 00:23:30.717000 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:30 crc kubenswrapper[4606]: I1212 00:23:30.717011 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:30 crc kubenswrapper[4606]: I1212 00:23:30.717038 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:30 crc kubenswrapper[4606]: I1212 00:23:30.717052 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:30 crc kubenswrapper[4606]: I1212 00:23:30.717022 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:30 crc kubenswrapper[4606]: I1212 00:23:30.717139 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:31 crc kubenswrapper[4606]: E1212 00:23:31.050507 4606 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.204:6443: connect: connection refused" interval="1.6s" Dec 12 00:23:31 crc kubenswrapper[4606]: W1212 00:23:31.078734 4606 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.204:6443: connect: connection refused Dec 12 00:23:31 crc kubenswrapper[4606]: E1212 00:23:31.078802 4606 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.204:6443: connect: connection refused" logger="UnhandledError" Dec 12 00:23:31 crc kubenswrapper[4606]: W1212 00:23:31.161685 4606 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.204:6443: connect: connection refused Dec 12 00:23:31 crc kubenswrapper[4606]: E1212 00:23:31.161766 4606 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.204:6443: connect: connection refused" logger="UnhandledError" Dec 12 00:23:31 crc kubenswrapper[4606]: W1212 00:23:31.250783 4606 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.204:6443: connect: connection refused Dec 12 00:23:31 crc kubenswrapper[4606]: E1212 00:23:31.250855 4606 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.204:6443: connect: connection refused" logger="UnhandledError" Dec 12 00:23:31 crc kubenswrapper[4606]: I1212 00:23:31.276118 4606 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 12 00:23:31 crc kubenswrapper[4606]: I1212 00:23:31.278683 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:31 crc kubenswrapper[4606]: I1212 00:23:31.278715 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:31 crc kubenswrapper[4606]: I1212 00:23:31.278725 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:31 crc kubenswrapper[4606]: I1212 00:23:31.278748 4606 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 12 00:23:31 crc kubenswrapper[4606]: E1212 00:23:31.279099 4606 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.204:6443: connect: connection refused" node="crc" Dec 12 00:23:31 crc kubenswrapper[4606]: I1212 00:23:31.719249 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"4309a0a93977264e5000a930a32a4fb18ebe51fcf2af20bbf153f30f09759ff6"} Dec 12 00:23:31 crc kubenswrapper[4606]: I1212 00:23:31.719297 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"902a9d2fb0d7dcdec6339fac981ee0539911f1998a86ecf02a01015d3b1b5907"} Dec 12 00:23:31 crc kubenswrapper[4606]: I1212 00:23:31.719311 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"de6cd366dc8d770945235faa6560cea35c0ef7eabd95cb272639e2116e2d623d"} Dec 12 00:23:31 crc kubenswrapper[4606]: I1212 00:23:31.719329 4606 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 12 00:23:31 crc kubenswrapper[4606]: I1212 00:23:31.720309 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:31 crc kubenswrapper[4606]: I1212 00:23:31.720331 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:31 crc kubenswrapper[4606]: I1212 00:23:31.720339 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:31 crc kubenswrapper[4606]: I1212 00:23:31.721232 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"f9531d10c07644d54aeff48edf4ac068968acbd1b6597baabcb0660e7bcd54c4"} Dec 12 00:23:31 crc kubenswrapper[4606]: I1212 00:23:31.721255 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"e76f2e4c2d97f36aebba4bcbabe12214a1df27637376d1440ee54d43fca39065"} Dec 12 00:23:31 crc kubenswrapper[4606]: I1212 00:23:31.721267 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"2d18b3a560566347e9cfcd748453a89d1588de77f02a5e961a71916c6182eff9"} Dec 12 00:23:31 crc kubenswrapper[4606]: I1212 00:23:31.722262 4606 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="684c387c89889a0a1e8d317ca70ebfc71ea377eb5de9c08c71a3ff02faf99e4d" exitCode=0 Dec 12 00:23:31 crc kubenswrapper[4606]: I1212 00:23:31.722303 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"684c387c89889a0a1e8d317ca70ebfc71ea377eb5de9c08c71a3ff02faf99e4d"} Dec 12 00:23:31 crc kubenswrapper[4606]: I1212 00:23:31.722412 4606 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 12 00:23:31 crc kubenswrapper[4606]: I1212 00:23:31.723243 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:31 crc kubenswrapper[4606]: I1212 00:23:31.723265 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:31 crc kubenswrapper[4606]: I1212 00:23:31.723274 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:31 crc kubenswrapper[4606]: I1212 00:23:31.725247 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"c06b3470d2a194c99cd63ef6039a6f2abff89fc93e8f61feca67c4eefa9b9ac8"} Dec 12 00:23:31 crc kubenswrapper[4606]: I1212 00:23:31.725320 4606 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 12 00:23:31 crc kubenswrapper[4606]: I1212 00:23:31.730322 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:31 crc kubenswrapper[4606]: I1212 00:23:31.730338 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:31 crc kubenswrapper[4606]: I1212 00:23:31.730346 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:31 crc kubenswrapper[4606]: I1212 00:23:31.731829 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"e338552ee9504d46234c3caf3d9b7306a033258a94b6cf7c542bb957ea32a94c"} Dec 12 00:23:31 crc kubenswrapper[4606]: I1212 00:23:31.731847 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"a159e87c39859bf3b5652b40223e4a8fdd9dcae3d23c1fae17d0eb8b5842a71c"} Dec 12 00:23:31 crc kubenswrapper[4606]: I1212 00:23:31.731866 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"c7a27c1b503f68ce99c9004d0190e5c74380bf2ff33b2b0e1e7f424e5cf9d450"} Dec 12 00:23:31 crc kubenswrapper[4606]: I1212 00:23:31.731918 4606 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 12 00:23:31 crc kubenswrapper[4606]: I1212 00:23:31.732426 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:31 crc kubenswrapper[4606]: I1212 00:23:31.732488 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:31 crc kubenswrapper[4606]: I1212 00:23:31.732552 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:31 crc kubenswrapper[4606]: I1212 00:23:31.782153 4606 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Dec 12 00:23:31 crc kubenswrapper[4606]: I1212 00:23:31.897497 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 12 00:23:32 crc kubenswrapper[4606]: I1212 00:23:32.738805 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"c75350d79a93c04db2bd6d71995920cde9b897776249b8696f3ead3939f7d5d4"} Dec 12 00:23:32 crc kubenswrapper[4606]: I1212 00:23:32.738842 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"b3baf1bf5fbaa04f9f165f43b6a4fba496be0d453641fcd8bbc811687891dd66"} Dec 12 00:23:32 crc kubenswrapper[4606]: I1212 00:23:32.738998 4606 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 12 00:23:32 crc kubenswrapper[4606]: I1212 00:23:32.740389 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:32 crc kubenswrapper[4606]: I1212 00:23:32.740582 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:32 crc kubenswrapper[4606]: I1212 00:23:32.740590 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:32 crc kubenswrapper[4606]: I1212 00:23:32.742479 4606 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="78da314feaec6cab81e27eaea086d2bba3a05c4b73ce2add1a0bb57a226b0f5e" exitCode=0 Dec 12 00:23:32 crc kubenswrapper[4606]: I1212 00:23:32.742643 4606 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 12 00:23:32 crc kubenswrapper[4606]: I1212 00:23:32.742841 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"78da314feaec6cab81e27eaea086d2bba3a05c4b73ce2add1a0bb57a226b0f5e"} Dec 12 00:23:32 crc kubenswrapper[4606]: I1212 00:23:32.743030 4606 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 12 00:23:32 crc kubenswrapper[4606]: I1212 00:23:32.744107 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:32 crc kubenswrapper[4606]: I1212 00:23:32.744149 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:32 crc kubenswrapper[4606]: I1212 00:23:32.744202 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:32 crc kubenswrapper[4606]: I1212 00:23:32.745432 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:32 crc kubenswrapper[4606]: I1212 00:23:32.745622 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:32 crc kubenswrapper[4606]: I1212 00:23:32.745795 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:32 crc kubenswrapper[4606]: I1212 00:23:32.879229 4606 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 12 00:23:32 crc kubenswrapper[4606]: I1212 00:23:32.880366 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:32 crc kubenswrapper[4606]: I1212 00:23:32.880398 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:32 crc kubenswrapper[4606]: I1212 00:23:32.880409 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:32 crc kubenswrapper[4606]: I1212 00:23:32.880432 4606 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 12 00:23:32 crc kubenswrapper[4606]: I1212 00:23:32.930902 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 12 00:23:33 crc kubenswrapper[4606]: I1212 00:23:33.392976 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 12 00:23:33 crc kubenswrapper[4606]: I1212 00:23:33.750455 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"cbd49ad6d6011a235d440da0e8058c0fc4dcc64effec37b50a68e84438d537a6"} Dec 12 00:23:33 crc kubenswrapper[4606]: I1212 00:23:33.750497 4606 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 12 00:23:33 crc kubenswrapper[4606]: I1212 00:23:33.750525 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"accf40ef1339b65136666c1ea6f21a70d2152a650c87ef3ade19ddb6f9effbc2"} Dec 12 00:23:33 crc kubenswrapper[4606]: I1212 00:23:33.750559 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"2c6fef20d4e159a405f0ab0206c49e5e314e97bf7ddaf0758e090106945c9ffe"} Dec 12 00:23:33 crc kubenswrapper[4606]: I1212 00:23:33.750584 4606 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 12 00:23:33 crc kubenswrapper[4606]: I1212 00:23:33.750583 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"b993d03e36e69edb510ce98f50dfb98344fb6fe2b5debeff3a2004807dd00ee6"} Dec 12 00:23:33 crc kubenswrapper[4606]: I1212 00:23:33.750762 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"f2860bce1479ca1d558c4f5230d76cadcaf3efc0e3842ef0d371ecf1168cd852"} Dec 12 00:23:33 crc kubenswrapper[4606]: I1212 00:23:33.750594 4606 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 12 00:23:33 crc kubenswrapper[4606]: I1212 00:23:33.750537 4606 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 12 00:23:33 crc kubenswrapper[4606]: I1212 00:23:33.752135 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:33 crc kubenswrapper[4606]: I1212 00:23:33.752216 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:33 crc kubenswrapper[4606]: I1212 00:23:33.752232 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:33 crc kubenswrapper[4606]: I1212 00:23:33.752214 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:33 crc kubenswrapper[4606]: I1212 00:23:33.752353 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:33 crc kubenswrapper[4606]: I1212 00:23:33.752376 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:33 crc kubenswrapper[4606]: I1212 00:23:33.752909 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:33 crc kubenswrapper[4606]: I1212 00:23:33.752966 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:33 crc kubenswrapper[4606]: I1212 00:23:33.752989 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:34 crc kubenswrapper[4606]: I1212 00:23:34.444531 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Dec 12 00:23:34 crc kubenswrapper[4606]: I1212 00:23:34.615246 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 12 00:23:34 crc kubenswrapper[4606]: I1212 00:23:34.638322 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 12 00:23:34 crc kubenswrapper[4606]: I1212 00:23:34.753167 4606 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 12 00:23:34 crc kubenswrapper[4606]: I1212 00:23:34.753276 4606 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 12 00:23:34 crc kubenswrapper[4606]: I1212 00:23:34.753301 4606 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 12 00:23:34 crc kubenswrapper[4606]: I1212 00:23:34.753277 4606 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 12 00:23:34 crc kubenswrapper[4606]: I1212 00:23:34.754901 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:34 crc kubenswrapper[4606]: I1212 00:23:34.754952 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:34 crc kubenswrapper[4606]: I1212 00:23:34.754983 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:34 crc kubenswrapper[4606]: I1212 00:23:34.755003 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:34 crc kubenswrapper[4606]: I1212 00:23:34.755044 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:34 crc kubenswrapper[4606]: I1212 00:23:34.755067 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:34 crc kubenswrapper[4606]: I1212 00:23:34.755375 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:34 crc kubenswrapper[4606]: I1212 00:23:34.755415 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:34 crc kubenswrapper[4606]: I1212 00:23:34.755437 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:34 crc kubenswrapper[4606]: I1212 00:23:34.841069 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 12 00:23:34 crc kubenswrapper[4606]: I1212 00:23:34.898318 4606 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 12 00:23:34 crc kubenswrapper[4606]: I1212 00:23:34.898430 4606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 12 00:23:35 crc kubenswrapper[4606]: I1212 00:23:35.755598 4606 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 12 00:23:35 crc kubenswrapper[4606]: I1212 00:23:35.756233 4606 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 12 00:23:35 crc kubenswrapper[4606]: I1212 00:23:35.756682 4606 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 12 00:23:35 crc kubenswrapper[4606]: I1212 00:23:35.757508 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:35 crc kubenswrapper[4606]: I1212 00:23:35.757537 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:35 crc kubenswrapper[4606]: I1212 00:23:35.757548 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:35 crc kubenswrapper[4606]: I1212 00:23:35.758264 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:35 crc kubenswrapper[4606]: I1212 00:23:35.758292 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:35 crc kubenswrapper[4606]: I1212 00:23:35.758306 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:35 crc kubenswrapper[4606]: I1212 00:23:35.758887 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:35 crc kubenswrapper[4606]: I1212 00:23:35.758911 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:35 crc kubenswrapper[4606]: I1212 00:23:35.758922 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:36 crc kubenswrapper[4606]: I1212 00:23:36.757583 4606 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 12 00:23:36 crc kubenswrapper[4606]: I1212 00:23:36.758573 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:36 crc kubenswrapper[4606]: I1212 00:23:36.758622 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:36 crc kubenswrapper[4606]: I1212 00:23:36.758635 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:37 crc kubenswrapper[4606]: I1212 00:23:37.667011 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 12 00:23:37 crc kubenswrapper[4606]: I1212 00:23:37.667292 4606 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 12 00:23:37 crc kubenswrapper[4606]: I1212 00:23:37.669377 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:37 crc kubenswrapper[4606]: I1212 00:23:37.669424 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:37 crc kubenswrapper[4606]: I1212 00:23:37.669437 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:39 crc kubenswrapper[4606]: E1212 00:23:39.814427 4606 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 12 00:23:41 crc kubenswrapper[4606]: I1212 00:23:41.643223 4606 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Dec 12 00:23:41 crc kubenswrapper[4606]: I1212 00:23:41.691131 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 12 00:23:41 crc kubenswrapper[4606]: I1212 00:23:41.691361 4606 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 12 00:23:41 crc kubenswrapper[4606]: I1212 00:23:41.692727 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:41 crc kubenswrapper[4606]: I1212 00:23:41.692782 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:41 crc kubenswrapper[4606]: I1212 00:23:41.692810 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:41 crc kubenswrapper[4606]: I1212 00:23:41.702384 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 12 00:23:41 crc kubenswrapper[4606]: I1212 00:23:41.770969 4606 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 12 00:23:41 crc kubenswrapper[4606]: I1212 00:23:41.772497 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:41 crc kubenswrapper[4606]: I1212 00:23:41.772558 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:41 crc kubenswrapper[4606]: I1212 00:23:41.772583 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:41 crc kubenswrapper[4606]: I1212 00:23:41.778549 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 12 00:23:41 crc kubenswrapper[4606]: E1212 00:23:41.783728 4606 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": net/http: TLS handshake timeout" logger="UnhandledError" Dec 12 00:23:42 crc kubenswrapper[4606]: E1212 00:23:42.651919 4606 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" interval="3.2s" Dec 12 00:23:42 crc kubenswrapper[4606]: I1212 00:23:42.684535 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Dec 12 00:23:42 crc kubenswrapper[4606]: I1212 00:23:42.684746 4606 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 12 00:23:42 crc kubenswrapper[4606]: I1212 00:23:42.685955 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:42 crc kubenswrapper[4606]: I1212 00:23:42.685991 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:42 crc kubenswrapper[4606]: I1212 00:23:42.686003 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:42 crc kubenswrapper[4606]: I1212 00:23:42.772724 4606 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 12 00:23:42 crc kubenswrapper[4606]: I1212 00:23:42.773654 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:42 crc kubenswrapper[4606]: I1212 00:23:42.773871 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:42 crc kubenswrapper[4606]: I1212 00:23:42.773987 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:42 crc kubenswrapper[4606]: I1212 00:23:42.885607 4606 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Dec 12 00:23:42 crc kubenswrapper[4606]: I1212 00:23:42.885665 4606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Dec 12 00:23:42 crc kubenswrapper[4606]: I1212 00:23:42.894896 4606 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Dec 12 00:23:42 crc kubenswrapper[4606]: I1212 00:23:42.895099 4606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Dec 12 00:23:44 crc kubenswrapper[4606]: I1212 00:23:44.621363 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 12 00:23:44 crc kubenswrapper[4606]: I1212 00:23:44.621563 4606 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 12 00:23:44 crc kubenswrapper[4606]: I1212 00:23:44.623510 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:44 crc kubenswrapper[4606]: I1212 00:23:44.623727 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:44 crc kubenswrapper[4606]: I1212 00:23:44.624011 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:44 crc kubenswrapper[4606]: I1212 00:23:44.628943 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 12 00:23:44 crc kubenswrapper[4606]: I1212 00:23:44.776442 4606 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 12 00:23:44 crc kubenswrapper[4606]: I1212 00:23:44.777727 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:44 crc kubenswrapper[4606]: I1212 00:23:44.777782 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:44 crc kubenswrapper[4606]: I1212 00:23:44.777802 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:44 crc kubenswrapper[4606]: I1212 00:23:44.898408 4606 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 12 00:23:44 crc kubenswrapper[4606]: I1212 00:23:44.898509 4606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 12 00:23:46 crc kubenswrapper[4606]: I1212 00:23:46.065812 4606 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Dec 12 00:23:46 crc kubenswrapper[4606]: I1212 00:23:46.081156 4606 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Dec 12 00:23:46 crc kubenswrapper[4606]: I1212 00:23:46.854363 4606 csr.go:261] certificate signing request csr-b8nj2 is approved, waiting to be issued Dec 12 00:23:46 crc kubenswrapper[4606]: I1212 00:23:46.869647 4606 csr.go:257] certificate signing request csr-b8nj2 is issued Dec 12 00:23:47 crc kubenswrapper[4606]: I1212 00:23:47.870632 4606 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-12-12 00:18:46 +0000 UTC, rotation deadline is 2026-08-26 07:09:47.808355815 +0000 UTC Dec 12 00:23:47 crc kubenswrapper[4606]: I1212 00:23:47.870682 4606 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6174h45m59.937675876s for next certificate rotation Dec 12 00:23:47 crc kubenswrapper[4606]: I1212 00:23:47.884520 4606 trace.go:236] Trace[1536946120]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (12-Dec-2025 00:23:33.302) (total time: 14581ms): Dec 12 00:23:47 crc kubenswrapper[4606]: Trace[1536946120]: ---"Objects listed" error: 14581ms (00:23:47.884) Dec 12 00:23:47 crc kubenswrapper[4606]: Trace[1536946120]: [14.581961197s] [14.581961197s] END Dec 12 00:23:47 crc kubenswrapper[4606]: I1212 00:23:47.884549 4606 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 12 00:23:47 crc kubenswrapper[4606]: I1212 00:23:47.886143 4606 trace.go:236] Trace[1946150119]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (12-Dec-2025 00:23:33.954) (total time: 13931ms): Dec 12 00:23:47 crc kubenswrapper[4606]: Trace[1946150119]: ---"Objects listed" error: 13931ms (00:23:47.886) Dec 12 00:23:47 crc kubenswrapper[4606]: Trace[1946150119]: [13.931160619s] [13.931160619s] END Dec 12 00:23:47 crc kubenswrapper[4606]: I1212 00:23:47.886158 4606 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 12 00:23:47 crc kubenswrapper[4606]: I1212 00:23:47.887748 4606 trace.go:236] Trace[644891276]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (12-Dec-2025 00:23:33.734) (total time: 14153ms): Dec 12 00:23:47 crc kubenswrapper[4606]: Trace[644891276]: ---"Objects listed" error: 14153ms (00:23:47.887) Dec 12 00:23:47 crc kubenswrapper[4606]: Trace[644891276]: [14.153689746s] [14.153689746s] END Dec 12 00:23:47 crc kubenswrapper[4606]: I1212 00:23:47.887765 4606 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 12 00:23:47 crc kubenswrapper[4606]: I1212 00:23:47.888655 4606 trace.go:236] Trace[659247911]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (12-Dec-2025 00:23:33.524) (total time: 14363ms): Dec 12 00:23:47 crc kubenswrapper[4606]: Trace[659247911]: ---"Objects listed" error: 14363ms (00:23:47.888) Dec 12 00:23:47 crc kubenswrapper[4606]: Trace[659247911]: [14.363959616s] [14.363959616s] END Dec 12 00:23:47 crc kubenswrapper[4606]: I1212 00:23:47.888679 4606 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 12 00:23:47 crc kubenswrapper[4606]: I1212 00:23:47.891660 4606 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Dec 12 00:23:47 crc kubenswrapper[4606]: E1212 00:23:47.898539 4606 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.128744 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.145088 4606 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": EOF" start-of-body= Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.145564 4606 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": EOF" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.145980 4606 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:56848->192.168.126.11:17697: read: connection reset by peer" start-of-body= Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.146075 4606 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:56848->192.168.126.11:17697: read: connection reset by peer" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.150669 4606 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:56856->192.168.126.11:17697: read: connection reset by peer" start-of-body= Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.151055 4606 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:56856->192.168.126.11:17697: read: connection reset by peer" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.673385 4606 apiserver.go:52] "Watching apiserver" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.765406 4606 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.765872 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-dns/node-resolver-554rp","openshift-kube-apiserver/kube-apiserver-crc","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf"] Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.766231 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.766415 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.766511 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.766498 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.766476 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.766658 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.766782 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-554rp" Dec 12 00:23:48 crc kubenswrapper[4606]: E1212 00:23:48.767098 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 12 00:23:48 crc kubenswrapper[4606]: E1212 00:23:48.767470 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 12 00:23:48 crc kubenswrapper[4606]: E1212 00:23:48.768200 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.768919 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.769139 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.769155 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.769678 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.771987 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.777625 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.777872 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.777989 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.778123 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.778242 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.778372 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.778545 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.788382 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.790537 4606 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="c75350d79a93c04db2bd6d71995920cde9b897776249b8696f3ead3939f7d5d4" exitCode=255 Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.790573 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"c75350d79a93c04db2bd6d71995920cde9b897776249b8696f3ead3939f7d5d4"} Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.790995 4606 scope.go:117] "RemoveContainer" containerID="c75350d79a93c04db2bd6d71995920cde9b897776249b8696f3ead3939f7d5d4" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.811311 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92839d27-9755-4aaa-a4c2-8aa83b6473de\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d18b3a560566347e9cfcd748453a89d1588de77f02a5e961a71916c6182eff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9531d10c07644d54aeff48edf4ac068968acbd1b6597baabcb0660e7bcd54c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76f2e4c2d97f36aebba4bcbabe12214a1df27637376d1440ee54d43fca39065\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c75350d79a93c04db2bd6d71995920cde9b897776249b8696f3ead3939f7d5d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3baf1bf5fbaa04f9f165f43b6a4fba496be0d453641fcd8bbc811687891dd66\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bbabfdcef0c9f4dcaada0bf5c2e98c083fd0f809462f017f8a767ef800e8e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0bbabfdcef0c9f4dcaada0bf5c2e98c083fd0f809462f017f8a767ef800e8e13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.843919 4606 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.864237 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.883821 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.897498 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.897556 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.897620 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.897648 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.897672 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.897700 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.897725 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.897752 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.897776 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.897795 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.897815 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.897840 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.897860 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.897886 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.897906 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.897928 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.897950 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.897976 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.898215 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.898238 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.898260 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.898281 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.898304 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.898357 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.898382 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.898402 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.898441 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.898464 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.898484 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.898505 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.898526 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.898547 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.898569 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.898592 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.898614 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.898636 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.898655 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.898674 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.898699 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.898723 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.898745 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.898767 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.898773 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.898788 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.898876 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.898908 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.898957 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.898986 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.899012 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.899036 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.899060 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.899086 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.899111 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.899124 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.899134 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.899204 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.899304 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.899347 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.899403 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.899423 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.899441 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.899457 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.899475 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.899494 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.899527 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.899548 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.899598 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.899619 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.899644 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.899669 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.899693 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.899717 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.899739 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.899760 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.899784 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.899805 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.899825 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.899840 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.899855 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.899875 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.899896 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.899919 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.899940 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.899963 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.899985 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.900005 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.900025 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.900047 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.900069 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.900091 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.900112 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.900133 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.900156 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.900193 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.900218 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.900241 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.900264 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.900285 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.900308 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.900331 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.900359 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.900379 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.900401 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.900422 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.900463 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.900487 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.900512 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.900539 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.900561 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.900582 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.900606 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.900629 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.900652 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.900679 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.900702 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.900723 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.900746 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.900769 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.900793 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.900818 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.900841 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.900864 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.900888 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.900910 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.900932 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.900953 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.900977 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.901002 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.901024 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.901046 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.901068 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.901090 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.901113 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.901135 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.901161 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.901224 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.901250 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.901272 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.901294 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.901316 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.901340 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.901365 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.901393 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.901418 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.901441 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.901463 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.901487 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.901509 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.901531 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.901553 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.901598 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.901624 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.901646 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.901668 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.901692 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.901715 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.901738 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.901761 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.901785 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.901809 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.901833 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.901856 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.901878 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.901903 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.901928 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.901953 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.901979 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.902005 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.902030 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.902267 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.902293 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.902317 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.902342 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.902365 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.902390 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.902415 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.902443 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.902468 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.902493 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.902519 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.902543 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.902568 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.902594 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.902619 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.902642 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.902685 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.902709 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.902731 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.902753 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.902779 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.902802 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.902834 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.902861 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.902886 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.902910 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.902934 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.902958 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.902984 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.903035 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.903064 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.903263 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.903293 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/1d34036b-3243-416a-a0c0-1d1ddb3a0ca5-hosts-file\") pod \"node-resolver-554rp\" (UID: \"1d34036b-3243-416a-a0c0-1d1ddb3a0ca5\") " pod="openshift-dns/node-resolver-554rp" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.903323 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.903350 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.903378 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9k8v\" (UniqueName: \"kubernetes.io/projected/1d34036b-3243-416a-a0c0-1d1ddb3a0ca5-kube-api-access-g9k8v\") pod \"node-resolver-554rp\" (UID: \"1d34036b-3243-416a-a0c0-1d1ddb3a0ca5\") " pod="openshift-dns/node-resolver-554rp" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.903402 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.903429 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.903454 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.903481 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.903506 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.903530 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.903557 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.903584 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.903611 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.903707 4606 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.903725 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.903739 4606 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.908267 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.899643 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.899808 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.900223 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.900389 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.900552 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.901124 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.902435 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.902643 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.902826 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.904011 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.906607 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.906852 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.908532 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.908777 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.908935 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.909104 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.910026 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.910102 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.910649 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.911193 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.911484 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.913464 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.913749 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.914021 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.914308 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.915180 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: E1212 00:23:48.915340 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-12 00:23:49.415295288 +0000 UTC m=+19.960648164 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.918058 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.918643 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.919043 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.919098 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.919449 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.919467 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.919748 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.919973 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.920163 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.920406 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.920478 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.920556 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.920660 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.920726 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.920843 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.920869 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.921020 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.921023 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.921148 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.921244 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.921398 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.921565 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.921767 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.922735 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.921972 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.922094 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.922203 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.922360 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.922504 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.922658 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.923027 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.923168 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.923402 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.923635 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.923660 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.923859 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.923870 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.924051 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.924093 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.924253 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.924677 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.925049 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.925417 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.925711 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.925726 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.925976 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.928540 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.928655 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.928706 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.929012 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.929185 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.929431 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.929701 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.930374 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.931113 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.931465 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.931516 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.931925 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.932017 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.944088 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.944443 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.944687 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.945469 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.946044 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.946384 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.946580 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.946828 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.946964 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.947035 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.947042 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.947103 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.947199 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.947349 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.947507 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.947558 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.947571 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.947711 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.947858 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.948256 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.948341 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.948452 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.948802 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.948827 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.949636 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.954359 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 12 00:23:48 crc kubenswrapper[4606]: E1212 00:23:48.954482 4606 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 12 00:23:48 crc kubenswrapper[4606]: E1212 00:23:48.954528 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-12 00:23:49.454514501 +0000 UTC m=+19.999867367 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.955862 4606 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Dec 12 00:23:48 crc kubenswrapper[4606]: E1212 00:23:48.958291 4606 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 12 00:23:48 crc kubenswrapper[4606]: E1212 00:23:48.958384 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-12 00:23:49.458361475 +0000 UTC m=+20.003714341 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.961505 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.921924 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.969353 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.974590 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.974905 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.975867 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.976290 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.976777 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.976921 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.976998 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.977046 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.977114 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.977288 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.977370 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.977844 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.978227 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.978353 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.978467 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.978578 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.978690 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.978796 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.978969 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.979082 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.979432 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.979524 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.979612 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.980408 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.983658 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.984335 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.984389 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.984428 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.984563 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.984607 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.984649 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.984781 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.984833 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: E1212 00:23:48.985023 4606 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 12 00:23:48 crc kubenswrapper[4606]: E1212 00:23:48.985043 4606 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 12 00:23:48 crc kubenswrapper[4606]: E1212 00:23:48.985054 4606 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 12 00:23:48 crc kubenswrapper[4606]: E1212 00:23:48.985099 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-12 00:23:49.485085262 +0000 UTC m=+20.030438128 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.985378 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.986073 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.986319 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: E1212 00:23:48.990496 4606 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 12 00:23:48 crc kubenswrapper[4606]: E1212 00:23:48.990522 4606 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 12 00:23:48 crc kubenswrapper[4606]: E1212 00:23:48.990532 4606 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 12 00:23:48 crc kubenswrapper[4606]: E1212 00:23:48.990577 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-12 00:23:49.490561689 +0000 UTC m=+20.035914555 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.993761 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.994134 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.994273 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.994426 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.994473 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.995528 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.997377 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 00:23:48 crc kubenswrapper[4606]: I1212 00:23:48.999664 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.001072 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.001581 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.001881 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.002239 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.002295 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.002376 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.002456 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.002705 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-554rp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d34036b-3243-416a-a0c0-1d1ddb3a0ca5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9k8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-554rp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.005276 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.005680 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.005838 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/1d34036b-3243-416a-a0c0-1d1ddb3a0ca5-hosts-file\") pod \"node-resolver-554rp\" (UID: \"1d34036b-3243-416a-a0c0-1d1ddb3a0ca5\") " pod="openshift-dns/node-resolver-554rp" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.005885 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9k8v\" (UniqueName: \"kubernetes.io/projected/1d34036b-3243-416a-a0c0-1d1ddb3a0ca5-kube-api-access-g9k8v\") pod \"node-resolver-554rp\" (UID: \"1d34036b-3243-416a-a0c0-1d1ddb3a0ca5\") " pod="openshift-dns/node-resolver-554rp" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.005910 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.005940 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.006042 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.006062 4606 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.006076 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.006088 4606 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.006100 4606 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.006113 4606 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.006124 4606 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.006137 4606 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.006149 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.006160 4606 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.009254 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.009274 4606 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.009341 4606 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.009359 4606 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.009375 4606 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.009388 4606 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.009402 4606 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.009418 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.009432 4606 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.009445 4606 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.009458 4606 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.009470 4606 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.009483 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.009498 4606 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.009510 4606 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.009524 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.009536 4606 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.009550 4606 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.009563 4606 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.009575 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.009588 4606 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.009601 4606 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.009613 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.009624 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.009679 4606 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.009692 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.009707 4606 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.009719 4606 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.009730 4606 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.009743 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.009757 4606 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.009769 4606 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.009780 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.009787 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.009791 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.009823 4606 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.009837 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.009849 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.009849 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.009862 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.009920 4606 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.009930 4606 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.009941 4606 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.009956 4606 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.009966 4606 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.009976 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.009985 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.009994 4606 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.010002 4606 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.010010 4606 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.010019 4606 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.010027 4606 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.010037 4606 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.010047 4606 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.010056 4606 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.010065 4606 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.010074 4606 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.010083 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.010092 4606 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.010100 4606 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.010109 4606 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.010117 4606 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.010126 4606 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.010136 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.010146 4606 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.010154 4606 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.010163 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.010195 4606 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.010206 4606 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.010214 4606 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.010222 4606 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.010231 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.010240 4606 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.010250 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.010258 4606 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.010266 4606 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.010274 4606 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.010283 4606 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.010292 4606 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.010300 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.010309 4606 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.010318 4606 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.010325 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.010333 4606 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.010343 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.010352 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.010360 4606 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.010368 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.010377 4606 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.010386 4606 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.010395 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.010404 4606 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.010412 4606 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.010421 4606 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.010429 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.010437 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.010446 4606 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.010453 4606 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.010461 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.010470 4606 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.010478 4606 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.010486 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.010495 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.010504 4606 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.010512 4606 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.010520 4606 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.010528 4606 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.010536 4606 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.010534 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.010545 4606 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.010645 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.010660 4606 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.010674 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.010687 4606 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.010700 4606 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.010711 4606 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.010721 4606 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.011228 4606 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.011249 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.011262 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.011299 4606 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.011314 4606 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.011326 4606 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.011338 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.011350 4606 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.009689 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/1d34036b-3243-416a-a0c0-1d1ddb3a0ca5-hosts-file\") pod \"node-resolver-554rp\" (UID: \"1d34036b-3243-416a-a0c0-1d1ddb3a0ca5\") " pod="openshift-dns/node-resolver-554rp" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.011388 4606 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.011412 4606 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.011425 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.011436 4606 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.011445 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.011453 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.011462 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.011471 4606 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.011479 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.011489 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.011498 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.011506 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.011514 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.011523 4606 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.011533 4606 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.011540 4606 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.011549 4606 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.011559 4606 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.011567 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.011575 4606 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.011583 4606 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.011591 4606 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.011600 4606 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.011607 4606 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.011615 4606 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.011624 4606 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.011632 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.011640 4606 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.011648 4606 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.011657 4606 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.011665 4606 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.011673 4606 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.011885 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.012365 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.013372 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.022617 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.022921 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.023310 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.023481 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.023621 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.024061 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.024298 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.024386 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.024449 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.024856 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.027397 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.027546 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.027672 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.027725 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.027941 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.028052 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.029375 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.029620 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.030185 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.030252 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.031391 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.031420 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.032986 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.037418 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.039999 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.040350 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.040519 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.040714 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.045566 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.047036 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9k8v\" (UniqueName: \"kubernetes.io/projected/1d34036b-3243-416a-a0c0-1d1ddb3a0ca5-kube-api-access-g9k8v\") pod \"node-resolver-554rp\" (UID: \"1d34036b-3243-416a-a0c0-1d1ddb3a0ca5\") " pod="openshift-dns/node-resolver-554rp" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.062960 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.074613 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.090994 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.097761 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92839d27-9755-4aaa-a4c2-8aa83b6473de\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d18b3a560566347e9cfcd748453a89d1588de77f02a5e961a71916c6182eff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9531d10c07644d54aeff48edf4ac068968acbd1b6597baabcb0660e7bcd54c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76f2e4c2d97f36aebba4bcbabe12214a1df27637376d1440ee54d43fca39065\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c75350d79a93c04db2bd6d71995920cde9b897776249b8696f3ead3939f7d5d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c75350d79a93c04db2bd6d71995920cde9b897776249b8696f3ead3939f7d5d4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"ed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1765499023\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1765499022\\\\\\\\\\\\\\\" (2025-12-11 23:23:42 +0000 UTC to 2026-12-11 23:23:42 +0000 UTC (now=2025-12-12 00:23:48.120831298 +0000 UTC))\\\\\\\"\\\\nI1212 00:23:48.120889 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1212 00:23:48.120921 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1212 00:23:48.120964 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1212 00:23:48.120986 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1212 00:23:48.121028 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3271453193/tls.crt::/tmp/serving-cert-3271453193/tls.key\\\\\\\"\\\\nI1212 00:23:48.121084 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1212 00:23:48.121211 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1212 00:23:48.121228 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1212 00:23:48.121265 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1212 00:23:48.121275 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1212 00:23:48.121345 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1212 00:23:48.121358 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF1212 00:23:48.123292 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3baf1bf5fbaa04f9f165f43b6a4fba496be0d453641fcd8bbc811687891dd66\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bbabfdcef0c9f4dcaada0bf5c2e98c083fd0f809462f017f8a767ef800e8e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0bbabfdcef0c9f4dcaada0bf5c2e98c083fd0f809462f017f8a767ef800e8e13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.110671 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.112569 4606 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.112597 4606 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.112610 4606 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.112620 4606 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.112632 4606 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.112642 4606 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.112652 4606 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.112661 4606 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.112671 4606 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.112681 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.112692 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.112705 4606 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.112716 4606 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.112728 4606 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.112740 4606 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.112752 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.112763 4606 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.112777 4606 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.112789 4606 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.112801 4606 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.112813 4606 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.112826 4606 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.112837 4606 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.112851 4606 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.112862 4606 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.112873 4606 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.112883 4606 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.112894 4606 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.123648 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.130526 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-554rp" Dec 12 00:23:49 crc kubenswrapper[4606]: W1212 00:23:49.141676 4606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-f6a6fa00c2069f0bf22478b89a8cc79d482e1717293dbd1e24631823a796ba57 WatchSource:0}: Error finding container f6a6fa00c2069f0bf22478b89a8cc79d482e1717293dbd1e24631823a796ba57: Status 404 returned error can't find the container with id f6a6fa00c2069f0bf22478b89a8cc79d482e1717293dbd1e24631823a796ba57 Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.420529 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 12 00:23:49 crc kubenswrapper[4606]: E1212 00:23:49.420712 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-12 00:23:50.420686373 +0000 UTC m=+20.966039249 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.521011 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.521067 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.521101 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.521139 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 12 00:23:49 crc kubenswrapper[4606]: E1212 00:23:49.521280 4606 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 12 00:23:49 crc kubenswrapper[4606]: E1212 00:23:49.521301 4606 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 12 00:23:49 crc kubenswrapper[4606]: E1212 00:23:49.521313 4606 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 12 00:23:49 crc kubenswrapper[4606]: E1212 00:23:49.521362 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-12 00:23:50.521346056 +0000 UTC m=+21.066698922 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 12 00:23:49 crc kubenswrapper[4606]: E1212 00:23:49.521717 4606 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 12 00:23:49 crc kubenswrapper[4606]: E1212 00:23:49.521743 4606 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 12 00:23:49 crc kubenswrapper[4606]: E1212 00:23:49.521754 4606 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 12 00:23:49 crc kubenswrapper[4606]: E1212 00:23:49.521782 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-12 00:23:50.521772548 +0000 UTC m=+21.067125414 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 12 00:23:49 crc kubenswrapper[4606]: E1212 00:23:49.521836 4606 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 12 00:23:49 crc kubenswrapper[4606]: E1212 00:23:49.521868 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-12 00:23:50.52185883 +0000 UTC m=+21.067211696 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 12 00:23:49 crc kubenswrapper[4606]: E1212 00:23:49.521903 4606 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 12 00:23:49 crc kubenswrapper[4606]: E1212 00:23:49.521930 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-12 00:23:50.521921812 +0000 UTC m=+21.067274668 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.596259 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-cqmz5"] Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.596592 4606 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.596763 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" Dec 12 00:23:49 crc kubenswrapper[4606]: W1212 00:23:49.597077 4606 reflector.go:484] object-"openshift-network-node-identity"/"ovnkube-identity-cm": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-network-node-identity"/"ovnkube-identity-cm": Unexpected watch close - watch lasted less than a second and no items received Dec 12 00:23:49 crc kubenswrapper[4606]: W1212 00:23:49.597117 4606 reflector.go:484] object-"openshift-network-operator"/"iptables-alerter-script": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-network-operator"/"iptables-alerter-script": Unexpected watch close - watch lasted less than a second and no items received Dec 12 00:23:49 crc kubenswrapper[4606]: W1212 00:23:49.597701 4606 reflector.go:484] object-"openshift-network-node-identity"/"openshift-service-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-network-node-identity"/"openshift-service-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Dec 12 00:23:49 crc kubenswrapper[4606]: W1212 00:23:49.597743 4606 reflector.go:484] object-"openshift-dns"/"kube-root-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-dns"/"kube-root-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Dec 12 00:23:49 crc kubenswrapper[4606]: W1212 00:23:49.598143 4606 reflector.go:484] object-"openshift-network-node-identity"/"network-node-identity-cert": watch of *v1.Secret ended with: very short watch: object-"openshift-network-node-identity"/"network-node-identity-cert": Unexpected watch close - watch lasted less than a second and no items received Dec 12 00:23:49 crc kubenswrapper[4606]: W1212 00:23:49.598197 4606 reflector.go:484] object-"openshift-network-operator"/"metrics-tls": watch of *v1.Secret ended with: very short watch: object-"openshift-network-operator"/"metrics-tls": Unexpected watch close - watch lasted less than a second and no items received Dec 12 00:23:49 crc kubenswrapper[4606]: W1212 00:23:49.598223 4606 reflector.go:484] object-"openshift-network-node-identity"/"kube-root-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-network-node-identity"/"kube-root-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Dec 12 00:23:49 crc kubenswrapper[4606]: W1212 00:23:49.598246 4606 reflector.go:484] object-"openshift-network-node-identity"/"env-overrides": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-network-node-identity"/"env-overrides": Unexpected watch close - watch lasted less than a second and no items received Dec 12 00:23:49 crc kubenswrapper[4606]: W1212 00:23:49.598271 4606 reflector.go:484] object-"openshift-network-operator"/"kube-root-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-network-operator"/"kube-root-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Dec 12 00:23:49 crc kubenswrapper[4606]: W1212 00:23:49.598292 4606 reflector.go:484] object-"openshift-dns"/"node-resolver-dockercfg-kz9s7": watch of *v1.Secret ended with: very short watch: object-"openshift-dns"/"node-resolver-dockercfg-kz9s7": Unexpected watch close - watch lasted less than a second and no items received Dec 12 00:23:49 crc kubenswrapper[4606]: W1212 00:23:49.598494 4606 reflector.go:484] object-"openshift-network-operator"/"openshift-service-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-network-operator"/"openshift-service-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Dec 12 00:23:49 crc kubenswrapper[4606]: W1212 00:23:49.598525 4606 reflector.go:484] object-"openshift-dns"/"openshift-service-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-dns"/"openshift-service-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.606048 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.607673 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.607855 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.607921 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.610748 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.611028 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-xzcfk"] Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.611244 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-w4rbn"] Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.611733 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-w4rbn" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.611977 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-xzcfk" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.616801 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.616987 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.617125 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.617275 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.617397 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.617530 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.617661 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.627215 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92839d27-9755-4aaa-a4c2-8aa83b6473de\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d18b3a560566347e9cfcd748453a89d1588de77f02a5e961a71916c6182eff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9531d10c07644d54aeff48edf4ac068968acbd1b6597baabcb0660e7bcd54c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76f2e4c2d97f36aebba4bcbabe12214a1df27637376d1440ee54d43fca39065\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c75350d79a93c04db2bd6d71995920cde9b897776249b8696f3ead3939f7d5d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c75350d79a93c04db2bd6d71995920cde9b897776249b8696f3ead3939f7d5d4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"ed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1765499023\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1765499022\\\\\\\\\\\\\\\" (2025-12-11 23:23:42 +0000 UTC to 2026-12-11 23:23:42 +0000 UTC (now=2025-12-12 00:23:48.120831298 +0000 UTC))\\\\\\\"\\\\nI1212 00:23:48.120889 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1212 00:23:48.120921 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1212 00:23:48.120964 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1212 00:23:48.120986 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1212 00:23:48.121028 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3271453193/tls.crt::/tmp/serving-cert-3271453193/tls.key\\\\\\\"\\\\nI1212 00:23:48.121084 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1212 00:23:48.121211 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1212 00:23:48.121228 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1212 00:23:48.121265 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1212 00:23:48.121275 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1212 00:23:48.121345 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1212 00:23:48.121358 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF1212 00:23:48.123292 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3baf1bf5fbaa04f9f165f43b6a4fba496be0d453641fcd8bbc811687891dd66\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bbabfdcef0c9f4dcaada0bf5c2e98c083fd0f809462f017f8a767ef800e8e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0bbabfdcef0c9f4dcaada0bf5c2e98c083fd0f809462f017f8a767ef800e8e13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.661156 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.692663 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:49Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.703227 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.703710 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.704424 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.704962 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.705494 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.705933 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.706532 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.707052 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.707625 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.708095 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.708598 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.711027 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.711604 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.712197 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.714753 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.715274 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.716236 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.716596 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.717126 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.718123 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.718551 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.719461 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.719869 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.720849 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.721269 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.721816 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.722860 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.723416 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.724388 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.724849 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.728239 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0-system-cni-dir\") pod \"multus-xzcfk\" (UID: \"b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0\") " pod="openshift-multus/multus-xzcfk" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.728281 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0-hostroot\") pod \"multus-xzcfk\" (UID: \"b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0\") " pod="openshift-multus/multus-xzcfk" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.728302 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0-multus-conf-dir\") pod \"multus-xzcfk\" (UID: \"b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0\") " pod="openshift-multus/multus-xzcfk" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.728336 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-986w2\" (UniqueName: \"kubernetes.io/projected/a543e227-be89-40cb-941d-b4707cc28921-kube-api-access-986w2\") pod \"machine-config-daemon-cqmz5\" (UID: \"a543e227-be89-40cb-941d-b4707cc28921\") " pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.728360 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spsv9\" (UniqueName: \"kubernetes.io/projected/b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0-kube-api-access-spsv9\") pod \"multus-xzcfk\" (UID: \"b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0\") " pod="openshift-multus/multus-xzcfk" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.728381 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0-host-run-netns\") pod \"multus-xzcfk\" (UID: \"b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0\") " pod="openshift-multus/multus-xzcfk" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.728439 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/470c2076-46bf-4305-9fb1-3e509eb4d4f0-os-release\") pod \"multus-additional-cni-plugins-w4rbn\" (UID: \"470c2076-46bf-4305-9fb1-3e509eb4d4f0\") " pod="openshift-multus/multus-additional-cni-plugins-w4rbn" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.728464 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/470c2076-46bf-4305-9fb1-3e509eb4d4f0-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-w4rbn\" (UID: \"470c2076-46bf-4305-9fb1-3e509eb4d4f0\") " pod="openshift-multus/multus-additional-cni-plugins-w4rbn" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.728438 4606 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.728530 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/470c2076-46bf-4305-9fb1-3e509eb4d4f0-cni-binary-copy\") pod \"multus-additional-cni-plugins-w4rbn\" (UID: \"470c2076-46bf-4305-9fb1-3e509eb4d4f0\") " pod="openshift-multus/multus-additional-cni-plugins-w4rbn" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.728555 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a543e227-be89-40cb-941d-b4707cc28921-proxy-tls\") pod \"machine-config-daemon-cqmz5\" (UID: \"a543e227-be89-40cb-941d-b4707cc28921\") " pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.728570 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0-host-var-lib-cni-bin\") pod \"multus-xzcfk\" (UID: \"b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0\") " pod="openshift-multus/multus-xzcfk" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.728588 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0-host-run-multus-certs\") pod \"multus-xzcfk\" (UID: \"b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0\") " pod="openshift-multus/multus-xzcfk" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.728601 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0-multus-cni-dir\") pod \"multus-xzcfk\" (UID: \"b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0\") " pod="openshift-multus/multus-xzcfk" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.728617 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0-etc-kubernetes\") pod \"multus-xzcfk\" (UID: \"b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0\") " pod="openshift-multus/multus-xzcfk" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.728636 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0-multus-daemon-config\") pod \"multus-xzcfk\" (UID: \"b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0\") " pod="openshift-multus/multus-xzcfk" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.728651 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0-host-run-k8s-cni-cncf-io\") pod \"multus-xzcfk\" (UID: \"b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0\") " pod="openshift-multus/multus-xzcfk" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.728665 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0-host-var-lib-kubelet\") pod \"multus-xzcfk\" (UID: \"b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0\") " pod="openshift-multus/multus-xzcfk" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.728666 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.728680 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/470c2076-46bf-4305-9fb1-3e509eb4d4f0-cnibin\") pod \"multus-additional-cni-plugins-w4rbn\" (UID: \"470c2076-46bf-4305-9fb1-3e509eb4d4f0\") " pod="openshift-multus/multus-additional-cni-plugins-w4rbn" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.728781 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a543e227-be89-40cb-941d-b4707cc28921-mcd-auth-proxy-config\") pod \"machine-config-daemon-cqmz5\" (UID: \"a543e227-be89-40cb-941d-b4707cc28921\") " pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.728802 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0-cni-binary-copy\") pod \"multus-xzcfk\" (UID: \"b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0\") " pod="openshift-multus/multus-xzcfk" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.728816 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0-multus-socket-dir-parent\") pod \"multus-xzcfk\" (UID: \"b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0\") " pod="openshift-multus/multus-xzcfk" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.728828 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/470c2076-46bf-4305-9fb1-3e509eb4d4f0-tuning-conf-dir\") pod \"multus-additional-cni-plugins-w4rbn\" (UID: \"470c2076-46bf-4305-9fb1-3e509eb4d4f0\") " pod="openshift-multus/multus-additional-cni-plugins-w4rbn" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.728844 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0-cnibin\") pod \"multus-xzcfk\" (UID: \"b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0\") " pod="openshift-multus/multus-xzcfk" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.728858 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0-host-var-lib-cni-multus\") pod \"multus-xzcfk\" (UID: \"b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0\") " pod="openshift-multus/multus-xzcfk" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.728872 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7tv2\" (UniqueName: \"kubernetes.io/projected/470c2076-46bf-4305-9fb1-3e509eb4d4f0-kube-api-access-k7tv2\") pod \"multus-additional-cni-plugins-w4rbn\" (UID: \"470c2076-46bf-4305-9fb1-3e509eb4d4f0\") " pod="openshift-multus/multus-additional-cni-plugins-w4rbn" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.728886 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0-os-release\") pod \"multus-xzcfk\" (UID: \"b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0\") " pod="openshift-multus/multus-xzcfk" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.728899 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/a543e227-be89-40cb-941d-b4707cc28921-rootfs\") pod \"machine-config-daemon-cqmz5\" (UID: \"a543e227-be89-40cb-941d-b4707cc28921\") " pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.728912 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/470c2076-46bf-4305-9fb1-3e509eb4d4f0-system-cni-dir\") pod \"multus-additional-cni-plugins-w4rbn\" (UID: \"470c2076-46bf-4305-9fb1-3e509eb4d4f0\") " pod="openshift-multus/multus-additional-cni-plugins-w4rbn" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.729317 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:49Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.730282 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.730834 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.731319 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.734043 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.735394 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.736036 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.740289 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.741248 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.741781 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.743042 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.744208 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.744912 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.745790 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.746427 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.747474 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.748455 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.749447 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.750113 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.750721 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.751876 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.752758 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.753869 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.793348 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-554rp" event={"ID":"1d34036b-3243-416a-a0c0-1d1ddb3a0ca5","Type":"ContainerStarted","Data":"346417326a8a4ba8e839ce4870dbaacd33a90b4ab096d8ce573ce598d909b51a"} Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.793396 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-554rp" event={"ID":"1d34036b-3243-416a-a0c0-1d1ddb3a0ca5","Type":"ContainerStarted","Data":"3669b627090eb1f1d3d9b91777b6f0255f62be25b9c7ee20f3ad550eb45ec883"} Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.794854 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"f6a6fa00c2069f0bf22478b89a8cc79d482e1717293dbd1e24631823a796ba57"} Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.796031 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"ce6c4e852b4c68e910621ad11058beecb40960d501d3e64f5fcad97c442df4ee"} Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.796061 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"851a83473c28345a8fa1e41fe5202c7c2356c87168d7941b997ee7a34abcc68a"} Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.797715 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a543e227-be89-40cb-941d-b4707cc28921\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-986w2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-986w2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cqmz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:49Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.798278 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.799948 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"4b764ca1153735e365d95ed2855c7d41cd7457614e1ddce0f008d62720d607e6"} Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.800255 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.801317 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"f4986a294a19f85082b9caf854f13f9c3587ca49314b40917b00768ee0bb51ad"} Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.801368 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"b4251eab73ccb6141903993f8df922037762f7c11dc1e0ef1a2b5708ed435307"} Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.801387 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"1b324d05bb8fe1943472299fd2564c398c3732e7b71fc3da51ff91f32552cf25"} Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.817812 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:49Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.829243 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/470c2076-46bf-4305-9fb1-3e509eb4d4f0-cni-binary-copy\") pod \"multus-additional-cni-plugins-w4rbn\" (UID: \"470c2076-46bf-4305-9fb1-3e509eb4d4f0\") " pod="openshift-multus/multus-additional-cni-plugins-w4rbn" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.829278 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/470c2076-46bf-4305-9fb1-3e509eb4d4f0-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-w4rbn\" (UID: \"470c2076-46bf-4305-9fb1-3e509eb4d4f0\") " pod="openshift-multus/multus-additional-cni-plugins-w4rbn" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.829298 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a543e227-be89-40cb-941d-b4707cc28921-proxy-tls\") pod \"machine-config-daemon-cqmz5\" (UID: \"a543e227-be89-40cb-941d-b4707cc28921\") " pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.829316 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0-host-var-lib-cni-bin\") pod \"multus-xzcfk\" (UID: \"b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0\") " pod="openshift-multus/multus-xzcfk" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.829333 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0-host-run-multus-certs\") pod \"multus-xzcfk\" (UID: \"b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0\") " pod="openshift-multus/multus-xzcfk" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.829348 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0-multus-cni-dir\") pod \"multus-xzcfk\" (UID: \"b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0\") " pod="openshift-multus/multus-xzcfk" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.829363 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0-multus-daemon-config\") pod \"multus-xzcfk\" (UID: \"b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0\") " pod="openshift-multus/multus-xzcfk" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.829377 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0-etc-kubernetes\") pod \"multus-xzcfk\" (UID: \"b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0\") " pod="openshift-multus/multus-xzcfk" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.829398 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a543e227-be89-40cb-941d-b4707cc28921-mcd-auth-proxy-config\") pod \"machine-config-daemon-cqmz5\" (UID: \"a543e227-be89-40cb-941d-b4707cc28921\") " pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.829415 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0-host-run-k8s-cni-cncf-io\") pod \"multus-xzcfk\" (UID: \"b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0\") " pod="openshift-multus/multus-xzcfk" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.829433 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0-host-var-lib-kubelet\") pod \"multus-xzcfk\" (UID: \"b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0\") " pod="openshift-multus/multus-xzcfk" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.829448 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/470c2076-46bf-4305-9fb1-3e509eb4d4f0-cnibin\") pod \"multus-additional-cni-plugins-w4rbn\" (UID: \"470c2076-46bf-4305-9fb1-3e509eb4d4f0\") " pod="openshift-multus/multus-additional-cni-plugins-w4rbn" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.829462 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0-cni-binary-copy\") pod \"multus-xzcfk\" (UID: \"b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0\") " pod="openshift-multus/multus-xzcfk" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.829476 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0-multus-socket-dir-parent\") pod \"multus-xzcfk\" (UID: \"b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0\") " pod="openshift-multus/multus-xzcfk" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.829490 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/470c2076-46bf-4305-9fb1-3e509eb4d4f0-tuning-conf-dir\") pod \"multus-additional-cni-plugins-w4rbn\" (UID: \"470c2076-46bf-4305-9fb1-3e509eb4d4f0\") " pod="openshift-multus/multus-additional-cni-plugins-w4rbn" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.829505 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0-cnibin\") pod \"multus-xzcfk\" (UID: \"b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0\") " pod="openshift-multus/multus-xzcfk" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.829518 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0-os-release\") pod \"multus-xzcfk\" (UID: \"b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0\") " pod="openshift-multus/multus-xzcfk" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.829531 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0-host-var-lib-cni-multus\") pod \"multus-xzcfk\" (UID: \"b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0\") " pod="openshift-multus/multus-xzcfk" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.829546 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k7tv2\" (UniqueName: \"kubernetes.io/projected/470c2076-46bf-4305-9fb1-3e509eb4d4f0-kube-api-access-k7tv2\") pod \"multus-additional-cni-plugins-w4rbn\" (UID: \"470c2076-46bf-4305-9fb1-3e509eb4d4f0\") " pod="openshift-multus/multus-additional-cni-plugins-w4rbn" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.829563 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/a543e227-be89-40cb-941d-b4707cc28921-rootfs\") pod \"machine-config-daemon-cqmz5\" (UID: \"a543e227-be89-40cb-941d-b4707cc28921\") " pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.829584 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/470c2076-46bf-4305-9fb1-3e509eb4d4f0-system-cni-dir\") pod \"multus-additional-cni-plugins-w4rbn\" (UID: \"470c2076-46bf-4305-9fb1-3e509eb4d4f0\") " pod="openshift-multus/multus-additional-cni-plugins-w4rbn" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.829600 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0-hostroot\") pod \"multus-xzcfk\" (UID: \"b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0\") " pod="openshift-multus/multus-xzcfk" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.829613 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0-multus-conf-dir\") pod \"multus-xzcfk\" (UID: \"b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0\") " pod="openshift-multus/multus-xzcfk" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.829631 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0-system-cni-dir\") pod \"multus-xzcfk\" (UID: \"b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0\") " pod="openshift-multus/multus-xzcfk" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.829657 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-986w2\" (UniqueName: \"kubernetes.io/projected/a543e227-be89-40cb-941d-b4707cc28921-kube-api-access-986w2\") pod \"machine-config-daemon-cqmz5\" (UID: \"a543e227-be89-40cb-941d-b4707cc28921\") " pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.829680 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-spsv9\" (UniqueName: \"kubernetes.io/projected/b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0-kube-api-access-spsv9\") pod \"multus-xzcfk\" (UID: \"b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0\") " pod="openshift-multus/multus-xzcfk" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.829722 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0-host-run-netns\") pod \"multus-xzcfk\" (UID: \"b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0\") " pod="openshift-multus/multus-xzcfk" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.829740 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/470c2076-46bf-4305-9fb1-3e509eb4d4f0-os-release\") pod \"multus-additional-cni-plugins-w4rbn\" (UID: \"470c2076-46bf-4305-9fb1-3e509eb4d4f0\") " pod="openshift-multus/multus-additional-cni-plugins-w4rbn" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.829996 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/470c2076-46bf-4305-9fb1-3e509eb4d4f0-os-release\") pod \"multus-additional-cni-plugins-w4rbn\" (UID: \"470c2076-46bf-4305-9fb1-3e509eb4d4f0\") " pod="openshift-multus/multus-additional-cni-plugins-w4rbn" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.830417 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/470c2076-46bf-4305-9fb1-3e509eb4d4f0-cni-binary-copy\") pod \"multus-additional-cni-plugins-w4rbn\" (UID: \"470c2076-46bf-4305-9fb1-3e509eb4d4f0\") " pod="openshift-multus/multus-additional-cni-plugins-w4rbn" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.830596 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/470c2076-46bf-4305-9fb1-3e509eb4d4f0-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-w4rbn\" (UID: \"470c2076-46bf-4305-9fb1-3e509eb4d4f0\") " pod="openshift-multus/multus-additional-cni-plugins-w4rbn" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.830614 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0-system-cni-dir\") pod \"multus-xzcfk\" (UID: \"b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0\") " pod="openshift-multus/multus-xzcfk" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.830844 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0-host-run-netns\") pod \"multus-xzcfk\" (UID: \"b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0\") " pod="openshift-multus/multus-xzcfk" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.830881 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0-host-var-lib-cni-multus\") pod \"multus-xzcfk\" (UID: \"b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0\") " pod="openshift-multus/multus-xzcfk" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.830914 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0-cnibin\") pod \"multus-xzcfk\" (UID: \"b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0\") " pod="openshift-multus/multus-xzcfk" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.830947 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0-os-release\") pod \"multus-xzcfk\" (UID: \"b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0\") " pod="openshift-multus/multus-xzcfk" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.830952 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0-hostroot\") pod \"multus-xzcfk\" (UID: \"b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0\") " pod="openshift-multus/multus-xzcfk" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.831206 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0-multus-conf-dir\") pod \"multus-xzcfk\" (UID: \"b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0\") " pod="openshift-multus/multus-xzcfk" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.831246 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/a543e227-be89-40cb-941d-b4707cc28921-rootfs\") pod \"machine-config-daemon-cqmz5\" (UID: \"a543e227-be89-40cb-941d-b4707cc28921\") " pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.831282 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0-etc-kubernetes\") pod \"multus-xzcfk\" (UID: \"b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0\") " pod="openshift-multus/multus-xzcfk" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.831298 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0-host-run-multus-certs\") pod \"multus-xzcfk\" (UID: \"b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0\") " pod="openshift-multus/multus-xzcfk" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.831334 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0-multus-cni-dir\") pod \"multus-xzcfk\" (UID: \"b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0\") " pod="openshift-multus/multus-xzcfk" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.831975 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0-multus-daemon-config\") pod \"multus-xzcfk\" (UID: \"b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0\") " pod="openshift-multus/multus-xzcfk" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.832033 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0-host-var-lib-cni-bin\") pod \"multus-xzcfk\" (UID: \"b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0\") " pod="openshift-multus/multus-xzcfk" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.832068 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0-host-var-lib-kubelet\") pod \"multus-xzcfk\" (UID: \"b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0\") " pod="openshift-multus/multus-xzcfk" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.832637 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a543e227-be89-40cb-941d-b4707cc28921-mcd-auth-proxy-config\") pod \"machine-config-daemon-cqmz5\" (UID: \"a543e227-be89-40cb-941d-b4707cc28921\") " pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.832680 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0-host-run-k8s-cni-cncf-io\") pod \"multus-xzcfk\" (UID: \"b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0\") " pod="openshift-multus/multus-xzcfk" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.833041 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0-cni-binary-copy\") pod \"multus-xzcfk\" (UID: \"b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0\") " pod="openshift-multus/multus-xzcfk" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.833076 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/470c2076-46bf-4305-9fb1-3e509eb4d4f0-cnibin\") pod \"multus-additional-cni-plugins-w4rbn\" (UID: \"470c2076-46bf-4305-9fb1-3e509eb4d4f0\") " pod="openshift-multus/multus-additional-cni-plugins-w4rbn" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.833108 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0-multus-socket-dir-parent\") pod \"multus-xzcfk\" (UID: \"b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0\") " pod="openshift-multus/multus-xzcfk" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.833197 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/470c2076-46bf-4305-9fb1-3e509eb4d4f0-tuning-conf-dir\") pod \"multus-additional-cni-plugins-w4rbn\" (UID: \"470c2076-46bf-4305-9fb1-3e509eb4d4f0\") " pod="openshift-multus/multus-additional-cni-plugins-w4rbn" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.833256 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/470c2076-46bf-4305-9fb1-3e509eb4d4f0-system-cni-dir\") pod \"multus-additional-cni-plugins-w4rbn\" (UID: \"470c2076-46bf-4305-9fb1-3e509eb4d4f0\") " pod="openshift-multus/multus-additional-cni-plugins-w4rbn" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.835773 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a543e227-be89-40cb-941d-b4707cc28921-proxy-tls\") pod \"machine-config-daemon-cqmz5\" (UID: \"a543e227-be89-40cb-941d-b4707cc28921\") " pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.878055 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7tv2\" (UniqueName: \"kubernetes.io/projected/470c2076-46bf-4305-9fb1-3e509eb4d4f0-kube-api-access-k7tv2\") pod \"multus-additional-cni-plugins-w4rbn\" (UID: \"470c2076-46bf-4305-9fb1-3e509eb4d4f0\") " pod="openshift-multus/multus-additional-cni-plugins-w4rbn" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.882876 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-986w2\" (UniqueName: \"kubernetes.io/projected/a543e227-be89-40cb-941d-b4707cc28921-kube-api-access-986w2\") pod \"machine-config-daemon-cqmz5\" (UID: \"a543e227-be89-40cb-941d-b4707cc28921\") " pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.883455 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:49Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.883717 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-spsv9\" (UniqueName: \"kubernetes.io/projected/b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0-kube-api-access-spsv9\") pod \"multus-xzcfk\" (UID: \"b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0\") " pod="openshift-multus/multus-xzcfk" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.901003 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:49Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.910358 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.927587 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-w4rbn" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.928470 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-554rp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d34036b-3243-416a-a0c0-1d1ddb3a0ca5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9k8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-554rp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:49Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.936670 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-xzcfk" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.947496 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:49Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.962758 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a543e227-be89-40cb-941d-b4707cc28921\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-986w2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-986w2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cqmz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:49Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:49 crc kubenswrapper[4606]: I1212 00:23:49.987373 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xzcfk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-spsv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xzcfk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:49Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:50 crc kubenswrapper[4606]: I1212 00:23:49.995841 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-hpw5w"] Dec 12 00:23:50 crc kubenswrapper[4606]: I1212 00:23:49.996581 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" Dec 12 00:23:50 crc kubenswrapper[4606]: I1212 00:23:50.000545 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 12 00:23:50 crc kubenswrapper[4606]: I1212 00:23:50.000795 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 12 00:23:50 crc kubenswrapper[4606]: I1212 00:23:50.000905 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 12 00:23:50 crc kubenswrapper[4606]: I1212 00:23:50.001005 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 12 00:23:50 crc kubenswrapper[4606]: I1212 00:23:50.001091 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 12 00:23:50 crc kubenswrapper[4606]: I1212 00:23:50.001304 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 12 00:23:50 crc kubenswrapper[4606]: I1212 00:23:50.001377 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 12 00:23:50 crc kubenswrapper[4606]: I1212 00:23:50.018805 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92839d27-9755-4aaa-a4c2-8aa83b6473de\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d18b3a560566347e9cfcd748453a89d1588de77f02a5e961a71916c6182eff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9531d10c07644d54aeff48edf4ac068968acbd1b6597baabcb0660e7bcd54c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76f2e4c2d97f36aebba4bcbabe12214a1df27637376d1440ee54d43fca39065\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b764ca1153735e365d95ed2855c7d41cd7457614e1ddce0f008d62720d607e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c75350d79a93c04db2bd6d71995920cde9b897776249b8696f3ead3939f7d5d4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"ed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1765499023\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1765499022\\\\\\\\\\\\\\\" (2025-12-11 23:23:42 +0000 UTC to 2026-12-11 23:23:42 +0000 UTC (now=2025-12-12 00:23:48.120831298 +0000 UTC))\\\\\\\"\\\\nI1212 00:23:48.120889 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1212 00:23:48.120921 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1212 00:23:48.120964 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1212 00:23:48.120986 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1212 00:23:48.121028 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3271453193/tls.crt::/tmp/serving-cert-3271453193/tls.key\\\\\\\"\\\\nI1212 00:23:48.121084 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1212 00:23:48.121211 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1212 00:23:48.121228 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1212 00:23:48.121265 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1212 00:23:48.121275 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1212 00:23:48.121345 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1212 00:23:48.121358 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF1212 00:23:48.123292 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3baf1bf5fbaa04f9f165f43b6a4fba496be0d453641fcd8bbc811687891dd66\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bbabfdcef0c9f4dcaada0bf5c2e98c083fd0f809462f017f8a767ef800e8e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0bbabfdcef0c9f4dcaada0bf5c2e98c083fd0f809462f017f8a767ef800e8e13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:50Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:50 crc kubenswrapper[4606]: I1212 00:23:50.039988 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:50Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:50 crc kubenswrapper[4606]: I1212 00:23:50.060457 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4986a294a19f85082b9caf854f13f9c3587ca49314b40917b00768ee0bb51ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4251eab73ccb6141903993f8df922037762f7c11dc1e0ef1a2b5708ed435307\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:50Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:50 crc kubenswrapper[4606]: I1212 00:23:50.075551 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:50Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:50 crc kubenswrapper[4606]: I1212 00:23:50.104894 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce6c4e852b4c68e910621ad11058beecb40960d501d3e64f5fcad97c442df4ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:50Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:50 crc kubenswrapper[4606]: I1212 00:23:50.124060 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:50Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:50 crc kubenswrapper[4606]: I1212 00:23:50.133137 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/da25b0ba-5398-4185-a4c1-aeba44ae5633-host-run-ovn-kubernetes\") pod \"ovnkube-node-hpw5w\" (UID: \"da25b0ba-5398-4185-a4c1-aeba44ae5633\") " pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" Dec 12 00:23:50 crc kubenswrapper[4606]: I1212 00:23:50.133284 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/da25b0ba-5398-4185-a4c1-aeba44ae5633-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-hpw5w\" (UID: \"da25b0ba-5398-4185-a4c1-aeba44ae5633\") " pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" Dec 12 00:23:50 crc kubenswrapper[4606]: I1212 00:23:50.133329 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/da25b0ba-5398-4185-a4c1-aeba44ae5633-etc-openvswitch\") pod \"ovnkube-node-hpw5w\" (UID: \"da25b0ba-5398-4185-a4c1-aeba44ae5633\") " pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" Dec 12 00:23:50 crc kubenswrapper[4606]: I1212 00:23:50.133353 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/da25b0ba-5398-4185-a4c1-aeba44ae5633-systemd-units\") pod \"ovnkube-node-hpw5w\" (UID: \"da25b0ba-5398-4185-a4c1-aeba44ae5633\") " pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" Dec 12 00:23:50 crc kubenswrapper[4606]: I1212 00:23:50.133374 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/da25b0ba-5398-4185-a4c1-aeba44ae5633-run-ovn\") pod \"ovnkube-node-hpw5w\" (UID: \"da25b0ba-5398-4185-a4c1-aeba44ae5633\") " pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" Dec 12 00:23:50 crc kubenswrapper[4606]: I1212 00:23:50.133404 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/da25b0ba-5398-4185-a4c1-aeba44ae5633-ovnkube-script-lib\") pod \"ovnkube-node-hpw5w\" (UID: \"da25b0ba-5398-4185-a4c1-aeba44ae5633\") " pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" Dec 12 00:23:50 crc kubenswrapper[4606]: I1212 00:23:50.133447 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/da25b0ba-5398-4185-a4c1-aeba44ae5633-host-cni-bin\") pod \"ovnkube-node-hpw5w\" (UID: \"da25b0ba-5398-4185-a4c1-aeba44ae5633\") " pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" Dec 12 00:23:50 crc kubenswrapper[4606]: I1212 00:23:50.133471 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/da25b0ba-5398-4185-a4c1-aeba44ae5633-ovn-node-metrics-cert\") pod \"ovnkube-node-hpw5w\" (UID: \"da25b0ba-5398-4185-a4c1-aeba44ae5633\") " pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" Dec 12 00:23:50 crc kubenswrapper[4606]: I1212 00:23:50.133498 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/da25b0ba-5398-4185-a4c1-aeba44ae5633-host-slash\") pod \"ovnkube-node-hpw5w\" (UID: \"da25b0ba-5398-4185-a4c1-aeba44ae5633\") " pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" Dec 12 00:23:50 crc kubenswrapper[4606]: I1212 00:23:50.133529 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/da25b0ba-5398-4185-a4c1-aeba44ae5633-env-overrides\") pod \"ovnkube-node-hpw5w\" (UID: \"da25b0ba-5398-4185-a4c1-aeba44ae5633\") " pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" Dec 12 00:23:50 crc kubenswrapper[4606]: I1212 00:23:50.133551 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbcl8\" (UniqueName: \"kubernetes.io/projected/da25b0ba-5398-4185-a4c1-aeba44ae5633-kube-api-access-hbcl8\") pod \"ovnkube-node-hpw5w\" (UID: \"da25b0ba-5398-4185-a4c1-aeba44ae5633\") " pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" Dec 12 00:23:50 crc kubenswrapper[4606]: I1212 00:23:50.133583 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/da25b0ba-5398-4185-a4c1-aeba44ae5633-var-lib-openvswitch\") pod \"ovnkube-node-hpw5w\" (UID: \"da25b0ba-5398-4185-a4c1-aeba44ae5633\") " pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" Dec 12 00:23:50 crc kubenswrapper[4606]: I1212 00:23:50.133619 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/da25b0ba-5398-4185-a4c1-aeba44ae5633-host-kubelet\") pod \"ovnkube-node-hpw5w\" (UID: \"da25b0ba-5398-4185-a4c1-aeba44ae5633\") " pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" Dec 12 00:23:50 crc kubenswrapper[4606]: I1212 00:23:50.133647 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/da25b0ba-5398-4185-a4c1-aeba44ae5633-host-cni-netd\") pod \"ovnkube-node-hpw5w\" (UID: \"da25b0ba-5398-4185-a4c1-aeba44ae5633\") " pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" Dec 12 00:23:50 crc kubenswrapper[4606]: I1212 00:23:50.133674 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/da25b0ba-5398-4185-a4c1-aeba44ae5633-run-systemd\") pod \"ovnkube-node-hpw5w\" (UID: \"da25b0ba-5398-4185-a4c1-aeba44ae5633\") " pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" Dec 12 00:23:50 crc kubenswrapper[4606]: I1212 00:23:50.133701 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/da25b0ba-5398-4185-a4c1-aeba44ae5633-log-socket\") pod \"ovnkube-node-hpw5w\" (UID: \"da25b0ba-5398-4185-a4c1-aeba44ae5633\") " pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" Dec 12 00:23:50 crc kubenswrapper[4606]: I1212 00:23:50.133723 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/da25b0ba-5398-4185-a4c1-aeba44ae5633-node-log\") pod \"ovnkube-node-hpw5w\" (UID: \"da25b0ba-5398-4185-a4c1-aeba44ae5633\") " pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" Dec 12 00:23:50 crc kubenswrapper[4606]: I1212 00:23:50.133746 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/da25b0ba-5398-4185-a4c1-aeba44ae5633-ovnkube-config\") pod \"ovnkube-node-hpw5w\" (UID: \"da25b0ba-5398-4185-a4c1-aeba44ae5633\") " pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" Dec 12 00:23:50 crc kubenswrapper[4606]: I1212 00:23:50.133788 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/da25b0ba-5398-4185-a4c1-aeba44ae5633-host-run-netns\") pod \"ovnkube-node-hpw5w\" (UID: \"da25b0ba-5398-4185-a4c1-aeba44ae5633\") " pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" Dec 12 00:23:50 crc kubenswrapper[4606]: I1212 00:23:50.133813 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/da25b0ba-5398-4185-a4c1-aeba44ae5633-run-openvswitch\") pod \"ovnkube-node-hpw5w\" (UID: \"da25b0ba-5398-4185-a4c1-aeba44ae5633\") " pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" Dec 12 00:23:50 crc kubenswrapper[4606]: I1212 00:23:50.146668 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-554rp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d34036b-3243-416a-a0c0-1d1ddb3a0ca5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://346417326a8a4ba8e839ce4870dbaacd33a90b4ab096d8ce573ce598d909b51a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9k8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-554rp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:50Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:50 crc kubenswrapper[4606]: I1212 00:23:50.164058 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-w4rbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"470c2076-46bf-4305-9fb1-3e509eb4d4f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-w4rbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:50Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:50 crc kubenswrapper[4606]: I1212 00:23:50.179271 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4986a294a19f85082b9caf854f13f9c3587ca49314b40917b00768ee0bb51ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4251eab73ccb6141903993f8df922037762f7c11dc1e0ef1a2b5708ed435307\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:50Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:50 crc kubenswrapper[4606]: I1212 00:23:50.193626 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:50Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:50 crc kubenswrapper[4606]: I1212 00:23:50.208703 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce6c4e852b4c68e910621ad11058beecb40960d501d3e64f5fcad97c442df4ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:50Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:50 crc kubenswrapper[4606]: I1212 00:23:50.221769 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:50Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:50 crc kubenswrapper[4606]: I1212 00:23:50.232286 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-554rp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d34036b-3243-416a-a0c0-1d1ddb3a0ca5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://346417326a8a4ba8e839ce4870dbaacd33a90b4ab096d8ce573ce598d909b51a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9k8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-554rp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:50Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:50 crc kubenswrapper[4606]: I1212 00:23:50.234593 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/da25b0ba-5398-4185-a4c1-aeba44ae5633-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-hpw5w\" (UID: \"da25b0ba-5398-4185-a4c1-aeba44ae5633\") " pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" Dec 12 00:23:50 crc kubenswrapper[4606]: I1212 00:23:50.234745 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/da25b0ba-5398-4185-a4c1-aeba44ae5633-etc-openvswitch\") pod \"ovnkube-node-hpw5w\" (UID: \"da25b0ba-5398-4185-a4c1-aeba44ae5633\") " pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" Dec 12 00:23:50 crc kubenswrapper[4606]: I1212 00:23:50.234805 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/da25b0ba-5398-4185-a4c1-aeba44ae5633-etc-openvswitch\") pod \"ovnkube-node-hpw5w\" (UID: \"da25b0ba-5398-4185-a4c1-aeba44ae5633\") " pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" Dec 12 00:23:50 crc kubenswrapper[4606]: I1212 00:23:50.234813 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/da25b0ba-5398-4185-a4c1-aeba44ae5633-systemd-units\") pod \"ovnkube-node-hpw5w\" (UID: \"da25b0ba-5398-4185-a4c1-aeba44ae5633\") " pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" Dec 12 00:23:50 crc kubenswrapper[4606]: I1212 00:23:50.234801 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/da25b0ba-5398-4185-a4c1-aeba44ae5633-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-hpw5w\" (UID: \"da25b0ba-5398-4185-a4c1-aeba44ae5633\") " pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" Dec 12 00:23:50 crc kubenswrapper[4606]: I1212 00:23:50.234968 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/da25b0ba-5398-4185-a4c1-aeba44ae5633-systemd-units\") pod \"ovnkube-node-hpw5w\" (UID: \"da25b0ba-5398-4185-a4c1-aeba44ae5633\") " pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" Dec 12 00:23:50 crc kubenswrapper[4606]: I1212 00:23:50.235065 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/da25b0ba-5398-4185-a4c1-aeba44ae5633-run-ovn\") pod \"ovnkube-node-hpw5w\" (UID: \"da25b0ba-5398-4185-a4c1-aeba44ae5633\") " pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" Dec 12 00:23:50 crc kubenswrapper[4606]: I1212 00:23:50.235099 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/da25b0ba-5398-4185-a4c1-aeba44ae5633-ovnkube-script-lib\") pod \"ovnkube-node-hpw5w\" (UID: \"da25b0ba-5398-4185-a4c1-aeba44ae5633\") " pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" Dec 12 00:23:50 crc kubenswrapper[4606]: I1212 00:23:50.235124 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/da25b0ba-5398-4185-a4c1-aeba44ae5633-host-cni-bin\") pod \"ovnkube-node-hpw5w\" (UID: \"da25b0ba-5398-4185-a4c1-aeba44ae5633\") " pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" Dec 12 00:23:50 crc kubenswrapper[4606]: I1212 00:23:50.235140 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/da25b0ba-5398-4185-a4c1-aeba44ae5633-ovn-node-metrics-cert\") pod \"ovnkube-node-hpw5w\" (UID: \"da25b0ba-5398-4185-a4c1-aeba44ae5633\") " pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" Dec 12 00:23:50 crc kubenswrapper[4606]: I1212 00:23:50.235160 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/da25b0ba-5398-4185-a4c1-aeba44ae5633-host-slash\") pod \"ovnkube-node-hpw5w\" (UID: \"da25b0ba-5398-4185-a4c1-aeba44ae5633\") " pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" Dec 12 00:23:50 crc kubenswrapper[4606]: I1212 00:23:50.235159 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/da25b0ba-5398-4185-a4c1-aeba44ae5633-run-ovn\") pod \"ovnkube-node-hpw5w\" (UID: \"da25b0ba-5398-4185-a4c1-aeba44ae5633\") " pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" Dec 12 00:23:50 crc kubenswrapper[4606]: I1212 00:23:50.235195 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/da25b0ba-5398-4185-a4c1-aeba44ae5633-env-overrides\") pod \"ovnkube-node-hpw5w\" (UID: \"da25b0ba-5398-4185-a4c1-aeba44ae5633\") " pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" Dec 12 00:23:50 crc kubenswrapper[4606]: I1212 00:23:50.235269 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hbcl8\" (UniqueName: \"kubernetes.io/projected/da25b0ba-5398-4185-a4c1-aeba44ae5633-kube-api-access-hbcl8\") pod \"ovnkube-node-hpw5w\" (UID: \"da25b0ba-5398-4185-a4c1-aeba44ae5633\") " pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" Dec 12 00:23:50 crc kubenswrapper[4606]: I1212 00:23:50.235309 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/da25b0ba-5398-4185-a4c1-aeba44ae5633-host-kubelet\") pod \"ovnkube-node-hpw5w\" (UID: \"da25b0ba-5398-4185-a4c1-aeba44ae5633\") " pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" Dec 12 00:23:50 crc kubenswrapper[4606]: I1212 00:23:50.235343 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/da25b0ba-5398-4185-a4c1-aeba44ae5633-var-lib-openvswitch\") pod \"ovnkube-node-hpw5w\" (UID: \"da25b0ba-5398-4185-a4c1-aeba44ae5633\") " pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" Dec 12 00:23:50 crc kubenswrapper[4606]: I1212 00:23:50.235381 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/da25b0ba-5398-4185-a4c1-aeba44ae5633-host-cni-netd\") pod \"ovnkube-node-hpw5w\" (UID: \"da25b0ba-5398-4185-a4c1-aeba44ae5633\") " pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" Dec 12 00:23:50 crc kubenswrapper[4606]: I1212 00:23:50.235413 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/da25b0ba-5398-4185-a4c1-aeba44ae5633-run-systemd\") pod \"ovnkube-node-hpw5w\" (UID: \"da25b0ba-5398-4185-a4c1-aeba44ae5633\") " pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" Dec 12 00:23:50 crc kubenswrapper[4606]: I1212 00:23:50.235453 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/da25b0ba-5398-4185-a4c1-aeba44ae5633-node-log\") pod \"ovnkube-node-hpw5w\" (UID: \"da25b0ba-5398-4185-a4c1-aeba44ae5633\") " pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" Dec 12 00:23:50 crc kubenswrapper[4606]: I1212 00:23:50.235482 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/da25b0ba-5398-4185-a4c1-aeba44ae5633-log-socket\") pod \"ovnkube-node-hpw5w\" (UID: \"da25b0ba-5398-4185-a4c1-aeba44ae5633\") " pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" Dec 12 00:23:50 crc kubenswrapper[4606]: I1212 00:23:50.235512 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/da25b0ba-5398-4185-a4c1-aeba44ae5633-ovnkube-config\") pod \"ovnkube-node-hpw5w\" (UID: \"da25b0ba-5398-4185-a4c1-aeba44ae5633\") " pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" Dec 12 00:23:50 crc kubenswrapper[4606]: I1212 00:23:50.235545 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/da25b0ba-5398-4185-a4c1-aeba44ae5633-host-run-netns\") pod \"ovnkube-node-hpw5w\" (UID: \"da25b0ba-5398-4185-a4c1-aeba44ae5633\") " pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" Dec 12 00:23:50 crc kubenswrapper[4606]: I1212 00:23:50.235600 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/da25b0ba-5398-4185-a4c1-aeba44ae5633-run-openvswitch\") pod \"ovnkube-node-hpw5w\" (UID: \"da25b0ba-5398-4185-a4c1-aeba44ae5633\") " pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" Dec 12 00:23:50 crc kubenswrapper[4606]: I1212 00:23:50.235616 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/da25b0ba-5398-4185-a4c1-aeba44ae5633-host-cni-bin\") pod \"ovnkube-node-hpw5w\" (UID: \"da25b0ba-5398-4185-a4c1-aeba44ae5633\") " pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" Dec 12 00:23:50 crc kubenswrapper[4606]: I1212 00:23:50.235638 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/da25b0ba-5398-4185-a4c1-aeba44ae5633-host-run-ovn-kubernetes\") pod \"ovnkube-node-hpw5w\" (UID: \"da25b0ba-5398-4185-a4c1-aeba44ae5633\") " pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" Dec 12 00:23:50 crc kubenswrapper[4606]: I1212 00:23:50.235728 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/da25b0ba-5398-4185-a4c1-aeba44ae5633-host-run-ovn-kubernetes\") pod \"ovnkube-node-hpw5w\" (UID: \"da25b0ba-5398-4185-a4c1-aeba44ae5633\") " pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" Dec 12 00:23:50 crc kubenswrapper[4606]: I1212 00:23:50.235890 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/da25b0ba-5398-4185-a4c1-aeba44ae5633-host-slash\") pod \"ovnkube-node-hpw5w\" (UID: \"da25b0ba-5398-4185-a4c1-aeba44ae5633\") " pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" Dec 12 00:23:50 crc kubenswrapper[4606]: I1212 00:23:50.236006 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/da25b0ba-5398-4185-a4c1-aeba44ae5633-node-log\") pod \"ovnkube-node-hpw5w\" (UID: \"da25b0ba-5398-4185-a4c1-aeba44ae5633\") " pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" Dec 12 00:23:50 crc kubenswrapper[4606]: I1212 00:23:50.236061 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/da25b0ba-5398-4185-a4c1-aeba44ae5633-host-kubelet\") pod \"ovnkube-node-hpw5w\" (UID: \"da25b0ba-5398-4185-a4c1-aeba44ae5633\") " pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" Dec 12 00:23:50 crc kubenswrapper[4606]: I1212 00:23:50.236106 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/da25b0ba-5398-4185-a4c1-aeba44ae5633-var-lib-openvswitch\") pod \"ovnkube-node-hpw5w\" (UID: \"da25b0ba-5398-4185-a4c1-aeba44ae5633\") " pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" Dec 12 00:23:50 crc kubenswrapper[4606]: I1212 00:23:50.236159 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/da25b0ba-5398-4185-a4c1-aeba44ae5633-host-cni-netd\") pod \"ovnkube-node-hpw5w\" (UID: \"da25b0ba-5398-4185-a4c1-aeba44ae5633\") " pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" Dec 12 00:23:50 crc kubenswrapper[4606]: I1212 00:23:50.236294 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/da25b0ba-5398-4185-a4c1-aeba44ae5633-host-run-netns\") pod \"ovnkube-node-hpw5w\" (UID: \"da25b0ba-5398-4185-a4c1-aeba44ae5633\") " pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" Dec 12 00:23:50 crc kubenswrapper[4606]: I1212 00:23:50.236363 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/da25b0ba-5398-4185-a4c1-aeba44ae5633-run-openvswitch\") pod \"ovnkube-node-hpw5w\" (UID: \"da25b0ba-5398-4185-a4c1-aeba44ae5633\") " pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" Dec 12 00:23:50 crc kubenswrapper[4606]: I1212 00:23:50.236417 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/da25b0ba-5398-4185-a4c1-aeba44ae5633-log-socket\") pod \"ovnkube-node-hpw5w\" (UID: \"da25b0ba-5398-4185-a4c1-aeba44ae5633\") " pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" Dec 12 00:23:50 crc kubenswrapper[4606]: I1212 00:23:50.236474 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/da25b0ba-5398-4185-a4c1-aeba44ae5633-run-systemd\") pod \"ovnkube-node-hpw5w\" (UID: \"da25b0ba-5398-4185-a4c1-aeba44ae5633\") " pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" Dec 12 00:23:50 crc kubenswrapper[4606]: I1212 00:23:50.244606 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/da25b0ba-5398-4185-a4c1-aeba44ae5633-env-overrides\") pod \"ovnkube-node-hpw5w\" (UID: \"da25b0ba-5398-4185-a4c1-aeba44ae5633\") " pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" Dec 12 00:23:50 crc kubenswrapper[4606]: I1212 00:23:50.248673 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/da25b0ba-5398-4185-a4c1-aeba44ae5633-ovnkube-config\") pod \"ovnkube-node-hpw5w\" (UID: \"da25b0ba-5398-4185-a4c1-aeba44ae5633\") " pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" Dec 12 00:23:50 crc kubenswrapper[4606]: I1212 00:23:50.249673 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/da25b0ba-5398-4185-a4c1-aeba44ae5633-ovnkube-script-lib\") pod \"ovnkube-node-hpw5w\" (UID: \"da25b0ba-5398-4185-a4c1-aeba44ae5633\") " pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" Dec 12 00:23:50 crc kubenswrapper[4606]: I1212 00:23:50.252727 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-w4rbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"470c2076-46bf-4305-9fb1-3e509eb4d4f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-w4rbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:50Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:50 crc kubenswrapper[4606]: I1212 00:23:50.255539 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/da25b0ba-5398-4185-a4c1-aeba44ae5633-ovn-node-metrics-cert\") pod \"ovnkube-node-hpw5w\" (UID: \"da25b0ba-5398-4185-a4c1-aeba44ae5633\") " pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" Dec 12 00:23:50 crc kubenswrapper[4606]: I1212 00:23:50.256163 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbcl8\" (UniqueName: \"kubernetes.io/projected/da25b0ba-5398-4185-a4c1-aeba44ae5633-kube-api-access-hbcl8\") pod \"ovnkube-node-hpw5w\" (UID: \"da25b0ba-5398-4185-a4c1-aeba44ae5633\") " pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" Dec 12 00:23:50 crc kubenswrapper[4606]: I1212 00:23:50.274426 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da25b0ba-5398-4185-a4c1-aeba44ae5633\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hpw5w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:50Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:50 crc kubenswrapper[4606]: I1212 00:23:50.288811 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92839d27-9755-4aaa-a4c2-8aa83b6473de\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d18b3a560566347e9cfcd748453a89d1588de77f02a5e961a71916c6182eff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9531d10c07644d54aeff48edf4ac068968acbd1b6597baabcb0660e7bcd54c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76f2e4c2d97f36aebba4bcbabe12214a1df27637376d1440ee54d43fca39065\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b764ca1153735e365d95ed2855c7d41cd7457614e1ddce0f008d62720d607e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c75350d79a93c04db2bd6d71995920cde9b897776249b8696f3ead3939f7d5d4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"ed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1765499023\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1765499022\\\\\\\\\\\\\\\" (2025-12-11 23:23:42 +0000 UTC to 2026-12-11 23:23:42 +0000 UTC (now=2025-12-12 00:23:48.120831298 +0000 UTC))\\\\\\\"\\\\nI1212 00:23:48.120889 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1212 00:23:48.120921 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1212 00:23:48.120964 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1212 00:23:48.120986 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1212 00:23:48.121028 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3271453193/tls.crt::/tmp/serving-cert-3271453193/tls.key\\\\\\\"\\\\nI1212 00:23:48.121084 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1212 00:23:48.121211 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1212 00:23:48.121228 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1212 00:23:48.121265 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1212 00:23:48.121275 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1212 00:23:48.121345 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1212 00:23:48.121358 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF1212 00:23:48.123292 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3baf1bf5fbaa04f9f165f43b6a4fba496be0d453641fcd8bbc811687891dd66\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bbabfdcef0c9f4dcaada0bf5c2e98c083fd0f809462f017f8a767ef800e8e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0bbabfdcef0c9f4dcaada0bf5c2e98c083fd0f809462f017f8a767ef800e8e13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:50Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:50 crc kubenswrapper[4606]: I1212 00:23:50.301526 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:50Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:50 crc kubenswrapper[4606]: I1212 00:23:50.312487 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:50Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:50 crc kubenswrapper[4606]: I1212 00:23:50.324720 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a543e227-be89-40cb-941d-b4707cc28921\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-986w2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-986w2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cqmz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:50Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:50 crc kubenswrapper[4606]: I1212 00:23:50.326763 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" Dec 12 00:23:50 crc kubenswrapper[4606]: I1212 00:23:50.336551 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xzcfk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-spsv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xzcfk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:50Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:50 crc kubenswrapper[4606]: W1212 00:23:50.362872 4606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podda25b0ba_5398_4185_a4c1_aeba44ae5633.slice/crio-5c2a63eacc2ccc10dc4ae350eb8ea92415a431f080396e6b22a1b790b6667255 WatchSource:0}: Error finding container 5c2a63eacc2ccc10dc4ae350eb8ea92415a431f080396e6b22a1b790b6667255: Status 404 returned error can't find the container with id 5c2a63eacc2ccc10dc4ae350eb8ea92415a431f080396e6b22a1b790b6667255 Dec 12 00:23:50 crc kubenswrapper[4606]: I1212 00:23:50.363684 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92839d27-9755-4aaa-a4c2-8aa83b6473de\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d18b3a560566347e9cfcd748453a89d1588de77f02a5e961a71916c6182eff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9531d10c07644d54aeff48edf4ac068968acbd1b6597baabcb0660e7bcd54c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76f2e4c2d97f36aebba4bcbabe12214a1df27637376d1440ee54d43fca39065\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b764ca1153735e365d95ed2855c7d41cd7457614e1ddce0f008d62720d607e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c75350d79a93c04db2bd6d71995920cde9b897776249b8696f3ead3939f7d5d4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"ed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1765499023\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1765499022\\\\\\\\\\\\\\\" (2025-12-11 23:23:42 +0000 UTC to 2026-12-11 23:23:42 +0000 UTC (now=2025-12-12 00:23:48.120831298 +0000 UTC))\\\\\\\"\\\\nI1212 00:23:48.120889 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1212 00:23:48.120921 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1212 00:23:48.120964 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1212 00:23:48.120986 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1212 00:23:48.121028 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3271453193/tls.crt::/tmp/serving-cert-3271453193/tls.key\\\\\\\"\\\\nI1212 00:23:48.121084 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1212 00:23:48.121211 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1212 00:23:48.121228 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1212 00:23:48.121265 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1212 00:23:48.121275 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1212 00:23:48.121345 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1212 00:23:48.121358 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF1212 00:23:48.123292 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3baf1bf5fbaa04f9f165f43b6a4fba496be0d453641fcd8bbc811687891dd66\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bbabfdcef0c9f4dcaada0bf5c2e98c083fd0f809462f017f8a767ef800e8e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0bbabfdcef0c9f4dcaada0bf5c2e98c083fd0f809462f017f8a767ef800e8e13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:50Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:50 crc kubenswrapper[4606]: I1212 00:23:50.389295 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:50Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:50 crc kubenswrapper[4606]: I1212 00:23:50.408468 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:50Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:50 crc kubenswrapper[4606]: I1212 00:23:50.428837 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a543e227-be89-40cb-941d-b4707cc28921\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-986w2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-986w2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cqmz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:50Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:50 crc kubenswrapper[4606]: I1212 00:23:50.437240 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 12 00:23:50 crc kubenswrapper[4606]: E1212 00:23:50.437607 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-12 00:23:52.437587727 +0000 UTC m=+22.982940593 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 12 00:23:50 crc kubenswrapper[4606]: I1212 00:23:50.452764 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xzcfk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-spsv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xzcfk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:50Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:50 crc kubenswrapper[4606]: I1212 00:23:50.472686 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4986a294a19f85082b9caf854f13f9c3587ca49314b40917b00768ee0bb51ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4251eab73ccb6141903993f8df922037762f7c11dc1e0ef1a2b5708ed435307\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:50Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:50 crc kubenswrapper[4606]: I1212 00:23:50.493301 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:50Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:50 crc kubenswrapper[4606]: I1212 00:23:50.505069 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 12 00:23:50 crc kubenswrapper[4606]: I1212 00:23:50.515133 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce6c4e852b4c68e910621ad11058beecb40960d501d3e64f5fcad97c442df4ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:50Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:50 crc kubenswrapper[4606]: I1212 00:23:50.523702 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 12 00:23:50 crc kubenswrapper[4606]: I1212 00:23:50.524418 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 12 00:23:50 crc kubenswrapper[4606]: I1212 00:23:50.537583 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:50Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:50 crc kubenswrapper[4606]: I1212 00:23:50.538027 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 12 00:23:50 crc kubenswrapper[4606]: I1212 00:23:50.538069 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 12 00:23:50 crc kubenswrapper[4606]: I1212 00:23:50.538095 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 12 00:23:50 crc kubenswrapper[4606]: I1212 00:23:50.538121 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 12 00:23:50 crc kubenswrapper[4606]: E1212 00:23:50.538259 4606 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 12 00:23:50 crc kubenswrapper[4606]: E1212 00:23:50.538267 4606 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 12 00:23:50 crc kubenswrapper[4606]: E1212 00:23:50.538299 4606 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 12 00:23:50 crc kubenswrapper[4606]: E1212 00:23:50.538307 4606 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 12 00:23:50 crc kubenswrapper[4606]: E1212 00:23:50.538275 4606 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 12 00:23:50 crc kubenswrapper[4606]: E1212 00:23:50.538323 4606 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 12 00:23:50 crc kubenswrapper[4606]: E1212 00:23:50.538349 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-12 00:23:52.538334852 +0000 UTC m=+23.083687728 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 12 00:23:50 crc kubenswrapper[4606]: E1212 00:23:50.538371 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-12 00:23:52.538356182 +0000 UTC m=+23.083709048 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 12 00:23:50 crc kubenswrapper[4606]: E1212 00:23:50.538312 4606 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 12 00:23:50 crc kubenswrapper[4606]: E1212 00:23:50.538402 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-12 00:23:52.538394414 +0000 UTC m=+23.083747400 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 12 00:23:50 crc kubenswrapper[4606]: E1212 00:23:50.538461 4606 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 12 00:23:50 crc kubenswrapper[4606]: E1212 00:23:50.538489 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-12 00:23:52.538481346 +0000 UTC m=+23.083834302 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 12 00:23:50 crc kubenswrapper[4606]: I1212 00:23:50.559066 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-554rp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d34036b-3243-416a-a0c0-1d1ddb3a0ca5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://346417326a8a4ba8e839ce4870dbaacd33a90b4ab096d8ce573ce598d909b51a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9k8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-554rp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:50Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:50 crc kubenswrapper[4606]: I1212 00:23:50.580489 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-w4rbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"470c2076-46bf-4305-9fb1-3e509eb4d4f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-w4rbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:50Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:50 crc kubenswrapper[4606]: I1212 00:23:50.598693 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da25b0ba-5398-4185-a4c1-aeba44ae5633\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hpw5w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:50Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:50 crc kubenswrapper[4606]: I1212 00:23:50.699608 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 12 00:23:50 crc kubenswrapper[4606]: E1212 00:23:50.699819 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 12 00:23:50 crc kubenswrapper[4606]: I1212 00:23:50.700093 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 12 00:23:50 crc kubenswrapper[4606]: I1212 00:23:50.700124 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 12 00:23:50 crc kubenswrapper[4606]: E1212 00:23:50.700167 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 12 00:23:50 crc kubenswrapper[4606]: E1212 00:23:50.700224 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 12 00:23:50 crc kubenswrapper[4606]: I1212 00:23:50.766303 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 12 00:23:50 crc kubenswrapper[4606]: I1212 00:23:50.780559 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 12 00:23:50 crc kubenswrapper[4606]: I1212 00:23:50.794304 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 12 00:23:50 crc kubenswrapper[4606]: I1212 00:23:50.813811 4606 generic.go:334] "Generic (PLEG): container finished" podID="da25b0ba-5398-4185-a4c1-aeba44ae5633" containerID="f94a9f641815d76464c17c6145b9ac4803eedab3bea29a52d65a793d448cebe2" exitCode=0 Dec 12 00:23:50 crc kubenswrapper[4606]: I1212 00:23:50.813909 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" event={"ID":"da25b0ba-5398-4185-a4c1-aeba44ae5633","Type":"ContainerDied","Data":"f94a9f641815d76464c17c6145b9ac4803eedab3bea29a52d65a793d448cebe2"} Dec 12 00:23:50 crc kubenswrapper[4606]: I1212 00:23:50.813971 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" event={"ID":"da25b0ba-5398-4185-a4c1-aeba44ae5633","Type":"ContainerStarted","Data":"5c2a63eacc2ccc10dc4ae350eb8ea92415a431f080396e6b22a1b790b6667255"} Dec 12 00:23:50 crc kubenswrapper[4606]: I1212 00:23:50.816134 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xzcfk" event={"ID":"b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0","Type":"ContainerStarted","Data":"84110159d07d70b0607ac029fec70fb043bbb0e310c39955e49becfe87e9faa0"} Dec 12 00:23:50 crc kubenswrapper[4606]: I1212 00:23:50.816202 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xzcfk" event={"ID":"b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0","Type":"ContainerStarted","Data":"4fe1a10b233a7e6dc14ca1d8c6de6af51e142974d7226e09d522b0f71c5ade42"} Dec 12 00:23:50 crc kubenswrapper[4606]: I1212 00:23:50.823754 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" event={"ID":"a543e227-be89-40cb-941d-b4707cc28921","Type":"ContainerStarted","Data":"b09638dd0d881593745c14548156f20a9366f65ecfa5018d510b902240dd4f7f"} Dec 12 00:23:50 crc kubenswrapper[4606]: I1212 00:23:50.823951 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" event={"ID":"a543e227-be89-40cb-941d-b4707cc28921","Type":"ContainerStarted","Data":"5de882904f4a5718bcf44d07da05096363447836f325f35130914c17a30c3a55"} Dec 12 00:23:50 crc kubenswrapper[4606]: I1212 00:23:50.824230 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" event={"ID":"a543e227-be89-40cb-941d-b4707cc28921","Type":"ContainerStarted","Data":"ffacdc3d3d02d63324e514cbc4d9854947e79f53d3a1dcc07dbb4ac6bed3502f"} Dec 12 00:23:50 crc kubenswrapper[4606]: I1212 00:23:50.825971 4606 generic.go:334] "Generic (PLEG): container finished" podID="470c2076-46bf-4305-9fb1-3e509eb4d4f0" containerID="34a40d0ce85078630553b90b7a35510e0d3523f600db8940d5e0cf063903192d" exitCode=0 Dec 12 00:23:50 crc kubenswrapper[4606]: I1212 00:23:50.826251 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-w4rbn" event={"ID":"470c2076-46bf-4305-9fb1-3e509eb4d4f0","Type":"ContainerDied","Data":"34a40d0ce85078630553b90b7a35510e0d3523f600db8940d5e0cf063903192d"} Dec 12 00:23:50 crc kubenswrapper[4606]: I1212 00:23:50.826349 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-w4rbn" event={"ID":"470c2076-46bf-4305-9fb1-3e509eb4d4f0","Type":"ContainerStarted","Data":"bec04773934d8a9973f2e36c2151225e2476b19f4dea473a85f731102fde3176"} Dec 12 00:23:50 crc kubenswrapper[4606]: I1212 00:23:50.846509 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 12 00:23:50 crc kubenswrapper[4606]: I1212 00:23:50.852460 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce6c4e852b4c68e910621ad11058beecb40960d501d3e64f5fcad97c442df4ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:50Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:50 crc kubenswrapper[4606]: I1212 00:23:50.894775 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 12 00:23:50 crc kubenswrapper[4606]: I1212 00:23:50.913439 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da25b0ba-5398-4185-a4c1-aeba44ae5633\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f94a9f641815d76464c17c6145b9ac4803eedab3bea29a52d65a793d448cebe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f94a9f641815d76464c17c6145b9ac4803eedab3bea29a52d65a793d448cebe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hpw5w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:50Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:50 crc kubenswrapper[4606]: I1212 00:23:50.936122 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:50Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:50 crc kubenswrapper[4606]: I1212 00:23:50.952903 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-554rp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d34036b-3243-416a-a0c0-1d1ddb3a0ca5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://346417326a8a4ba8e839ce4870dbaacd33a90b4ab096d8ce573ce598d909b51a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9k8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-554rp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:50Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:50 crc kubenswrapper[4606]: I1212 00:23:50.987366 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-w4rbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"470c2076-46bf-4305-9fb1-3e509eb4d4f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-w4rbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:50Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:51 crc kubenswrapper[4606]: I1212 00:23:51.006325 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:51Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:51 crc kubenswrapper[4606]: I1212 00:23:51.023478 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:51Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:51 crc kubenswrapper[4606]: I1212 00:23:51.075705 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a543e227-be89-40cb-941d-b4707cc28921\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-986w2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-986w2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cqmz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:51Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:51 crc kubenswrapper[4606]: I1212 00:23:51.095000 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xzcfk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-spsv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xzcfk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:51Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:51 crc kubenswrapper[4606]: I1212 00:23:51.098969 4606 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 12 00:23:51 crc kubenswrapper[4606]: I1212 00:23:51.100569 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:51 crc kubenswrapper[4606]: I1212 00:23:51.100605 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:51 crc kubenswrapper[4606]: I1212 00:23:51.100618 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:51 crc kubenswrapper[4606]: I1212 00:23:51.100726 4606 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 12 00:23:51 crc kubenswrapper[4606]: I1212 00:23:51.116882 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92839d27-9755-4aaa-a4c2-8aa83b6473de\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d18b3a560566347e9cfcd748453a89d1588de77f02a5e961a71916c6182eff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9531d10c07644d54aeff48edf4ac068968acbd1b6597baabcb0660e7bcd54c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76f2e4c2d97f36aebba4bcbabe12214a1df27637376d1440ee54d43fca39065\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b764ca1153735e365d95ed2855c7d41cd7457614e1ddce0f008d62720d607e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c75350d79a93c04db2bd6d71995920cde9b897776249b8696f3ead3939f7d5d4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"ed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1765499023\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1765499022\\\\\\\\\\\\\\\" (2025-12-11 23:23:42 +0000 UTC to 2026-12-11 23:23:42 +0000 UTC (now=2025-12-12 00:23:48.120831298 +0000 UTC))\\\\\\\"\\\\nI1212 00:23:48.120889 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1212 00:23:48.120921 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1212 00:23:48.120964 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1212 00:23:48.120986 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1212 00:23:48.121028 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3271453193/tls.crt::/tmp/serving-cert-3271453193/tls.key\\\\\\\"\\\\nI1212 00:23:48.121084 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1212 00:23:48.121211 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1212 00:23:48.121228 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1212 00:23:48.121265 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1212 00:23:48.121275 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1212 00:23:48.121345 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1212 00:23:48.121358 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF1212 00:23:48.123292 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3baf1bf5fbaa04f9f165f43b6a4fba496be0d453641fcd8bbc811687891dd66\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bbabfdcef0c9f4dcaada0bf5c2e98c083fd0f809462f017f8a767ef800e8e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0bbabfdcef0c9f4dcaada0bf5c2e98c083fd0f809462f017f8a767ef800e8e13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:51Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:51 crc kubenswrapper[4606]: I1212 00:23:51.122023 4606 kubelet_node_status.go:115] "Node was previously registered" node="crc" Dec 12 00:23:51 crc kubenswrapper[4606]: I1212 00:23:51.122299 4606 kubelet_node_status.go:79] "Successfully registered node" node="crc" Dec 12 00:23:51 crc kubenswrapper[4606]: I1212 00:23:51.123717 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:51 crc kubenswrapper[4606]: I1212 00:23:51.123738 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:51 crc kubenswrapper[4606]: I1212 00:23:51.123746 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:51 crc kubenswrapper[4606]: I1212 00:23:51.123760 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:23:51 crc kubenswrapper[4606]: I1212 00:23:51.123768 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:23:51Z","lastTransitionTime":"2025-12-12T00:23:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:23:51 crc kubenswrapper[4606]: I1212 00:23:51.131240 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 12 00:23:51 crc kubenswrapper[4606]: I1212 00:23:51.138568 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 12 00:23:51 crc kubenswrapper[4606]: I1212 00:23:51.143531 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:51Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:51 crc kubenswrapper[4606]: I1212 00:23:51.156313 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 12 00:23:51 crc kubenswrapper[4606]: E1212 00:23:51.160847 4606 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"19ab28be-ccfb-4859-88d0-8d375e2d06f2\\\",\\\"systemUUID\\\":\\\"d5982033-b2dc-474c-9cda-275cd567c208\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:51Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:51 crc kubenswrapper[4606]: I1212 00:23:51.165852 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 12 00:23:51 crc kubenswrapper[4606]: I1212 00:23:51.168242 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:51 crc kubenswrapper[4606]: I1212 00:23:51.168276 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:51 crc kubenswrapper[4606]: I1212 00:23:51.168284 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:51 crc kubenswrapper[4606]: I1212 00:23:51.168297 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:23:51 crc kubenswrapper[4606]: I1212 00:23:51.168308 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:23:51Z","lastTransitionTime":"2025-12-12T00:23:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:23:51 crc kubenswrapper[4606]: I1212 00:23:51.172786 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4986a294a19f85082b9caf854f13f9c3587ca49314b40917b00768ee0bb51ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4251eab73ccb6141903993f8df922037762f7c11dc1e0ef1a2b5708ed435307\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:51Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:51 crc kubenswrapper[4606]: E1212 00:23:51.184515 4606 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"19ab28be-ccfb-4859-88d0-8d375e2d06f2\\\",\\\"systemUUID\\\":\\\"d5982033-b2dc-474c-9cda-275cd567c208\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:51Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:51 crc kubenswrapper[4606]: I1212 00:23:51.185153 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4986a294a19f85082b9caf854f13f9c3587ca49314b40917b00768ee0bb51ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4251eab73ccb6141903993f8df922037762f7c11dc1e0ef1a2b5708ed435307\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:51Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:51 crc kubenswrapper[4606]: I1212 00:23:51.188525 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:51 crc kubenswrapper[4606]: I1212 00:23:51.188624 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:51 crc kubenswrapper[4606]: I1212 00:23:51.188705 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:51 crc kubenswrapper[4606]: I1212 00:23:51.188772 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:23:51 crc kubenswrapper[4606]: I1212 00:23:51.188835 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:23:51Z","lastTransitionTime":"2025-12-12T00:23:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:23:51 crc kubenswrapper[4606]: I1212 00:23:51.225703 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:51Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:51 crc kubenswrapper[4606]: E1212 00:23:51.225774 4606 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"19ab28be-ccfb-4859-88d0-8d375e2d06f2\\\",\\\"systemUUID\\\":\\\"d5982033-b2dc-474c-9cda-275cd567c208\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:51Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:51 crc kubenswrapper[4606]: I1212 00:23:51.237503 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:51 crc kubenswrapper[4606]: I1212 00:23:51.237549 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:51 crc kubenswrapper[4606]: I1212 00:23:51.237560 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:51 crc kubenswrapper[4606]: I1212 00:23:51.237579 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:23:51 crc kubenswrapper[4606]: I1212 00:23:51.237588 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:23:51Z","lastTransitionTime":"2025-12-12T00:23:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:23:51 crc kubenswrapper[4606]: I1212 00:23:51.254338 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce6c4e852b4c68e910621ad11058beecb40960d501d3e64f5fcad97c442df4ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:51Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:51 crc kubenswrapper[4606]: E1212 00:23:51.258262 4606 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"19ab28be-ccfb-4859-88d0-8d375e2d06f2\\\",\\\"systemUUID\\\":\\\"d5982033-b2dc-474c-9cda-275cd567c208\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:51Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:51 crc kubenswrapper[4606]: I1212 00:23:51.261985 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:51 crc kubenswrapper[4606]: I1212 00:23:51.262006 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:51 crc kubenswrapper[4606]: I1212 00:23:51.262049 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:51 crc kubenswrapper[4606]: I1212 00:23:51.262075 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:23:51 crc kubenswrapper[4606]: I1212 00:23:51.262085 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:23:51Z","lastTransitionTime":"2025-12-12T00:23:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:23:51 crc kubenswrapper[4606]: I1212 00:23:51.271994 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:51Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:51 crc kubenswrapper[4606]: E1212 00:23:51.283757 4606 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"19ab28be-ccfb-4859-88d0-8d375e2d06f2\\\",\\\"systemUUID\\\":\\\"d5982033-b2dc-474c-9cda-275cd567c208\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:51Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:51 crc kubenswrapper[4606]: E1212 00:23:51.283907 4606 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 12 00:23:51 crc kubenswrapper[4606]: I1212 00:23:51.284998 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-554rp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d34036b-3243-416a-a0c0-1d1ddb3a0ca5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://346417326a8a4ba8e839ce4870dbaacd33a90b4ab096d8ce573ce598d909b51a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9k8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-554rp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:51Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:51 crc kubenswrapper[4606]: I1212 00:23:51.285722 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:51 crc kubenswrapper[4606]: I1212 00:23:51.285743 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:51 crc kubenswrapper[4606]: I1212 00:23:51.285752 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:51 crc kubenswrapper[4606]: I1212 00:23:51.285767 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:23:51 crc kubenswrapper[4606]: I1212 00:23:51.285777 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:23:51Z","lastTransitionTime":"2025-12-12T00:23:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:23:51 crc kubenswrapper[4606]: I1212 00:23:51.309020 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-w4rbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"470c2076-46bf-4305-9fb1-3e509eb4d4f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34a40d0ce85078630553b90b7a35510e0d3523f600db8940d5e0cf063903192d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34a40d0ce85078630553b90b7a35510e0d3523f600db8940d5e0cf063903192d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-w4rbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:51Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:51 crc kubenswrapper[4606]: I1212 00:23:51.336594 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da25b0ba-5398-4185-a4c1-aeba44ae5633\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f94a9f641815d76464c17c6145b9ac4803eedab3bea29a52d65a793d448cebe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f94a9f641815d76464c17c6145b9ac4803eedab3bea29a52d65a793d448cebe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hpw5w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:51Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:51 crc kubenswrapper[4606]: I1212 00:23:51.354933 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:51Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:51 crc kubenswrapper[4606]: I1212 00:23:51.376986 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a543e227-be89-40cb-941d-b4707cc28921\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b09638dd0d881593745c14548156f20a9366f65ecfa5018d510b902240dd4f7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-986w2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5de882904f4a5718bcf44d07da05096363447836f325f35130914c17a30c3a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-986w2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cqmz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:51Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:51 crc kubenswrapper[4606]: I1212 00:23:51.387981 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:51 crc kubenswrapper[4606]: I1212 00:23:51.388031 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:51 crc kubenswrapper[4606]: I1212 00:23:51.388039 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:51 crc kubenswrapper[4606]: I1212 00:23:51.388052 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:23:51 crc kubenswrapper[4606]: I1212 00:23:51.388061 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:23:51Z","lastTransitionTime":"2025-12-12T00:23:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:23:51 crc kubenswrapper[4606]: I1212 00:23:51.392195 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xzcfk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84110159d07d70b0607ac029fec70fb043bbb0e310c39955e49becfe87e9faa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-spsv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xzcfk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:51Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:51 crc kubenswrapper[4606]: I1212 00:23:51.407247 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92839d27-9755-4aaa-a4c2-8aa83b6473de\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d18b3a560566347e9cfcd748453a89d1588de77f02a5e961a71916c6182eff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9531d10c07644d54aeff48edf4ac068968acbd1b6597baabcb0660e7bcd54c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76f2e4c2d97f36aebba4bcbabe12214a1df27637376d1440ee54d43fca39065\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b764ca1153735e365d95ed2855c7d41cd7457614e1ddce0f008d62720d607e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c75350d79a93c04db2bd6d71995920cde9b897776249b8696f3ead3939f7d5d4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"ed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1765499023\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1765499022\\\\\\\\\\\\\\\" (2025-12-11 23:23:42 +0000 UTC to 2026-12-11 23:23:42 +0000 UTC (now=2025-12-12 00:23:48.120831298 +0000 UTC))\\\\\\\"\\\\nI1212 00:23:48.120889 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1212 00:23:48.120921 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1212 00:23:48.120964 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1212 00:23:48.120986 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1212 00:23:48.121028 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3271453193/tls.crt::/tmp/serving-cert-3271453193/tls.key\\\\\\\"\\\\nI1212 00:23:48.121084 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1212 00:23:48.121211 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1212 00:23:48.121228 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1212 00:23:48.121265 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1212 00:23:48.121275 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1212 00:23:48.121345 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1212 00:23:48.121358 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF1212 00:23:48.123292 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3baf1bf5fbaa04f9f165f43b6a4fba496be0d453641fcd8bbc811687891dd66\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bbabfdcef0c9f4dcaada0bf5c2e98c083fd0f809462f017f8a767ef800e8e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0bbabfdcef0c9f4dcaada0bf5c2e98c083fd0f809462f017f8a767ef800e8e13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:51Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:51 crc kubenswrapper[4606]: I1212 00:23:51.421653 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:51Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:51 crc kubenswrapper[4606]: I1212 00:23:51.490789 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:51 crc kubenswrapper[4606]: I1212 00:23:51.490827 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:51 crc kubenswrapper[4606]: I1212 00:23:51.490840 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:51 crc kubenswrapper[4606]: I1212 00:23:51.490855 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:23:51 crc kubenswrapper[4606]: I1212 00:23:51.490866 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:23:51Z","lastTransitionTime":"2025-12-12T00:23:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:23:51 crc kubenswrapper[4606]: I1212 00:23:51.593334 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:51 crc kubenswrapper[4606]: I1212 00:23:51.593598 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:51 crc kubenswrapper[4606]: I1212 00:23:51.593607 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:51 crc kubenswrapper[4606]: I1212 00:23:51.593621 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:23:51 crc kubenswrapper[4606]: I1212 00:23:51.593631 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:23:51Z","lastTransitionTime":"2025-12-12T00:23:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:23:51 crc kubenswrapper[4606]: I1212 00:23:51.696209 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:51 crc kubenswrapper[4606]: I1212 00:23:51.696382 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:51 crc kubenswrapper[4606]: I1212 00:23:51.696457 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:51 crc kubenswrapper[4606]: I1212 00:23:51.696545 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:23:51 crc kubenswrapper[4606]: I1212 00:23:51.696631 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:23:51Z","lastTransitionTime":"2025-12-12T00:23:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:23:51 crc kubenswrapper[4606]: I1212 00:23:51.799212 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:51 crc kubenswrapper[4606]: I1212 00:23:51.799241 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:51 crc kubenswrapper[4606]: I1212 00:23:51.799249 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:51 crc kubenswrapper[4606]: I1212 00:23:51.799262 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:23:51 crc kubenswrapper[4606]: I1212 00:23:51.799273 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:23:51Z","lastTransitionTime":"2025-12-12T00:23:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:23:51 crc kubenswrapper[4606]: I1212 00:23:51.835903 4606 generic.go:334] "Generic (PLEG): container finished" podID="470c2076-46bf-4305-9fb1-3e509eb4d4f0" containerID="6f0d903a3e09486a00bc5031c1a100d7a520f8e0cbdcccbeadb796e70f1075b1" exitCode=0 Dec 12 00:23:51 crc kubenswrapper[4606]: I1212 00:23:51.835999 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-w4rbn" event={"ID":"470c2076-46bf-4305-9fb1-3e509eb4d4f0","Type":"ContainerDied","Data":"6f0d903a3e09486a00bc5031c1a100d7a520f8e0cbdcccbeadb796e70f1075b1"} Dec 12 00:23:51 crc kubenswrapper[4606]: I1212 00:23:51.840295 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"694cd2f423e8241bd8fd68395824d1cd45a12d80e92ad57e63e59c0642b21384"} Dec 12 00:23:51 crc kubenswrapper[4606]: I1212 00:23:51.848721 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" event={"ID":"da25b0ba-5398-4185-a4c1-aeba44ae5633","Type":"ContainerStarted","Data":"653bdfa6f15b75149c89a4aff0706e132368f8bc75520cde9f1e1c55c8866537"} Dec 12 00:23:51 crc kubenswrapper[4606]: I1212 00:23:51.848765 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" event={"ID":"da25b0ba-5398-4185-a4c1-aeba44ae5633","Type":"ContainerStarted","Data":"fa827acce161127216bcfcf3553efeb627cbe52e4d23a8cb011e025f67dde670"} Dec 12 00:23:51 crc kubenswrapper[4606]: I1212 00:23:51.848775 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" event={"ID":"da25b0ba-5398-4185-a4c1-aeba44ae5633","Type":"ContainerStarted","Data":"6183a4685fda61493c792e9ae693cd7f76e4aad9753f622b48347da3fd2b3b00"} Dec 12 00:23:51 crc kubenswrapper[4606]: I1212 00:23:51.848785 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" event={"ID":"da25b0ba-5398-4185-a4c1-aeba44ae5633","Type":"ContainerStarted","Data":"0867a89de2edf12036267a937b7e3b1a801e9710ff0a1250c20e020c80ac4d5d"} Dec 12 00:23:51 crc kubenswrapper[4606]: I1212 00:23:51.852836 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:51Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:51 crc kubenswrapper[4606]: I1212 00:23:51.865515 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:51Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:51 crc kubenswrapper[4606]: I1212 00:23:51.877305 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a543e227-be89-40cb-941d-b4707cc28921\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b09638dd0d881593745c14548156f20a9366f65ecfa5018d510b902240dd4f7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-986w2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5de882904f4a5718bcf44d07da05096363447836f325f35130914c17a30c3a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-986w2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cqmz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:51Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:51 crc kubenswrapper[4606]: I1212 00:23:51.889636 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xzcfk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84110159d07d70b0607ac029fec70fb043bbb0e310c39955e49becfe87e9faa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-spsv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xzcfk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:51Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:51 crc kubenswrapper[4606]: I1212 00:23:51.904967 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:51 crc kubenswrapper[4606]: I1212 00:23:51.905266 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:51 crc kubenswrapper[4606]: I1212 00:23:51.905280 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:51 crc kubenswrapper[4606]: I1212 00:23:51.905298 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:23:51 crc kubenswrapper[4606]: I1212 00:23:51.905309 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:23:51Z","lastTransitionTime":"2025-12-12T00:23:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:23:51 crc kubenswrapper[4606]: I1212 00:23:51.907452 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 12 00:23:51 crc kubenswrapper[4606]: I1212 00:23:51.912251 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92839d27-9755-4aaa-a4c2-8aa83b6473de\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d18b3a560566347e9cfcd748453a89d1588de77f02a5e961a71916c6182eff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9531d10c07644d54aeff48edf4ac068968acbd1b6597baabcb0660e7bcd54c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76f2e4c2d97f36aebba4bcbabe12214a1df27637376d1440ee54d43fca39065\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b764ca1153735e365d95ed2855c7d41cd7457614e1ddce0f008d62720d607e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c75350d79a93c04db2bd6d71995920cde9b897776249b8696f3ead3939f7d5d4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"ed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1765499023\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1765499022\\\\\\\\\\\\\\\" (2025-12-11 23:23:42 +0000 UTC to 2026-12-11 23:23:42 +0000 UTC (now=2025-12-12 00:23:48.120831298 +0000 UTC))\\\\\\\"\\\\nI1212 00:23:48.120889 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1212 00:23:48.120921 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1212 00:23:48.120964 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1212 00:23:48.120986 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1212 00:23:48.121028 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3271453193/tls.crt::/tmp/serving-cert-3271453193/tls.key\\\\\\\"\\\\nI1212 00:23:48.121084 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1212 00:23:48.121211 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1212 00:23:48.121228 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1212 00:23:48.121265 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1212 00:23:48.121275 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1212 00:23:48.121345 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1212 00:23:48.121358 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF1212 00:23:48.123292 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3baf1bf5fbaa04f9f165f43b6a4fba496be0d453641fcd8bbc811687891dd66\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bbabfdcef0c9f4dcaada0bf5c2e98c083fd0f809462f017f8a767ef800e8e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0bbabfdcef0c9f4dcaada0bf5c2e98c083fd0f809462f017f8a767ef800e8e13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:51Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:51 crc kubenswrapper[4606]: I1212 00:23:51.912508 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 12 00:23:51 crc kubenswrapper[4606]: I1212 00:23:51.922215 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Dec 12 00:23:51 crc kubenswrapper[4606]: I1212 00:23:51.939947 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4986a294a19f85082b9caf854f13f9c3587ca49314b40917b00768ee0bb51ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4251eab73ccb6141903993f8df922037762f7c11dc1e0ef1a2b5708ed435307\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:51Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:51 crc kubenswrapper[4606]: I1212 00:23:51.966129 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:51Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:51 crc kubenswrapper[4606]: I1212 00:23:51.972847 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-qtz7b"] Dec 12 00:23:51 crc kubenswrapper[4606]: I1212 00:23:51.973197 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-qtz7b" Dec 12 00:23:51 crc kubenswrapper[4606]: I1212 00:23:51.978016 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 12 00:23:51 crc kubenswrapper[4606]: I1212 00:23:51.978139 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 12 00:23:51 crc kubenswrapper[4606]: I1212 00:23:51.978367 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 12 00:23:51 crc kubenswrapper[4606]: I1212 00:23:51.979301 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 12 00:23:51 crc kubenswrapper[4606]: I1212 00:23:51.982581 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce6c4e852b4c68e910621ad11058beecb40960d501d3e64f5fcad97c442df4ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:51Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:51 crc kubenswrapper[4606]: I1212 00:23:51.998642 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-w4rbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"470c2076-46bf-4305-9fb1-3e509eb4d4f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34a40d0ce85078630553b90b7a35510e0d3523f600db8940d5e0cf063903192d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34a40d0ce85078630553b90b7a35510e0d3523f600db8940d5e0cf063903192d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f0d903a3e09486a00bc5031c1a100d7a520f8e0cbdcccbeadb796e70f1075b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f0d903a3e09486a00bc5031c1a100d7a520f8e0cbdcccbeadb796e70f1075b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-w4rbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:51Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:52 crc kubenswrapper[4606]: I1212 00:23:52.013665 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:52 crc kubenswrapper[4606]: I1212 00:23:52.013695 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:52 crc kubenswrapper[4606]: I1212 00:23:52.013708 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:52 crc kubenswrapper[4606]: I1212 00:23:52.013725 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:23:52 crc kubenswrapper[4606]: I1212 00:23:52.013736 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:23:52Z","lastTransitionTime":"2025-12-12T00:23:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:23:52 crc kubenswrapper[4606]: I1212 00:23:52.020413 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da25b0ba-5398-4185-a4c1-aeba44ae5633\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f94a9f641815d76464c17c6145b9ac4803eedab3bea29a52d65a793d448cebe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f94a9f641815d76464c17c6145b9ac4803eedab3bea29a52d65a793d448cebe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hpw5w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:52Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:52 crc kubenswrapper[4606]: I1212 00:23:52.033668 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:52Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:52 crc kubenswrapper[4606]: I1212 00:23:52.058904 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-554rp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d34036b-3243-416a-a0c0-1d1ddb3a0ca5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://346417326a8a4ba8e839ce4870dbaacd33a90b4ab096d8ce573ce598d909b51a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9k8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-554rp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:52Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:52 crc kubenswrapper[4606]: I1212 00:23:52.081625 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://694cd2f423e8241bd8fd68395824d1cd45a12d80e92ad57e63e59c0642b21384\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:52Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:52 crc kubenswrapper[4606]: I1212 00:23:52.093220 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a543e227-be89-40cb-941d-b4707cc28921\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b09638dd0d881593745c14548156f20a9366f65ecfa5018d510b902240dd4f7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-986w2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5de882904f4a5718bcf44d07da05096363447836f325f35130914c17a30c3a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-986w2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cqmz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:52Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:52 crc kubenswrapper[4606]: I1212 00:23:52.107564 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xzcfk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84110159d07d70b0607ac029fec70fb043bbb0e310c39955e49becfe87e9faa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-spsv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xzcfk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:52Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:52 crc kubenswrapper[4606]: I1212 00:23:52.116341 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:52 crc kubenswrapper[4606]: I1212 00:23:52.116381 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:52 crc kubenswrapper[4606]: I1212 00:23:52.116390 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:52 crc kubenswrapper[4606]: I1212 00:23:52.116404 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:23:52 crc kubenswrapper[4606]: I1212 00:23:52.116414 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:23:52Z","lastTransitionTime":"2025-12-12T00:23:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:23:52 crc kubenswrapper[4606]: I1212 00:23:52.121273 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92839d27-9755-4aaa-a4c2-8aa83b6473de\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d18b3a560566347e9cfcd748453a89d1588de77f02a5e961a71916c6182eff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9531d10c07644d54aeff48edf4ac068968acbd1b6597baabcb0660e7bcd54c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76f2e4c2d97f36aebba4bcbabe12214a1df27637376d1440ee54d43fca39065\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b764ca1153735e365d95ed2855c7d41cd7457614e1ddce0f008d62720d607e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c75350d79a93c04db2bd6d71995920cde9b897776249b8696f3ead3939f7d5d4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"ed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1765499023\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1765499022\\\\\\\\\\\\\\\" (2025-12-11 23:23:42 +0000 UTC to 2026-12-11 23:23:42 +0000 UTC (now=2025-12-12 00:23:48.120831298 +0000 UTC))\\\\\\\"\\\\nI1212 00:23:48.120889 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1212 00:23:48.120921 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1212 00:23:48.120964 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1212 00:23:48.120986 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1212 00:23:48.121028 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3271453193/tls.crt::/tmp/serving-cert-3271453193/tls.key\\\\\\\"\\\\nI1212 00:23:48.121084 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1212 00:23:48.121211 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1212 00:23:48.121228 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1212 00:23:48.121265 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1212 00:23:48.121275 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1212 00:23:48.121345 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1212 00:23:48.121358 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF1212 00:23:48.123292 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3baf1bf5fbaa04f9f165f43b6a4fba496be0d453641fcd8bbc811687891dd66\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bbabfdcef0c9f4dcaada0bf5c2e98c083fd0f809462f017f8a767ef800e8e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0bbabfdcef0c9f4dcaada0bf5c2e98c083fd0f809462f017f8a767ef800e8e13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:52Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:52 crc kubenswrapper[4606]: I1212 00:23:52.133757 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ba650bd-b1e1-4950-98c6-2bee87181b18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de6cd366dc8d770945235faa6560cea35c0ef7eabd95cb272639e2116e2d623d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a380a96c542428bf9cf66391f4d843144052269b294636e863319b2ba954047f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://902a9d2fb0d7dcdec6339fac981ee0539911f1998a86ecf02a01015d3b1b5907\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4309a0a93977264e5000a930a32a4fb18ebe51fcf2af20bbf153f30f09759ff6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:52Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:52 crc kubenswrapper[4606]: I1212 00:23:52.144377 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:52Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:52 crc kubenswrapper[4606]: I1212 00:23:52.155110 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qtz7b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f437a237-3eb8-4817-b0a9-35efece69933\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qq49f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qtz7b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:52Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:52 crc kubenswrapper[4606]: I1212 00:23:52.161333 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f437a237-3eb8-4817-b0a9-35efece69933-host\") pod \"node-ca-qtz7b\" (UID: \"f437a237-3eb8-4817-b0a9-35efece69933\") " pod="openshift-image-registry/node-ca-qtz7b" Dec 12 00:23:52 crc kubenswrapper[4606]: I1212 00:23:52.161520 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/f437a237-3eb8-4817-b0a9-35efece69933-serviceca\") pod \"node-ca-qtz7b\" (UID: \"f437a237-3eb8-4817-b0a9-35efece69933\") " pod="openshift-image-registry/node-ca-qtz7b" Dec 12 00:23:52 crc kubenswrapper[4606]: I1212 00:23:52.161615 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qq49f\" (UniqueName: \"kubernetes.io/projected/f437a237-3eb8-4817-b0a9-35efece69933-kube-api-access-qq49f\") pod \"node-ca-qtz7b\" (UID: \"f437a237-3eb8-4817-b0a9-35efece69933\") " pod="openshift-image-registry/node-ca-qtz7b" Dec 12 00:23:52 crc kubenswrapper[4606]: I1212 00:23:52.168335 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4986a294a19f85082b9caf854f13f9c3587ca49314b40917b00768ee0bb51ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4251eab73ccb6141903993f8df922037762f7c11dc1e0ef1a2b5708ed435307\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:52Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:52 crc kubenswrapper[4606]: I1212 00:23:52.179823 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:52Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:52 crc kubenswrapper[4606]: I1212 00:23:52.195976 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce6c4e852b4c68e910621ad11058beecb40960d501d3e64f5fcad97c442df4ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:52Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:52 crc kubenswrapper[4606]: I1212 00:23:52.215264 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:52Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:52 crc kubenswrapper[4606]: I1212 00:23:52.219346 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:52 crc kubenswrapper[4606]: I1212 00:23:52.219375 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:52 crc kubenswrapper[4606]: I1212 00:23:52.219386 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:52 crc kubenswrapper[4606]: I1212 00:23:52.219400 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:23:52 crc kubenswrapper[4606]: I1212 00:23:52.219411 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:23:52Z","lastTransitionTime":"2025-12-12T00:23:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:23:52 crc kubenswrapper[4606]: I1212 00:23:52.231001 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-554rp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d34036b-3243-416a-a0c0-1d1ddb3a0ca5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://346417326a8a4ba8e839ce4870dbaacd33a90b4ab096d8ce573ce598d909b51a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9k8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-554rp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:52Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:52 crc kubenswrapper[4606]: I1212 00:23:52.260626 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-w4rbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"470c2076-46bf-4305-9fb1-3e509eb4d4f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34a40d0ce85078630553b90b7a35510e0d3523f600db8940d5e0cf063903192d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34a40d0ce85078630553b90b7a35510e0d3523f600db8940d5e0cf063903192d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f0d903a3e09486a00bc5031c1a100d7a520f8e0cbdcccbeadb796e70f1075b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f0d903a3e09486a00bc5031c1a100d7a520f8e0cbdcccbeadb796e70f1075b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-w4rbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:52Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:52 crc kubenswrapper[4606]: I1212 00:23:52.262245 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/f437a237-3eb8-4817-b0a9-35efece69933-serviceca\") pod \"node-ca-qtz7b\" (UID: \"f437a237-3eb8-4817-b0a9-35efece69933\") " pod="openshift-image-registry/node-ca-qtz7b" Dec 12 00:23:52 crc kubenswrapper[4606]: I1212 00:23:52.262283 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qq49f\" (UniqueName: \"kubernetes.io/projected/f437a237-3eb8-4817-b0a9-35efece69933-kube-api-access-qq49f\") pod \"node-ca-qtz7b\" (UID: \"f437a237-3eb8-4817-b0a9-35efece69933\") " pod="openshift-image-registry/node-ca-qtz7b" Dec 12 00:23:52 crc kubenswrapper[4606]: I1212 00:23:52.262311 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f437a237-3eb8-4817-b0a9-35efece69933-host\") pod \"node-ca-qtz7b\" (UID: \"f437a237-3eb8-4817-b0a9-35efece69933\") " pod="openshift-image-registry/node-ca-qtz7b" Dec 12 00:23:52 crc kubenswrapper[4606]: I1212 00:23:52.262368 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f437a237-3eb8-4817-b0a9-35efece69933-host\") pod \"node-ca-qtz7b\" (UID: \"f437a237-3eb8-4817-b0a9-35efece69933\") " pod="openshift-image-registry/node-ca-qtz7b" Dec 12 00:23:52 crc kubenswrapper[4606]: I1212 00:23:52.263565 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/f437a237-3eb8-4817-b0a9-35efece69933-serviceca\") pod \"node-ca-qtz7b\" (UID: \"f437a237-3eb8-4817-b0a9-35efece69933\") " pod="openshift-image-registry/node-ca-qtz7b" Dec 12 00:23:52 crc kubenswrapper[4606]: I1212 00:23:52.285720 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da25b0ba-5398-4185-a4c1-aeba44ae5633\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f94a9f641815d76464c17c6145b9ac4803eedab3bea29a52d65a793d448cebe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f94a9f641815d76464c17c6145b9ac4803eedab3bea29a52d65a793d448cebe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hpw5w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:52Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:52 crc kubenswrapper[4606]: I1212 00:23:52.301760 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qq49f\" (UniqueName: \"kubernetes.io/projected/f437a237-3eb8-4817-b0a9-35efece69933-kube-api-access-qq49f\") pod \"node-ca-qtz7b\" (UID: \"f437a237-3eb8-4817-b0a9-35efece69933\") " pod="openshift-image-registry/node-ca-qtz7b" Dec 12 00:23:52 crc kubenswrapper[4606]: I1212 00:23:52.321772 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:52 crc kubenswrapper[4606]: I1212 00:23:52.321802 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:52 crc kubenswrapper[4606]: I1212 00:23:52.321813 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:52 crc kubenswrapper[4606]: I1212 00:23:52.321828 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:23:52 crc kubenswrapper[4606]: I1212 00:23:52.321839 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:23:52Z","lastTransitionTime":"2025-12-12T00:23:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:23:52 crc kubenswrapper[4606]: I1212 00:23:52.424229 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:52 crc kubenswrapper[4606]: I1212 00:23:52.424256 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:52 crc kubenswrapper[4606]: I1212 00:23:52.424264 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:52 crc kubenswrapper[4606]: I1212 00:23:52.424276 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:23:52 crc kubenswrapper[4606]: I1212 00:23:52.424284 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:23:52Z","lastTransitionTime":"2025-12-12T00:23:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:23:52 crc kubenswrapper[4606]: I1212 00:23:52.464095 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 12 00:23:52 crc kubenswrapper[4606]: E1212 00:23:52.464368 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-12 00:23:56.464348115 +0000 UTC m=+27.009700981 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 12 00:23:52 crc kubenswrapper[4606]: I1212 00:23:52.526674 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:52 crc kubenswrapper[4606]: I1212 00:23:52.526714 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:52 crc kubenswrapper[4606]: I1212 00:23:52.526724 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:52 crc kubenswrapper[4606]: I1212 00:23:52.526738 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:23:52 crc kubenswrapper[4606]: I1212 00:23:52.526747 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:23:52Z","lastTransitionTime":"2025-12-12T00:23:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:23:52 crc kubenswrapper[4606]: I1212 00:23:52.565569 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 12 00:23:52 crc kubenswrapper[4606]: I1212 00:23:52.565625 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 12 00:23:52 crc kubenswrapper[4606]: I1212 00:23:52.565662 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 12 00:23:52 crc kubenswrapper[4606]: I1212 00:23:52.565695 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 12 00:23:52 crc kubenswrapper[4606]: E1212 00:23:52.565767 4606 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 12 00:23:52 crc kubenswrapper[4606]: E1212 00:23:52.565775 4606 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 12 00:23:52 crc kubenswrapper[4606]: E1212 00:23:52.565799 4606 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 12 00:23:52 crc kubenswrapper[4606]: E1212 00:23:52.565813 4606 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 12 00:23:52 crc kubenswrapper[4606]: E1212 00:23:52.565821 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-12 00:23:56.56580519 +0000 UTC m=+27.111158056 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 12 00:23:52 crc kubenswrapper[4606]: E1212 00:23:52.565849 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-12 00:23:56.565836911 +0000 UTC m=+27.111189787 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 12 00:23:52 crc kubenswrapper[4606]: E1212 00:23:52.565851 4606 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 12 00:23:52 crc kubenswrapper[4606]: E1212 00:23:52.565873 4606 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 12 00:23:52 crc kubenswrapper[4606]: E1212 00:23:52.565906 4606 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 12 00:23:52 crc kubenswrapper[4606]: E1212 00:23:52.565938 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-12 00:23:56.565928423 +0000 UTC m=+27.111281289 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 12 00:23:52 crc kubenswrapper[4606]: E1212 00:23:52.565766 4606 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 12 00:23:52 crc kubenswrapper[4606]: E1212 00:23:52.565991 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-12 00:23:56.565977325 +0000 UTC m=+27.111330191 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 12 00:23:52 crc kubenswrapper[4606]: I1212 00:23:52.588165 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-qtz7b" Dec 12 00:23:52 crc kubenswrapper[4606]: W1212 00:23:52.601307 4606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf437a237_3eb8_4817_b0a9_35efece69933.slice/crio-37c07278ad08e4387f308faeb776d3f374262bab50b597358bbacc17d3dd1628 WatchSource:0}: Error finding container 37c07278ad08e4387f308faeb776d3f374262bab50b597358bbacc17d3dd1628: Status 404 returned error can't find the container with id 37c07278ad08e4387f308faeb776d3f374262bab50b597358bbacc17d3dd1628 Dec 12 00:23:52 crc kubenswrapper[4606]: I1212 00:23:52.628701 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:52 crc kubenswrapper[4606]: I1212 00:23:52.629053 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:52 crc kubenswrapper[4606]: I1212 00:23:52.629068 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:52 crc kubenswrapper[4606]: I1212 00:23:52.629086 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:23:52 crc kubenswrapper[4606]: I1212 00:23:52.629103 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:23:52Z","lastTransitionTime":"2025-12-12T00:23:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:23:52 crc kubenswrapper[4606]: I1212 00:23:52.699619 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 12 00:23:52 crc kubenswrapper[4606]: I1212 00:23:52.699815 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 12 00:23:52 crc kubenswrapper[4606]: I1212 00:23:52.700987 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 12 00:23:52 crc kubenswrapper[4606]: E1212 00:23:52.701003 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 12 00:23:52 crc kubenswrapper[4606]: E1212 00:23:52.701070 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 12 00:23:52 crc kubenswrapper[4606]: E1212 00:23:52.701156 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 12 00:23:52 crc kubenswrapper[4606]: I1212 00:23:52.716387 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Dec 12 00:23:52 crc kubenswrapper[4606]: I1212 00:23:52.727385 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Dec 12 00:23:52 crc kubenswrapper[4606]: I1212 00:23:52.731124 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Dec 12 00:23:52 crc kubenswrapper[4606]: I1212 00:23:52.731512 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:52 crc kubenswrapper[4606]: I1212 00:23:52.731557 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:52 crc kubenswrapper[4606]: I1212 00:23:52.731568 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:52 crc kubenswrapper[4606]: I1212 00:23:52.731590 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:23:52 crc kubenswrapper[4606]: I1212 00:23:52.731602 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:23:52Z","lastTransitionTime":"2025-12-12T00:23:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:23:52 crc kubenswrapper[4606]: I1212 00:23:52.734070 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4986a294a19f85082b9caf854f13f9c3587ca49314b40917b00768ee0bb51ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4251eab73ccb6141903993f8df922037762f7c11dc1e0ef1a2b5708ed435307\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:52Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:52 crc kubenswrapper[4606]: I1212 00:23:52.749788 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:52Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:52 crc kubenswrapper[4606]: I1212 00:23:52.762831 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qtz7b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f437a237-3eb8-4817-b0a9-35efece69933\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qq49f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qtz7b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:52Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:52 crc kubenswrapper[4606]: I1212 00:23:52.775132 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce6c4e852b4c68e910621ad11058beecb40960d501d3e64f5fcad97c442df4ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:52Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:52 crc kubenswrapper[4606]: I1212 00:23:52.787125 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:52Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:52 crc kubenswrapper[4606]: I1212 00:23:52.799132 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-554rp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d34036b-3243-416a-a0c0-1d1ddb3a0ca5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://346417326a8a4ba8e839ce4870dbaacd33a90b4ab096d8ce573ce598d909b51a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9k8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-554rp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:52Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:52 crc kubenswrapper[4606]: I1212 00:23:52.818840 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-w4rbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"470c2076-46bf-4305-9fb1-3e509eb4d4f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34a40d0ce85078630553b90b7a35510e0d3523f600db8940d5e0cf063903192d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34a40d0ce85078630553b90b7a35510e0d3523f600db8940d5e0cf063903192d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f0d903a3e09486a00bc5031c1a100d7a520f8e0cbdcccbeadb796e70f1075b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f0d903a3e09486a00bc5031c1a100d7a520f8e0cbdcccbeadb796e70f1075b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-w4rbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:52Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:52 crc kubenswrapper[4606]: I1212 00:23:52.834314 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:52 crc kubenswrapper[4606]: I1212 00:23:52.834343 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:52 crc kubenswrapper[4606]: I1212 00:23:52.834355 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:52 crc kubenswrapper[4606]: I1212 00:23:52.834370 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:23:52 crc kubenswrapper[4606]: I1212 00:23:52.834382 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:23:52Z","lastTransitionTime":"2025-12-12T00:23:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:23:52 crc kubenswrapper[4606]: I1212 00:23:52.837454 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da25b0ba-5398-4185-a4c1-aeba44ae5633\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f94a9f641815d76464c17c6145b9ac4803eedab3bea29a52d65a793d448cebe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f94a9f641815d76464c17c6145b9ac4803eedab3bea29a52d65a793d448cebe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hpw5w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:52Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:52 crc kubenswrapper[4606]: I1212 00:23:52.849954 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92839d27-9755-4aaa-a4c2-8aa83b6473de\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d18b3a560566347e9cfcd748453a89d1588de77f02a5e961a71916c6182eff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9531d10c07644d54aeff48edf4ac068968acbd1b6597baabcb0660e7bcd54c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76f2e4c2d97f36aebba4bcbabe12214a1df27637376d1440ee54d43fca39065\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b764ca1153735e365d95ed2855c7d41cd7457614e1ddce0f008d62720d607e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c75350d79a93c04db2bd6d71995920cde9b897776249b8696f3ead3939f7d5d4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"ed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1765499023\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1765499022\\\\\\\\\\\\\\\" (2025-12-11 23:23:42 +0000 UTC to 2026-12-11 23:23:42 +0000 UTC (now=2025-12-12 00:23:48.120831298 +0000 UTC))\\\\\\\"\\\\nI1212 00:23:48.120889 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1212 00:23:48.120921 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1212 00:23:48.120964 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1212 00:23:48.120986 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1212 00:23:48.121028 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3271453193/tls.crt::/tmp/serving-cert-3271453193/tls.key\\\\\\\"\\\\nI1212 00:23:48.121084 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1212 00:23:48.121211 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1212 00:23:48.121228 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1212 00:23:48.121265 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1212 00:23:48.121275 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1212 00:23:48.121345 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1212 00:23:48.121358 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF1212 00:23:48.123292 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3baf1bf5fbaa04f9f165f43b6a4fba496be0d453641fcd8bbc811687891dd66\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bbabfdcef0c9f4dcaada0bf5c2e98c083fd0f809462f017f8a767ef800e8e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0bbabfdcef0c9f4dcaada0bf5c2e98c083fd0f809462f017f8a767ef800e8e13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:52Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:52 crc kubenswrapper[4606]: I1212 00:23:52.862666 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" event={"ID":"da25b0ba-5398-4185-a4c1-aeba44ae5633","Type":"ContainerStarted","Data":"dbb742680d55b47d51cd1964e187b6816e50a88befbce58d3a8746af74cf7071"} Dec 12 00:23:52 crc kubenswrapper[4606]: I1212 00:23:52.862715 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" event={"ID":"da25b0ba-5398-4185-a4c1-aeba44ae5633","Type":"ContainerStarted","Data":"366ddc63e1156e702b24e0c3d3c02785c10324669fa851e46851f9a2b8a46a5f"} Dec 12 00:23:52 crc kubenswrapper[4606]: I1212 00:23:52.864044 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-qtz7b" event={"ID":"f437a237-3eb8-4817-b0a9-35efece69933","Type":"ContainerStarted","Data":"f587a144b02f6ba1229380174f2a4f2c50ed33702fe8d64eb5ff7174b32eb2d8"} Dec 12 00:23:52 crc kubenswrapper[4606]: I1212 00:23:52.864092 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-qtz7b" event={"ID":"f437a237-3eb8-4817-b0a9-35efece69933","Type":"ContainerStarted","Data":"37c07278ad08e4387f308faeb776d3f374262bab50b597358bbacc17d3dd1628"} Dec 12 00:23:52 crc kubenswrapper[4606]: I1212 00:23:52.865859 4606 generic.go:334] "Generic (PLEG): container finished" podID="470c2076-46bf-4305-9fb1-3e509eb4d4f0" containerID="06909fa1392ee8f29394ecd7411437e4f331ccd2007d0a534d0292175a7260ee" exitCode=0 Dec 12 00:23:52 crc kubenswrapper[4606]: I1212 00:23:52.865973 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-w4rbn" event={"ID":"470c2076-46bf-4305-9fb1-3e509eb4d4f0","Type":"ContainerDied","Data":"06909fa1392ee8f29394ecd7411437e4f331ccd2007d0a534d0292175a7260ee"} Dec 12 00:23:52 crc kubenswrapper[4606]: E1212 00:23:52.873275 4606 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-crc\" already exists" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 12 00:23:52 crc kubenswrapper[4606]: I1212 00:23:52.873622 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ba650bd-b1e1-4950-98c6-2bee87181b18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de6cd366dc8d770945235faa6560cea35c0ef7eabd95cb272639e2116e2d623d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a380a96c542428bf9cf66391f4d843144052269b294636e863319b2ba954047f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://902a9d2fb0d7dcdec6339fac981ee0539911f1998a86ecf02a01015d3b1b5907\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4309a0a93977264e5000a930a32a4fb18ebe51fcf2af20bbf153f30f09759ff6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:52Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:52 crc kubenswrapper[4606]: I1212 00:23:52.887598 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:52Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:52 crc kubenswrapper[4606]: I1212 00:23:52.898546 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://694cd2f423e8241bd8fd68395824d1cd45a12d80e92ad57e63e59c0642b21384\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:52Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:52 crc kubenswrapper[4606]: I1212 00:23:52.909985 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a543e227-be89-40cb-941d-b4707cc28921\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b09638dd0d881593745c14548156f20a9366f65ecfa5018d510b902240dd4f7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-986w2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5de882904f4a5718bcf44d07da05096363447836f325f35130914c17a30c3a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-986w2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cqmz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:52Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:52 crc kubenswrapper[4606]: I1212 00:23:52.925225 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xzcfk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84110159d07d70b0607ac029fec70fb043bbb0e310c39955e49becfe87e9faa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-spsv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xzcfk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:52Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:52 crc kubenswrapper[4606]: I1212 00:23:52.937000 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:52 crc kubenswrapper[4606]: I1212 00:23:52.937035 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:52 crc kubenswrapper[4606]: I1212 00:23:52.937046 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:52 crc kubenswrapper[4606]: I1212 00:23:52.937062 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:23:52 crc kubenswrapper[4606]: I1212 00:23:52.937073 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:23:52Z","lastTransitionTime":"2025-12-12T00:23:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:23:52 crc kubenswrapper[4606]: I1212 00:23:52.945288 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f8df398-6835-482a-a2cd-3395b6e9efab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b993d03e36e69edb510ce98f50dfb98344fb6fe2b5debeff3a2004807dd00ee6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c6fef20d4e159a405f0ab0206c49e5e314e97bf7ddaf0758e090106945c9ffe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://accf40ef1339b65136666c1ea6f21a70d2152a650c87ef3ade19ddb6f9effbc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbd49ad6d6011a235d440da0e8058c0fc4dcc64effec37b50a68e84438d537a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2860bce1479ca1d558c4f5230d76cadcaf3efc0e3842ef0d371ecf1168cd852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7780eb068dc4a157794f9253367d2198a38a2c039cf3906fc2319cf48db3cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa7780eb068dc4a157794f9253367d2198a38a2c039cf3906fc2319cf48db3cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://684c387c89889a0a1e8d317ca70ebfc71ea377eb5de9c08c71a3ff02faf99e4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://684c387c89889a0a1e8d317ca70ebfc71ea377eb5de9c08c71a3ff02faf99e4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://78da314feaec6cab81e27eaea086d2bba3a05c4b73ce2add1a0bb57a226b0f5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78da314feaec6cab81e27eaea086d2bba3a05c4b73ce2add1a0bb57a226b0f5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:52Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:52 crc kubenswrapper[4606]: I1212 00:23:52.963278 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92839d27-9755-4aaa-a4c2-8aa83b6473de\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d18b3a560566347e9cfcd748453a89d1588de77f02a5e961a71916c6182eff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9531d10c07644d54aeff48edf4ac068968acbd1b6597baabcb0660e7bcd54c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76f2e4c2d97f36aebba4bcbabe12214a1df27637376d1440ee54d43fca39065\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b764ca1153735e365d95ed2855c7d41cd7457614e1ddce0f008d62720d607e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c75350d79a93c04db2bd6d71995920cde9b897776249b8696f3ead3939f7d5d4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"ed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1765499023\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1765499022\\\\\\\\\\\\\\\" (2025-12-11 23:23:42 +0000 UTC to 2026-12-11 23:23:42 +0000 UTC (now=2025-12-12 00:23:48.120831298 +0000 UTC))\\\\\\\"\\\\nI1212 00:23:48.120889 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1212 00:23:48.120921 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1212 00:23:48.120964 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1212 00:23:48.120986 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1212 00:23:48.121028 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3271453193/tls.crt::/tmp/serving-cert-3271453193/tls.key\\\\\\\"\\\\nI1212 00:23:48.121084 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1212 00:23:48.121211 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1212 00:23:48.121228 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1212 00:23:48.121265 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1212 00:23:48.121275 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1212 00:23:48.121345 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1212 00:23:48.121358 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF1212 00:23:48.123292 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3baf1bf5fbaa04f9f165f43b6a4fba496be0d453641fcd8bbc811687891dd66\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bbabfdcef0c9f4dcaada0bf5c2e98c083fd0f809462f017f8a767ef800e8e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0bbabfdcef0c9f4dcaada0bf5c2e98c083fd0f809462f017f8a767ef800e8e13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:52Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:52 crc kubenswrapper[4606]: I1212 00:23:52.976222 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ba650bd-b1e1-4950-98c6-2bee87181b18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de6cd366dc8d770945235faa6560cea35c0ef7eabd95cb272639e2116e2d623d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a380a96c542428bf9cf66391f4d843144052269b294636e863319b2ba954047f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://902a9d2fb0d7dcdec6339fac981ee0539911f1998a86ecf02a01015d3b1b5907\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4309a0a93977264e5000a930a32a4fb18ebe51fcf2af20bbf153f30f09759ff6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:52Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:52 crc kubenswrapper[4606]: I1212 00:23:52.989280 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:52Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:53 crc kubenswrapper[4606]: I1212 00:23:53.028555 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://694cd2f423e8241bd8fd68395824d1cd45a12d80e92ad57e63e59c0642b21384\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:53Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:53 crc kubenswrapper[4606]: I1212 00:23:53.043243 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:53 crc kubenswrapper[4606]: I1212 00:23:53.043269 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:53 crc kubenswrapper[4606]: I1212 00:23:53.043277 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:53 crc kubenswrapper[4606]: I1212 00:23:53.043290 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:23:53 crc kubenswrapper[4606]: I1212 00:23:53.043300 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:23:53Z","lastTransitionTime":"2025-12-12T00:23:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:23:53 crc kubenswrapper[4606]: I1212 00:23:53.073644 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a543e227-be89-40cb-941d-b4707cc28921\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b09638dd0d881593745c14548156f20a9366f65ecfa5018d510b902240dd4f7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-986w2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5de882904f4a5718bcf44d07da05096363447836f325f35130914c17a30c3a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-986w2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cqmz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:53Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:53 crc kubenswrapper[4606]: I1212 00:23:53.109338 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xzcfk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84110159d07d70b0607ac029fec70fb043bbb0e310c39955e49becfe87e9faa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-spsv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xzcfk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:53Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:53 crc kubenswrapper[4606]: I1212 00:23:53.145829 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:53 crc kubenswrapper[4606]: I1212 00:23:53.145857 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:53 crc kubenswrapper[4606]: I1212 00:23:53.145865 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:53 crc kubenswrapper[4606]: I1212 00:23:53.145879 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:23:53 crc kubenswrapper[4606]: I1212 00:23:53.145888 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:23:53Z","lastTransitionTime":"2025-12-12T00:23:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:23:53 crc kubenswrapper[4606]: I1212 00:23:53.149456 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4986a294a19f85082b9caf854f13f9c3587ca49314b40917b00768ee0bb51ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4251eab73ccb6141903993f8df922037762f7c11dc1e0ef1a2b5708ed435307\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:53Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:53 crc kubenswrapper[4606]: I1212 00:23:53.189311 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:53Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:53 crc kubenswrapper[4606]: I1212 00:23:53.226388 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qtz7b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f437a237-3eb8-4817-b0a9-35efece69933\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f587a144b02f6ba1229380174f2a4f2c50ed33702fe8d64eb5ff7174b32eb2d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qq49f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qtz7b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:53Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:53 crc kubenswrapper[4606]: I1212 00:23:53.248653 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:53 crc kubenswrapper[4606]: I1212 00:23:53.248695 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:53 crc kubenswrapper[4606]: I1212 00:23:53.248705 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:53 crc kubenswrapper[4606]: I1212 00:23:53.248722 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:23:53 crc kubenswrapper[4606]: I1212 00:23:53.248733 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:23:53Z","lastTransitionTime":"2025-12-12T00:23:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:23:53 crc kubenswrapper[4606]: I1212 00:23:53.268137 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce6c4e852b4c68e910621ad11058beecb40960d501d3e64f5fcad97c442df4ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:53Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:53 crc kubenswrapper[4606]: I1212 00:23:53.308237 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:53Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:53 crc kubenswrapper[4606]: I1212 00:23:53.351764 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:53 crc kubenswrapper[4606]: I1212 00:23:53.351820 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:53 crc kubenswrapper[4606]: I1212 00:23:53.351831 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:53 crc kubenswrapper[4606]: I1212 00:23:53.351847 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:23:53 crc kubenswrapper[4606]: I1212 00:23:53.352194 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:23:53Z","lastTransitionTime":"2025-12-12T00:23:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:23:53 crc kubenswrapper[4606]: I1212 00:23:53.357980 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-554rp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d34036b-3243-416a-a0c0-1d1ddb3a0ca5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://346417326a8a4ba8e839ce4870dbaacd33a90b4ab096d8ce573ce598d909b51a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9k8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-554rp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:53Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:53 crc kubenswrapper[4606]: I1212 00:23:53.399247 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-w4rbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"470c2076-46bf-4305-9fb1-3e509eb4d4f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34a40d0ce85078630553b90b7a35510e0d3523f600db8940d5e0cf063903192d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34a40d0ce85078630553b90b7a35510e0d3523f600db8940d5e0cf063903192d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f0d903a3e09486a00bc5031c1a100d7a520f8e0cbdcccbeadb796e70f1075b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f0d903a3e09486a00bc5031c1a100d7a520f8e0cbdcccbeadb796e70f1075b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06909fa1392ee8f29394ecd7411437e4f331ccd2007d0a534d0292175a7260ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06909fa1392ee8f29394ecd7411437e4f331ccd2007d0a534d0292175a7260ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-w4rbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:53Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:53 crc kubenswrapper[4606]: I1212 00:23:53.447432 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da25b0ba-5398-4185-a4c1-aeba44ae5633\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f94a9f641815d76464c17c6145b9ac4803eedab3bea29a52d65a793d448cebe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f94a9f641815d76464c17c6145b9ac4803eedab3bea29a52d65a793d448cebe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hpw5w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:53Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:53 crc kubenswrapper[4606]: I1212 00:23:53.455462 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:53 crc kubenswrapper[4606]: I1212 00:23:53.455500 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:53 crc kubenswrapper[4606]: I1212 00:23:53.455518 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:53 crc kubenswrapper[4606]: I1212 00:23:53.455537 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:23:53 crc kubenswrapper[4606]: I1212 00:23:53.455548 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:23:53Z","lastTransitionTime":"2025-12-12T00:23:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:23:53 crc kubenswrapper[4606]: I1212 00:23:53.558237 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:53 crc kubenswrapper[4606]: I1212 00:23:53.558268 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:53 crc kubenswrapper[4606]: I1212 00:23:53.558284 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:53 crc kubenswrapper[4606]: I1212 00:23:53.558306 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:23:53 crc kubenswrapper[4606]: I1212 00:23:53.558320 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:23:53Z","lastTransitionTime":"2025-12-12T00:23:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:23:53 crc kubenswrapper[4606]: I1212 00:23:53.662096 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:53 crc kubenswrapper[4606]: I1212 00:23:53.662163 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:53 crc kubenswrapper[4606]: I1212 00:23:53.662214 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:53 crc kubenswrapper[4606]: I1212 00:23:53.662242 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:23:53 crc kubenswrapper[4606]: I1212 00:23:53.662266 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:23:53Z","lastTransitionTime":"2025-12-12T00:23:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:23:53 crc kubenswrapper[4606]: I1212 00:23:53.765578 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:53 crc kubenswrapper[4606]: I1212 00:23:53.765620 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:53 crc kubenswrapper[4606]: I1212 00:23:53.765636 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:53 crc kubenswrapper[4606]: I1212 00:23:53.765659 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:23:53 crc kubenswrapper[4606]: I1212 00:23:53.765678 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:23:53Z","lastTransitionTime":"2025-12-12T00:23:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:23:53 crc kubenswrapper[4606]: I1212 00:23:53.868335 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:53 crc kubenswrapper[4606]: I1212 00:23:53.868720 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:53 crc kubenswrapper[4606]: I1212 00:23:53.868818 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:53 crc kubenswrapper[4606]: I1212 00:23:53.868899 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:23:53 crc kubenswrapper[4606]: I1212 00:23:53.869007 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:23:53Z","lastTransitionTime":"2025-12-12T00:23:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:23:53 crc kubenswrapper[4606]: I1212 00:23:53.873715 4606 generic.go:334] "Generic (PLEG): container finished" podID="470c2076-46bf-4305-9fb1-3e509eb4d4f0" containerID="328f409c465a8289f2e2e070ac2475187e763de8354cad8af5b5b4e0a3539628" exitCode=0 Dec 12 00:23:53 crc kubenswrapper[4606]: I1212 00:23:53.874365 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-w4rbn" event={"ID":"470c2076-46bf-4305-9fb1-3e509eb4d4f0","Type":"ContainerDied","Data":"328f409c465a8289f2e2e070ac2475187e763de8354cad8af5b5b4e0a3539628"} Dec 12 00:23:53 crc kubenswrapper[4606]: I1212 00:23:53.893818 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4986a294a19f85082b9caf854f13f9c3587ca49314b40917b00768ee0bb51ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4251eab73ccb6141903993f8df922037762f7c11dc1e0ef1a2b5708ed435307\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:53Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:53 crc kubenswrapper[4606]: I1212 00:23:53.909648 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:53Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:53 crc kubenswrapper[4606]: I1212 00:23:53.924443 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qtz7b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f437a237-3eb8-4817-b0a9-35efece69933\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f587a144b02f6ba1229380174f2a4f2c50ed33702fe8d64eb5ff7174b32eb2d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qq49f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qtz7b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:53Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:53 crc kubenswrapper[4606]: I1212 00:23:53.939355 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce6c4e852b4c68e910621ad11058beecb40960d501d3e64f5fcad97c442df4ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:53Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:53 crc kubenswrapper[4606]: I1212 00:23:53.951785 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:53Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:53 crc kubenswrapper[4606]: I1212 00:23:53.960820 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-554rp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d34036b-3243-416a-a0c0-1d1ddb3a0ca5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://346417326a8a4ba8e839ce4870dbaacd33a90b4ab096d8ce573ce598d909b51a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9k8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-554rp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:53Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:53 crc kubenswrapper[4606]: I1212 00:23:53.973390 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-w4rbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"470c2076-46bf-4305-9fb1-3e509eb4d4f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34a40d0ce85078630553b90b7a35510e0d3523f600db8940d5e0cf063903192d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34a40d0ce85078630553b90b7a35510e0d3523f600db8940d5e0cf063903192d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f0d903a3e09486a00bc5031c1a100d7a520f8e0cbdcccbeadb796e70f1075b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f0d903a3e09486a00bc5031c1a100d7a520f8e0cbdcccbeadb796e70f1075b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06909fa1392ee8f29394ecd7411437e4f331ccd2007d0a534d0292175a7260ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06909fa1392ee8f29394ecd7411437e4f331ccd2007d0a534d0292175a7260ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://328f409c465a8289f2e2e070ac2475187e763de8354cad8af5b5b4e0a3539628\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://328f409c465a8289f2e2e070ac2475187e763de8354cad8af5b5b4e0a3539628\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-w4rbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:53Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:53 crc kubenswrapper[4606]: I1212 00:23:53.973668 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:53 crc kubenswrapper[4606]: I1212 00:23:53.973698 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:53 crc kubenswrapper[4606]: I1212 00:23:53.973709 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:53 crc kubenswrapper[4606]: I1212 00:23:53.973725 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:23:53 crc kubenswrapper[4606]: I1212 00:23:53.973736 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:23:53Z","lastTransitionTime":"2025-12-12T00:23:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:23:53 crc kubenswrapper[4606]: I1212 00:23:53.989194 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da25b0ba-5398-4185-a4c1-aeba44ae5633\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f94a9f641815d76464c17c6145b9ac4803eedab3bea29a52d65a793d448cebe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f94a9f641815d76464c17c6145b9ac4803eedab3bea29a52d65a793d448cebe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hpw5w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:53Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:54 crc kubenswrapper[4606]: I1212 00:23:54.001096 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xzcfk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84110159d07d70b0607ac029fec70fb043bbb0e310c39955e49becfe87e9faa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-spsv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xzcfk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:53Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:54 crc kubenswrapper[4606]: I1212 00:23:54.020846 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f8df398-6835-482a-a2cd-3395b6e9efab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b993d03e36e69edb510ce98f50dfb98344fb6fe2b5debeff3a2004807dd00ee6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c6fef20d4e159a405f0ab0206c49e5e314e97bf7ddaf0758e090106945c9ffe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://accf40ef1339b65136666c1ea6f21a70d2152a650c87ef3ade19ddb6f9effbc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbd49ad6d6011a235d440da0e8058c0fc4dcc64effec37b50a68e84438d537a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2860bce1479ca1d558c4f5230d76cadcaf3efc0e3842ef0d371ecf1168cd852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7780eb068dc4a157794f9253367d2198a38a2c039cf3906fc2319cf48db3cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa7780eb068dc4a157794f9253367d2198a38a2c039cf3906fc2319cf48db3cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://684c387c89889a0a1e8d317ca70ebfc71ea377eb5de9c08c71a3ff02faf99e4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://684c387c89889a0a1e8d317ca70ebfc71ea377eb5de9c08c71a3ff02faf99e4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://78da314feaec6cab81e27eaea086d2bba3a05c4b73ce2add1a0bb57a226b0f5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78da314feaec6cab81e27eaea086d2bba3a05c4b73ce2add1a0bb57a226b0f5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:54Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:54 crc kubenswrapper[4606]: I1212 00:23:54.032414 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92839d27-9755-4aaa-a4c2-8aa83b6473de\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d18b3a560566347e9cfcd748453a89d1588de77f02a5e961a71916c6182eff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9531d10c07644d54aeff48edf4ac068968acbd1b6597baabcb0660e7bcd54c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76f2e4c2d97f36aebba4bcbabe12214a1df27637376d1440ee54d43fca39065\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b764ca1153735e365d95ed2855c7d41cd7457614e1ddce0f008d62720d607e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c75350d79a93c04db2bd6d71995920cde9b897776249b8696f3ead3939f7d5d4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"ed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1765499023\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1765499022\\\\\\\\\\\\\\\" (2025-12-11 23:23:42 +0000 UTC to 2026-12-11 23:23:42 +0000 UTC (now=2025-12-12 00:23:48.120831298 +0000 UTC))\\\\\\\"\\\\nI1212 00:23:48.120889 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1212 00:23:48.120921 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1212 00:23:48.120964 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1212 00:23:48.120986 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1212 00:23:48.121028 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3271453193/tls.crt::/tmp/serving-cert-3271453193/tls.key\\\\\\\"\\\\nI1212 00:23:48.121084 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1212 00:23:48.121211 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1212 00:23:48.121228 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1212 00:23:48.121265 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1212 00:23:48.121275 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1212 00:23:48.121345 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1212 00:23:48.121358 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF1212 00:23:48.123292 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3baf1bf5fbaa04f9f165f43b6a4fba496be0d453641fcd8bbc811687891dd66\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bbabfdcef0c9f4dcaada0bf5c2e98c083fd0f809462f017f8a767ef800e8e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0bbabfdcef0c9f4dcaada0bf5c2e98c083fd0f809462f017f8a767ef800e8e13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:54Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:54 crc kubenswrapper[4606]: I1212 00:23:54.050895 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ba650bd-b1e1-4950-98c6-2bee87181b18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de6cd366dc8d770945235faa6560cea35c0ef7eabd95cb272639e2116e2d623d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a380a96c542428bf9cf66391f4d843144052269b294636e863319b2ba954047f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://902a9d2fb0d7dcdec6339fac981ee0539911f1998a86ecf02a01015d3b1b5907\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4309a0a93977264e5000a930a32a4fb18ebe51fcf2af20bbf153f30f09759ff6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:54Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:54 crc kubenswrapper[4606]: I1212 00:23:54.064170 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:54Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:54 crc kubenswrapper[4606]: I1212 00:23:54.077475 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:54 crc kubenswrapper[4606]: I1212 00:23:54.077510 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:54 crc kubenswrapper[4606]: I1212 00:23:54.077520 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:54 crc kubenswrapper[4606]: I1212 00:23:54.077537 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:23:54 crc kubenswrapper[4606]: I1212 00:23:54.077548 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:23:54Z","lastTransitionTime":"2025-12-12T00:23:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:23:54 crc kubenswrapper[4606]: I1212 00:23:54.081477 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://694cd2f423e8241bd8fd68395824d1cd45a12d80e92ad57e63e59c0642b21384\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:54Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:54 crc kubenswrapper[4606]: I1212 00:23:54.093722 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a543e227-be89-40cb-941d-b4707cc28921\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b09638dd0d881593745c14548156f20a9366f65ecfa5018d510b902240dd4f7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-986w2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5de882904f4a5718bcf44d07da05096363447836f325f35130914c17a30c3a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-986w2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cqmz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:54Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:54 crc kubenswrapper[4606]: I1212 00:23:54.179760 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:54 crc kubenswrapper[4606]: I1212 00:23:54.179813 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:54 crc kubenswrapper[4606]: I1212 00:23:54.179832 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:54 crc kubenswrapper[4606]: I1212 00:23:54.179849 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:23:54 crc kubenswrapper[4606]: I1212 00:23:54.179860 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:23:54Z","lastTransitionTime":"2025-12-12T00:23:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:23:54 crc kubenswrapper[4606]: I1212 00:23:54.282580 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:54 crc kubenswrapper[4606]: I1212 00:23:54.282618 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:54 crc kubenswrapper[4606]: I1212 00:23:54.282628 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:54 crc kubenswrapper[4606]: I1212 00:23:54.282644 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:23:54 crc kubenswrapper[4606]: I1212 00:23:54.282654 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:23:54Z","lastTransitionTime":"2025-12-12T00:23:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:23:54 crc kubenswrapper[4606]: I1212 00:23:54.385118 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:54 crc kubenswrapper[4606]: I1212 00:23:54.385331 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:54 crc kubenswrapper[4606]: I1212 00:23:54.385339 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:54 crc kubenswrapper[4606]: I1212 00:23:54.385351 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:23:54 crc kubenswrapper[4606]: I1212 00:23:54.385359 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:23:54Z","lastTransitionTime":"2025-12-12T00:23:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:23:54 crc kubenswrapper[4606]: I1212 00:23:54.488455 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:54 crc kubenswrapper[4606]: I1212 00:23:54.488512 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:54 crc kubenswrapper[4606]: I1212 00:23:54.488531 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:54 crc kubenswrapper[4606]: I1212 00:23:54.488555 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:23:54 crc kubenswrapper[4606]: I1212 00:23:54.488573 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:23:54Z","lastTransitionTime":"2025-12-12T00:23:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:23:54 crc kubenswrapper[4606]: I1212 00:23:54.591790 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:54 crc kubenswrapper[4606]: I1212 00:23:54.591846 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:54 crc kubenswrapper[4606]: I1212 00:23:54.591864 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:54 crc kubenswrapper[4606]: I1212 00:23:54.591888 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:23:54 crc kubenswrapper[4606]: I1212 00:23:54.591906 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:23:54Z","lastTransitionTime":"2025-12-12T00:23:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:23:54 crc kubenswrapper[4606]: I1212 00:23:54.694912 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:54 crc kubenswrapper[4606]: I1212 00:23:54.694971 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:54 crc kubenswrapper[4606]: I1212 00:23:54.694987 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:54 crc kubenswrapper[4606]: I1212 00:23:54.695013 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:23:54 crc kubenswrapper[4606]: I1212 00:23:54.695029 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:23:54Z","lastTransitionTime":"2025-12-12T00:23:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:23:54 crc kubenswrapper[4606]: I1212 00:23:54.699673 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 12 00:23:54 crc kubenswrapper[4606]: I1212 00:23:54.699720 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 12 00:23:54 crc kubenswrapper[4606]: I1212 00:23:54.699741 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 12 00:23:54 crc kubenswrapper[4606]: E1212 00:23:54.699847 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 12 00:23:54 crc kubenswrapper[4606]: E1212 00:23:54.700005 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 12 00:23:54 crc kubenswrapper[4606]: E1212 00:23:54.700159 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 12 00:23:54 crc kubenswrapper[4606]: I1212 00:23:54.798587 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:54 crc kubenswrapper[4606]: I1212 00:23:54.798636 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:54 crc kubenswrapper[4606]: I1212 00:23:54.798653 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:54 crc kubenswrapper[4606]: I1212 00:23:54.798676 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:23:54 crc kubenswrapper[4606]: I1212 00:23:54.798693 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:23:54Z","lastTransitionTime":"2025-12-12T00:23:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:23:54 crc kubenswrapper[4606]: I1212 00:23:54.881833 4606 generic.go:334] "Generic (PLEG): container finished" podID="470c2076-46bf-4305-9fb1-3e509eb4d4f0" containerID="09980f0c6b16141a2a66d2768c6e01f75c37314fea48825e9b9707a3fc4d3f9e" exitCode=0 Dec 12 00:23:54 crc kubenswrapper[4606]: I1212 00:23:54.881930 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-w4rbn" event={"ID":"470c2076-46bf-4305-9fb1-3e509eb4d4f0","Type":"ContainerDied","Data":"09980f0c6b16141a2a66d2768c6e01f75c37314fea48825e9b9707a3fc4d3f9e"} Dec 12 00:23:54 crc kubenswrapper[4606]: I1212 00:23:54.889726 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" event={"ID":"da25b0ba-5398-4185-a4c1-aeba44ae5633","Type":"ContainerStarted","Data":"8e3c08f155fa3cef9692edf10c4df9719f3647b5625a9ac68a7c158afb80dd86"} Dec 12 00:23:54 crc kubenswrapper[4606]: I1212 00:23:54.901920 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:54 crc kubenswrapper[4606]: I1212 00:23:54.901993 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:54 crc kubenswrapper[4606]: I1212 00:23:54.902018 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:54 crc kubenswrapper[4606]: I1212 00:23:54.902048 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:23:54 crc kubenswrapper[4606]: I1212 00:23:54.902069 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:23:54Z","lastTransitionTime":"2025-12-12T00:23:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:23:54 crc kubenswrapper[4606]: I1212 00:23:54.914613 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4986a294a19f85082b9caf854f13f9c3587ca49314b40917b00768ee0bb51ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4251eab73ccb6141903993f8df922037762f7c11dc1e0ef1a2b5708ed435307\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:54Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:54 crc kubenswrapper[4606]: I1212 00:23:54.936219 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:54Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:54 crc kubenswrapper[4606]: I1212 00:23:54.952546 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qtz7b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f437a237-3eb8-4817-b0a9-35efece69933\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f587a144b02f6ba1229380174f2a4f2c50ed33702fe8d64eb5ff7174b32eb2d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qq49f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qtz7b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:54Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:54 crc kubenswrapper[4606]: I1212 00:23:54.971431 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce6c4e852b4c68e910621ad11058beecb40960d501d3e64f5fcad97c442df4ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:54Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:54 crc kubenswrapper[4606]: I1212 00:23:54.986769 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:54Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:55 crc kubenswrapper[4606]: I1212 00:23:55.001091 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-554rp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d34036b-3243-416a-a0c0-1d1ddb3a0ca5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://346417326a8a4ba8e839ce4870dbaacd33a90b4ab096d8ce573ce598d909b51a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9k8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-554rp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:54Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:55 crc kubenswrapper[4606]: I1212 00:23:55.004273 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:55 crc kubenswrapper[4606]: I1212 00:23:55.004318 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:55 crc kubenswrapper[4606]: I1212 00:23:55.004334 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:55 crc kubenswrapper[4606]: I1212 00:23:55.004354 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:23:55 crc kubenswrapper[4606]: I1212 00:23:55.004366 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:23:55Z","lastTransitionTime":"2025-12-12T00:23:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:23:55 crc kubenswrapper[4606]: I1212 00:23:55.018682 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-w4rbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"470c2076-46bf-4305-9fb1-3e509eb4d4f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34a40d0ce85078630553b90b7a35510e0d3523f600db8940d5e0cf063903192d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34a40d0ce85078630553b90b7a35510e0d3523f600db8940d5e0cf063903192d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f0d903a3e09486a00bc5031c1a100d7a520f8e0cbdcccbeadb796e70f1075b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f0d903a3e09486a00bc5031c1a100d7a520f8e0cbdcccbeadb796e70f1075b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06909fa1392ee8f29394ecd7411437e4f331ccd2007d0a534d0292175a7260ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06909fa1392ee8f29394ecd7411437e4f331ccd2007d0a534d0292175a7260ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://328f409c465a8289f2e2e070ac2475187e763de8354cad8af5b5b4e0a3539628\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://328f409c465a8289f2e2e070ac2475187e763de8354cad8af5b5b4e0a3539628\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09980f0c6b16141a2a66d2768c6e01f75c37314fea48825e9b9707a3fc4d3f9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09980f0c6b16141a2a66d2768c6e01f75c37314fea48825e9b9707a3fc4d3f9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-w4rbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:55Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:55 crc kubenswrapper[4606]: I1212 00:23:55.039260 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da25b0ba-5398-4185-a4c1-aeba44ae5633\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f94a9f641815d76464c17c6145b9ac4803eedab3bea29a52d65a793d448cebe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f94a9f641815d76464c17c6145b9ac4803eedab3bea29a52d65a793d448cebe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hpw5w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:55Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:55 crc kubenswrapper[4606]: I1212 00:23:55.065349 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f8df398-6835-482a-a2cd-3395b6e9efab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b993d03e36e69edb510ce98f50dfb98344fb6fe2b5debeff3a2004807dd00ee6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c6fef20d4e159a405f0ab0206c49e5e314e97bf7ddaf0758e090106945c9ffe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://accf40ef1339b65136666c1ea6f21a70d2152a650c87ef3ade19ddb6f9effbc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbd49ad6d6011a235d440da0e8058c0fc4dcc64effec37b50a68e84438d537a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2860bce1479ca1d558c4f5230d76cadcaf3efc0e3842ef0d371ecf1168cd852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7780eb068dc4a157794f9253367d2198a38a2c039cf3906fc2319cf48db3cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa7780eb068dc4a157794f9253367d2198a38a2c039cf3906fc2319cf48db3cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://684c387c89889a0a1e8d317ca70ebfc71ea377eb5de9c08c71a3ff02faf99e4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://684c387c89889a0a1e8d317ca70ebfc71ea377eb5de9c08c71a3ff02faf99e4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://78da314feaec6cab81e27eaea086d2bba3a05c4b73ce2add1a0bb57a226b0f5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78da314feaec6cab81e27eaea086d2bba3a05c4b73ce2add1a0bb57a226b0f5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:55Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:55 crc kubenswrapper[4606]: I1212 00:23:55.082318 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92839d27-9755-4aaa-a4c2-8aa83b6473de\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d18b3a560566347e9cfcd748453a89d1588de77f02a5e961a71916c6182eff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9531d10c07644d54aeff48edf4ac068968acbd1b6597baabcb0660e7bcd54c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76f2e4c2d97f36aebba4bcbabe12214a1df27637376d1440ee54d43fca39065\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b764ca1153735e365d95ed2855c7d41cd7457614e1ddce0f008d62720d607e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c75350d79a93c04db2bd6d71995920cde9b897776249b8696f3ead3939f7d5d4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"ed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1765499023\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1765499022\\\\\\\\\\\\\\\" (2025-12-11 23:23:42 +0000 UTC to 2026-12-11 23:23:42 +0000 UTC (now=2025-12-12 00:23:48.120831298 +0000 UTC))\\\\\\\"\\\\nI1212 00:23:48.120889 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1212 00:23:48.120921 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1212 00:23:48.120964 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1212 00:23:48.120986 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1212 00:23:48.121028 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3271453193/tls.crt::/tmp/serving-cert-3271453193/tls.key\\\\\\\"\\\\nI1212 00:23:48.121084 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1212 00:23:48.121211 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1212 00:23:48.121228 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1212 00:23:48.121265 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1212 00:23:48.121275 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1212 00:23:48.121345 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1212 00:23:48.121358 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF1212 00:23:48.123292 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3baf1bf5fbaa04f9f165f43b6a4fba496be0d453641fcd8bbc811687891dd66\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bbabfdcef0c9f4dcaada0bf5c2e98c083fd0f809462f017f8a767ef800e8e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0bbabfdcef0c9f4dcaada0bf5c2e98c083fd0f809462f017f8a767ef800e8e13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:55Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:55 crc kubenswrapper[4606]: I1212 00:23:55.094788 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ba650bd-b1e1-4950-98c6-2bee87181b18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de6cd366dc8d770945235faa6560cea35c0ef7eabd95cb272639e2116e2d623d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a380a96c542428bf9cf66391f4d843144052269b294636e863319b2ba954047f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://902a9d2fb0d7dcdec6339fac981ee0539911f1998a86ecf02a01015d3b1b5907\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4309a0a93977264e5000a930a32a4fb18ebe51fcf2af20bbf153f30f09759ff6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:55Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:55 crc kubenswrapper[4606]: I1212 00:23:55.106327 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:55 crc kubenswrapper[4606]: I1212 00:23:55.106367 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:55 crc kubenswrapper[4606]: I1212 00:23:55.106378 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:55 crc kubenswrapper[4606]: I1212 00:23:55.106394 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:23:55 crc kubenswrapper[4606]: I1212 00:23:55.106406 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:23:55Z","lastTransitionTime":"2025-12-12T00:23:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:23:55 crc kubenswrapper[4606]: I1212 00:23:55.111421 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:55Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:55 crc kubenswrapper[4606]: I1212 00:23:55.123414 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://694cd2f423e8241bd8fd68395824d1cd45a12d80e92ad57e63e59c0642b21384\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:55Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:55 crc kubenswrapper[4606]: I1212 00:23:55.136440 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a543e227-be89-40cb-941d-b4707cc28921\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b09638dd0d881593745c14548156f20a9366f65ecfa5018d510b902240dd4f7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-986w2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5de882904f4a5718bcf44d07da05096363447836f325f35130914c17a30c3a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-986w2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cqmz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:55Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:55 crc kubenswrapper[4606]: I1212 00:23:55.149117 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xzcfk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84110159d07d70b0607ac029fec70fb043bbb0e310c39955e49becfe87e9faa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-spsv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xzcfk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:55Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:55 crc kubenswrapper[4606]: I1212 00:23:55.209069 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:55 crc kubenswrapper[4606]: I1212 00:23:55.209098 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:55 crc kubenswrapper[4606]: I1212 00:23:55.209107 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:55 crc kubenswrapper[4606]: I1212 00:23:55.209120 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:23:55 crc kubenswrapper[4606]: I1212 00:23:55.209129 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:23:55Z","lastTransitionTime":"2025-12-12T00:23:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:23:55 crc kubenswrapper[4606]: I1212 00:23:55.311745 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:55 crc kubenswrapper[4606]: I1212 00:23:55.311779 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:55 crc kubenswrapper[4606]: I1212 00:23:55.311791 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:55 crc kubenswrapper[4606]: I1212 00:23:55.311806 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:23:55 crc kubenswrapper[4606]: I1212 00:23:55.311817 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:23:55Z","lastTransitionTime":"2025-12-12T00:23:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:23:55 crc kubenswrapper[4606]: I1212 00:23:55.415100 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:55 crc kubenswrapper[4606]: I1212 00:23:55.415214 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:55 crc kubenswrapper[4606]: I1212 00:23:55.415246 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:55 crc kubenswrapper[4606]: I1212 00:23:55.415275 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:23:55 crc kubenswrapper[4606]: I1212 00:23:55.415298 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:23:55Z","lastTransitionTime":"2025-12-12T00:23:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:23:55 crc kubenswrapper[4606]: I1212 00:23:55.518888 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:55 crc kubenswrapper[4606]: I1212 00:23:55.518947 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:55 crc kubenswrapper[4606]: I1212 00:23:55.518970 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:55 crc kubenswrapper[4606]: I1212 00:23:55.518998 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:23:55 crc kubenswrapper[4606]: I1212 00:23:55.519019 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:23:55Z","lastTransitionTime":"2025-12-12T00:23:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:23:55 crc kubenswrapper[4606]: I1212 00:23:55.621125 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:55 crc kubenswrapper[4606]: I1212 00:23:55.621241 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:55 crc kubenswrapper[4606]: I1212 00:23:55.621270 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:55 crc kubenswrapper[4606]: I1212 00:23:55.621305 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:23:55 crc kubenswrapper[4606]: I1212 00:23:55.621379 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:23:55Z","lastTransitionTime":"2025-12-12T00:23:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:23:55 crc kubenswrapper[4606]: I1212 00:23:55.723738 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:55 crc kubenswrapper[4606]: I1212 00:23:55.723782 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:55 crc kubenswrapper[4606]: I1212 00:23:55.723800 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:55 crc kubenswrapper[4606]: I1212 00:23:55.723821 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:23:55 crc kubenswrapper[4606]: I1212 00:23:55.723839 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:23:55Z","lastTransitionTime":"2025-12-12T00:23:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:23:55 crc kubenswrapper[4606]: I1212 00:23:55.826386 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:55 crc kubenswrapper[4606]: I1212 00:23:55.826433 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:55 crc kubenswrapper[4606]: I1212 00:23:55.826449 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:55 crc kubenswrapper[4606]: I1212 00:23:55.826474 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:23:55 crc kubenswrapper[4606]: I1212 00:23:55.826494 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:23:55Z","lastTransitionTime":"2025-12-12T00:23:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:23:55 crc kubenswrapper[4606]: I1212 00:23:55.896099 4606 generic.go:334] "Generic (PLEG): container finished" podID="470c2076-46bf-4305-9fb1-3e509eb4d4f0" containerID="8d34cd079b74cccb75c66edf8c51b64113483767ced4eb56ad429d6527d0bb0c" exitCode=0 Dec 12 00:23:55 crc kubenswrapper[4606]: I1212 00:23:55.896144 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-w4rbn" event={"ID":"470c2076-46bf-4305-9fb1-3e509eb4d4f0","Type":"ContainerDied","Data":"8d34cd079b74cccb75c66edf8c51b64113483767ced4eb56ad429d6527d0bb0c"} Dec 12 00:23:55 crc kubenswrapper[4606]: I1212 00:23:55.914844 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:55Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:55 crc kubenswrapper[4606]: I1212 00:23:55.930875 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:55 crc kubenswrapper[4606]: I1212 00:23:55.930911 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:55 crc kubenswrapper[4606]: I1212 00:23:55.930919 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:55 crc kubenswrapper[4606]: I1212 00:23:55.930933 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:23:55 crc kubenswrapper[4606]: I1212 00:23:55.930944 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:23:55Z","lastTransitionTime":"2025-12-12T00:23:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:23:55 crc kubenswrapper[4606]: I1212 00:23:55.932260 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-554rp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d34036b-3243-416a-a0c0-1d1ddb3a0ca5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://346417326a8a4ba8e839ce4870dbaacd33a90b4ab096d8ce573ce598d909b51a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9k8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-554rp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:55Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:55 crc kubenswrapper[4606]: I1212 00:23:55.950774 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-w4rbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"470c2076-46bf-4305-9fb1-3e509eb4d4f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34a40d0ce85078630553b90b7a35510e0d3523f600db8940d5e0cf063903192d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34a40d0ce85078630553b90b7a35510e0d3523f600db8940d5e0cf063903192d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f0d903a3e09486a00bc5031c1a100d7a520f8e0cbdcccbeadb796e70f1075b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f0d903a3e09486a00bc5031c1a100d7a520f8e0cbdcccbeadb796e70f1075b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06909fa1392ee8f29394ecd7411437e4f331ccd2007d0a534d0292175a7260ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06909fa1392ee8f29394ecd7411437e4f331ccd2007d0a534d0292175a7260ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://328f409c465a8289f2e2e070ac2475187e763de8354cad8af5b5b4e0a3539628\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://328f409c465a8289f2e2e070ac2475187e763de8354cad8af5b5b4e0a3539628\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09980f0c6b16141a2a66d2768c6e01f75c37314fea48825e9b9707a3fc4d3f9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09980f0c6b16141a2a66d2768c6e01f75c37314fea48825e9b9707a3fc4d3f9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d34cd079b74cccb75c66edf8c51b64113483767ced4eb56ad429d6527d0bb0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d34cd079b74cccb75c66edf8c51b64113483767ced4eb56ad429d6527d0bb0c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-w4rbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:55Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:55 crc kubenswrapper[4606]: I1212 00:23:55.972421 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da25b0ba-5398-4185-a4c1-aeba44ae5633\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f94a9f641815d76464c17c6145b9ac4803eedab3bea29a52d65a793d448cebe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f94a9f641815d76464c17c6145b9ac4803eedab3bea29a52d65a793d448cebe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hpw5w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:55Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:55 crc kubenswrapper[4606]: I1212 00:23:55.988280 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://694cd2f423e8241bd8fd68395824d1cd45a12d80e92ad57e63e59c0642b21384\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:55Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:56 crc kubenswrapper[4606]: I1212 00:23:55.999750 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a543e227-be89-40cb-941d-b4707cc28921\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b09638dd0d881593745c14548156f20a9366f65ecfa5018d510b902240dd4f7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-986w2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5de882904f4a5718bcf44d07da05096363447836f325f35130914c17a30c3a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-986w2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cqmz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:55Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:56 crc kubenswrapper[4606]: I1212 00:23:56.011265 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xzcfk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84110159d07d70b0607ac029fec70fb043bbb0e310c39955e49becfe87e9faa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-spsv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xzcfk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:56Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:56 crc kubenswrapper[4606]: I1212 00:23:56.029736 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f8df398-6835-482a-a2cd-3395b6e9efab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b993d03e36e69edb510ce98f50dfb98344fb6fe2b5debeff3a2004807dd00ee6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c6fef20d4e159a405f0ab0206c49e5e314e97bf7ddaf0758e090106945c9ffe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://accf40ef1339b65136666c1ea6f21a70d2152a650c87ef3ade19ddb6f9effbc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbd49ad6d6011a235d440da0e8058c0fc4dcc64effec37b50a68e84438d537a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2860bce1479ca1d558c4f5230d76cadcaf3efc0e3842ef0d371ecf1168cd852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7780eb068dc4a157794f9253367d2198a38a2c039cf3906fc2319cf48db3cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa7780eb068dc4a157794f9253367d2198a38a2c039cf3906fc2319cf48db3cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://684c387c89889a0a1e8d317ca70ebfc71ea377eb5de9c08c71a3ff02faf99e4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://684c387c89889a0a1e8d317ca70ebfc71ea377eb5de9c08c71a3ff02faf99e4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://78da314feaec6cab81e27eaea086d2bba3a05c4b73ce2add1a0bb57a226b0f5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78da314feaec6cab81e27eaea086d2bba3a05c4b73ce2add1a0bb57a226b0f5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:56Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:56 crc kubenswrapper[4606]: I1212 00:23:56.032774 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:56 crc kubenswrapper[4606]: I1212 00:23:56.032824 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:56 crc kubenswrapper[4606]: I1212 00:23:56.032836 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:56 crc kubenswrapper[4606]: I1212 00:23:56.032853 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:23:56 crc kubenswrapper[4606]: I1212 00:23:56.032865 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:23:56Z","lastTransitionTime":"2025-12-12T00:23:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:23:56 crc kubenswrapper[4606]: I1212 00:23:56.046230 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92839d27-9755-4aaa-a4c2-8aa83b6473de\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d18b3a560566347e9cfcd748453a89d1588de77f02a5e961a71916c6182eff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9531d10c07644d54aeff48edf4ac068968acbd1b6597baabcb0660e7bcd54c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76f2e4c2d97f36aebba4bcbabe12214a1df27637376d1440ee54d43fca39065\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b764ca1153735e365d95ed2855c7d41cd7457614e1ddce0f008d62720d607e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c75350d79a93c04db2bd6d71995920cde9b897776249b8696f3ead3939f7d5d4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"ed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1765499023\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1765499022\\\\\\\\\\\\\\\" (2025-12-11 23:23:42 +0000 UTC to 2026-12-11 23:23:42 +0000 UTC (now=2025-12-12 00:23:48.120831298 +0000 UTC))\\\\\\\"\\\\nI1212 00:23:48.120889 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1212 00:23:48.120921 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1212 00:23:48.120964 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1212 00:23:48.120986 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1212 00:23:48.121028 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3271453193/tls.crt::/tmp/serving-cert-3271453193/tls.key\\\\\\\"\\\\nI1212 00:23:48.121084 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1212 00:23:48.121211 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1212 00:23:48.121228 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1212 00:23:48.121265 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1212 00:23:48.121275 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1212 00:23:48.121345 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1212 00:23:48.121358 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF1212 00:23:48.123292 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3baf1bf5fbaa04f9f165f43b6a4fba496be0d453641fcd8bbc811687891dd66\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bbabfdcef0c9f4dcaada0bf5c2e98c083fd0f809462f017f8a767ef800e8e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0bbabfdcef0c9f4dcaada0bf5c2e98c083fd0f809462f017f8a767ef800e8e13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:56Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:56 crc kubenswrapper[4606]: I1212 00:23:56.058766 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ba650bd-b1e1-4950-98c6-2bee87181b18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de6cd366dc8d770945235faa6560cea35c0ef7eabd95cb272639e2116e2d623d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a380a96c542428bf9cf66391f4d843144052269b294636e863319b2ba954047f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://902a9d2fb0d7dcdec6339fac981ee0539911f1998a86ecf02a01015d3b1b5907\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4309a0a93977264e5000a930a32a4fb18ebe51fcf2af20bbf153f30f09759ff6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:56Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:56 crc kubenswrapper[4606]: I1212 00:23:56.071927 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:56Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:56 crc kubenswrapper[4606]: I1212 00:23:56.081497 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qtz7b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f437a237-3eb8-4817-b0a9-35efece69933\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f587a144b02f6ba1229380174f2a4f2c50ed33702fe8d64eb5ff7174b32eb2d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qq49f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qtz7b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:56Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:56 crc kubenswrapper[4606]: I1212 00:23:56.096835 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4986a294a19f85082b9caf854f13f9c3587ca49314b40917b00768ee0bb51ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4251eab73ccb6141903993f8df922037762f7c11dc1e0ef1a2b5708ed435307\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:56Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:56 crc kubenswrapper[4606]: I1212 00:23:56.107661 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:56Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:56 crc kubenswrapper[4606]: I1212 00:23:56.122100 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce6c4e852b4c68e910621ad11058beecb40960d501d3e64f5fcad97c442df4ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:56Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:56 crc kubenswrapper[4606]: I1212 00:23:56.135456 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:56 crc kubenswrapper[4606]: I1212 00:23:56.135491 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:56 crc kubenswrapper[4606]: I1212 00:23:56.135503 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:56 crc kubenswrapper[4606]: I1212 00:23:56.135518 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:23:56 crc kubenswrapper[4606]: I1212 00:23:56.135528 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:23:56Z","lastTransitionTime":"2025-12-12T00:23:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:23:56 crc kubenswrapper[4606]: I1212 00:23:56.237059 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:56 crc kubenswrapper[4606]: I1212 00:23:56.237092 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:56 crc kubenswrapper[4606]: I1212 00:23:56.237102 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:56 crc kubenswrapper[4606]: I1212 00:23:56.237116 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:23:56 crc kubenswrapper[4606]: I1212 00:23:56.237126 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:23:56Z","lastTransitionTime":"2025-12-12T00:23:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:23:56 crc kubenswrapper[4606]: I1212 00:23:56.339501 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:56 crc kubenswrapper[4606]: I1212 00:23:56.339532 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:56 crc kubenswrapper[4606]: I1212 00:23:56.339543 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:56 crc kubenswrapper[4606]: I1212 00:23:56.339556 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:23:56 crc kubenswrapper[4606]: I1212 00:23:56.339566 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:23:56Z","lastTransitionTime":"2025-12-12T00:23:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:23:56 crc kubenswrapper[4606]: I1212 00:23:56.445633 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:56 crc kubenswrapper[4606]: I1212 00:23:56.445696 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:56 crc kubenswrapper[4606]: I1212 00:23:56.445715 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:56 crc kubenswrapper[4606]: I1212 00:23:56.445743 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:23:56 crc kubenswrapper[4606]: I1212 00:23:56.445761 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:23:56Z","lastTransitionTime":"2025-12-12T00:23:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:23:56 crc kubenswrapper[4606]: I1212 00:23:56.500544 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 12 00:23:56 crc kubenswrapper[4606]: E1212 00:23:56.500793 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-12 00:24:04.500775944 +0000 UTC m=+35.046128820 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 12 00:23:56 crc kubenswrapper[4606]: I1212 00:23:56.548076 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:56 crc kubenswrapper[4606]: I1212 00:23:56.548119 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:56 crc kubenswrapper[4606]: I1212 00:23:56.548132 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:56 crc kubenswrapper[4606]: I1212 00:23:56.548151 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:23:56 crc kubenswrapper[4606]: I1212 00:23:56.548165 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:23:56Z","lastTransitionTime":"2025-12-12T00:23:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:23:56 crc kubenswrapper[4606]: I1212 00:23:56.601255 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 12 00:23:56 crc kubenswrapper[4606]: I1212 00:23:56.601297 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 12 00:23:56 crc kubenswrapper[4606]: I1212 00:23:56.601317 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 12 00:23:56 crc kubenswrapper[4606]: I1212 00:23:56.601343 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 12 00:23:56 crc kubenswrapper[4606]: E1212 00:23:56.601453 4606 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 12 00:23:56 crc kubenswrapper[4606]: E1212 00:23:56.601469 4606 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 12 00:23:56 crc kubenswrapper[4606]: E1212 00:23:56.601491 4606 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 12 00:23:56 crc kubenswrapper[4606]: E1212 00:23:56.601526 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-12 00:24:04.601513759 +0000 UTC m=+35.146866625 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 12 00:23:56 crc kubenswrapper[4606]: E1212 00:23:56.601844 4606 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 12 00:23:56 crc kubenswrapper[4606]: E1212 00:23:56.601877 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-12 00:24:04.601868189 +0000 UTC m=+35.147221055 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 12 00:23:56 crc kubenswrapper[4606]: E1212 00:23:56.601924 4606 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 12 00:23:56 crc kubenswrapper[4606]: E1212 00:23:56.601959 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-12 00:24:04.601951711 +0000 UTC m=+35.147304577 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 12 00:23:56 crc kubenswrapper[4606]: E1212 00:23:56.602010 4606 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 12 00:23:56 crc kubenswrapper[4606]: E1212 00:23:56.602023 4606 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 12 00:23:56 crc kubenswrapper[4606]: E1212 00:23:56.602033 4606 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 12 00:23:56 crc kubenswrapper[4606]: E1212 00:23:56.602059 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-12 00:24:04.602051874 +0000 UTC m=+35.147404740 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 12 00:23:56 crc kubenswrapper[4606]: I1212 00:23:56.649929 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:56 crc kubenswrapper[4606]: I1212 00:23:56.649954 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:56 crc kubenswrapper[4606]: I1212 00:23:56.649962 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:56 crc kubenswrapper[4606]: I1212 00:23:56.649975 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:23:56 crc kubenswrapper[4606]: I1212 00:23:56.649983 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:23:56Z","lastTransitionTime":"2025-12-12T00:23:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:23:56 crc kubenswrapper[4606]: I1212 00:23:56.699652 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 12 00:23:56 crc kubenswrapper[4606]: E1212 00:23:56.699757 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 12 00:23:56 crc kubenswrapper[4606]: I1212 00:23:56.699816 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 12 00:23:56 crc kubenswrapper[4606]: I1212 00:23:56.699835 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 12 00:23:56 crc kubenswrapper[4606]: E1212 00:23:56.699953 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 12 00:23:56 crc kubenswrapper[4606]: E1212 00:23:56.700163 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 12 00:23:56 crc kubenswrapper[4606]: I1212 00:23:56.753160 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:56 crc kubenswrapper[4606]: I1212 00:23:56.753252 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:56 crc kubenswrapper[4606]: I1212 00:23:56.753272 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:56 crc kubenswrapper[4606]: I1212 00:23:56.753295 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:23:56 crc kubenswrapper[4606]: I1212 00:23:56.753313 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:23:56Z","lastTransitionTime":"2025-12-12T00:23:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:23:56 crc kubenswrapper[4606]: I1212 00:23:56.856135 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:56 crc kubenswrapper[4606]: I1212 00:23:56.856216 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:56 crc kubenswrapper[4606]: I1212 00:23:56.856235 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:56 crc kubenswrapper[4606]: I1212 00:23:56.856259 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:23:56 crc kubenswrapper[4606]: I1212 00:23:56.856273 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:23:56Z","lastTransitionTime":"2025-12-12T00:23:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:23:56 crc kubenswrapper[4606]: I1212 00:23:56.904774 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" event={"ID":"da25b0ba-5398-4185-a4c1-aeba44ae5633","Type":"ContainerStarted","Data":"2e9ccad70d1286f1c2de41f13ad508f217e20f2351d2b31d5153fdfdcdc79d45"} Dec 12 00:23:56 crc kubenswrapper[4606]: I1212 00:23:56.905146 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" Dec 12 00:23:56 crc kubenswrapper[4606]: I1212 00:23:56.905167 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" Dec 12 00:23:56 crc kubenswrapper[4606]: I1212 00:23:56.912029 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-w4rbn" event={"ID":"470c2076-46bf-4305-9fb1-3e509eb4d4f0","Type":"ContainerStarted","Data":"78e7b08154aa4792439cd9e13c38f1c3a106a4d138d8e6a47d5406c82f11662c"} Dec 12 00:23:56 crc kubenswrapper[4606]: I1212 00:23:56.925333 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4986a294a19f85082b9caf854f13f9c3587ca49314b40917b00768ee0bb51ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4251eab73ccb6141903993f8df922037762f7c11dc1e0ef1a2b5708ed435307\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:56Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:56 crc kubenswrapper[4606]: I1212 00:23:56.930246 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" Dec 12 00:23:56 crc kubenswrapper[4606]: I1212 00:23:56.940370 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:56Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:56 crc kubenswrapper[4606]: I1212 00:23:56.951813 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qtz7b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f437a237-3eb8-4817-b0a9-35efece69933\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f587a144b02f6ba1229380174f2a4f2c50ed33702fe8d64eb5ff7174b32eb2d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qq49f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qtz7b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:56Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:56 crc kubenswrapper[4606]: I1212 00:23:56.957929 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:56 crc kubenswrapper[4606]: I1212 00:23:56.957970 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:56 crc kubenswrapper[4606]: I1212 00:23:56.957984 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:56 crc kubenswrapper[4606]: I1212 00:23:56.958001 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:23:56 crc kubenswrapper[4606]: I1212 00:23:56.958014 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:23:56Z","lastTransitionTime":"2025-12-12T00:23:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:23:56 crc kubenswrapper[4606]: I1212 00:23:56.965904 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce6c4e852b4c68e910621ad11058beecb40960d501d3e64f5fcad97c442df4ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:56Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:56 crc kubenswrapper[4606]: I1212 00:23:56.977504 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:56Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:56 crc kubenswrapper[4606]: I1212 00:23:56.990028 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-554rp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d34036b-3243-416a-a0c0-1d1ddb3a0ca5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://346417326a8a4ba8e839ce4870dbaacd33a90b4ab096d8ce573ce598d909b51a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9k8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-554rp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:56Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:57 crc kubenswrapper[4606]: I1212 00:23:57.005079 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-w4rbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"470c2076-46bf-4305-9fb1-3e509eb4d4f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34a40d0ce85078630553b90b7a35510e0d3523f600db8940d5e0cf063903192d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34a40d0ce85078630553b90b7a35510e0d3523f600db8940d5e0cf063903192d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f0d903a3e09486a00bc5031c1a100d7a520f8e0cbdcccbeadb796e70f1075b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f0d903a3e09486a00bc5031c1a100d7a520f8e0cbdcccbeadb796e70f1075b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06909fa1392ee8f29394ecd7411437e4f331ccd2007d0a534d0292175a7260ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06909fa1392ee8f29394ecd7411437e4f331ccd2007d0a534d0292175a7260ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://328f409c465a8289f2e2e070ac2475187e763de8354cad8af5b5b4e0a3539628\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://328f409c465a8289f2e2e070ac2475187e763de8354cad8af5b5b4e0a3539628\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09980f0c6b16141a2a66d2768c6e01f75c37314fea48825e9b9707a3fc4d3f9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09980f0c6b16141a2a66d2768c6e01f75c37314fea48825e9b9707a3fc4d3f9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d34cd079b74cccb75c66edf8c51b64113483767ced4eb56ad429d6527d0bb0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d34cd079b74cccb75c66edf8c51b64113483767ced4eb56ad429d6527d0bb0c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-w4rbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:57Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:57 crc kubenswrapper[4606]: I1212 00:23:57.029082 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da25b0ba-5398-4185-a4c1-aeba44ae5633\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa827acce161127216bcfcf3553efeb627cbe52e4d23a8cb011e025f67dde670\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://653bdfa6f15b75149c89a4aff0706e132368f8bc75520cde9f1e1c55c8866537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbb742680d55b47d51cd1964e187b6816e50a88befbce58d3a8746af74cf7071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://366ddc63e1156e702b24e0c3d3c02785c10324669fa851e46851f9a2b8a46a5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6183a4685fda61493c792e9ae693cd7f76e4aad9753f622b48347da3fd2b3b00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0867a89de2edf12036267a937b7e3b1a801e9710ff0a1250c20e020c80ac4d5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e9ccad70d1286f1c2de41f13ad508f217e20f2351d2b31d5153fdfdcdc79d45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e3c08f155fa3cef9692edf10c4df9719f3647b5625a9ac68a7c158afb80dd86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f94a9f641815d76464c17c6145b9ac4803eedab3bea29a52d65a793d448cebe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f94a9f641815d76464c17c6145b9ac4803eedab3bea29a52d65a793d448cebe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hpw5w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:57Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:57 crc kubenswrapper[4606]: I1212 00:23:57.051476 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f8df398-6835-482a-a2cd-3395b6e9efab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b993d03e36e69edb510ce98f50dfb98344fb6fe2b5debeff3a2004807dd00ee6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c6fef20d4e159a405f0ab0206c49e5e314e97bf7ddaf0758e090106945c9ffe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://accf40ef1339b65136666c1ea6f21a70d2152a650c87ef3ade19ddb6f9effbc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbd49ad6d6011a235d440da0e8058c0fc4dcc64effec37b50a68e84438d537a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2860bce1479ca1d558c4f5230d76cadcaf3efc0e3842ef0d371ecf1168cd852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7780eb068dc4a157794f9253367d2198a38a2c039cf3906fc2319cf48db3cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa7780eb068dc4a157794f9253367d2198a38a2c039cf3906fc2319cf48db3cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://684c387c89889a0a1e8d317ca70ebfc71ea377eb5de9c08c71a3ff02faf99e4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://684c387c89889a0a1e8d317ca70ebfc71ea377eb5de9c08c71a3ff02faf99e4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://78da314feaec6cab81e27eaea086d2bba3a05c4b73ce2add1a0bb57a226b0f5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78da314feaec6cab81e27eaea086d2bba3a05c4b73ce2add1a0bb57a226b0f5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:57Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:57 crc kubenswrapper[4606]: I1212 00:23:57.060277 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:57 crc kubenswrapper[4606]: I1212 00:23:57.060403 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:57 crc kubenswrapper[4606]: I1212 00:23:57.060574 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:57 crc kubenswrapper[4606]: I1212 00:23:57.060658 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:23:57 crc kubenswrapper[4606]: I1212 00:23:57.060741 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:23:57Z","lastTransitionTime":"2025-12-12T00:23:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:23:57 crc kubenswrapper[4606]: I1212 00:23:57.066900 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92839d27-9755-4aaa-a4c2-8aa83b6473de\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d18b3a560566347e9cfcd748453a89d1588de77f02a5e961a71916c6182eff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9531d10c07644d54aeff48edf4ac068968acbd1b6597baabcb0660e7bcd54c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76f2e4c2d97f36aebba4bcbabe12214a1df27637376d1440ee54d43fca39065\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b764ca1153735e365d95ed2855c7d41cd7457614e1ddce0f008d62720d607e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c75350d79a93c04db2bd6d71995920cde9b897776249b8696f3ead3939f7d5d4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"ed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1765499023\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1765499022\\\\\\\\\\\\\\\" (2025-12-11 23:23:42 +0000 UTC to 2026-12-11 23:23:42 +0000 UTC (now=2025-12-12 00:23:48.120831298 +0000 UTC))\\\\\\\"\\\\nI1212 00:23:48.120889 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1212 00:23:48.120921 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1212 00:23:48.120964 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1212 00:23:48.120986 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1212 00:23:48.121028 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3271453193/tls.crt::/tmp/serving-cert-3271453193/tls.key\\\\\\\"\\\\nI1212 00:23:48.121084 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1212 00:23:48.121211 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1212 00:23:48.121228 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1212 00:23:48.121265 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1212 00:23:48.121275 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1212 00:23:48.121345 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1212 00:23:48.121358 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF1212 00:23:48.123292 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3baf1bf5fbaa04f9f165f43b6a4fba496be0d453641fcd8bbc811687891dd66\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bbabfdcef0c9f4dcaada0bf5c2e98c083fd0f809462f017f8a767ef800e8e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0bbabfdcef0c9f4dcaada0bf5c2e98c083fd0f809462f017f8a767ef800e8e13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:57Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:57 crc kubenswrapper[4606]: I1212 00:23:57.095755 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ba650bd-b1e1-4950-98c6-2bee87181b18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de6cd366dc8d770945235faa6560cea35c0ef7eabd95cb272639e2116e2d623d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a380a96c542428bf9cf66391f4d843144052269b294636e863319b2ba954047f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://902a9d2fb0d7dcdec6339fac981ee0539911f1998a86ecf02a01015d3b1b5907\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4309a0a93977264e5000a930a32a4fb18ebe51fcf2af20bbf153f30f09759ff6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:57Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:57 crc kubenswrapper[4606]: I1212 00:23:57.119818 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:57Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:57 crc kubenswrapper[4606]: I1212 00:23:57.136297 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://694cd2f423e8241bd8fd68395824d1cd45a12d80e92ad57e63e59c0642b21384\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:57Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:57 crc kubenswrapper[4606]: I1212 00:23:57.159941 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a543e227-be89-40cb-941d-b4707cc28921\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b09638dd0d881593745c14548156f20a9366f65ecfa5018d510b902240dd4f7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-986w2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5de882904f4a5718bcf44d07da05096363447836f325f35130914c17a30c3a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-986w2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cqmz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:57Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:57 crc kubenswrapper[4606]: I1212 00:23:57.162423 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:57 crc kubenswrapper[4606]: I1212 00:23:57.162496 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:57 crc kubenswrapper[4606]: I1212 00:23:57.162545 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:57 crc kubenswrapper[4606]: I1212 00:23:57.162563 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:23:57 crc kubenswrapper[4606]: I1212 00:23:57.162577 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:23:57Z","lastTransitionTime":"2025-12-12T00:23:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:23:57 crc kubenswrapper[4606]: I1212 00:23:57.176956 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xzcfk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84110159d07d70b0607ac029fec70fb043bbb0e310c39955e49becfe87e9faa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-spsv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xzcfk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:57Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:57 crc kubenswrapper[4606]: I1212 00:23:57.192551 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce6c4e852b4c68e910621ad11058beecb40960d501d3e64f5fcad97c442df4ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:57Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:57 crc kubenswrapper[4606]: I1212 00:23:57.205570 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:57Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:57 crc kubenswrapper[4606]: I1212 00:23:57.216164 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-554rp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d34036b-3243-416a-a0c0-1d1ddb3a0ca5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://346417326a8a4ba8e839ce4870dbaacd33a90b4ab096d8ce573ce598d909b51a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9k8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-554rp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:57Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:57 crc kubenswrapper[4606]: I1212 00:23:57.236533 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-w4rbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"470c2076-46bf-4305-9fb1-3e509eb4d4f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78e7b08154aa4792439cd9e13c38f1c3a106a4d138d8e6a47d5406c82f11662c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34a40d0ce85078630553b90b7a35510e0d3523f600db8940d5e0cf063903192d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34a40d0ce85078630553b90b7a35510e0d3523f600db8940d5e0cf063903192d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f0d903a3e09486a00bc5031c1a100d7a520f8e0cbdcccbeadb796e70f1075b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f0d903a3e09486a00bc5031c1a100d7a520f8e0cbdcccbeadb796e70f1075b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06909fa1392ee8f29394ecd7411437e4f331ccd2007d0a534d0292175a7260ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06909fa1392ee8f29394ecd7411437e4f331ccd2007d0a534d0292175a7260ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://328f409c465a8289f2e2e070ac2475187e763de8354cad8af5b5b4e0a3539628\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://328f409c465a8289f2e2e070ac2475187e763de8354cad8af5b5b4e0a3539628\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09980f0c6b16141a2a66d2768c6e01f75c37314fea48825e9b9707a3fc4d3f9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09980f0c6b16141a2a66d2768c6e01f75c37314fea48825e9b9707a3fc4d3f9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d34cd079b74cccb75c66edf8c51b64113483767ced4eb56ad429d6527d0bb0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d34cd079b74cccb75c66edf8c51b64113483767ced4eb56ad429d6527d0bb0c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-w4rbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:57Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:57 crc kubenswrapper[4606]: I1212 00:23:57.257753 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da25b0ba-5398-4185-a4c1-aeba44ae5633\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa827acce161127216bcfcf3553efeb627cbe52e4d23a8cb011e025f67dde670\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://653bdfa6f15b75149c89a4aff0706e132368f8bc75520cde9f1e1c55c8866537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbb742680d55b47d51cd1964e187b6816e50a88befbce58d3a8746af74cf7071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://366ddc63e1156e702b24e0c3d3c02785c10324669fa851e46851f9a2b8a46a5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6183a4685fda61493c792e9ae693cd7f76e4aad9753f622b48347da3fd2b3b00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0867a89de2edf12036267a937b7e3b1a801e9710ff0a1250c20e020c80ac4d5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e9ccad70d1286f1c2de41f13ad508f217e20f2351d2b31d5153fdfdcdc79d45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e3c08f155fa3cef9692edf10c4df9719f3647b5625a9ac68a7c158afb80dd86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f94a9f641815d76464c17c6145b9ac4803eedab3bea29a52d65a793d448cebe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f94a9f641815d76464c17c6145b9ac4803eedab3bea29a52d65a793d448cebe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hpw5w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:57Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:57 crc kubenswrapper[4606]: I1212 00:23:57.264916 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:57 crc kubenswrapper[4606]: I1212 00:23:57.264949 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:57 crc kubenswrapper[4606]: I1212 00:23:57.264960 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:57 crc kubenswrapper[4606]: I1212 00:23:57.264979 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:23:57 crc kubenswrapper[4606]: I1212 00:23:57.264988 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:23:57Z","lastTransitionTime":"2025-12-12T00:23:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:23:57 crc kubenswrapper[4606]: I1212 00:23:57.280619 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f8df398-6835-482a-a2cd-3395b6e9efab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b993d03e36e69edb510ce98f50dfb98344fb6fe2b5debeff3a2004807dd00ee6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c6fef20d4e159a405f0ab0206c49e5e314e97bf7ddaf0758e090106945c9ffe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://accf40ef1339b65136666c1ea6f21a70d2152a650c87ef3ade19ddb6f9effbc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbd49ad6d6011a235d440da0e8058c0fc4dcc64effec37b50a68e84438d537a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2860bce1479ca1d558c4f5230d76cadcaf3efc0e3842ef0d371ecf1168cd852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7780eb068dc4a157794f9253367d2198a38a2c039cf3906fc2319cf48db3cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa7780eb068dc4a157794f9253367d2198a38a2c039cf3906fc2319cf48db3cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://684c387c89889a0a1e8d317ca70ebfc71ea377eb5de9c08c71a3ff02faf99e4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://684c387c89889a0a1e8d317ca70ebfc71ea377eb5de9c08c71a3ff02faf99e4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://78da314feaec6cab81e27eaea086d2bba3a05c4b73ce2add1a0bb57a226b0f5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78da314feaec6cab81e27eaea086d2bba3a05c4b73ce2add1a0bb57a226b0f5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:57Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:57 crc kubenswrapper[4606]: I1212 00:23:57.296565 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92839d27-9755-4aaa-a4c2-8aa83b6473de\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d18b3a560566347e9cfcd748453a89d1588de77f02a5e961a71916c6182eff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9531d10c07644d54aeff48edf4ac068968acbd1b6597baabcb0660e7bcd54c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76f2e4c2d97f36aebba4bcbabe12214a1df27637376d1440ee54d43fca39065\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b764ca1153735e365d95ed2855c7d41cd7457614e1ddce0f008d62720d607e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c75350d79a93c04db2bd6d71995920cde9b897776249b8696f3ead3939f7d5d4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"ed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1765499023\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1765499022\\\\\\\\\\\\\\\" (2025-12-11 23:23:42 +0000 UTC to 2026-12-11 23:23:42 +0000 UTC (now=2025-12-12 00:23:48.120831298 +0000 UTC))\\\\\\\"\\\\nI1212 00:23:48.120889 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1212 00:23:48.120921 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1212 00:23:48.120964 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1212 00:23:48.120986 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1212 00:23:48.121028 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3271453193/tls.crt::/tmp/serving-cert-3271453193/tls.key\\\\\\\"\\\\nI1212 00:23:48.121084 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1212 00:23:48.121211 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1212 00:23:48.121228 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1212 00:23:48.121265 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1212 00:23:48.121275 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1212 00:23:48.121345 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1212 00:23:48.121358 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF1212 00:23:48.123292 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3baf1bf5fbaa04f9f165f43b6a4fba496be0d453641fcd8bbc811687891dd66\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bbabfdcef0c9f4dcaada0bf5c2e98c083fd0f809462f017f8a767ef800e8e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0bbabfdcef0c9f4dcaada0bf5c2e98c083fd0f809462f017f8a767ef800e8e13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:57Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:57 crc kubenswrapper[4606]: I1212 00:23:57.310817 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ba650bd-b1e1-4950-98c6-2bee87181b18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de6cd366dc8d770945235faa6560cea35c0ef7eabd95cb272639e2116e2d623d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a380a96c542428bf9cf66391f4d843144052269b294636e863319b2ba954047f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://902a9d2fb0d7dcdec6339fac981ee0539911f1998a86ecf02a01015d3b1b5907\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4309a0a93977264e5000a930a32a4fb18ebe51fcf2af20bbf153f30f09759ff6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:57Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:57 crc kubenswrapper[4606]: I1212 00:23:57.326007 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:57Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:57 crc kubenswrapper[4606]: I1212 00:23:57.339137 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://694cd2f423e8241bd8fd68395824d1cd45a12d80e92ad57e63e59c0642b21384\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:57Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:57 crc kubenswrapper[4606]: I1212 00:23:57.353639 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a543e227-be89-40cb-941d-b4707cc28921\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b09638dd0d881593745c14548156f20a9366f65ecfa5018d510b902240dd4f7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-986w2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5de882904f4a5718bcf44d07da05096363447836f325f35130914c17a30c3a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-986w2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cqmz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:57Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:57 crc kubenswrapper[4606]: I1212 00:23:57.368047 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:57 crc kubenswrapper[4606]: I1212 00:23:57.368122 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:57 crc kubenswrapper[4606]: I1212 00:23:57.368149 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:57 crc kubenswrapper[4606]: I1212 00:23:57.368183 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:23:57 crc kubenswrapper[4606]: I1212 00:23:57.368198 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:23:57Z","lastTransitionTime":"2025-12-12T00:23:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:23:57 crc kubenswrapper[4606]: I1212 00:23:57.369204 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xzcfk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84110159d07d70b0607ac029fec70fb043bbb0e310c39955e49becfe87e9faa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-spsv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xzcfk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:57Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:57 crc kubenswrapper[4606]: I1212 00:23:57.382346 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4986a294a19f85082b9caf854f13f9c3587ca49314b40917b00768ee0bb51ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4251eab73ccb6141903993f8df922037762f7c11dc1e0ef1a2b5708ed435307\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:57Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:57 crc kubenswrapper[4606]: I1212 00:23:57.396274 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:57Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:57 crc kubenswrapper[4606]: I1212 00:23:57.412438 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qtz7b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f437a237-3eb8-4817-b0a9-35efece69933\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f587a144b02f6ba1229380174f2a4f2c50ed33702fe8d64eb5ff7174b32eb2d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qq49f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qtz7b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:57Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:57 crc kubenswrapper[4606]: I1212 00:23:57.471844 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:57 crc kubenswrapper[4606]: I1212 00:23:57.471897 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:57 crc kubenswrapper[4606]: I1212 00:23:57.471914 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:57 crc kubenswrapper[4606]: I1212 00:23:57.471941 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:23:57 crc kubenswrapper[4606]: I1212 00:23:57.471958 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:23:57Z","lastTransitionTime":"2025-12-12T00:23:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:23:57 crc kubenswrapper[4606]: I1212 00:23:57.574271 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:57 crc kubenswrapper[4606]: I1212 00:23:57.574326 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:57 crc kubenswrapper[4606]: I1212 00:23:57.574355 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:57 crc kubenswrapper[4606]: I1212 00:23:57.574387 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:23:57 crc kubenswrapper[4606]: I1212 00:23:57.574406 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:23:57Z","lastTransitionTime":"2025-12-12T00:23:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:23:57 crc kubenswrapper[4606]: I1212 00:23:57.676685 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:57 crc kubenswrapper[4606]: I1212 00:23:57.676718 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:57 crc kubenswrapper[4606]: I1212 00:23:57.676728 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:57 crc kubenswrapper[4606]: I1212 00:23:57.676743 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:23:57 crc kubenswrapper[4606]: I1212 00:23:57.676754 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:23:57Z","lastTransitionTime":"2025-12-12T00:23:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:23:57 crc kubenswrapper[4606]: I1212 00:23:57.779493 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:57 crc kubenswrapper[4606]: I1212 00:23:57.779572 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:57 crc kubenswrapper[4606]: I1212 00:23:57.779594 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:57 crc kubenswrapper[4606]: I1212 00:23:57.779628 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:23:57 crc kubenswrapper[4606]: I1212 00:23:57.779650 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:23:57Z","lastTransitionTime":"2025-12-12T00:23:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:23:57 crc kubenswrapper[4606]: I1212 00:23:57.882374 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:57 crc kubenswrapper[4606]: I1212 00:23:57.882433 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:57 crc kubenswrapper[4606]: I1212 00:23:57.882451 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:57 crc kubenswrapper[4606]: I1212 00:23:57.882475 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:23:57 crc kubenswrapper[4606]: I1212 00:23:57.882492 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:23:57Z","lastTransitionTime":"2025-12-12T00:23:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:23:57 crc kubenswrapper[4606]: I1212 00:23:57.916634 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" Dec 12 00:23:57 crc kubenswrapper[4606]: I1212 00:23:57.947632 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" Dec 12 00:23:57 crc kubenswrapper[4606]: I1212 00:23:57.960546 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-554rp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d34036b-3243-416a-a0c0-1d1ddb3a0ca5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://346417326a8a4ba8e839ce4870dbaacd33a90b4ab096d8ce573ce598d909b51a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9k8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-554rp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:57Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:57 crc kubenswrapper[4606]: I1212 00:23:57.981247 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-w4rbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"470c2076-46bf-4305-9fb1-3e509eb4d4f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78e7b08154aa4792439cd9e13c38f1c3a106a4d138d8e6a47d5406c82f11662c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34a40d0ce85078630553b90b7a35510e0d3523f600db8940d5e0cf063903192d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34a40d0ce85078630553b90b7a35510e0d3523f600db8940d5e0cf063903192d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f0d903a3e09486a00bc5031c1a100d7a520f8e0cbdcccbeadb796e70f1075b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f0d903a3e09486a00bc5031c1a100d7a520f8e0cbdcccbeadb796e70f1075b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06909fa1392ee8f29394ecd7411437e4f331ccd2007d0a534d0292175a7260ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06909fa1392ee8f29394ecd7411437e4f331ccd2007d0a534d0292175a7260ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://328f409c465a8289f2e2e070ac2475187e763de8354cad8af5b5b4e0a3539628\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://328f409c465a8289f2e2e070ac2475187e763de8354cad8af5b5b4e0a3539628\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09980f0c6b16141a2a66d2768c6e01f75c37314fea48825e9b9707a3fc4d3f9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09980f0c6b16141a2a66d2768c6e01f75c37314fea48825e9b9707a3fc4d3f9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d34cd079b74cccb75c66edf8c51b64113483767ced4eb56ad429d6527d0bb0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d34cd079b74cccb75c66edf8c51b64113483767ced4eb56ad429d6527d0bb0c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-w4rbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:57Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:57 crc kubenswrapper[4606]: I1212 00:23:57.984964 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:57 crc kubenswrapper[4606]: I1212 00:23:57.985014 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:57 crc kubenswrapper[4606]: I1212 00:23:57.985028 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:57 crc kubenswrapper[4606]: I1212 00:23:57.985047 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:23:57 crc kubenswrapper[4606]: I1212 00:23:57.985059 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:23:57Z","lastTransitionTime":"2025-12-12T00:23:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:23:58 crc kubenswrapper[4606]: I1212 00:23:58.002499 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da25b0ba-5398-4185-a4c1-aeba44ae5633\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa827acce161127216bcfcf3553efeb627cbe52e4d23a8cb011e025f67dde670\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://653bdfa6f15b75149c89a4aff0706e132368f8bc75520cde9f1e1c55c8866537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbb742680d55b47d51cd1964e187b6816e50a88befbce58d3a8746af74cf7071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://366ddc63e1156e702b24e0c3d3c02785c10324669fa851e46851f9a2b8a46a5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6183a4685fda61493c792e9ae693cd7f76e4aad9753f622b48347da3fd2b3b00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0867a89de2edf12036267a937b7e3b1a801e9710ff0a1250c20e020c80ac4d5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e9ccad70d1286f1c2de41f13ad508f217e20f2351d2b31d5153fdfdcdc79d45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e3c08f155fa3cef9692edf10c4df9719f3647b5625a9ac68a7c158afb80dd86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f94a9f641815d76464c17c6145b9ac4803eedab3bea29a52d65a793d448cebe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f94a9f641815d76464c17c6145b9ac4803eedab3bea29a52d65a793d448cebe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hpw5w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:58Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:58 crc kubenswrapper[4606]: I1212 00:23:58.019467 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:58Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:58 crc kubenswrapper[4606]: I1212 00:23:58.037264 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92839d27-9755-4aaa-a4c2-8aa83b6473de\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d18b3a560566347e9cfcd748453a89d1588de77f02a5e961a71916c6182eff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9531d10c07644d54aeff48edf4ac068968acbd1b6597baabcb0660e7bcd54c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76f2e4c2d97f36aebba4bcbabe12214a1df27637376d1440ee54d43fca39065\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b764ca1153735e365d95ed2855c7d41cd7457614e1ddce0f008d62720d607e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c75350d79a93c04db2bd6d71995920cde9b897776249b8696f3ead3939f7d5d4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"ed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1765499023\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1765499022\\\\\\\\\\\\\\\" (2025-12-11 23:23:42 +0000 UTC to 2026-12-11 23:23:42 +0000 UTC (now=2025-12-12 00:23:48.120831298 +0000 UTC))\\\\\\\"\\\\nI1212 00:23:48.120889 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1212 00:23:48.120921 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1212 00:23:48.120964 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1212 00:23:48.120986 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1212 00:23:48.121028 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3271453193/tls.crt::/tmp/serving-cert-3271453193/tls.key\\\\\\\"\\\\nI1212 00:23:48.121084 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1212 00:23:48.121211 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1212 00:23:48.121228 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1212 00:23:48.121265 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1212 00:23:48.121275 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1212 00:23:48.121345 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1212 00:23:48.121358 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF1212 00:23:48.123292 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3baf1bf5fbaa04f9f165f43b6a4fba496be0d453641fcd8bbc811687891dd66\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bbabfdcef0c9f4dcaada0bf5c2e98c083fd0f809462f017f8a767ef800e8e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0bbabfdcef0c9f4dcaada0bf5c2e98c083fd0f809462f017f8a767ef800e8e13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:58Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:58 crc kubenswrapper[4606]: I1212 00:23:58.054560 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ba650bd-b1e1-4950-98c6-2bee87181b18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de6cd366dc8d770945235faa6560cea35c0ef7eabd95cb272639e2116e2d623d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a380a96c542428bf9cf66391f4d843144052269b294636e863319b2ba954047f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://902a9d2fb0d7dcdec6339fac981ee0539911f1998a86ecf02a01015d3b1b5907\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4309a0a93977264e5000a930a32a4fb18ebe51fcf2af20bbf153f30f09759ff6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:58Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:58 crc kubenswrapper[4606]: I1212 00:23:58.071296 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:58Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:58 crc kubenswrapper[4606]: I1212 00:23:58.087354 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://694cd2f423e8241bd8fd68395824d1cd45a12d80e92ad57e63e59c0642b21384\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:58Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:58 crc kubenswrapper[4606]: I1212 00:23:58.087774 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:58 crc kubenswrapper[4606]: I1212 00:23:58.087819 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:58 crc kubenswrapper[4606]: I1212 00:23:58.087830 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:58 crc kubenswrapper[4606]: I1212 00:23:58.087844 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:23:58 crc kubenswrapper[4606]: I1212 00:23:58.087854 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:23:58Z","lastTransitionTime":"2025-12-12T00:23:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:23:58 crc kubenswrapper[4606]: I1212 00:23:58.103813 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a543e227-be89-40cb-941d-b4707cc28921\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b09638dd0d881593745c14548156f20a9366f65ecfa5018d510b902240dd4f7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-986w2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5de882904f4a5718bcf44d07da05096363447836f325f35130914c17a30c3a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-986w2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cqmz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:58Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:58 crc kubenswrapper[4606]: I1212 00:23:58.118636 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xzcfk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84110159d07d70b0607ac029fec70fb043bbb0e310c39955e49becfe87e9faa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-spsv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xzcfk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:58Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:58 crc kubenswrapper[4606]: I1212 00:23:58.142364 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f8df398-6835-482a-a2cd-3395b6e9efab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b993d03e36e69edb510ce98f50dfb98344fb6fe2b5debeff3a2004807dd00ee6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c6fef20d4e159a405f0ab0206c49e5e314e97bf7ddaf0758e090106945c9ffe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://accf40ef1339b65136666c1ea6f21a70d2152a650c87ef3ade19ddb6f9effbc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbd49ad6d6011a235d440da0e8058c0fc4dcc64effec37b50a68e84438d537a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2860bce1479ca1d558c4f5230d76cadcaf3efc0e3842ef0d371ecf1168cd852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7780eb068dc4a157794f9253367d2198a38a2c039cf3906fc2319cf48db3cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa7780eb068dc4a157794f9253367d2198a38a2c039cf3906fc2319cf48db3cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://684c387c89889a0a1e8d317ca70ebfc71ea377eb5de9c08c71a3ff02faf99e4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://684c387c89889a0a1e8d317ca70ebfc71ea377eb5de9c08c71a3ff02faf99e4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://78da314feaec6cab81e27eaea086d2bba3a05c4b73ce2add1a0bb57a226b0f5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78da314feaec6cab81e27eaea086d2bba3a05c4b73ce2add1a0bb57a226b0f5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:58Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:58 crc kubenswrapper[4606]: I1212 00:23:58.160803 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4986a294a19f85082b9caf854f13f9c3587ca49314b40917b00768ee0bb51ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4251eab73ccb6141903993f8df922037762f7c11dc1e0ef1a2b5708ed435307\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:58Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:58 crc kubenswrapper[4606]: I1212 00:23:58.194466 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:58Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:58 crc kubenswrapper[4606]: I1212 00:23:58.198665 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:58 crc kubenswrapper[4606]: I1212 00:23:58.198704 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:58 crc kubenswrapper[4606]: I1212 00:23:58.198717 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:58 crc kubenswrapper[4606]: I1212 00:23:58.198731 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:23:58 crc kubenswrapper[4606]: I1212 00:23:58.198740 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:23:58Z","lastTransitionTime":"2025-12-12T00:23:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:23:58 crc kubenswrapper[4606]: I1212 00:23:58.209923 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qtz7b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f437a237-3eb8-4817-b0a9-35efece69933\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f587a144b02f6ba1229380174f2a4f2c50ed33702fe8d64eb5ff7174b32eb2d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qq49f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qtz7b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:58Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:58 crc kubenswrapper[4606]: I1212 00:23:58.223309 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce6c4e852b4c68e910621ad11058beecb40960d501d3e64f5fcad97c442df4ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:58Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:58 crc kubenswrapper[4606]: I1212 00:23:58.300855 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:58 crc kubenswrapper[4606]: I1212 00:23:58.300899 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:58 crc kubenswrapper[4606]: I1212 00:23:58.300913 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:58 crc kubenswrapper[4606]: I1212 00:23:58.300931 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:23:58 crc kubenswrapper[4606]: I1212 00:23:58.300962 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:23:58Z","lastTransitionTime":"2025-12-12T00:23:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:23:58 crc kubenswrapper[4606]: I1212 00:23:58.402999 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:58 crc kubenswrapper[4606]: I1212 00:23:58.403044 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:58 crc kubenswrapper[4606]: I1212 00:23:58.403055 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:58 crc kubenswrapper[4606]: I1212 00:23:58.403074 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:23:58 crc kubenswrapper[4606]: I1212 00:23:58.403087 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:23:58Z","lastTransitionTime":"2025-12-12T00:23:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:23:58 crc kubenswrapper[4606]: I1212 00:23:58.505247 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:58 crc kubenswrapper[4606]: I1212 00:23:58.505302 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:58 crc kubenswrapper[4606]: I1212 00:23:58.505313 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:58 crc kubenswrapper[4606]: I1212 00:23:58.505327 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:23:58 crc kubenswrapper[4606]: I1212 00:23:58.505335 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:23:58Z","lastTransitionTime":"2025-12-12T00:23:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:23:58 crc kubenswrapper[4606]: I1212 00:23:58.608603 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:58 crc kubenswrapper[4606]: I1212 00:23:58.608849 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:58 crc kubenswrapper[4606]: I1212 00:23:58.608912 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:58 crc kubenswrapper[4606]: I1212 00:23:58.608979 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:23:58 crc kubenswrapper[4606]: I1212 00:23:58.609060 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:23:58Z","lastTransitionTime":"2025-12-12T00:23:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:23:58 crc kubenswrapper[4606]: I1212 00:23:58.699533 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 12 00:23:58 crc kubenswrapper[4606]: I1212 00:23:58.699884 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 12 00:23:58 crc kubenswrapper[4606]: I1212 00:23:58.699971 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 12 00:23:58 crc kubenswrapper[4606]: E1212 00:23:58.700230 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 12 00:23:58 crc kubenswrapper[4606]: E1212 00:23:58.700144 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 12 00:23:58 crc kubenswrapper[4606]: E1212 00:23:58.700061 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 12 00:23:58 crc kubenswrapper[4606]: I1212 00:23:58.718399 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:58 crc kubenswrapper[4606]: I1212 00:23:58.718427 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:58 crc kubenswrapper[4606]: I1212 00:23:58.718437 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:58 crc kubenswrapper[4606]: I1212 00:23:58.718451 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:23:58 crc kubenswrapper[4606]: I1212 00:23:58.718462 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:23:58Z","lastTransitionTime":"2025-12-12T00:23:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:23:58 crc kubenswrapper[4606]: I1212 00:23:58.820725 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:58 crc kubenswrapper[4606]: I1212 00:23:58.820755 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:58 crc kubenswrapper[4606]: I1212 00:23:58.820763 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:58 crc kubenswrapper[4606]: I1212 00:23:58.820776 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:23:58 crc kubenswrapper[4606]: I1212 00:23:58.820784 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:23:58Z","lastTransitionTime":"2025-12-12T00:23:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:23:58 crc kubenswrapper[4606]: I1212 00:23:58.923056 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:58 crc kubenswrapper[4606]: I1212 00:23:58.923100 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:58 crc kubenswrapper[4606]: I1212 00:23:58.923109 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:58 crc kubenswrapper[4606]: I1212 00:23:58.923124 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:23:58 crc kubenswrapper[4606]: I1212 00:23:58.923132 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:23:58Z","lastTransitionTime":"2025-12-12T00:23:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:23:59 crc kubenswrapper[4606]: I1212 00:23:59.025730 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:59 crc kubenswrapper[4606]: I1212 00:23:59.025796 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:59 crc kubenswrapper[4606]: I1212 00:23:59.025811 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:59 crc kubenswrapper[4606]: I1212 00:23:59.025831 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:23:59 crc kubenswrapper[4606]: I1212 00:23:59.025843 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:23:59Z","lastTransitionTime":"2025-12-12T00:23:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:23:59 crc kubenswrapper[4606]: I1212 00:23:59.128371 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:59 crc kubenswrapper[4606]: I1212 00:23:59.128427 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:59 crc kubenswrapper[4606]: I1212 00:23:59.128443 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:59 crc kubenswrapper[4606]: I1212 00:23:59.128464 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:23:59 crc kubenswrapper[4606]: I1212 00:23:59.128481 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:23:59Z","lastTransitionTime":"2025-12-12T00:23:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:23:59 crc kubenswrapper[4606]: I1212 00:23:59.230584 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:59 crc kubenswrapper[4606]: I1212 00:23:59.230663 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:59 crc kubenswrapper[4606]: I1212 00:23:59.230675 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:59 crc kubenswrapper[4606]: I1212 00:23:59.230686 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:23:59 crc kubenswrapper[4606]: I1212 00:23:59.230695 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:23:59Z","lastTransitionTime":"2025-12-12T00:23:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:23:59 crc kubenswrapper[4606]: I1212 00:23:59.333219 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:59 crc kubenswrapper[4606]: I1212 00:23:59.333262 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:59 crc kubenswrapper[4606]: I1212 00:23:59.333277 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:59 crc kubenswrapper[4606]: I1212 00:23:59.333293 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:23:59 crc kubenswrapper[4606]: I1212 00:23:59.333304 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:23:59Z","lastTransitionTime":"2025-12-12T00:23:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:23:59 crc kubenswrapper[4606]: I1212 00:23:59.436707 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:59 crc kubenswrapper[4606]: I1212 00:23:59.436779 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:59 crc kubenswrapper[4606]: I1212 00:23:59.436804 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:59 crc kubenswrapper[4606]: I1212 00:23:59.436835 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:23:59 crc kubenswrapper[4606]: I1212 00:23:59.436857 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:23:59Z","lastTransitionTime":"2025-12-12T00:23:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:23:59 crc kubenswrapper[4606]: I1212 00:23:59.539946 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:59 crc kubenswrapper[4606]: I1212 00:23:59.539986 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:59 crc kubenswrapper[4606]: I1212 00:23:59.539995 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:59 crc kubenswrapper[4606]: I1212 00:23:59.540010 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:23:59 crc kubenswrapper[4606]: I1212 00:23:59.540022 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:23:59Z","lastTransitionTime":"2025-12-12T00:23:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:23:59 crc kubenswrapper[4606]: I1212 00:23:59.642237 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:59 crc kubenswrapper[4606]: I1212 00:23:59.642282 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:59 crc kubenswrapper[4606]: I1212 00:23:59.642294 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:59 crc kubenswrapper[4606]: I1212 00:23:59.642334 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:23:59 crc kubenswrapper[4606]: I1212 00:23:59.642346 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:23:59Z","lastTransitionTime":"2025-12-12T00:23:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:23:59 crc kubenswrapper[4606]: I1212 00:23:59.718412 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4986a294a19f85082b9caf854f13f9c3587ca49314b40917b00768ee0bb51ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4251eab73ccb6141903993f8df922037762f7c11dc1e0ef1a2b5708ed435307\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:59Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:59 crc kubenswrapper[4606]: I1212 00:23:59.732144 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:59Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:59 crc kubenswrapper[4606]: I1212 00:23:59.744920 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:59 crc kubenswrapper[4606]: I1212 00:23:59.744964 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:59 crc kubenswrapper[4606]: I1212 00:23:59.744977 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:59 crc kubenswrapper[4606]: I1212 00:23:59.744997 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:23:59 crc kubenswrapper[4606]: I1212 00:23:59.745015 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:23:59Z","lastTransitionTime":"2025-12-12T00:23:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:23:59 crc kubenswrapper[4606]: I1212 00:23:59.748725 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qtz7b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f437a237-3eb8-4817-b0a9-35efece69933\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f587a144b02f6ba1229380174f2a4f2c50ed33702fe8d64eb5ff7174b32eb2d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qq49f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qtz7b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:59Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:59 crc kubenswrapper[4606]: I1212 00:23:59.767167 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce6c4e852b4c68e910621ad11058beecb40960d501d3e64f5fcad97c442df4ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:59Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:59 crc kubenswrapper[4606]: I1212 00:23:59.787242 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:59Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:59 crc kubenswrapper[4606]: I1212 00:23:59.803258 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-554rp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d34036b-3243-416a-a0c0-1d1ddb3a0ca5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://346417326a8a4ba8e839ce4870dbaacd33a90b4ab096d8ce573ce598d909b51a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9k8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-554rp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:59Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:59 crc kubenswrapper[4606]: I1212 00:23:59.821896 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-w4rbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"470c2076-46bf-4305-9fb1-3e509eb4d4f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78e7b08154aa4792439cd9e13c38f1c3a106a4d138d8e6a47d5406c82f11662c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34a40d0ce85078630553b90b7a35510e0d3523f600db8940d5e0cf063903192d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34a40d0ce85078630553b90b7a35510e0d3523f600db8940d5e0cf063903192d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f0d903a3e09486a00bc5031c1a100d7a520f8e0cbdcccbeadb796e70f1075b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f0d903a3e09486a00bc5031c1a100d7a520f8e0cbdcccbeadb796e70f1075b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06909fa1392ee8f29394ecd7411437e4f331ccd2007d0a534d0292175a7260ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06909fa1392ee8f29394ecd7411437e4f331ccd2007d0a534d0292175a7260ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://328f409c465a8289f2e2e070ac2475187e763de8354cad8af5b5b4e0a3539628\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://328f409c465a8289f2e2e070ac2475187e763de8354cad8af5b5b4e0a3539628\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09980f0c6b16141a2a66d2768c6e01f75c37314fea48825e9b9707a3fc4d3f9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09980f0c6b16141a2a66d2768c6e01f75c37314fea48825e9b9707a3fc4d3f9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d34cd079b74cccb75c66edf8c51b64113483767ced4eb56ad429d6527d0bb0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d34cd079b74cccb75c66edf8c51b64113483767ced4eb56ad429d6527d0bb0c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-w4rbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:59Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:59 crc kubenswrapper[4606]: I1212 00:23:59.844645 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da25b0ba-5398-4185-a4c1-aeba44ae5633\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa827acce161127216bcfcf3553efeb627cbe52e4d23a8cb011e025f67dde670\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://653bdfa6f15b75149c89a4aff0706e132368f8bc75520cde9f1e1c55c8866537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbb742680d55b47d51cd1964e187b6816e50a88befbce58d3a8746af74cf7071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://366ddc63e1156e702b24e0c3d3c02785c10324669fa851e46851f9a2b8a46a5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6183a4685fda61493c792e9ae693cd7f76e4aad9753f622b48347da3fd2b3b00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0867a89de2edf12036267a937b7e3b1a801e9710ff0a1250c20e020c80ac4d5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e9ccad70d1286f1c2de41f13ad508f217e20f2351d2b31d5153fdfdcdc79d45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e3c08f155fa3cef9692edf10c4df9719f3647b5625a9ac68a7c158afb80dd86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f94a9f641815d76464c17c6145b9ac4803eedab3bea29a52d65a793d448cebe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f94a9f641815d76464c17c6145b9ac4803eedab3bea29a52d65a793d448cebe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hpw5w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:59Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:59 crc kubenswrapper[4606]: I1212 00:23:59.847244 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:59 crc kubenswrapper[4606]: I1212 00:23:59.847277 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:59 crc kubenswrapper[4606]: I1212 00:23:59.847286 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:59 crc kubenswrapper[4606]: I1212 00:23:59.847300 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:23:59 crc kubenswrapper[4606]: I1212 00:23:59.847309 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:23:59Z","lastTransitionTime":"2025-12-12T00:23:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:23:59 crc kubenswrapper[4606]: I1212 00:23:59.861476 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a543e227-be89-40cb-941d-b4707cc28921\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b09638dd0d881593745c14548156f20a9366f65ecfa5018d510b902240dd4f7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-986w2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5de882904f4a5718bcf44d07da05096363447836f325f35130914c17a30c3a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-986w2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cqmz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:59Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:59 crc kubenswrapper[4606]: I1212 00:23:59.873035 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xzcfk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84110159d07d70b0607ac029fec70fb043bbb0e310c39955e49becfe87e9faa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-spsv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xzcfk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:59Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:59 crc kubenswrapper[4606]: I1212 00:23:59.894878 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f8df398-6835-482a-a2cd-3395b6e9efab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b993d03e36e69edb510ce98f50dfb98344fb6fe2b5debeff3a2004807dd00ee6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c6fef20d4e159a405f0ab0206c49e5e314e97bf7ddaf0758e090106945c9ffe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://accf40ef1339b65136666c1ea6f21a70d2152a650c87ef3ade19ddb6f9effbc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbd49ad6d6011a235d440da0e8058c0fc4dcc64effec37b50a68e84438d537a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2860bce1479ca1d558c4f5230d76cadcaf3efc0e3842ef0d371ecf1168cd852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7780eb068dc4a157794f9253367d2198a38a2c039cf3906fc2319cf48db3cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa7780eb068dc4a157794f9253367d2198a38a2c039cf3906fc2319cf48db3cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://684c387c89889a0a1e8d317ca70ebfc71ea377eb5de9c08c71a3ff02faf99e4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://684c387c89889a0a1e8d317ca70ebfc71ea377eb5de9c08c71a3ff02faf99e4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://78da314feaec6cab81e27eaea086d2bba3a05c4b73ce2add1a0bb57a226b0f5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78da314feaec6cab81e27eaea086d2bba3a05c4b73ce2add1a0bb57a226b0f5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:59Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:59 crc kubenswrapper[4606]: I1212 00:23:59.908352 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92839d27-9755-4aaa-a4c2-8aa83b6473de\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d18b3a560566347e9cfcd748453a89d1588de77f02a5e961a71916c6182eff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9531d10c07644d54aeff48edf4ac068968acbd1b6597baabcb0660e7bcd54c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76f2e4c2d97f36aebba4bcbabe12214a1df27637376d1440ee54d43fca39065\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b764ca1153735e365d95ed2855c7d41cd7457614e1ddce0f008d62720d607e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c75350d79a93c04db2bd6d71995920cde9b897776249b8696f3ead3939f7d5d4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"ed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1765499023\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1765499022\\\\\\\\\\\\\\\" (2025-12-11 23:23:42 +0000 UTC to 2026-12-11 23:23:42 +0000 UTC (now=2025-12-12 00:23:48.120831298 +0000 UTC))\\\\\\\"\\\\nI1212 00:23:48.120889 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1212 00:23:48.120921 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1212 00:23:48.120964 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1212 00:23:48.120986 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1212 00:23:48.121028 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3271453193/tls.crt::/tmp/serving-cert-3271453193/tls.key\\\\\\\"\\\\nI1212 00:23:48.121084 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1212 00:23:48.121211 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1212 00:23:48.121228 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1212 00:23:48.121265 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1212 00:23:48.121275 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1212 00:23:48.121345 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1212 00:23:48.121358 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF1212 00:23:48.123292 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3baf1bf5fbaa04f9f165f43b6a4fba496be0d453641fcd8bbc811687891dd66\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bbabfdcef0c9f4dcaada0bf5c2e98c083fd0f809462f017f8a767ef800e8e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0bbabfdcef0c9f4dcaada0bf5c2e98c083fd0f809462f017f8a767ef800e8e13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:59Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:59 crc kubenswrapper[4606]: I1212 00:23:59.926061 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hpw5w_da25b0ba-5398-4185-a4c1-aeba44ae5633/ovnkube-controller/0.log" Dec 12 00:23:59 crc kubenswrapper[4606]: I1212 00:23:59.926997 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ba650bd-b1e1-4950-98c6-2bee87181b18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de6cd366dc8d770945235faa6560cea35c0ef7eabd95cb272639e2116e2d623d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a380a96c542428bf9cf66391f4d843144052269b294636e863319b2ba954047f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://902a9d2fb0d7dcdec6339fac981ee0539911f1998a86ecf02a01015d3b1b5907\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4309a0a93977264e5000a930a32a4fb18ebe51fcf2af20bbf153f30f09759ff6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:59Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:59 crc kubenswrapper[4606]: I1212 00:23:59.929209 4606 generic.go:334] "Generic (PLEG): container finished" podID="da25b0ba-5398-4185-a4c1-aeba44ae5633" containerID="2e9ccad70d1286f1c2de41f13ad508f217e20f2351d2b31d5153fdfdcdc79d45" exitCode=1 Dec 12 00:23:59 crc kubenswrapper[4606]: I1212 00:23:59.929258 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" event={"ID":"da25b0ba-5398-4185-a4c1-aeba44ae5633","Type":"ContainerDied","Data":"2e9ccad70d1286f1c2de41f13ad508f217e20f2351d2b31d5153fdfdcdc79d45"} Dec 12 00:23:59 crc kubenswrapper[4606]: I1212 00:23:59.930041 4606 scope.go:117] "RemoveContainer" containerID="2e9ccad70d1286f1c2de41f13ad508f217e20f2351d2b31d5153fdfdcdc79d45" Dec 12 00:23:59 crc kubenswrapper[4606]: I1212 00:23:59.939595 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:59Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:59 crc kubenswrapper[4606]: I1212 00:23:59.953073 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:23:59 crc kubenswrapper[4606]: I1212 00:23:59.953106 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:23:59 crc kubenswrapper[4606]: I1212 00:23:59.953117 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:23:59 crc kubenswrapper[4606]: I1212 00:23:59.953133 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:23:59 crc kubenswrapper[4606]: I1212 00:23:59.953144 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:23:59Z","lastTransitionTime":"2025-12-12T00:23:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:23:59 crc kubenswrapper[4606]: I1212 00:23:59.956906 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://694cd2f423e8241bd8fd68395824d1cd45a12d80e92ad57e63e59c0642b21384\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:59Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:59 crc kubenswrapper[4606]: I1212 00:23:59.969015 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:59Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:59 crc kubenswrapper[4606]: I1212 00:23:59.978033 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-554rp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d34036b-3243-416a-a0c0-1d1ddb3a0ca5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://346417326a8a4ba8e839ce4870dbaacd33a90b4ab096d8ce573ce598d909b51a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9k8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-554rp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:59Z is after 2025-08-24T17:21:41Z" Dec 12 00:23:59 crc kubenswrapper[4606]: I1212 00:23:59.992271 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-w4rbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"470c2076-46bf-4305-9fb1-3e509eb4d4f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78e7b08154aa4792439cd9e13c38f1c3a106a4d138d8e6a47d5406c82f11662c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34a40d0ce85078630553b90b7a35510e0d3523f600db8940d5e0cf063903192d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34a40d0ce85078630553b90b7a35510e0d3523f600db8940d5e0cf063903192d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f0d903a3e09486a00bc5031c1a100d7a520f8e0cbdcccbeadb796e70f1075b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f0d903a3e09486a00bc5031c1a100d7a520f8e0cbdcccbeadb796e70f1075b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06909fa1392ee8f29394ecd7411437e4f331ccd2007d0a534d0292175a7260ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06909fa1392ee8f29394ecd7411437e4f331ccd2007d0a534d0292175a7260ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://328f409c465a8289f2e2e070ac2475187e763de8354cad8af5b5b4e0a3539628\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://328f409c465a8289f2e2e070ac2475187e763de8354cad8af5b5b4e0a3539628\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09980f0c6b16141a2a66d2768c6e01f75c37314fea48825e9b9707a3fc4d3f9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09980f0c6b16141a2a66d2768c6e01f75c37314fea48825e9b9707a3fc4d3f9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d34cd079b74cccb75c66edf8c51b64113483767ced4eb56ad429d6527d0bb0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d34cd079b74cccb75c66edf8c51b64113483767ced4eb56ad429d6527d0bb0c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-w4rbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:23:59Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:00 crc kubenswrapper[4606]: I1212 00:24:00.007717 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da25b0ba-5398-4185-a4c1-aeba44ae5633\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa827acce161127216bcfcf3553efeb627cbe52e4d23a8cb011e025f67dde670\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://653bdfa6f15b75149c89a4aff0706e132368f8bc75520cde9f1e1c55c8866537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbb742680d55b47d51cd1964e187b6816e50a88befbce58d3a8746af74cf7071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://366ddc63e1156e702b24e0c3d3c02785c10324669fa851e46851f9a2b8a46a5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6183a4685fda61493c792e9ae693cd7f76e4aad9753f622b48347da3fd2b3b00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0867a89de2edf12036267a937b7e3b1a801e9710ff0a1250c20e020c80ac4d5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e9ccad70d1286f1c2de41f13ad508f217e20f2351d2b31d5153fdfdcdc79d45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e9ccad70d1286f1c2de41f13ad508f217e20f2351d2b31d5153fdfdcdc79d45\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-12T00:23:59Z\\\",\\\"message\\\":\\\"ing reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1212 00:23:59.117007 5872 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1212 00:23:59.117044 5872 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1212 00:23:59.117090 5872 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1212 00:23:59.117122 5872 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1212 00:23:59.117239 5872 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1212 00:23:59.117303 5872 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1212 00:23:59.117334 5872 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1212 00:23:59.117385 5872 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1212 00:23:59.117439 5872 factory.go:656] Stopping watch factory\\\\nI1212 00:23:59.117508 5872 ovnkube.go:599] Stopped ovnkube\\\\nI1212 00:23:59.117281 5872 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1212 00:23:59.117164 5872 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1212 00:23:59.117358 5872 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1212 00:23:59.117688 5872 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1212 00:23:59.117476 5872 handler.go:208] Removed *v1.Node event handler 7\\\\nI12\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e3c08f155fa3cef9692edf10c4df9719f3647b5625a9ac68a7c158afb80dd86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f94a9f641815d76464c17c6145b9ac4803eedab3bea29a52d65a793d448cebe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f94a9f641815d76464c17c6145b9ac4803eedab3bea29a52d65a793d448cebe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hpw5w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:00Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:00 crc kubenswrapper[4606]: I1212 00:24:00.018843 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://694cd2f423e8241bd8fd68395824d1cd45a12d80e92ad57e63e59c0642b21384\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:00Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:00 crc kubenswrapper[4606]: I1212 00:24:00.032970 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a543e227-be89-40cb-941d-b4707cc28921\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b09638dd0d881593745c14548156f20a9366f65ecfa5018d510b902240dd4f7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-986w2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5de882904f4a5718bcf44d07da05096363447836f325f35130914c17a30c3a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-986w2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cqmz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:00Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:00 crc kubenswrapper[4606]: I1212 00:24:00.047696 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xzcfk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84110159d07d70b0607ac029fec70fb043bbb0e310c39955e49becfe87e9faa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-spsv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xzcfk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:00Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:00 crc kubenswrapper[4606]: I1212 00:24:00.055153 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:00 crc kubenswrapper[4606]: I1212 00:24:00.055208 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:00 crc kubenswrapper[4606]: I1212 00:24:00.055220 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:00 crc kubenswrapper[4606]: I1212 00:24:00.055239 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:00 crc kubenswrapper[4606]: I1212 00:24:00.055252 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:00Z","lastTransitionTime":"2025-12-12T00:24:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:00 crc kubenswrapper[4606]: I1212 00:24:00.067849 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f8df398-6835-482a-a2cd-3395b6e9efab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b993d03e36e69edb510ce98f50dfb98344fb6fe2b5debeff3a2004807dd00ee6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c6fef20d4e159a405f0ab0206c49e5e314e97bf7ddaf0758e090106945c9ffe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://accf40ef1339b65136666c1ea6f21a70d2152a650c87ef3ade19ddb6f9effbc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbd49ad6d6011a235d440da0e8058c0fc4dcc64effec37b50a68e84438d537a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2860bce1479ca1d558c4f5230d76cadcaf3efc0e3842ef0d371ecf1168cd852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7780eb068dc4a157794f9253367d2198a38a2c039cf3906fc2319cf48db3cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa7780eb068dc4a157794f9253367d2198a38a2c039cf3906fc2319cf48db3cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://684c387c89889a0a1e8d317ca70ebfc71ea377eb5de9c08c71a3ff02faf99e4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://684c387c89889a0a1e8d317ca70ebfc71ea377eb5de9c08c71a3ff02faf99e4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://78da314feaec6cab81e27eaea086d2bba3a05c4b73ce2add1a0bb57a226b0f5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78da314feaec6cab81e27eaea086d2bba3a05c4b73ce2add1a0bb57a226b0f5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:00Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:00 crc kubenswrapper[4606]: I1212 00:24:00.081059 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92839d27-9755-4aaa-a4c2-8aa83b6473de\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d18b3a560566347e9cfcd748453a89d1588de77f02a5e961a71916c6182eff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9531d10c07644d54aeff48edf4ac068968acbd1b6597baabcb0660e7bcd54c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76f2e4c2d97f36aebba4bcbabe12214a1df27637376d1440ee54d43fca39065\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b764ca1153735e365d95ed2855c7d41cd7457614e1ddce0f008d62720d607e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c75350d79a93c04db2bd6d71995920cde9b897776249b8696f3ead3939f7d5d4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"ed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1765499023\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1765499022\\\\\\\\\\\\\\\" (2025-12-11 23:23:42 +0000 UTC to 2026-12-11 23:23:42 +0000 UTC (now=2025-12-12 00:23:48.120831298 +0000 UTC))\\\\\\\"\\\\nI1212 00:23:48.120889 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1212 00:23:48.120921 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1212 00:23:48.120964 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1212 00:23:48.120986 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1212 00:23:48.121028 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3271453193/tls.crt::/tmp/serving-cert-3271453193/tls.key\\\\\\\"\\\\nI1212 00:23:48.121084 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1212 00:23:48.121211 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1212 00:23:48.121228 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1212 00:23:48.121265 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1212 00:23:48.121275 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1212 00:23:48.121345 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1212 00:23:48.121358 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF1212 00:23:48.123292 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3baf1bf5fbaa04f9f165f43b6a4fba496be0d453641fcd8bbc811687891dd66\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bbabfdcef0c9f4dcaada0bf5c2e98c083fd0f809462f017f8a767ef800e8e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0bbabfdcef0c9f4dcaada0bf5c2e98c083fd0f809462f017f8a767ef800e8e13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:00Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:00 crc kubenswrapper[4606]: I1212 00:24:00.101988 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ba650bd-b1e1-4950-98c6-2bee87181b18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de6cd366dc8d770945235faa6560cea35c0ef7eabd95cb272639e2116e2d623d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a380a96c542428bf9cf66391f4d843144052269b294636e863319b2ba954047f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://902a9d2fb0d7dcdec6339fac981ee0539911f1998a86ecf02a01015d3b1b5907\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4309a0a93977264e5000a930a32a4fb18ebe51fcf2af20bbf153f30f09759ff6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:00Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:00 crc kubenswrapper[4606]: I1212 00:24:00.112638 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:00Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:00 crc kubenswrapper[4606]: I1212 00:24:00.120733 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qtz7b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f437a237-3eb8-4817-b0a9-35efece69933\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f587a144b02f6ba1229380174f2a4f2c50ed33702fe8d64eb5ff7174b32eb2d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qq49f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qtz7b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:00Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:00 crc kubenswrapper[4606]: I1212 00:24:00.133902 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4986a294a19f85082b9caf854f13f9c3587ca49314b40917b00768ee0bb51ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4251eab73ccb6141903993f8df922037762f7c11dc1e0ef1a2b5708ed435307\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:00Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:00 crc kubenswrapper[4606]: I1212 00:24:00.145794 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:00Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:00 crc kubenswrapper[4606]: I1212 00:24:00.157773 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:00 crc kubenswrapper[4606]: I1212 00:24:00.157810 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:00 crc kubenswrapper[4606]: I1212 00:24:00.157820 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:00 crc kubenswrapper[4606]: I1212 00:24:00.157835 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:00 crc kubenswrapper[4606]: I1212 00:24:00.157843 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:00Z","lastTransitionTime":"2025-12-12T00:24:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:00 crc kubenswrapper[4606]: I1212 00:24:00.158271 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce6c4e852b4c68e910621ad11058beecb40960d501d3e64f5fcad97c442df4ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:00Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:00 crc kubenswrapper[4606]: I1212 00:24:00.260377 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:00 crc kubenswrapper[4606]: I1212 00:24:00.260418 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:00 crc kubenswrapper[4606]: I1212 00:24:00.260429 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:00 crc kubenswrapper[4606]: I1212 00:24:00.260448 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:00 crc kubenswrapper[4606]: I1212 00:24:00.260461 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:00Z","lastTransitionTime":"2025-12-12T00:24:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:00 crc kubenswrapper[4606]: I1212 00:24:00.361849 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:00 crc kubenswrapper[4606]: I1212 00:24:00.361886 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:00 crc kubenswrapper[4606]: I1212 00:24:00.361897 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:00 crc kubenswrapper[4606]: I1212 00:24:00.361922 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:00 crc kubenswrapper[4606]: I1212 00:24:00.361932 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:00Z","lastTransitionTime":"2025-12-12T00:24:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:00 crc kubenswrapper[4606]: I1212 00:24:00.463801 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:00 crc kubenswrapper[4606]: I1212 00:24:00.463828 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:00 crc kubenswrapper[4606]: I1212 00:24:00.463836 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:00 crc kubenswrapper[4606]: I1212 00:24:00.463849 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:00 crc kubenswrapper[4606]: I1212 00:24:00.463858 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:00Z","lastTransitionTime":"2025-12-12T00:24:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:00 crc kubenswrapper[4606]: I1212 00:24:00.565639 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:00 crc kubenswrapper[4606]: I1212 00:24:00.565670 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:00 crc kubenswrapper[4606]: I1212 00:24:00.565680 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:00 crc kubenswrapper[4606]: I1212 00:24:00.565696 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:00 crc kubenswrapper[4606]: I1212 00:24:00.565706 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:00Z","lastTransitionTime":"2025-12-12T00:24:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:00 crc kubenswrapper[4606]: I1212 00:24:00.668214 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:00 crc kubenswrapper[4606]: I1212 00:24:00.668251 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:00 crc kubenswrapper[4606]: I1212 00:24:00.668264 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:00 crc kubenswrapper[4606]: I1212 00:24:00.668277 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:00 crc kubenswrapper[4606]: I1212 00:24:00.668286 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:00Z","lastTransitionTime":"2025-12-12T00:24:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:00 crc kubenswrapper[4606]: I1212 00:24:00.698793 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 12 00:24:00 crc kubenswrapper[4606]: I1212 00:24:00.698830 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 12 00:24:00 crc kubenswrapper[4606]: I1212 00:24:00.698908 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 12 00:24:00 crc kubenswrapper[4606]: E1212 00:24:00.698909 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 12 00:24:00 crc kubenswrapper[4606]: E1212 00:24:00.699010 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 12 00:24:00 crc kubenswrapper[4606]: E1212 00:24:00.699086 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 12 00:24:00 crc kubenswrapper[4606]: I1212 00:24:00.770335 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:00 crc kubenswrapper[4606]: I1212 00:24:00.770395 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:00 crc kubenswrapper[4606]: I1212 00:24:00.770413 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:00 crc kubenswrapper[4606]: I1212 00:24:00.770437 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:00 crc kubenswrapper[4606]: I1212 00:24:00.770454 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:00Z","lastTransitionTime":"2025-12-12T00:24:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:00 crc kubenswrapper[4606]: I1212 00:24:00.872311 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:00 crc kubenswrapper[4606]: I1212 00:24:00.872372 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:00 crc kubenswrapper[4606]: I1212 00:24:00.872390 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:00 crc kubenswrapper[4606]: I1212 00:24:00.872415 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:00 crc kubenswrapper[4606]: I1212 00:24:00.872434 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:00Z","lastTransitionTime":"2025-12-12T00:24:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:00 crc kubenswrapper[4606]: I1212 00:24:00.936506 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hpw5w_da25b0ba-5398-4185-a4c1-aeba44ae5633/ovnkube-controller/0.log" Dec 12 00:24:00 crc kubenswrapper[4606]: I1212 00:24:00.939689 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" event={"ID":"da25b0ba-5398-4185-a4c1-aeba44ae5633","Type":"ContainerStarted","Data":"5cd9db0d886b8f1de6f10f2be1348147d443ba4f3e542b171b1daf9f5abbd1ec"} Dec 12 00:24:00 crc kubenswrapper[4606]: I1212 00:24:00.940336 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" Dec 12 00:24:00 crc kubenswrapper[4606]: I1212 00:24:00.974565 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:00 crc kubenswrapper[4606]: I1212 00:24:00.974607 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:00 crc kubenswrapper[4606]: I1212 00:24:00.974618 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:00 crc kubenswrapper[4606]: I1212 00:24:00.974635 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:00 crc kubenswrapper[4606]: I1212 00:24:00.974647 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:00Z","lastTransitionTime":"2025-12-12T00:24:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:00 crc kubenswrapper[4606]: I1212 00:24:00.979139 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f8df398-6835-482a-a2cd-3395b6e9efab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b993d03e36e69edb510ce98f50dfb98344fb6fe2b5debeff3a2004807dd00ee6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c6fef20d4e159a405f0ab0206c49e5e314e97bf7ddaf0758e090106945c9ffe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://accf40ef1339b65136666c1ea6f21a70d2152a650c87ef3ade19ddb6f9effbc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbd49ad6d6011a235d440da0e8058c0fc4dcc64effec37b50a68e84438d537a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2860bce1479ca1d558c4f5230d76cadcaf3efc0e3842ef0d371ecf1168cd852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7780eb068dc4a157794f9253367d2198a38a2c039cf3906fc2319cf48db3cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa7780eb068dc4a157794f9253367d2198a38a2c039cf3906fc2319cf48db3cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://684c387c89889a0a1e8d317ca70ebfc71ea377eb5de9c08c71a3ff02faf99e4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://684c387c89889a0a1e8d317ca70ebfc71ea377eb5de9c08c71a3ff02faf99e4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://78da314feaec6cab81e27eaea086d2bba3a05c4b73ce2add1a0bb57a226b0f5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78da314feaec6cab81e27eaea086d2bba3a05c4b73ce2add1a0bb57a226b0f5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:00Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:01 crc kubenswrapper[4606]: I1212 00:24:01.000060 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92839d27-9755-4aaa-a4c2-8aa83b6473de\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d18b3a560566347e9cfcd748453a89d1588de77f02a5e961a71916c6182eff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9531d10c07644d54aeff48edf4ac068968acbd1b6597baabcb0660e7bcd54c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76f2e4c2d97f36aebba4bcbabe12214a1df27637376d1440ee54d43fca39065\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b764ca1153735e365d95ed2855c7d41cd7457614e1ddce0f008d62720d607e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c75350d79a93c04db2bd6d71995920cde9b897776249b8696f3ead3939f7d5d4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"ed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1765499023\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1765499022\\\\\\\\\\\\\\\" (2025-12-11 23:23:42 +0000 UTC to 2026-12-11 23:23:42 +0000 UTC (now=2025-12-12 00:23:48.120831298 +0000 UTC))\\\\\\\"\\\\nI1212 00:23:48.120889 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1212 00:23:48.120921 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1212 00:23:48.120964 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1212 00:23:48.120986 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1212 00:23:48.121028 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3271453193/tls.crt::/tmp/serving-cert-3271453193/tls.key\\\\\\\"\\\\nI1212 00:23:48.121084 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1212 00:23:48.121211 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1212 00:23:48.121228 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1212 00:23:48.121265 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1212 00:23:48.121275 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1212 00:23:48.121345 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1212 00:23:48.121358 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF1212 00:23:48.123292 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3baf1bf5fbaa04f9f165f43b6a4fba496be0d453641fcd8bbc811687891dd66\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bbabfdcef0c9f4dcaada0bf5c2e98c083fd0f809462f017f8a767ef800e8e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0bbabfdcef0c9f4dcaada0bf5c2e98c083fd0f809462f017f8a767ef800e8e13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:00Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:01 crc kubenswrapper[4606]: I1212 00:24:01.014612 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ba650bd-b1e1-4950-98c6-2bee87181b18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de6cd366dc8d770945235faa6560cea35c0ef7eabd95cb272639e2116e2d623d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a380a96c542428bf9cf66391f4d843144052269b294636e863319b2ba954047f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://902a9d2fb0d7dcdec6339fac981ee0539911f1998a86ecf02a01015d3b1b5907\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4309a0a93977264e5000a930a32a4fb18ebe51fcf2af20bbf153f30f09759ff6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:01Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:01 crc kubenswrapper[4606]: I1212 00:24:01.030060 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:01Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:01 crc kubenswrapper[4606]: I1212 00:24:01.046623 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://694cd2f423e8241bd8fd68395824d1cd45a12d80e92ad57e63e59c0642b21384\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:01Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:01 crc kubenswrapper[4606]: I1212 00:24:01.061903 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a543e227-be89-40cb-941d-b4707cc28921\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b09638dd0d881593745c14548156f20a9366f65ecfa5018d510b902240dd4f7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-986w2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5de882904f4a5718bcf44d07da05096363447836f325f35130914c17a30c3a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-986w2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cqmz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:01Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:01 crc kubenswrapper[4606]: I1212 00:24:01.077444 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xzcfk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84110159d07d70b0607ac029fec70fb043bbb0e310c39955e49becfe87e9faa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-spsv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xzcfk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:01Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:01 crc kubenswrapper[4606]: I1212 00:24:01.077886 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:01 crc kubenswrapper[4606]: I1212 00:24:01.077940 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:01 crc kubenswrapper[4606]: I1212 00:24:01.077959 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:01 crc kubenswrapper[4606]: I1212 00:24:01.077987 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:01 crc kubenswrapper[4606]: I1212 00:24:01.078005 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:01Z","lastTransitionTime":"2025-12-12T00:24:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:01 crc kubenswrapper[4606]: I1212 00:24:01.092712 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4986a294a19f85082b9caf854f13f9c3587ca49314b40917b00768ee0bb51ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4251eab73ccb6141903993f8df922037762f7c11dc1e0ef1a2b5708ed435307\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:01Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:01 crc kubenswrapper[4606]: I1212 00:24:01.106839 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:01Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:01 crc kubenswrapper[4606]: I1212 00:24:01.117333 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qtz7b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f437a237-3eb8-4817-b0a9-35efece69933\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f587a144b02f6ba1229380174f2a4f2c50ed33702fe8d64eb5ff7174b32eb2d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qq49f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qtz7b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:01Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:01 crc kubenswrapper[4606]: I1212 00:24:01.133973 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce6c4e852b4c68e910621ad11058beecb40960d501d3e64f5fcad97c442df4ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:01Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:01 crc kubenswrapper[4606]: I1212 00:24:01.149092 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:01Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:01 crc kubenswrapper[4606]: I1212 00:24:01.160972 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-554rp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d34036b-3243-416a-a0c0-1d1ddb3a0ca5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://346417326a8a4ba8e839ce4870dbaacd33a90b4ab096d8ce573ce598d909b51a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9k8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-554rp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:01Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:01 crc kubenswrapper[4606]: I1212 00:24:01.178646 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-w4rbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"470c2076-46bf-4305-9fb1-3e509eb4d4f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78e7b08154aa4792439cd9e13c38f1c3a106a4d138d8e6a47d5406c82f11662c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34a40d0ce85078630553b90b7a35510e0d3523f600db8940d5e0cf063903192d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34a40d0ce85078630553b90b7a35510e0d3523f600db8940d5e0cf063903192d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f0d903a3e09486a00bc5031c1a100d7a520f8e0cbdcccbeadb796e70f1075b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f0d903a3e09486a00bc5031c1a100d7a520f8e0cbdcccbeadb796e70f1075b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06909fa1392ee8f29394ecd7411437e4f331ccd2007d0a534d0292175a7260ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06909fa1392ee8f29394ecd7411437e4f331ccd2007d0a534d0292175a7260ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://328f409c465a8289f2e2e070ac2475187e763de8354cad8af5b5b4e0a3539628\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://328f409c465a8289f2e2e070ac2475187e763de8354cad8af5b5b4e0a3539628\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09980f0c6b16141a2a66d2768c6e01f75c37314fea48825e9b9707a3fc4d3f9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09980f0c6b16141a2a66d2768c6e01f75c37314fea48825e9b9707a3fc4d3f9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d34cd079b74cccb75c66edf8c51b64113483767ced4eb56ad429d6527d0bb0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d34cd079b74cccb75c66edf8c51b64113483767ced4eb56ad429d6527d0bb0c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-w4rbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:01Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:01 crc kubenswrapper[4606]: I1212 00:24:01.181073 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:01 crc kubenswrapper[4606]: I1212 00:24:01.181101 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:01 crc kubenswrapper[4606]: I1212 00:24:01.181114 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:01 crc kubenswrapper[4606]: I1212 00:24:01.181147 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:01 crc kubenswrapper[4606]: I1212 00:24:01.181160 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:01Z","lastTransitionTime":"2025-12-12T00:24:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:01 crc kubenswrapper[4606]: I1212 00:24:01.213253 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da25b0ba-5398-4185-a4c1-aeba44ae5633\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa827acce161127216bcfcf3553efeb627cbe52e4d23a8cb011e025f67dde670\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://653bdfa6f15b75149c89a4aff0706e132368f8bc75520cde9f1e1c55c8866537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbb742680d55b47d51cd1964e187b6816e50a88befbce58d3a8746af74cf7071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://366ddc63e1156e702b24e0c3d3c02785c10324669fa851e46851f9a2b8a46a5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6183a4685fda61493c792e9ae693cd7f76e4aad9753f622b48347da3fd2b3b00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0867a89de2edf12036267a937b7e3b1a801e9710ff0a1250c20e020c80ac4d5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cd9db0d886b8f1de6f10f2be1348147d443ba4f3e542b171b1daf9f5abbd1ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e9ccad70d1286f1c2de41f13ad508f217e20f2351d2b31d5153fdfdcdc79d45\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-12T00:23:59Z\\\",\\\"message\\\":\\\"ing reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1212 00:23:59.117007 5872 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1212 00:23:59.117044 5872 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1212 00:23:59.117090 5872 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1212 00:23:59.117122 5872 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1212 00:23:59.117239 5872 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1212 00:23:59.117303 5872 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1212 00:23:59.117334 5872 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1212 00:23:59.117385 5872 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1212 00:23:59.117439 5872 factory.go:656] Stopping watch factory\\\\nI1212 00:23:59.117508 5872 ovnkube.go:599] Stopped ovnkube\\\\nI1212 00:23:59.117281 5872 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1212 00:23:59.117164 5872 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1212 00:23:59.117358 5872 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1212 00:23:59.117688 5872 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1212 00:23:59.117476 5872 handler.go:208] Removed *v1.Node event handler 7\\\\nI12\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:24:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e3c08f155fa3cef9692edf10c4df9719f3647b5625a9ac68a7c158afb80dd86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f94a9f641815d76464c17c6145b9ac4803eedab3bea29a52d65a793d448cebe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f94a9f641815d76464c17c6145b9ac4803eedab3bea29a52d65a793d448cebe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hpw5w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:01Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:01 crc kubenswrapper[4606]: I1212 00:24:01.283930 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:01 crc kubenswrapper[4606]: I1212 00:24:01.283985 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:01 crc kubenswrapper[4606]: I1212 00:24:01.284008 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:01 crc kubenswrapper[4606]: I1212 00:24:01.284036 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:01 crc kubenswrapper[4606]: I1212 00:24:01.284058 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:01Z","lastTransitionTime":"2025-12-12T00:24:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:01 crc kubenswrapper[4606]: I1212 00:24:01.288625 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:01 crc kubenswrapper[4606]: I1212 00:24:01.288671 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:01 crc kubenswrapper[4606]: I1212 00:24:01.288689 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:01 crc kubenswrapper[4606]: I1212 00:24:01.288710 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:01 crc kubenswrapper[4606]: I1212 00:24:01.288725 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:01Z","lastTransitionTime":"2025-12-12T00:24:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:01 crc kubenswrapper[4606]: E1212 00:24:01.304753 4606 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:24:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:24:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:24:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:24:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"19ab28be-ccfb-4859-88d0-8d375e2d06f2\\\",\\\"systemUUID\\\":\\\"d5982033-b2dc-474c-9cda-275cd567c208\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:01Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:01 crc kubenswrapper[4606]: I1212 00:24:01.309518 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:01 crc kubenswrapper[4606]: I1212 00:24:01.309573 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:01 crc kubenswrapper[4606]: I1212 00:24:01.309599 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:01 crc kubenswrapper[4606]: I1212 00:24:01.309626 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:01 crc kubenswrapper[4606]: I1212 00:24:01.309648 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:01Z","lastTransitionTime":"2025-12-12T00:24:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:01 crc kubenswrapper[4606]: E1212 00:24:01.329915 4606 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:24:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:24:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:24:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:24:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"19ab28be-ccfb-4859-88d0-8d375e2d06f2\\\",\\\"systemUUID\\\":\\\"d5982033-b2dc-474c-9cda-275cd567c208\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:01Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:01 crc kubenswrapper[4606]: I1212 00:24:01.334232 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:01 crc kubenswrapper[4606]: I1212 00:24:01.334267 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:01 crc kubenswrapper[4606]: I1212 00:24:01.334278 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:01 crc kubenswrapper[4606]: I1212 00:24:01.334296 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:01 crc kubenswrapper[4606]: I1212 00:24:01.334307 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:01Z","lastTransitionTime":"2025-12-12T00:24:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:01 crc kubenswrapper[4606]: E1212 00:24:01.349711 4606 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:24:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:24:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:24:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:24:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"19ab28be-ccfb-4859-88d0-8d375e2d06f2\\\",\\\"systemUUID\\\":\\\"d5982033-b2dc-474c-9cda-275cd567c208\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:01Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:01 crc kubenswrapper[4606]: I1212 00:24:01.353682 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:01 crc kubenswrapper[4606]: I1212 00:24:01.353717 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:01 crc kubenswrapper[4606]: I1212 00:24:01.353730 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:01 crc kubenswrapper[4606]: I1212 00:24:01.353749 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:01 crc kubenswrapper[4606]: I1212 00:24:01.353763 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:01Z","lastTransitionTime":"2025-12-12T00:24:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:01 crc kubenswrapper[4606]: E1212 00:24:01.366802 4606 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:24:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:24:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:24:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:24:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"19ab28be-ccfb-4859-88d0-8d375e2d06f2\\\",\\\"systemUUID\\\":\\\"d5982033-b2dc-474c-9cda-275cd567c208\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:01Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:01 crc kubenswrapper[4606]: I1212 00:24:01.369763 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:01 crc kubenswrapper[4606]: I1212 00:24:01.369793 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:01 crc kubenswrapper[4606]: I1212 00:24:01.369802 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:01 crc kubenswrapper[4606]: I1212 00:24:01.369814 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:01 crc kubenswrapper[4606]: I1212 00:24:01.369823 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:01Z","lastTransitionTime":"2025-12-12T00:24:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:01 crc kubenswrapper[4606]: E1212 00:24:01.381383 4606 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:24:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:24:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:24:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:24:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"19ab28be-ccfb-4859-88d0-8d375e2d06f2\\\",\\\"systemUUID\\\":\\\"d5982033-b2dc-474c-9cda-275cd567c208\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:01Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:01 crc kubenswrapper[4606]: E1212 00:24:01.381490 4606 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 12 00:24:01 crc kubenswrapper[4606]: I1212 00:24:01.385650 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:01 crc kubenswrapper[4606]: I1212 00:24:01.385680 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:01 crc kubenswrapper[4606]: I1212 00:24:01.385693 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:01 crc kubenswrapper[4606]: I1212 00:24:01.385712 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:01 crc kubenswrapper[4606]: I1212 00:24:01.385728 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:01Z","lastTransitionTime":"2025-12-12T00:24:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:01 crc kubenswrapper[4606]: I1212 00:24:01.489276 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:01 crc kubenswrapper[4606]: I1212 00:24:01.489337 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:01 crc kubenswrapper[4606]: I1212 00:24:01.489361 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:01 crc kubenswrapper[4606]: I1212 00:24:01.489393 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:01 crc kubenswrapper[4606]: I1212 00:24:01.489417 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:01Z","lastTransitionTime":"2025-12-12T00:24:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:01 crc kubenswrapper[4606]: I1212 00:24:01.592508 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:01 crc kubenswrapper[4606]: I1212 00:24:01.592561 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:01 crc kubenswrapper[4606]: I1212 00:24:01.592575 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:01 crc kubenswrapper[4606]: I1212 00:24:01.592598 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:01 crc kubenswrapper[4606]: I1212 00:24:01.592615 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:01Z","lastTransitionTime":"2025-12-12T00:24:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:01 crc kubenswrapper[4606]: I1212 00:24:01.695879 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:01 crc kubenswrapper[4606]: I1212 00:24:01.695944 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:01 crc kubenswrapper[4606]: I1212 00:24:01.695967 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:01 crc kubenswrapper[4606]: I1212 00:24:01.695998 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:01 crc kubenswrapper[4606]: I1212 00:24:01.696021 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:01Z","lastTransitionTime":"2025-12-12T00:24:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:01 crc kubenswrapper[4606]: I1212 00:24:01.731897 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7rxps"] Dec 12 00:24:01 crc kubenswrapper[4606]: I1212 00:24:01.732502 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7rxps" Dec 12 00:24:01 crc kubenswrapper[4606]: I1212 00:24:01.734487 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 12 00:24:01 crc kubenswrapper[4606]: I1212 00:24:01.735619 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 12 00:24:01 crc kubenswrapper[4606]: I1212 00:24:01.747817 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbjbt\" (UniqueName: \"kubernetes.io/projected/4c831d6d-b07d-46dd-adc0-85239379350f-kube-api-access-hbjbt\") pod \"ovnkube-control-plane-749d76644c-7rxps\" (UID: \"4c831d6d-b07d-46dd-adc0-85239379350f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7rxps" Dec 12 00:24:01 crc kubenswrapper[4606]: I1212 00:24:01.747951 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4c831d6d-b07d-46dd-adc0-85239379350f-env-overrides\") pod \"ovnkube-control-plane-749d76644c-7rxps\" (UID: \"4c831d6d-b07d-46dd-adc0-85239379350f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7rxps" Dec 12 00:24:01 crc kubenswrapper[4606]: I1212 00:24:01.747991 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4c831d6d-b07d-46dd-adc0-85239379350f-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-7rxps\" (UID: \"4c831d6d-b07d-46dd-adc0-85239379350f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7rxps" Dec 12 00:24:01 crc kubenswrapper[4606]: I1212 00:24:01.748053 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4c831d6d-b07d-46dd-adc0-85239379350f-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-7rxps\" (UID: \"4c831d6d-b07d-46dd-adc0-85239379350f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7rxps" Dec 12 00:24:01 crc kubenswrapper[4606]: I1212 00:24:01.756020 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4986a294a19f85082b9caf854f13f9c3587ca49314b40917b00768ee0bb51ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4251eab73ccb6141903993f8df922037762f7c11dc1e0ef1a2b5708ed435307\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:01Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:01 crc kubenswrapper[4606]: I1212 00:24:01.769677 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:01Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:01 crc kubenswrapper[4606]: I1212 00:24:01.780680 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qtz7b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f437a237-3eb8-4817-b0a9-35efece69933\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f587a144b02f6ba1229380174f2a4f2c50ed33702fe8d64eb5ff7174b32eb2d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qq49f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qtz7b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:01Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:01 crc kubenswrapper[4606]: I1212 00:24:01.794766 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce6c4e852b4c68e910621ad11058beecb40960d501d3e64f5fcad97c442df4ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:01Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:01 crc kubenswrapper[4606]: I1212 00:24:01.800618 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:01 crc kubenswrapper[4606]: I1212 00:24:01.800649 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:01 crc kubenswrapper[4606]: I1212 00:24:01.800660 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:01 crc kubenswrapper[4606]: I1212 00:24:01.800675 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:01 crc kubenswrapper[4606]: I1212 00:24:01.800686 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:01Z","lastTransitionTime":"2025-12-12T00:24:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:01 crc kubenswrapper[4606]: I1212 00:24:01.814302 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:01Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:01 crc kubenswrapper[4606]: I1212 00:24:01.829510 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-554rp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d34036b-3243-416a-a0c0-1d1ddb3a0ca5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://346417326a8a4ba8e839ce4870dbaacd33a90b4ab096d8ce573ce598d909b51a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9k8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-554rp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:01Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:01 crc kubenswrapper[4606]: I1212 00:24:01.849008 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4c831d6d-b07d-46dd-adc0-85239379350f-env-overrides\") pod \"ovnkube-control-plane-749d76644c-7rxps\" (UID: \"4c831d6d-b07d-46dd-adc0-85239379350f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7rxps" Dec 12 00:24:01 crc kubenswrapper[4606]: I1212 00:24:01.849069 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4c831d6d-b07d-46dd-adc0-85239379350f-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-7rxps\" (UID: \"4c831d6d-b07d-46dd-adc0-85239379350f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7rxps" Dec 12 00:24:01 crc kubenswrapper[4606]: I1212 00:24:01.849125 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4c831d6d-b07d-46dd-adc0-85239379350f-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-7rxps\" (UID: \"4c831d6d-b07d-46dd-adc0-85239379350f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7rxps" Dec 12 00:24:01 crc kubenswrapper[4606]: I1212 00:24:01.849166 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hbjbt\" (UniqueName: \"kubernetes.io/projected/4c831d6d-b07d-46dd-adc0-85239379350f-kube-api-access-hbjbt\") pod \"ovnkube-control-plane-749d76644c-7rxps\" (UID: \"4c831d6d-b07d-46dd-adc0-85239379350f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7rxps" Dec 12 00:24:01 crc kubenswrapper[4606]: I1212 00:24:01.850033 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4c831d6d-b07d-46dd-adc0-85239379350f-env-overrides\") pod \"ovnkube-control-plane-749d76644c-7rxps\" (UID: \"4c831d6d-b07d-46dd-adc0-85239379350f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7rxps" Dec 12 00:24:01 crc kubenswrapper[4606]: I1212 00:24:01.852076 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-w4rbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"470c2076-46bf-4305-9fb1-3e509eb4d4f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78e7b08154aa4792439cd9e13c38f1c3a106a4d138d8e6a47d5406c82f11662c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34a40d0ce85078630553b90b7a35510e0d3523f600db8940d5e0cf063903192d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34a40d0ce85078630553b90b7a35510e0d3523f600db8940d5e0cf063903192d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f0d903a3e09486a00bc5031c1a100d7a520f8e0cbdcccbeadb796e70f1075b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f0d903a3e09486a00bc5031c1a100d7a520f8e0cbdcccbeadb796e70f1075b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06909fa1392ee8f29394ecd7411437e4f331ccd2007d0a534d0292175a7260ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06909fa1392ee8f29394ecd7411437e4f331ccd2007d0a534d0292175a7260ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://328f409c465a8289f2e2e070ac2475187e763de8354cad8af5b5b4e0a3539628\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://328f409c465a8289f2e2e070ac2475187e763de8354cad8af5b5b4e0a3539628\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09980f0c6b16141a2a66d2768c6e01f75c37314fea48825e9b9707a3fc4d3f9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09980f0c6b16141a2a66d2768c6e01f75c37314fea48825e9b9707a3fc4d3f9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d34cd079b74cccb75c66edf8c51b64113483767ced4eb56ad429d6527d0bb0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d34cd079b74cccb75c66edf8c51b64113483767ced4eb56ad429d6527d0bb0c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-w4rbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:01Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:01 crc kubenswrapper[4606]: I1212 00:24:01.852327 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4c831d6d-b07d-46dd-adc0-85239379350f-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-7rxps\" (UID: \"4c831d6d-b07d-46dd-adc0-85239379350f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7rxps" Dec 12 00:24:01 crc kubenswrapper[4606]: I1212 00:24:01.863683 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4c831d6d-b07d-46dd-adc0-85239379350f-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-7rxps\" (UID: \"4c831d6d-b07d-46dd-adc0-85239379350f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7rxps" Dec 12 00:24:01 crc kubenswrapper[4606]: I1212 00:24:01.877803 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da25b0ba-5398-4185-a4c1-aeba44ae5633\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa827acce161127216bcfcf3553efeb627cbe52e4d23a8cb011e025f67dde670\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://653bdfa6f15b75149c89a4aff0706e132368f8bc75520cde9f1e1c55c8866537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbb742680d55b47d51cd1964e187b6816e50a88befbce58d3a8746af74cf7071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://366ddc63e1156e702b24e0c3d3c02785c10324669fa851e46851f9a2b8a46a5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6183a4685fda61493c792e9ae693cd7f76e4aad9753f622b48347da3fd2b3b00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0867a89de2edf12036267a937b7e3b1a801e9710ff0a1250c20e020c80ac4d5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cd9db0d886b8f1de6f10f2be1348147d443ba4f3e542b171b1daf9f5abbd1ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e9ccad70d1286f1c2de41f13ad508f217e20f2351d2b31d5153fdfdcdc79d45\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-12T00:23:59Z\\\",\\\"message\\\":\\\"ing reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1212 00:23:59.117007 5872 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1212 00:23:59.117044 5872 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1212 00:23:59.117090 5872 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1212 00:23:59.117122 5872 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1212 00:23:59.117239 5872 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1212 00:23:59.117303 5872 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1212 00:23:59.117334 5872 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1212 00:23:59.117385 5872 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1212 00:23:59.117439 5872 factory.go:656] Stopping watch factory\\\\nI1212 00:23:59.117508 5872 ovnkube.go:599] Stopped ovnkube\\\\nI1212 00:23:59.117281 5872 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1212 00:23:59.117164 5872 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1212 00:23:59.117358 5872 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1212 00:23:59.117688 5872 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1212 00:23:59.117476 5872 handler.go:208] Removed *v1.Node event handler 7\\\\nI12\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:24:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e3c08f155fa3cef9692edf10c4df9719f3647b5625a9ac68a7c158afb80dd86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f94a9f641815d76464c17c6145b9ac4803eedab3bea29a52d65a793d448cebe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f94a9f641815d76464c17c6145b9ac4803eedab3bea29a52d65a793d448cebe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hpw5w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:01Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:01 crc kubenswrapper[4606]: I1212 00:24:01.879069 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbjbt\" (UniqueName: \"kubernetes.io/projected/4c831d6d-b07d-46dd-adc0-85239379350f-kube-api-access-hbjbt\") pod \"ovnkube-control-plane-749d76644c-7rxps\" (UID: \"4c831d6d-b07d-46dd-adc0-85239379350f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7rxps" Dec 12 00:24:01 crc kubenswrapper[4606]: I1212 00:24:01.894991 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7rxps" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c831d6d-b07d-46dd-adc0-85239379350f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbjbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbjbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:24:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7rxps\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:01Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:01 crc kubenswrapper[4606]: I1212 00:24:01.902689 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:01 crc kubenswrapper[4606]: I1212 00:24:01.902714 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:01 crc kubenswrapper[4606]: I1212 00:24:01.902724 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:01 crc kubenswrapper[4606]: I1212 00:24:01.902739 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:01 crc kubenswrapper[4606]: I1212 00:24:01.902751 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:01Z","lastTransitionTime":"2025-12-12T00:24:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:01 crc kubenswrapper[4606]: I1212 00:24:01.919489 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f8df398-6835-482a-a2cd-3395b6e9efab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b993d03e36e69edb510ce98f50dfb98344fb6fe2b5debeff3a2004807dd00ee6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c6fef20d4e159a405f0ab0206c49e5e314e97bf7ddaf0758e090106945c9ffe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://accf40ef1339b65136666c1ea6f21a70d2152a650c87ef3ade19ddb6f9effbc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbd49ad6d6011a235d440da0e8058c0fc4dcc64effec37b50a68e84438d537a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2860bce1479ca1d558c4f5230d76cadcaf3efc0e3842ef0d371ecf1168cd852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7780eb068dc4a157794f9253367d2198a38a2c039cf3906fc2319cf48db3cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa7780eb068dc4a157794f9253367d2198a38a2c039cf3906fc2319cf48db3cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://684c387c89889a0a1e8d317ca70ebfc71ea377eb5de9c08c71a3ff02faf99e4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://684c387c89889a0a1e8d317ca70ebfc71ea377eb5de9c08c71a3ff02faf99e4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://78da314feaec6cab81e27eaea086d2bba3a05c4b73ce2add1a0bb57a226b0f5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78da314feaec6cab81e27eaea086d2bba3a05c4b73ce2add1a0bb57a226b0f5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:01Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:01 crc kubenswrapper[4606]: I1212 00:24:01.933065 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92839d27-9755-4aaa-a4c2-8aa83b6473de\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d18b3a560566347e9cfcd748453a89d1588de77f02a5e961a71916c6182eff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9531d10c07644d54aeff48edf4ac068968acbd1b6597baabcb0660e7bcd54c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76f2e4c2d97f36aebba4bcbabe12214a1df27637376d1440ee54d43fca39065\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b764ca1153735e365d95ed2855c7d41cd7457614e1ddce0f008d62720d607e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c75350d79a93c04db2bd6d71995920cde9b897776249b8696f3ead3939f7d5d4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"ed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1765499023\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1765499022\\\\\\\\\\\\\\\" (2025-12-11 23:23:42 +0000 UTC to 2026-12-11 23:23:42 +0000 UTC (now=2025-12-12 00:23:48.120831298 +0000 UTC))\\\\\\\"\\\\nI1212 00:23:48.120889 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1212 00:23:48.120921 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1212 00:23:48.120964 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1212 00:23:48.120986 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1212 00:23:48.121028 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3271453193/tls.crt::/tmp/serving-cert-3271453193/tls.key\\\\\\\"\\\\nI1212 00:23:48.121084 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1212 00:23:48.121211 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1212 00:23:48.121228 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1212 00:23:48.121265 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1212 00:23:48.121275 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1212 00:23:48.121345 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1212 00:23:48.121358 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF1212 00:23:48.123292 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3baf1bf5fbaa04f9f165f43b6a4fba496be0d453641fcd8bbc811687891dd66\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bbabfdcef0c9f4dcaada0bf5c2e98c083fd0f809462f017f8a767ef800e8e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0bbabfdcef0c9f4dcaada0bf5c2e98c083fd0f809462f017f8a767ef800e8e13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:01Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:01 crc kubenswrapper[4606]: I1212 00:24:01.943661 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hpw5w_da25b0ba-5398-4185-a4c1-aeba44ae5633/ovnkube-controller/1.log" Dec 12 00:24:01 crc kubenswrapper[4606]: I1212 00:24:01.944115 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hpw5w_da25b0ba-5398-4185-a4c1-aeba44ae5633/ovnkube-controller/0.log" Dec 12 00:24:01 crc kubenswrapper[4606]: I1212 00:24:01.946618 4606 generic.go:334] "Generic (PLEG): container finished" podID="da25b0ba-5398-4185-a4c1-aeba44ae5633" containerID="5cd9db0d886b8f1de6f10f2be1348147d443ba4f3e542b171b1daf9f5abbd1ec" exitCode=1 Dec 12 00:24:01 crc kubenswrapper[4606]: I1212 00:24:01.946653 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" event={"ID":"da25b0ba-5398-4185-a4c1-aeba44ae5633","Type":"ContainerDied","Data":"5cd9db0d886b8f1de6f10f2be1348147d443ba4f3e542b171b1daf9f5abbd1ec"} Dec 12 00:24:01 crc kubenswrapper[4606]: I1212 00:24:01.946719 4606 scope.go:117] "RemoveContainer" containerID="2e9ccad70d1286f1c2de41f13ad508f217e20f2351d2b31d5153fdfdcdc79d45" Dec 12 00:24:01 crc kubenswrapper[4606]: I1212 00:24:01.947380 4606 scope.go:117] "RemoveContainer" containerID="5cd9db0d886b8f1de6f10f2be1348147d443ba4f3e542b171b1daf9f5abbd1ec" Dec 12 00:24:01 crc kubenswrapper[4606]: E1212 00:24:01.947570 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-hpw5w_openshift-ovn-kubernetes(da25b0ba-5398-4185-a4c1-aeba44ae5633)\"" pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" podUID="da25b0ba-5398-4185-a4c1-aeba44ae5633" Dec 12 00:24:01 crc kubenswrapper[4606]: I1212 00:24:01.949481 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ba650bd-b1e1-4950-98c6-2bee87181b18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de6cd366dc8d770945235faa6560cea35c0ef7eabd95cb272639e2116e2d623d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a380a96c542428bf9cf66391f4d843144052269b294636e863319b2ba954047f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://902a9d2fb0d7dcdec6339fac981ee0539911f1998a86ecf02a01015d3b1b5907\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4309a0a93977264e5000a930a32a4fb18ebe51fcf2af20bbf153f30f09759ff6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:01Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:01 crc kubenswrapper[4606]: I1212 00:24:01.963819 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:01Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:01 crc kubenswrapper[4606]: I1212 00:24:01.973680 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://694cd2f423e8241bd8fd68395824d1cd45a12d80e92ad57e63e59c0642b21384\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:01Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:01 crc kubenswrapper[4606]: I1212 00:24:01.982998 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a543e227-be89-40cb-941d-b4707cc28921\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b09638dd0d881593745c14548156f20a9366f65ecfa5018d510b902240dd4f7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-986w2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5de882904f4a5718bcf44d07da05096363447836f325f35130914c17a30c3a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-986w2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cqmz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:01Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:01 crc kubenswrapper[4606]: I1212 00:24:01.994341 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xzcfk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84110159d07d70b0607ac029fec70fb043bbb0e310c39955e49becfe87e9faa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-spsv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xzcfk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:01Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:02 crc kubenswrapper[4606]: I1212 00:24:02.004444 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:02 crc kubenswrapper[4606]: I1212 00:24:02.004479 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:02 crc kubenswrapper[4606]: I1212 00:24:02.004490 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:02 crc kubenswrapper[4606]: I1212 00:24:02.004508 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:02 crc kubenswrapper[4606]: I1212 00:24:02.004520 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:02Z","lastTransitionTime":"2025-12-12T00:24:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:02 crc kubenswrapper[4606]: I1212 00:24:02.007642 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce6c4e852b4c68e910621ad11058beecb40960d501d3e64f5fcad97c442df4ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:02Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:02 crc kubenswrapper[4606]: I1212 00:24:02.020296 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-w4rbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"470c2076-46bf-4305-9fb1-3e509eb4d4f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78e7b08154aa4792439cd9e13c38f1c3a106a4d138d8e6a47d5406c82f11662c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34a40d0ce85078630553b90b7a35510e0d3523f600db8940d5e0cf063903192d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34a40d0ce85078630553b90b7a35510e0d3523f600db8940d5e0cf063903192d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f0d903a3e09486a00bc5031c1a100d7a520f8e0cbdcccbeadb796e70f1075b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f0d903a3e09486a00bc5031c1a100d7a520f8e0cbdcccbeadb796e70f1075b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06909fa1392ee8f29394ecd7411437e4f331ccd2007d0a534d0292175a7260ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06909fa1392ee8f29394ecd7411437e4f331ccd2007d0a534d0292175a7260ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://328f409c465a8289f2e2e070ac2475187e763de8354cad8af5b5b4e0a3539628\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://328f409c465a8289f2e2e070ac2475187e763de8354cad8af5b5b4e0a3539628\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09980f0c6b16141a2a66d2768c6e01f75c37314fea48825e9b9707a3fc4d3f9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09980f0c6b16141a2a66d2768c6e01f75c37314fea48825e9b9707a3fc4d3f9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d34cd079b74cccb75c66edf8c51b64113483767ced4eb56ad429d6527d0bb0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d34cd079b74cccb75c66edf8c51b64113483767ced4eb56ad429d6527d0bb0c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-w4rbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:02Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:02 crc kubenswrapper[4606]: I1212 00:24:02.040096 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da25b0ba-5398-4185-a4c1-aeba44ae5633\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa827acce161127216bcfcf3553efeb627cbe52e4d23a8cb011e025f67dde670\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://653bdfa6f15b75149c89a4aff0706e132368f8bc75520cde9f1e1c55c8866537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbb742680d55b47d51cd1964e187b6816e50a88befbce58d3a8746af74cf7071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://366ddc63e1156e702b24e0c3d3c02785c10324669fa851e46851f9a2b8a46a5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6183a4685fda61493c792e9ae693cd7f76e4aad9753f622b48347da3fd2b3b00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0867a89de2edf12036267a937b7e3b1a801e9710ff0a1250c20e020c80ac4d5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cd9db0d886b8f1de6f10f2be1348147d443ba4f3e542b171b1daf9f5abbd1ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e9ccad70d1286f1c2de41f13ad508f217e20f2351d2b31d5153fdfdcdc79d45\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-12T00:23:59Z\\\",\\\"message\\\":\\\"ing reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1212 00:23:59.117007 5872 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1212 00:23:59.117044 5872 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1212 00:23:59.117090 5872 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1212 00:23:59.117122 5872 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1212 00:23:59.117239 5872 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1212 00:23:59.117303 5872 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1212 00:23:59.117334 5872 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1212 00:23:59.117385 5872 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1212 00:23:59.117439 5872 factory.go:656] Stopping watch factory\\\\nI1212 00:23:59.117508 5872 ovnkube.go:599] Stopped ovnkube\\\\nI1212 00:23:59.117281 5872 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1212 00:23:59.117164 5872 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1212 00:23:59.117358 5872 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1212 00:23:59.117688 5872 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1212 00:23:59.117476 5872 handler.go:208] Removed *v1.Node event handler 7\\\\nI12\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5cd9db0d886b8f1de6f10f2be1348147d443ba4f3e542b171b1daf9f5abbd1ec\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-12T00:24:01Z\\\",\\\"message\\\":\\\"retry setup to complete in iterateRetryResources\\\\nF1212 00:24:00.622973 5993 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:00Z is after 2025-08-24T17:21:41Z]\\\\nI1212 00:24:00.622999 5993 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1212 00:24:00.623015 5993 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1212 00:24:00.623022 5993 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf in node crc\\\\nI1212 00:24:00.623021 5993 obj_retry.go:303] Ret\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-12T00:24:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e3c08f155fa3cef9692edf10c4df9719f3647b5625a9ac68a7c158afb80dd86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f94a9f641815d76464c17c6145b9ac4803eedab3bea29a52d65a793d448cebe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f94a9f641815d76464c17c6145b9ac4803eedab3bea29a52d65a793d448cebe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hpw5w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:02Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:02 crc kubenswrapper[4606]: I1212 00:24:02.050241 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7rxps" Dec 12 00:24:02 crc kubenswrapper[4606]: I1212 00:24:02.055714 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7rxps" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c831d6d-b07d-46dd-adc0-85239379350f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbjbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbjbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:24:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7rxps\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:02Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:02 crc kubenswrapper[4606]: W1212 00:24:02.064593 4606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c831d6d_b07d_46dd_adc0_85239379350f.slice/crio-b435496bcb5a5c8cb0bfe8e0be6b90bff2e46240086db5d960ef0c6dd0ac201b WatchSource:0}: Error finding container b435496bcb5a5c8cb0bfe8e0be6b90bff2e46240086db5d960ef0c6dd0ac201b: Status 404 returned error can't find the container with id b435496bcb5a5c8cb0bfe8e0be6b90bff2e46240086db5d960ef0c6dd0ac201b Dec 12 00:24:02 crc kubenswrapper[4606]: I1212 00:24:02.075085 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:02Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:02 crc kubenswrapper[4606]: I1212 00:24:02.085706 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-554rp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d34036b-3243-416a-a0c0-1d1ddb3a0ca5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://346417326a8a4ba8e839ce4870dbaacd33a90b4ab096d8ce573ce598d909b51a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9k8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-554rp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:02Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:02 crc kubenswrapper[4606]: I1212 00:24:02.097444 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ba650bd-b1e1-4950-98c6-2bee87181b18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de6cd366dc8d770945235faa6560cea35c0ef7eabd95cb272639e2116e2d623d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a380a96c542428bf9cf66391f4d843144052269b294636e863319b2ba954047f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://902a9d2fb0d7dcdec6339fac981ee0539911f1998a86ecf02a01015d3b1b5907\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4309a0a93977264e5000a930a32a4fb18ebe51fcf2af20bbf153f30f09759ff6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:02Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:02 crc kubenswrapper[4606]: I1212 00:24:02.107577 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:02 crc kubenswrapper[4606]: I1212 00:24:02.107616 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:02 crc kubenswrapper[4606]: I1212 00:24:02.107627 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:02 crc kubenswrapper[4606]: I1212 00:24:02.107643 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:02 crc kubenswrapper[4606]: I1212 00:24:02.107654 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:02Z","lastTransitionTime":"2025-12-12T00:24:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:02 crc kubenswrapper[4606]: I1212 00:24:02.111688 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:02Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:02 crc kubenswrapper[4606]: I1212 00:24:02.124798 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://694cd2f423e8241bd8fd68395824d1cd45a12d80e92ad57e63e59c0642b21384\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:02Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:02 crc kubenswrapper[4606]: I1212 00:24:02.136673 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a543e227-be89-40cb-941d-b4707cc28921\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b09638dd0d881593745c14548156f20a9366f65ecfa5018d510b902240dd4f7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-986w2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5de882904f4a5718bcf44d07da05096363447836f325f35130914c17a30c3a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-986w2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cqmz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:02Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:02 crc kubenswrapper[4606]: I1212 00:24:02.151518 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xzcfk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84110159d07d70b0607ac029fec70fb043bbb0e310c39955e49becfe87e9faa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-spsv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xzcfk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:02Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:02 crc kubenswrapper[4606]: I1212 00:24:02.170326 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f8df398-6835-482a-a2cd-3395b6e9efab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b993d03e36e69edb510ce98f50dfb98344fb6fe2b5debeff3a2004807dd00ee6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c6fef20d4e159a405f0ab0206c49e5e314e97bf7ddaf0758e090106945c9ffe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://accf40ef1339b65136666c1ea6f21a70d2152a650c87ef3ade19ddb6f9effbc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbd49ad6d6011a235d440da0e8058c0fc4dcc64effec37b50a68e84438d537a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2860bce1479ca1d558c4f5230d76cadcaf3efc0e3842ef0d371ecf1168cd852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7780eb068dc4a157794f9253367d2198a38a2c039cf3906fc2319cf48db3cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa7780eb068dc4a157794f9253367d2198a38a2c039cf3906fc2319cf48db3cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://684c387c89889a0a1e8d317ca70ebfc71ea377eb5de9c08c71a3ff02faf99e4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://684c387c89889a0a1e8d317ca70ebfc71ea377eb5de9c08c71a3ff02faf99e4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://78da314feaec6cab81e27eaea086d2bba3a05c4b73ce2add1a0bb57a226b0f5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78da314feaec6cab81e27eaea086d2bba3a05c4b73ce2add1a0bb57a226b0f5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:02Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:02 crc kubenswrapper[4606]: I1212 00:24:02.182225 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92839d27-9755-4aaa-a4c2-8aa83b6473de\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d18b3a560566347e9cfcd748453a89d1588de77f02a5e961a71916c6182eff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9531d10c07644d54aeff48edf4ac068968acbd1b6597baabcb0660e7bcd54c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76f2e4c2d97f36aebba4bcbabe12214a1df27637376d1440ee54d43fca39065\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b764ca1153735e365d95ed2855c7d41cd7457614e1ddce0f008d62720d607e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c75350d79a93c04db2bd6d71995920cde9b897776249b8696f3ead3939f7d5d4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"ed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1765499023\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1765499022\\\\\\\\\\\\\\\" (2025-12-11 23:23:42 +0000 UTC to 2026-12-11 23:23:42 +0000 UTC (now=2025-12-12 00:23:48.120831298 +0000 UTC))\\\\\\\"\\\\nI1212 00:23:48.120889 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1212 00:23:48.120921 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1212 00:23:48.120964 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1212 00:23:48.120986 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1212 00:23:48.121028 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3271453193/tls.crt::/tmp/serving-cert-3271453193/tls.key\\\\\\\"\\\\nI1212 00:23:48.121084 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1212 00:23:48.121211 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1212 00:23:48.121228 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1212 00:23:48.121265 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1212 00:23:48.121275 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1212 00:23:48.121345 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1212 00:23:48.121358 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF1212 00:23:48.123292 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3baf1bf5fbaa04f9f165f43b6a4fba496be0d453641fcd8bbc811687891dd66\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bbabfdcef0c9f4dcaada0bf5c2e98c083fd0f809462f017f8a767ef800e8e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0bbabfdcef0c9f4dcaada0bf5c2e98c083fd0f809462f017f8a767ef800e8e13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:02Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:02 crc kubenswrapper[4606]: I1212 00:24:02.199592 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4986a294a19f85082b9caf854f13f9c3587ca49314b40917b00768ee0bb51ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4251eab73ccb6141903993f8df922037762f7c11dc1e0ef1a2b5708ed435307\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:02Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:02 crc kubenswrapper[4606]: I1212 00:24:02.210066 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:02 crc kubenswrapper[4606]: I1212 00:24:02.210099 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:02 crc kubenswrapper[4606]: I1212 00:24:02.210109 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:02 crc kubenswrapper[4606]: I1212 00:24:02.210124 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:02 crc kubenswrapper[4606]: I1212 00:24:02.210135 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:02Z","lastTransitionTime":"2025-12-12T00:24:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:02 crc kubenswrapper[4606]: I1212 00:24:02.214089 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:02Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:02 crc kubenswrapper[4606]: I1212 00:24:02.223273 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qtz7b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f437a237-3eb8-4817-b0a9-35efece69933\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f587a144b02f6ba1229380174f2a4f2c50ed33702fe8d64eb5ff7174b32eb2d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qq49f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qtz7b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:02Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:02 crc kubenswrapper[4606]: I1212 00:24:02.313840 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:02 crc kubenswrapper[4606]: I1212 00:24:02.313889 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:02 crc kubenswrapper[4606]: I1212 00:24:02.313912 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:02 crc kubenswrapper[4606]: I1212 00:24:02.313939 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:02 crc kubenswrapper[4606]: I1212 00:24:02.313960 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:02Z","lastTransitionTime":"2025-12-12T00:24:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:02 crc kubenswrapper[4606]: I1212 00:24:02.417117 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:02 crc kubenswrapper[4606]: I1212 00:24:02.417261 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:02 crc kubenswrapper[4606]: I1212 00:24:02.417847 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:02 crc kubenswrapper[4606]: I1212 00:24:02.417948 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:02 crc kubenswrapper[4606]: I1212 00:24:02.418246 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:02Z","lastTransitionTime":"2025-12-12T00:24:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:02 crc kubenswrapper[4606]: I1212 00:24:02.521109 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:02 crc kubenswrapper[4606]: I1212 00:24:02.521224 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:02 crc kubenswrapper[4606]: I1212 00:24:02.521252 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:02 crc kubenswrapper[4606]: I1212 00:24:02.521282 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:02 crc kubenswrapper[4606]: I1212 00:24:02.521306 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:02Z","lastTransitionTime":"2025-12-12T00:24:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:02 crc kubenswrapper[4606]: I1212 00:24:02.624275 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:02 crc kubenswrapper[4606]: I1212 00:24:02.624350 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:02 crc kubenswrapper[4606]: I1212 00:24:02.624390 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:02 crc kubenswrapper[4606]: I1212 00:24:02.624413 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:02 crc kubenswrapper[4606]: I1212 00:24:02.624432 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:02Z","lastTransitionTime":"2025-12-12T00:24:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:02 crc kubenswrapper[4606]: I1212 00:24:02.699291 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 12 00:24:02 crc kubenswrapper[4606]: I1212 00:24:02.699400 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 12 00:24:02 crc kubenswrapper[4606]: E1212 00:24:02.699439 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 12 00:24:02 crc kubenswrapper[4606]: I1212 00:24:02.699501 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 12 00:24:02 crc kubenswrapper[4606]: E1212 00:24:02.699663 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 12 00:24:02 crc kubenswrapper[4606]: E1212 00:24:02.699878 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 12 00:24:02 crc kubenswrapper[4606]: I1212 00:24:02.726435 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:02 crc kubenswrapper[4606]: I1212 00:24:02.726480 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:02 crc kubenswrapper[4606]: I1212 00:24:02.726497 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:02 crc kubenswrapper[4606]: I1212 00:24:02.726520 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:02 crc kubenswrapper[4606]: I1212 00:24:02.726536 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:02Z","lastTransitionTime":"2025-12-12T00:24:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:02 crc kubenswrapper[4606]: I1212 00:24:02.829136 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:02 crc kubenswrapper[4606]: I1212 00:24:02.829192 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:02 crc kubenswrapper[4606]: I1212 00:24:02.829205 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:02 crc kubenswrapper[4606]: I1212 00:24:02.829222 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:02 crc kubenswrapper[4606]: I1212 00:24:02.829233 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:02Z","lastTransitionTime":"2025-12-12T00:24:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:02 crc kubenswrapper[4606]: I1212 00:24:02.933402 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:02 crc kubenswrapper[4606]: I1212 00:24:02.933451 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:02 crc kubenswrapper[4606]: I1212 00:24:02.933467 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:02 crc kubenswrapper[4606]: I1212 00:24:02.933490 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:02 crc kubenswrapper[4606]: I1212 00:24:02.933505 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:02Z","lastTransitionTime":"2025-12-12T00:24:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:02 crc kubenswrapper[4606]: I1212 00:24:02.950147 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7rxps" event={"ID":"4c831d6d-b07d-46dd-adc0-85239379350f","Type":"ContainerStarted","Data":"ba170e097af91d1b27e0ee3e414e0831a5a93256c693247e7129a5fda0b9dc85"} Dec 12 00:24:02 crc kubenswrapper[4606]: I1212 00:24:02.950234 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7rxps" event={"ID":"4c831d6d-b07d-46dd-adc0-85239379350f","Type":"ContainerStarted","Data":"b435496bcb5a5c8cb0bfe8e0be6b90bff2e46240086db5d960ef0c6dd0ac201b"} Dec 12 00:24:02 crc kubenswrapper[4606]: I1212 00:24:02.952569 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hpw5w_da25b0ba-5398-4185-a4c1-aeba44ae5633/ovnkube-controller/1.log" Dec 12 00:24:02 crc kubenswrapper[4606]: I1212 00:24:02.957408 4606 scope.go:117] "RemoveContainer" containerID="5cd9db0d886b8f1de6f10f2be1348147d443ba4f3e542b171b1daf9f5abbd1ec" Dec 12 00:24:02 crc kubenswrapper[4606]: E1212 00:24:02.957574 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-hpw5w_openshift-ovn-kubernetes(da25b0ba-5398-4185-a4c1-aeba44ae5633)\"" pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" podUID="da25b0ba-5398-4185-a4c1-aeba44ae5633" Dec 12 00:24:02 crc kubenswrapper[4606]: I1212 00:24:02.969955 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce6c4e852b4c68e910621ad11058beecb40960d501d3e64f5fcad97c442df4ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:02Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:02 crc kubenswrapper[4606]: I1212 00:24:02.982229 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7rxps" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c831d6d-b07d-46dd-adc0-85239379350f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbjbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbjbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:24:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7rxps\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:02Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:02 crc kubenswrapper[4606]: I1212 00:24:02.993588 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:02Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:03 crc kubenswrapper[4606]: I1212 00:24:03.005048 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-554rp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d34036b-3243-416a-a0c0-1d1ddb3a0ca5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://346417326a8a4ba8e839ce4870dbaacd33a90b4ab096d8ce573ce598d909b51a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9k8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-554rp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:03Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:03 crc kubenswrapper[4606]: I1212 00:24:03.035739 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:03 crc kubenswrapper[4606]: I1212 00:24:03.035782 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:03 crc kubenswrapper[4606]: I1212 00:24:03.035795 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:03 crc kubenswrapper[4606]: I1212 00:24:03.035817 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:03 crc kubenswrapper[4606]: I1212 00:24:03.035829 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:03Z","lastTransitionTime":"2025-12-12T00:24:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:03 crc kubenswrapper[4606]: I1212 00:24:03.039880 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-w4rbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"470c2076-46bf-4305-9fb1-3e509eb4d4f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78e7b08154aa4792439cd9e13c38f1c3a106a4d138d8e6a47d5406c82f11662c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34a40d0ce85078630553b90b7a35510e0d3523f600db8940d5e0cf063903192d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34a40d0ce85078630553b90b7a35510e0d3523f600db8940d5e0cf063903192d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f0d903a3e09486a00bc5031c1a100d7a520f8e0cbdcccbeadb796e70f1075b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f0d903a3e09486a00bc5031c1a100d7a520f8e0cbdcccbeadb796e70f1075b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06909fa1392ee8f29394ecd7411437e4f331ccd2007d0a534d0292175a7260ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06909fa1392ee8f29394ecd7411437e4f331ccd2007d0a534d0292175a7260ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://328f409c465a8289f2e2e070ac2475187e763de8354cad8af5b5b4e0a3539628\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://328f409c465a8289f2e2e070ac2475187e763de8354cad8af5b5b4e0a3539628\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09980f0c6b16141a2a66d2768c6e01f75c37314fea48825e9b9707a3fc4d3f9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09980f0c6b16141a2a66d2768c6e01f75c37314fea48825e9b9707a3fc4d3f9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d34cd079b74cccb75c66edf8c51b64113483767ced4eb56ad429d6527d0bb0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d34cd079b74cccb75c66edf8c51b64113483767ced4eb56ad429d6527d0bb0c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-w4rbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:03Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:03 crc kubenswrapper[4606]: I1212 00:24:03.065775 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da25b0ba-5398-4185-a4c1-aeba44ae5633\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa827acce161127216bcfcf3553efeb627cbe52e4d23a8cb011e025f67dde670\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://653bdfa6f15b75149c89a4aff0706e132368f8bc75520cde9f1e1c55c8866537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbb742680d55b47d51cd1964e187b6816e50a88befbce58d3a8746af74cf7071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://366ddc63e1156e702b24e0c3d3c02785c10324669fa851e46851f9a2b8a46a5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6183a4685fda61493c792e9ae693cd7f76e4aad9753f622b48347da3fd2b3b00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0867a89de2edf12036267a937b7e3b1a801e9710ff0a1250c20e020c80ac4d5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cd9db0d886b8f1de6f10f2be1348147d443ba4f3e542b171b1daf9f5abbd1ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5cd9db0d886b8f1de6f10f2be1348147d443ba4f3e542b171b1daf9f5abbd1ec\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-12T00:24:01Z\\\",\\\"message\\\":\\\"retry setup to complete in iterateRetryResources\\\\nF1212 00:24:00.622973 5993 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:00Z is after 2025-08-24T17:21:41Z]\\\\nI1212 00:24:00.622999 5993 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1212 00:24:00.623015 5993 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1212 00:24:00.623022 5993 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf in node crc\\\\nI1212 00:24:00.623021 5993 obj_retry.go:303] Ret\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-12T00:24:00Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-hpw5w_openshift-ovn-kubernetes(da25b0ba-5398-4185-a4c1-aeba44ae5633)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e3c08f155fa3cef9692edf10c4df9719f3647b5625a9ac68a7c158afb80dd86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f94a9f641815d76464c17c6145b9ac4803eedab3bea29a52d65a793d448cebe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f94a9f641815d76464c17c6145b9ac4803eedab3bea29a52d65a793d448cebe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hpw5w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:03Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:03 crc kubenswrapper[4606]: I1212 00:24:03.080241 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://694cd2f423e8241bd8fd68395824d1cd45a12d80e92ad57e63e59c0642b21384\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:03Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:03 crc kubenswrapper[4606]: I1212 00:24:03.095532 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a543e227-be89-40cb-941d-b4707cc28921\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b09638dd0d881593745c14548156f20a9366f65ecfa5018d510b902240dd4f7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-986w2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5de882904f4a5718bcf44d07da05096363447836f325f35130914c17a30c3a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-986w2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cqmz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:03Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:03 crc kubenswrapper[4606]: I1212 00:24:03.105875 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xzcfk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84110159d07d70b0607ac029fec70fb043bbb0e310c39955e49becfe87e9faa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-spsv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xzcfk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:03Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:03 crc kubenswrapper[4606]: I1212 00:24:03.125248 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f8df398-6835-482a-a2cd-3395b6e9efab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b993d03e36e69edb510ce98f50dfb98344fb6fe2b5debeff3a2004807dd00ee6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c6fef20d4e159a405f0ab0206c49e5e314e97bf7ddaf0758e090106945c9ffe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://accf40ef1339b65136666c1ea6f21a70d2152a650c87ef3ade19ddb6f9effbc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbd49ad6d6011a235d440da0e8058c0fc4dcc64effec37b50a68e84438d537a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2860bce1479ca1d558c4f5230d76cadcaf3efc0e3842ef0d371ecf1168cd852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7780eb068dc4a157794f9253367d2198a38a2c039cf3906fc2319cf48db3cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa7780eb068dc4a157794f9253367d2198a38a2c039cf3906fc2319cf48db3cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://684c387c89889a0a1e8d317ca70ebfc71ea377eb5de9c08c71a3ff02faf99e4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://684c387c89889a0a1e8d317ca70ebfc71ea377eb5de9c08c71a3ff02faf99e4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://78da314feaec6cab81e27eaea086d2bba3a05c4b73ce2add1a0bb57a226b0f5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78da314feaec6cab81e27eaea086d2bba3a05c4b73ce2add1a0bb57a226b0f5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:03Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:03 crc kubenswrapper[4606]: I1212 00:24:03.137425 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:03 crc kubenswrapper[4606]: I1212 00:24:03.137457 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:03 crc kubenswrapper[4606]: I1212 00:24:03.137465 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:03 crc kubenswrapper[4606]: I1212 00:24:03.137479 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:03 crc kubenswrapper[4606]: I1212 00:24:03.137488 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:03Z","lastTransitionTime":"2025-12-12T00:24:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:03 crc kubenswrapper[4606]: I1212 00:24:03.140710 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92839d27-9755-4aaa-a4c2-8aa83b6473de\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d18b3a560566347e9cfcd748453a89d1588de77f02a5e961a71916c6182eff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9531d10c07644d54aeff48edf4ac068968acbd1b6597baabcb0660e7bcd54c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76f2e4c2d97f36aebba4bcbabe12214a1df27637376d1440ee54d43fca39065\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b764ca1153735e365d95ed2855c7d41cd7457614e1ddce0f008d62720d607e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c75350d79a93c04db2bd6d71995920cde9b897776249b8696f3ead3939f7d5d4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"ed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1765499023\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1765499022\\\\\\\\\\\\\\\" (2025-12-11 23:23:42 +0000 UTC to 2026-12-11 23:23:42 +0000 UTC (now=2025-12-12 00:23:48.120831298 +0000 UTC))\\\\\\\"\\\\nI1212 00:23:48.120889 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1212 00:23:48.120921 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1212 00:23:48.120964 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1212 00:23:48.120986 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1212 00:23:48.121028 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3271453193/tls.crt::/tmp/serving-cert-3271453193/tls.key\\\\\\\"\\\\nI1212 00:23:48.121084 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1212 00:23:48.121211 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1212 00:23:48.121228 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1212 00:23:48.121265 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1212 00:23:48.121275 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1212 00:23:48.121345 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1212 00:23:48.121358 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF1212 00:23:48.123292 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3baf1bf5fbaa04f9f165f43b6a4fba496be0d453641fcd8bbc811687891dd66\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bbabfdcef0c9f4dcaada0bf5c2e98c083fd0f809462f017f8a767ef800e8e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0bbabfdcef0c9f4dcaada0bf5c2e98c083fd0f809462f017f8a767ef800e8e13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:03Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:03 crc kubenswrapper[4606]: I1212 00:24:03.154122 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ba650bd-b1e1-4950-98c6-2bee87181b18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de6cd366dc8d770945235faa6560cea35c0ef7eabd95cb272639e2116e2d623d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a380a96c542428bf9cf66391f4d843144052269b294636e863319b2ba954047f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://902a9d2fb0d7dcdec6339fac981ee0539911f1998a86ecf02a01015d3b1b5907\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4309a0a93977264e5000a930a32a4fb18ebe51fcf2af20bbf153f30f09759ff6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:03Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:03 crc kubenswrapper[4606]: I1212 00:24:03.174704 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:03Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:03 crc kubenswrapper[4606]: I1212 00:24:03.183295 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qtz7b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f437a237-3eb8-4817-b0a9-35efece69933\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f587a144b02f6ba1229380174f2a4f2c50ed33702fe8d64eb5ff7174b32eb2d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qq49f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qtz7b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:03Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:03 crc kubenswrapper[4606]: I1212 00:24:03.193509 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4986a294a19f85082b9caf854f13f9c3587ca49314b40917b00768ee0bb51ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4251eab73ccb6141903993f8df922037762f7c11dc1e0ef1a2b5708ed435307\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:03Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:03 crc kubenswrapper[4606]: I1212 00:24:03.203228 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:03Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:03 crc kubenswrapper[4606]: I1212 00:24:03.239268 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:03 crc kubenswrapper[4606]: I1212 00:24:03.239305 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:03 crc kubenswrapper[4606]: I1212 00:24:03.239319 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:03 crc kubenswrapper[4606]: I1212 00:24:03.239338 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:03 crc kubenswrapper[4606]: I1212 00:24:03.239350 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:03Z","lastTransitionTime":"2025-12-12T00:24:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:03 crc kubenswrapper[4606]: I1212 00:24:03.341724 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:03 crc kubenswrapper[4606]: I1212 00:24:03.341757 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:03 crc kubenswrapper[4606]: I1212 00:24:03.341766 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:03 crc kubenswrapper[4606]: I1212 00:24:03.341780 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:03 crc kubenswrapper[4606]: I1212 00:24:03.341788 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:03Z","lastTransitionTime":"2025-12-12T00:24:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:03 crc kubenswrapper[4606]: I1212 00:24:03.444407 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:03 crc kubenswrapper[4606]: I1212 00:24:03.444463 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:03 crc kubenswrapper[4606]: I1212 00:24:03.444479 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:03 crc kubenswrapper[4606]: I1212 00:24:03.444503 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:03 crc kubenswrapper[4606]: I1212 00:24:03.444523 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:03Z","lastTransitionTime":"2025-12-12T00:24:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:03 crc kubenswrapper[4606]: I1212 00:24:03.546943 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:03 crc kubenswrapper[4606]: I1212 00:24:03.547005 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:03 crc kubenswrapper[4606]: I1212 00:24:03.547021 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:03 crc kubenswrapper[4606]: I1212 00:24:03.547044 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:03 crc kubenswrapper[4606]: I1212 00:24:03.547061 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:03Z","lastTransitionTime":"2025-12-12T00:24:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:03 crc kubenswrapper[4606]: I1212 00:24:03.614818 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-mjjwd"] Dec 12 00:24:03 crc kubenswrapper[4606]: I1212 00:24:03.615639 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mjjwd" Dec 12 00:24:03 crc kubenswrapper[4606]: E1212 00:24:03.615746 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mjjwd" podUID="0853dce1-c009-407e-960d-1113f85e503f" Dec 12 00:24:03 crc kubenswrapper[4606]: I1212 00:24:03.635142 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce6c4e852b4c68e910621ad11058beecb40960d501d3e64f5fcad97c442df4ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:03Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:03 crc kubenswrapper[4606]: I1212 00:24:03.650013 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:03 crc kubenswrapper[4606]: I1212 00:24:03.650068 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:03 crc kubenswrapper[4606]: I1212 00:24:03.650080 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:03 crc kubenswrapper[4606]: I1212 00:24:03.650099 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:03 crc kubenswrapper[4606]: I1212 00:24:03.650114 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:03Z","lastTransitionTime":"2025-12-12T00:24:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:03 crc kubenswrapper[4606]: I1212 00:24:03.652427 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:03Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:03 crc kubenswrapper[4606]: I1212 00:24:03.665485 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-554rp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d34036b-3243-416a-a0c0-1d1ddb3a0ca5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://346417326a8a4ba8e839ce4870dbaacd33a90b4ab096d8ce573ce598d909b51a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9k8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-554rp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:03Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:03 crc kubenswrapper[4606]: I1212 00:24:03.667905 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwwtt\" (UniqueName: \"kubernetes.io/projected/0853dce1-c009-407e-960d-1113f85e503f-kube-api-access-rwwtt\") pod \"network-metrics-daemon-mjjwd\" (UID: \"0853dce1-c009-407e-960d-1113f85e503f\") " pod="openshift-multus/network-metrics-daemon-mjjwd" Dec 12 00:24:03 crc kubenswrapper[4606]: I1212 00:24:03.667959 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0853dce1-c009-407e-960d-1113f85e503f-metrics-certs\") pod \"network-metrics-daemon-mjjwd\" (UID: \"0853dce1-c009-407e-960d-1113f85e503f\") " pod="openshift-multus/network-metrics-daemon-mjjwd" Dec 12 00:24:03 crc kubenswrapper[4606]: I1212 00:24:03.684060 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-w4rbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"470c2076-46bf-4305-9fb1-3e509eb4d4f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78e7b08154aa4792439cd9e13c38f1c3a106a4d138d8e6a47d5406c82f11662c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34a40d0ce85078630553b90b7a35510e0d3523f600db8940d5e0cf063903192d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34a40d0ce85078630553b90b7a35510e0d3523f600db8940d5e0cf063903192d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f0d903a3e09486a00bc5031c1a100d7a520f8e0cbdcccbeadb796e70f1075b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f0d903a3e09486a00bc5031c1a100d7a520f8e0cbdcccbeadb796e70f1075b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06909fa1392ee8f29394ecd7411437e4f331ccd2007d0a534d0292175a7260ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06909fa1392ee8f29394ecd7411437e4f331ccd2007d0a534d0292175a7260ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://328f409c465a8289f2e2e070ac2475187e763de8354cad8af5b5b4e0a3539628\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://328f409c465a8289f2e2e070ac2475187e763de8354cad8af5b5b4e0a3539628\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09980f0c6b16141a2a66d2768c6e01f75c37314fea48825e9b9707a3fc4d3f9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09980f0c6b16141a2a66d2768c6e01f75c37314fea48825e9b9707a3fc4d3f9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d34cd079b74cccb75c66edf8c51b64113483767ced4eb56ad429d6527d0bb0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d34cd079b74cccb75c66edf8c51b64113483767ced4eb56ad429d6527d0bb0c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-w4rbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:03Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:03 crc kubenswrapper[4606]: I1212 00:24:03.710222 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da25b0ba-5398-4185-a4c1-aeba44ae5633\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa827acce161127216bcfcf3553efeb627cbe52e4d23a8cb011e025f67dde670\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://653bdfa6f15b75149c89a4aff0706e132368f8bc75520cde9f1e1c55c8866537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbb742680d55b47d51cd1964e187b6816e50a88befbce58d3a8746af74cf7071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://366ddc63e1156e702b24e0c3d3c02785c10324669fa851e46851f9a2b8a46a5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6183a4685fda61493c792e9ae693cd7f76e4aad9753f622b48347da3fd2b3b00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0867a89de2edf12036267a937b7e3b1a801e9710ff0a1250c20e020c80ac4d5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cd9db0d886b8f1de6f10f2be1348147d443ba4f3e542b171b1daf9f5abbd1ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5cd9db0d886b8f1de6f10f2be1348147d443ba4f3e542b171b1daf9f5abbd1ec\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-12T00:24:01Z\\\",\\\"message\\\":\\\"retry setup to complete in iterateRetryResources\\\\nF1212 00:24:00.622973 5993 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:00Z is after 2025-08-24T17:21:41Z]\\\\nI1212 00:24:00.622999 5993 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1212 00:24:00.623015 5993 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1212 00:24:00.623022 5993 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf in node crc\\\\nI1212 00:24:00.623021 5993 obj_retry.go:303] Ret\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-12T00:24:00Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-hpw5w_openshift-ovn-kubernetes(da25b0ba-5398-4185-a4c1-aeba44ae5633)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e3c08f155fa3cef9692edf10c4df9719f3647b5625a9ac68a7c158afb80dd86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f94a9f641815d76464c17c6145b9ac4803eedab3bea29a52d65a793d448cebe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f94a9f641815d76464c17c6145b9ac4803eedab3bea29a52d65a793d448cebe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hpw5w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:03Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:03 crc kubenswrapper[4606]: I1212 00:24:03.724649 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7rxps" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c831d6d-b07d-46dd-adc0-85239379350f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbjbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbjbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:24:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7rxps\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:03Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:03 crc kubenswrapper[4606]: I1212 00:24:03.739446 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mjjwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0853dce1-c009-407e-960d-1113f85e503f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwwtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwwtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:24:03Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mjjwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:03Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:03 crc kubenswrapper[4606]: I1212 00:24:03.752293 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:03 crc kubenswrapper[4606]: I1212 00:24:03.752341 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:03 crc kubenswrapper[4606]: I1212 00:24:03.752357 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:03 crc kubenswrapper[4606]: I1212 00:24:03.752380 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:03 crc kubenswrapper[4606]: I1212 00:24:03.752394 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:03Z","lastTransitionTime":"2025-12-12T00:24:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:03 crc kubenswrapper[4606]: I1212 00:24:03.754821 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xzcfk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84110159d07d70b0607ac029fec70fb043bbb0e310c39955e49becfe87e9faa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-spsv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xzcfk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:03Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:03 crc kubenswrapper[4606]: I1212 00:24:03.769231 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rwwtt\" (UniqueName: \"kubernetes.io/projected/0853dce1-c009-407e-960d-1113f85e503f-kube-api-access-rwwtt\") pod \"network-metrics-daemon-mjjwd\" (UID: \"0853dce1-c009-407e-960d-1113f85e503f\") " pod="openshift-multus/network-metrics-daemon-mjjwd" Dec 12 00:24:03 crc kubenswrapper[4606]: I1212 00:24:03.769331 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0853dce1-c009-407e-960d-1113f85e503f-metrics-certs\") pod \"network-metrics-daemon-mjjwd\" (UID: \"0853dce1-c009-407e-960d-1113f85e503f\") " pod="openshift-multus/network-metrics-daemon-mjjwd" Dec 12 00:24:03 crc kubenswrapper[4606]: E1212 00:24:03.769553 4606 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 12 00:24:03 crc kubenswrapper[4606]: E1212 00:24:03.769637 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0853dce1-c009-407e-960d-1113f85e503f-metrics-certs podName:0853dce1-c009-407e-960d-1113f85e503f nodeName:}" failed. No retries permitted until 2025-12-12 00:24:04.269618504 +0000 UTC m=+34.814971370 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0853dce1-c009-407e-960d-1113f85e503f-metrics-certs") pod "network-metrics-daemon-mjjwd" (UID: "0853dce1-c009-407e-960d-1113f85e503f") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 12 00:24:03 crc kubenswrapper[4606]: I1212 00:24:03.778800 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f8df398-6835-482a-a2cd-3395b6e9efab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b993d03e36e69edb510ce98f50dfb98344fb6fe2b5debeff3a2004807dd00ee6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c6fef20d4e159a405f0ab0206c49e5e314e97bf7ddaf0758e090106945c9ffe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://accf40ef1339b65136666c1ea6f21a70d2152a650c87ef3ade19ddb6f9effbc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbd49ad6d6011a235d440da0e8058c0fc4dcc64effec37b50a68e84438d537a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2860bce1479ca1d558c4f5230d76cadcaf3efc0e3842ef0d371ecf1168cd852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7780eb068dc4a157794f9253367d2198a38a2c039cf3906fc2319cf48db3cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa7780eb068dc4a157794f9253367d2198a38a2c039cf3906fc2319cf48db3cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://684c387c89889a0a1e8d317ca70ebfc71ea377eb5de9c08c71a3ff02faf99e4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://684c387c89889a0a1e8d317ca70ebfc71ea377eb5de9c08c71a3ff02faf99e4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://78da314feaec6cab81e27eaea086d2bba3a05c4b73ce2add1a0bb57a226b0f5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78da314feaec6cab81e27eaea086d2bba3a05c4b73ce2add1a0bb57a226b0f5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:03Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:03 crc kubenswrapper[4606]: I1212 00:24:03.794074 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92839d27-9755-4aaa-a4c2-8aa83b6473de\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d18b3a560566347e9cfcd748453a89d1588de77f02a5e961a71916c6182eff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9531d10c07644d54aeff48edf4ac068968acbd1b6597baabcb0660e7bcd54c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76f2e4c2d97f36aebba4bcbabe12214a1df27637376d1440ee54d43fca39065\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b764ca1153735e365d95ed2855c7d41cd7457614e1ddce0f008d62720d607e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c75350d79a93c04db2bd6d71995920cde9b897776249b8696f3ead3939f7d5d4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"ed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1765499023\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1765499022\\\\\\\\\\\\\\\" (2025-12-11 23:23:42 +0000 UTC to 2026-12-11 23:23:42 +0000 UTC (now=2025-12-12 00:23:48.120831298 +0000 UTC))\\\\\\\"\\\\nI1212 00:23:48.120889 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1212 00:23:48.120921 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1212 00:23:48.120964 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1212 00:23:48.120986 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1212 00:23:48.121028 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3271453193/tls.crt::/tmp/serving-cert-3271453193/tls.key\\\\\\\"\\\\nI1212 00:23:48.121084 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1212 00:23:48.121211 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1212 00:23:48.121228 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1212 00:23:48.121265 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1212 00:23:48.121275 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1212 00:23:48.121345 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1212 00:23:48.121358 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF1212 00:23:48.123292 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3baf1bf5fbaa04f9f165f43b6a4fba496be0d453641fcd8bbc811687891dd66\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bbabfdcef0c9f4dcaada0bf5c2e98c083fd0f809462f017f8a767ef800e8e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0bbabfdcef0c9f4dcaada0bf5c2e98c083fd0f809462f017f8a767ef800e8e13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:03Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:03 crc kubenswrapper[4606]: I1212 00:24:03.799980 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwwtt\" (UniqueName: \"kubernetes.io/projected/0853dce1-c009-407e-960d-1113f85e503f-kube-api-access-rwwtt\") pod \"network-metrics-daemon-mjjwd\" (UID: \"0853dce1-c009-407e-960d-1113f85e503f\") " pod="openshift-multus/network-metrics-daemon-mjjwd" Dec 12 00:24:03 crc kubenswrapper[4606]: I1212 00:24:03.809733 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ba650bd-b1e1-4950-98c6-2bee87181b18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de6cd366dc8d770945235faa6560cea35c0ef7eabd95cb272639e2116e2d623d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a380a96c542428bf9cf66391f4d843144052269b294636e863319b2ba954047f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://902a9d2fb0d7dcdec6339fac981ee0539911f1998a86ecf02a01015d3b1b5907\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4309a0a93977264e5000a930a32a4fb18ebe51fcf2af20bbf153f30f09759ff6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:03Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:03 crc kubenswrapper[4606]: I1212 00:24:03.821206 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:03Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:03 crc kubenswrapper[4606]: I1212 00:24:03.831840 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://694cd2f423e8241bd8fd68395824d1cd45a12d80e92ad57e63e59c0642b21384\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:03Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:03 crc kubenswrapper[4606]: I1212 00:24:03.841339 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a543e227-be89-40cb-941d-b4707cc28921\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b09638dd0d881593745c14548156f20a9366f65ecfa5018d510b902240dd4f7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-986w2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5de882904f4a5718bcf44d07da05096363447836f325f35130914c17a30c3a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-986w2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cqmz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:03Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:03 crc kubenswrapper[4606]: I1212 00:24:03.852746 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4986a294a19f85082b9caf854f13f9c3587ca49314b40917b00768ee0bb51ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4251eab73ccb6141903993f8df922037762f7c11dc1e0ef1a2b5708ed435307\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:03Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:03 crc kubenswrapper[4606]: I1212 00:24:03.854448 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:03 crc kubenswrapper[4606]: I1212 00:24:03.854486 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:03 crc kubenswrapper[4606]: I1212 00:24:03.854499 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:03 crc kubenswrapper[4606]: I1212 00:24:03.854516 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:03 crc kubenswrapper[4606]: I1212 00:24:03.854528 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:03Z","lastTransitionTime":"2025-12-12T00:24:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:03 crc kubenswrapper[4606]: I1212 00:24:03.865457 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:03Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:03 crc kubenswrapper[4606]: I1212 00:24:03.874846 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qtz7b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f437a237-3eb8-4817-b0a9-35efece69933\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f587a144b02f6ba1229380174f2a4f2c50ed33702fe8d64eb5ff7174b32eb2d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qq49f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qtz7b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:03Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:03 crc kubenswrapper[4606]: I1212 00:24:03.957650 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:03 crc kubenswrapper[4606]: I1212 00:24:03.957698 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:03 crc kubenswrapper[4606]: I1212 00:24:03.957755 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:03 crc kubenswrapper[4606]: I1212 00:24:03.957781 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:03 crc kubenswrapper[4606]: I1212 00:24:03.957845 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:03Z","lastTransitionTime":"2025-12-12T00:24:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:03 crc kubenswrapper[4606]: I1212 00:24:03.963043 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7rxps" event={"ID":"4c831d6d-b07d-46dd-adc0-85239379350f","Type":"ContainerStarted","Data":"d66f7e90d4969cb2791da434d47d0e4fb269aa8760347c7d59bd8d16f4cf42c0"} Dec 12 00:24:03 crc kubenswrapper[4606]: I1212 00:24:03.978667 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xzcfk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84110159d07d70b0607ac029fec70fb043bbb0e310c39955e49becfe87e9faa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-spsv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xzcfk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:03Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:04 crc kubenswrapper[4606]: I1212 00:24:04.004731 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f8df398-6835-482a-a2cd-3395b6e9efab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b993d03e36e69edb510ce98f50dfb98344fb6fe2b5debeff3a2004807dd00ee6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c6fef20d4e159a405f0ab0206c49e5e314e97bf7ddaf0758e090106945c9ffe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://accf40ef1339b65136666c1ea6f21a70d2152a650c87ef3ade19ddb6f9effbc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbd49ad6d6011a235d440da0e8058c0fc4dcc64effec37b50a68e84438d537a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2860bce1479ca1d558c4f5230d76cadcaf3efc0e3842ef0d371ecf1168cd852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7780eb068dc4a157794f9253367d2198a38a2c039cf3906fc2319cf48db3cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa7780eb068dc4a157794f9253367d2198a38a2c039cf3906fc2319cf48db3cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://684c387c89889a0a1e8d317ca70ebfc71ea377eb5de9c08c71a3ff02faf99e4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://684c387c89889a0a1e8d317ca70ebfc71ea377eb5de9c08c71a3ff02faf99e4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://78da314feaec6cab81e27eaea086d2bba3a05c4b73ce2add1a0bb57a226b0f5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78da314feaec6cab81e27eaea086d2bba3a05c4b73ce2add1a0bb57a226b0f5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:04Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:04 crc kubenswrapper[4606]: I1212 00:24:04.019624 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92839d27-9755-4aaa-a4c2-8aa83b6473de\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d18b3a560566347e9cfcd748453a89d1588de77f02a5e961a71916c6182eff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9531d10c07644d54aeff48edf4ac068968acbd1b6597baabcb0660e7bcd54c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76f2e4c2d97f36aebba4bcbabe12214a1df27637376d1440ee54d43fca39065\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b764ca1153735e365d95ed2855c7d41cd7457614e1ddce0f008d62720d607e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c75350d79a93c04db2bd6d71995920cde9b897776249b8696f3ead3939f7d5d4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"ed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1765499023\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1765499022\\\\\\\\\\\\\\\" (2025-12-11 23:23:42 +0000 UTC to 2026-12-11 23:23:42 +0000 UTC (now=2025-12-12 00:23:48.120831298 +0000 UTC))\\\\\\\"\\\\nI1212 00:23:48.120889 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1212 00:23:48.120921 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1212 00:23:48.120964 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1212 00:23:48.120986 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1212 00:23:48.121028 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3271453193/tls.crt::/tmp/serving-cert-3271453193/tls.key\\\\\\\"\\\\nI1212 00:23:48.121084 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1212 00:23:48.121211 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1212 00:23:48.121228 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1212 00:23:48.121265 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1212 00:23:48.121275 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1212 00:23:48.121345 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1212 00:23:48.121358 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF1212 00:23:48.123292 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3baf1bf5fbaa04f9f165f43b6a4fba496be0d453641fcd8bbc811687891dd66\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bbabfdcef0c9f4dcaada0bf5c2e98c083fd0f809462f017f8a767ef800e8e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0bbabfdcef0c9f4dcaada0bf5c2e98c083fd0f809462f017f8a767ef800e8e13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:04Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:04 crc kubenswrapper[4606]: I1212 00:24:04.034258 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ba650bd-b1e1-4950-98c6-2bee87181b18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de6cd366dc8d770945235faa6560cea35c0ef7eabd95cb272639e2116e2d623d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a380a96c542428bf9cf66391f4d843144052269b294636e863319b2ba954047f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://902a9d2fb0d7dcdec6339fac981ee0539911f1998a86ecf02a01015d3b1b5907\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4309a0a93977264e5000a930a32a4fb18ebe51fcf2af20bbf153f30f09759ff6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:04Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:04 crc kubenswrapper[4606]: I1212 00:24:04.049565 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:04Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:04 crc kubenswrapper[4606]: I1212 00:24:04.059893 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:04 crc kubenswrapper[4606]: I1212 00:24:04.060072 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:04 crc kubenswrapper[4606]: I1212 00:24:04.060166 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:04 crc kubenswrapper[4606]: I1212 00:24:04.060278 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:04 crc kubenswrapper[4606]: I1212 00:24:04.060366 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:04Z","lastTransitionTime":"2025-12-12T00:24:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:04 crc kubenswrapper[4606]: I1212 00:24:04.062249 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://694cd2f423e8241bd8fd68395824d1cd45a12d80e92ad57e63e59c0642b21384\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:04Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:04 crc kubenswrapper[4606]: I1212 00:24:04.072818 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a543e227-be89-40cb-941d-b4707cc28921\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b09638dd0d881593745c14548156f20a9366f65ecfa5018d510b902240dd4f7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-986w2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5de882904f4a5718bcf44d07da05096363447836f325f35130914c17a30c3a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-986w2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cqmz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:04Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:04 crc kubenswrapper[4606]: I1212 00:24:04.084226 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4986a294a19f85082b9caf854f13f9c3587ca49314b40917b00768ee0bb51ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4251eab73ccb6141903993f8df922037762f7c11dc1e0ef1a2b5708ed435307\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:04Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:04 crc kubenswrapper[4606]: I1212 00:24:04.094553 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:04Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:04 crc kubenswrapper[4606]: I1212 00:24:04.103390 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qtz7b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f437a237-3eb8-4817-b0a9-35efece69933\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f587a144b02f6ba1229380174f2a4f2c50ed33702fe8d64eb5ff7174b32eb2d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qq49f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qtz7b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:04Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:04 crc kubenswrapper[4606]: I1212 00:24:04.113898 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce6c4e852b4c68e910621ad11058beecb40960d501d3e64f5fcad97c442df4ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:04Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:04 crc kubenswrapper[4606]: I1212 00:24:04.125890 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:04Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:04 crc kubenswrapper[4606]: I1212 00:24:04.134735 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-554rp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d34036b-3243-416a-a0c0-1d1ddb3a0ca5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://346417326a8a4ba8e839ce4870dbaacd33a90b4ab096d8ce573ce598d909b51a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9k8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-554rp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:04Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:04 crc kubenswrapper[4606]: I1212 00:24:04.147197 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-w4rbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"470c2076-46bf-4305-9fb1-3e509eb4d4f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78e7b08154aa4792439cd9e13c38f1c3a106a4d138d8e6a47d5406c82f11662c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34a40d0ce85078630553b90b7a35510e0d3523f600db8940d5e0cf063903192d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34a40d0ce85078630553b90b7a35510e0d3523f600db8940d5e0cf063903192d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f0d903a3e09486a00bc5031c1a100d7a520f8e0cbdcccbeadb796e70f1075b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f0d903a3e09486a00bc5031c1a100d7a520f8e0cbdcccbeadb796e70f1075b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06909fa1392ee8f29394ecd7411437e4f331ccd2007d0a534d0292175a7260ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06909fa1392ee8f29394ecd7411437e4f331ccd2007d0a534d0292175a7260ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://328f409c465a8289f2e2e070ac2475187e763de8354cad8af5b5b4e0a3539628\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://328f409c465a8289f2e2e070ac2475187e763de8354cad8af5b5b4e0a3539628\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09980f0c6b16141a2a66d2768c6e01f75c37314fea48825e9b9707a3fc4d3f9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09980f0c6b16141a2a66d2768c6e01f75c37314fea48825e9b9707a3fc4d3f9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d34cd079b74cccb75c66edf8c51b64113483767ced4eb56ad429d6527d0bb0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d34cd079b74cccb75c66edf8c51b64113483767ced4eb56ad429d6527d0bb0c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-w4rbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:04Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:04 crc kubenswrapper[4606]: I1212 00:24:04.162913 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:04 crc kubenswrapper[4606]: I1212 00:24:04.162983 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:04 crc kubenswrapper[4606]: I1212 00:24:04.163005 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:04 crc kubenswrapper[4606]: I1212 00:24:04.163034 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:04 crc kubenswrapper[4606]: I1212 00:24:04.163052 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:04Z","lastTransitionTime":"2025-12-12T00:24:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:04 crc kubenswrapper[4606]: I1212 00:24:04.169010 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da25b0ba-5398-4185-a4c1-aeba44ae5633\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa827acce161127216bcfcf3553efeb627cbe52e4d23a8cb011e025f67dde670\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://653bdfa6f15b75149c89a4aff0706e132368f8bc75520cde9f1e1c55c8866537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbb742680d55b47d51cd1964e187b6816e50a88befbce58d3a8746af74cf7071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://366ddc63e1156e702b24e0c3d3c02785c10324669fa851e46851f9a2b8a46a5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6183a4685fda61493c792e9ae693cd7f76e4aad9753f622b48347da3fd2b3b00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0867a89de2edf12036267a937b7e3b1a801e9710ff0a1250c20e020c80ac4d5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cd9db0d886b8f1de6f10f2be1348147d443ba4f3e542b171b1daf9f5abbd1ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5cd9db0d886b8f1de6f10f2be1348147d443ba4f3e542b171b1daf9f5abbd1ec\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-12T00:24:01Z\\\",\\\"message\\\":\\\"retry setup to complete in iterateRetryResources\\\\nF1212 00:24:00.622973 5993 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:00Z is after 2025-08-24T17:21:41Z]\\\\nI1212 00:24:00.622999 5993 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1212 00:24:00.623015 5993 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1212 00:24:00.623022 5993 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf in node crc\\\\nI1212 00:24:00.623021 5993 obj_retry.go:303] Ret\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-12T00:24:00Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-hpw5w_openshift-ovn-kubernetes(da25b0ba-5398-4185-a4c1-aeba44ae5633)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e3c08f155fa3cef9692edf10c4df9719f3647b5625a9ac68a7c158afb80dd86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f94a9f641815d76464c17c6145b9ac4803eedab3bea29a52d65a793d448cebe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f94a9f641815d76464c17c6145b9ac4803eedab3bea29a52d65a793d448cebe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hpw5w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:04Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:04 crc kubenswrapper[4606]: I1212 00:24:04.181910 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7rxps" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c831d6d-b07d-46dd-adc0-85239379350f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba170e097af91d1b27e0ee3e414e0831a5a93256c693247e7129a5fda0b9dc85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:24:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbjbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d66f7e90d4969cb2791da434d47d0e4fb269aa8760347c7d59bd8d16f4cf42c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:24:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbjbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:24:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7rxps\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:04Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:04 crc kubenswrapper[4606]: I1212 00:24:04.191538 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mjjwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0853dce1-c009-407e-960d-1113f85e503f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwwtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwwtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:24:03Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mjjwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:04Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:04 crc kubenswrapper[4606]: I1212 00:24:04.265613 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:04 crc kubenswrapper[4606]: I1212 00:24:04.265688 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:04 crc kubenswrapper[4606]: I1212 00:24:04.265709 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:04 crc kubenswrapper[4606]: I1212 00:24:04.265770 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:04 crc kubenswrapper[4606]: I1212 00:24:04.265807 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:04Z","lastTransitionTime":"2025-12-12T00:24:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:04 crc kubenswrapper[4606]: I1212 00:24:04.274125 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0853dce1-c009-407e-960d-1113f85e503f-metrics-certs\") pod \"network-metrics-daemon-mjjwd\" (UID: \"0853dce1-c009-407e-960d-1113f85e503f\") " pod="openshift-multus/network-metrics-daemon-mjjwd" Dec 12 00:24:04 crc kubenswrapper[4606]: E1212 00:24:04.274307 4606 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 12 00:24:04 crc kubenswrapper[4606]: E1212 00:24:04.274401 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0853dce1-c009-407e-960d-1113f85e503f-metrics-certs podName:0853dce1-c009-407e-960d-1113f85e503f nodeName:}" failed. No retries permitted until 2025-12-12 00:24:05.274378482 +0000 UTC m=+35.819731368 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0853dce1-c009-407e-960d-1113f85e503f-metrics-certs") pod "network-metrics-daemon-mjjwd" (UID: "0853dce1-c009-407e-960d-1113f85e503f") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 12 00:24:04 crc kubenswrapper[4606]: I1212 00:24:04.368120 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:04 crc kubenswrapper[4606]: I1212 00:24:04.368262 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:04 crc kubenswrapper[4606]: I1212 00:24:04.368300 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:04 crc kubenswrapper[4606]: I1212 00:24:04.368336 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:04 crc kubenswrapper[4606]: I1212 00:24:04.368360 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:04Z","lastTransitionTime":"2025-12-12T00:24:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:04 crc kubenswrapper[4606]: I1212 00:24:04.471751 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:04 crc kubenswrapper[4606]: I1212 00:24:04.471807 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:04 crc kubenswrapper[4606]: I1212 00:24:04.471822 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:04 crc kubenswrapper[4606]: I1212 00:24:04.471843 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:04 crc kubenswrapper[4606]: I1212 00:24:04.471858 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:04Z","lastTransitionTime":"2025-12-12T00:24:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:04 crc kubenswrapper[4606]: I1212 00:24:04.575421 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:04 crc kubenswrapper[4606]: I1212 00:24:04.575502 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:04 crc kubenswrapper[4606]: I1212 00:24:04.575527 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:04 crc kubenswrapper[4606]: I1212 00:24:04.575560 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:04 crc kubenswrapper[4606]: I1212 00:24:04.575585 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:04Z","lastTransitionTime":"2025-12-12T00:24:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:04 crc kubenswrapper[4606]: I1212 00:24:04.577758 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 12 00:24:04 crc kubenswrapper[4606]: E1212 00:24:04.577988 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-12 00:24:20.577958296 +0000 UTC m=+51.123311202 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 12 00:24:04 crc kubenswrapper[4606]: I1212 00:24:04.678243 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:04 crc kubenswrapper[4606]: I1212 00:24:04.678338 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:04 crc kubenswrapper[4606]: I1212 00:24:04.678355 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:04 crc kubenswrapper[4606]: I1212 00:24:04.678385 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:04 crc kubenswrapper[4606]: I1212 00:24:04.678409 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:04Z","lastTransitionTime":"2025-12-12T00:24:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:04 crc kubenswrapper[4606]: I1212 00:24:04.678505 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 12 00:24:04 crc kubenswrapper[4606]: I1212 00:24:04.678555 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 12 00:24:04 crc kubenswrapper[4606]: I1212 00:24:04.678583 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 12 00:24:04 crc kubenswrapper[4606]: I1212 00:24:04.678621 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 12 00:24:04 crc kubenswrapper[4606]: E1212 00:24:04.678746 4606 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 12 00:24:04 crc kubenswrapper[4606]: E1212 00:24:04.678779 4606 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 12 00:24:04 crc kubenswrapper[4606]: E1212 00:24:04.678799 4606 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 12 00:24:04 crc kubenswrapper[4606]: E1212 00:24:04.678842 4606 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 12 00:24:04 crc kubenswrapper[4606]: E1212 00:24:04.678858 4606 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 12 00:24:04 crc kubenswrapper[4606]: E1212 00:24:04.678888 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-12 00:24:20.678854936 +0000 UTC m=+51.224207842 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 12 00:24:04 crc kubenswrapper[4606]: E1212 00:24:04.678899 4606 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 12 00:24:04 crc kubenswrapper[4606]: E1212 00:24:04.678929 4606 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 12 00:24:04 crc kubenswrapper[4606]: E1212 00:24:04.678931 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-12 00:24:20.678911618 +0000 UTC m=+51.224264524 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 12 00:24:04 crc kubenswrapper[4606]: E1212 00:24:04.679030 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-12 00:24:20.6790013 +0000 UTC m=+51.224354246 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 12 00:24:04 crc kubenswrapper[4606]: E1212 00:24:04.679125 4606 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 12 00:24:04 crc kubenswrapper[4606]: E1212 00:24:04.679247 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-12 00:24:20.679229696 +0000 UTC m=+51.224582602 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 12 00:24:04 crc kubenswrapper[4606]: I1212 00:24:04.698733 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 12 00:24:04 crc kubenswrapper[4606]: I1212 00:24:04.698816 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 12 00:24:04 crc kubenswrapper[4606]: I1212 00:24:04.698871 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 12 00:24:04 crc kubenswrapper[4606]: E1212 00:24:04.699032 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 12 00:24:04 crc kubenswrapper[4606]: E1212 00:24:04.699147 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 12 00:24:04 crc kubenswrapper[4606]: E1212 00:24:04.699275 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 12 00:24:04 crc kubenswrapper[4606]: I1212 00:24:04.781431 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:04 crc kubenswrapper[4606]: I1212 00:24:04.781505 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:04 crc kubenswrapper[4606]: I1212 00:24:04.781525 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:04 crc kubenswrapper[4606]: I1212 00:24:04.781549 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:04 crc kubenswrapper[4606]: I1212 00:24:04.781565 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:04Z","lastTransitionTime":"2025-12-12T00:24:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:04 crc kubenswrapper[4606]: I1212 00:24:04.848539 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 12 00:24:04 crc kubenswrapper[4606]: I1212 00:24:04.866220 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xzcfk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84110159d07d70b0607ac029fec70fb043bbb0e310c39955e49becfe87e9faa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-spsv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xzcfk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:04Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:04 crc kubenswrapper[4606]: I1212 00:24:04.883705 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:04 crc kubenswrapper[4606]: I1212 00:24:04.883775 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:04 crc kubenswrapper[4606]: I1212 00:24:04.883791 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:04 crc kubenswrapper[4606]: I1212 00:24:04.883817 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:04 crc kubenswrapper[4606]: I1212 00:24:04.883834 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:04Z","lastTransitionTime":"2025-12-12T00:24:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:04 crc kubenswrapper[4606]: I1212 00:24:04.896327 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f8df398-6835-482a-a2cd-3395b6e9efab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b993d03e36e69edb510ce98f50dfb98344fb6fe2b5debeff3a2004807dd00ee6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c6fef20d4e159a405f0ab0206c49e5e314e97bf7ddaf0758e090106945c9ffe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://accf40ef1339b65136666c1ea6f21a70d2152a650c87ef3ade19ddb6f9effbc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbd49ad6d6011a235d440da0e8058c0fc4dcc64effec37b50a68e84438d537a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2860bce1479ca1d558c4f5230d76cadcaf3efc0e3842ef0d371ecf1168cd852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7780eb068dc4a157794f9253367d2198a38a2c039cf3906fc2319cf48db3cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa7780eb068dc4a157794f9253367d2198a38a2c039cf3906fc2319cf48db3cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://684c387c89889a0a1e8d317ca70ebfc71ea377eb5de9c08c71a3ff02faf99e4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://684c387c89889a0a1e8d317ca70ebfc71ea377eb5de9c08c71a3ff02faf99e4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://78da314feaec6cab81e27eaea086d2bba3a05c4b73ce2add1a0bb57a226b0f5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78da314feaec6cab81e27eaea086d2bba3a05c4b73ce2add1a0bb57a226b0f5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:04Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:04 crc kubenswrapper[4606]: I1212 00:24:04.914928 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92839d27-9755-4aaa-a4c2-8aa83b6473de\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d18b3a560566347e9cfcd748453a89d1588de77f02a5e961a71916c6182eff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9531d10c07644d54aeff48edf4ac068968acbd1b6597baabcb0660e7bcd54c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76f2e4c2d97f36aebba4bcbabe12214a1df27637376d1440ee54d43fca39065\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b764ca1153735e365d95ed2855c7d41cd7457614e1ddce0f008d62720d607e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c75350d79a93c04db2bd6d71995920cde9b897776249b8696f3ead3939f7d5d4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"ed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1765499023\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1765499022\\\\\\\\\\\\\\\" (2025-12-11 23:23:42 +0000 UTC to 2026-12-11 23:23:42 +0000 UTC (now=2025-12-12 00:23:48.120831298 +0000 UTC))\\\\\\\"\\\\nI1212 00:23:48.120889 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1212 00:23:48.120921 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1212 00:23:48.120964 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1212 00:23:48.120986 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1212 00:23:48.121028 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3271453193/tls.crt::/tmp/serving-cert-3271453193/tls.key\\\\\\\"\\\\nI1212 00:23:48.121084 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1212 00:23:48.121211 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1212 00:23:48.121228 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1212 00:23:48.121265 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1212 00:23:48.121275 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1212 00:23:48.121345 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1212 00:23:48.121358 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF1212 00:23:48.123292 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3baf1bf5fbaa04f9f165f43b6a4fba496be0d453641fcd8bbc811687891dd66\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bbabfdcef0c9f4dcaada0bf5c2e98c083fd0f809462f017f8a767ef800e8e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0bbabfdcef0c9f4dcaada0bf5c2e98c083fd0f809462f017f8a767ef800e8e13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:04Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:04 crc kubenswrapper[4606]: I1212 00:24:04.928755 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ba650bd-b1e1-4950-98c6-2bee87181b18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de6cd366dc8d770945235faa6560cea35c0ef7eabd95cb272639e2116e2d623d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a380a96c542428bf9cf66391f4d843144052269b294636e863319b2ba954047f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://902a9d2fb0d7dcdec6339fac981ee0539911f1998a86ecf02a01015d3b1b5907\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4309a0a93977264e5000a930a32a4fb18ebe51fcf2af20bbf153f30f09759ff6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:04Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:04 crc kubenswrapper[4606]: I1212 00:24:04.939962 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:04Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:04 crc kubenswrapper[4606]: I1212 00:24:04.949937 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://694cd2f423e8241bd8fd68395824d1cd45a12d80e92ad57e63e59c0642b21384\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:04Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:04 crc kubenswrapper[4606]: I1212 00:24:04.958288 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a543e227-be89-40cb-941d-b4707cc28921\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b09638dd0d881593745c14548156f20a9366f65ecfa5018d510b902240dd4f7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-986w2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5de882904f4a5718bcf44d07da05096363447836f325f35130914c17a30c3a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-986w2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cqmz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:04Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:04 crc kubenswrapper[4606]: I1212 00:24:04.969975 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4986a294a19f85082b9caf854f13f9c3587ca49314b40917b00768ee0bb51ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4251eab73ccb6141903993f8df922037762f7c11dc1e0ef1a2b5708ed435307\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:04Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:04 crc kubenswrapper[4606]: I1212 00:24:04.981115 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:04Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:04 crc kubenswrapper[4606]: I1212 00:24:04.985781 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:04 crc kubenswrapper[4606]: I1212 00:24:04.985811 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:04 crc kubenswrapper[4606]: I1212 00:24:04.985821 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:04 crc kubenswrapper[4606]: I1212 00:24:04.985839 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:04 crc kubenswrapper[4606]: I1212 00:24:04.985850 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:04Z","lastTransitionTime":"2025-12-12T00:24:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:04 crc kubenswrapper[4606]: I1212 00:24:04.994026 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qtz7b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f437a237-3eb8-4817-b0a9-35efece69933\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f587a144b02f6ba1229380174f2a4f2c50ed33702fe8d64eb5ff7174b32eb2d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qq49f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qtz7b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:04Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:05 crc kubenswrapper[4606]: I1212 00:24:05.008293 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce6c4e852b4c68e910621ad11058beecb40960d501d3e64f5fcad97c442df4ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:05Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:05 crc kubenswrapper[4606]: I1212 00:24:05.018599 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:05Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:05 crc kubenswrapper[4606]: I1212 00:24:05.027210 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-554rp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d34036b-3243-416a-a0c0-1d1ddb3a0ca5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://346417326a8a4ba8e839ce4870dbaacd33a90b4ab096d8ce573ce598d909b51a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9k8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-554rp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:05Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:05 crc kubenswrapper[4606]: I1212 00:24:05.043420 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-w4rbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"470c2076-46bf-4305-9fb1-3e509eb4d4f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78e7b08154aa4792439cd9e13c38f1c3a106a4d138d8e6a47d5406c82f11662c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34a40d0ce85078630553b90b7a35510e0d3523f600db8940d5e0cf063903192d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34a40d0ce85078630553b90b7a35510e0d3523f600db8940d5e0cf063903192d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f0d903a3e09486a00bc5031c1a100d7a520f8e0cbdcccbeadb796e70f1075b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f0d903a3e09486a00bc5031c1a100d7a520f8e0cbdcccbeadb796e70f1075b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06909fa1392ee8f29394ecd7411437e4f331ccd2007d0a534d0292175a7260ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06909fa1392ee8f29394ecd7411437e4f331ccd2007d0a534d0292175a7260ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://328f409c465a8289f2e2e070ac2475187e763de8354cad8af5b5b4e0a3539628\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://328f409c465a8289f2e2e070ac2475187e763de8354cad8af5b5b4e0a3539628\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09980f0c6b16141a2a66d2768c6e01f75c37314fea48825e9b9707a3fc4d3f9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09980f0c6b16141a2a66d2768c6e01f75c37314fea48825e9b9707a3fc4d3f9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d34cd079b74cccb75c66edf8c51b64113483767ced4eb56ad429d6527d0bb0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d34cd079b74cccb75c66edf8c51b64113483767ced4eb56ad429d6527d0bb0c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-w4rbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:05Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:05 crc kubenswrapper[4606]: I1212 00:24:05.059847 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da25b0ba-5398-4185-a4c1-aeba44ae5633\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa827acce161127216bcfcf3553efeb627cbe52e4d23a8cb011e025f67dde670\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://653bdfa6f15b75149c89a4aff0706e132368f8bc75520cde9f1e1c55c8866537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbb742680d55b47d51cd1964e187b6816e50a88befbce58d3a8746af74cf7071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://366ddc63e1156e702b24e0c3d3c02785c10324669fa851e46851f9a2b8a46a5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6183a4685fda61493c792e9ae693cd7f76e4aad9753f622b48347da3fd2b3b00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0867a89de2edf12036267a937b7e3b1a801e9710ff0a1250c20e020c80ac4d5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cd9db0d886b8f1de6f10f2be1348147d443ba4f3e542b171b1daf9f5abbd1ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5cd9db0d886b8f1de6f10f2be1348147d443ba4f3e542b171b1daf9f5abbd1ec\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-12T00:24:01Z\\\",\\\"message\\\":\\\"retry setup to complete in iterateRetryResources\\\\nF1212 00:24:00.622973 5993 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:00Z is after 2025-08-24T17:21:41Z]\\\\nI1212 00:24:00.622999 5993 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1212 00:24:00.623015 5993 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1212 00:24:00.623022 5993 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf in node crc\\\\nI1212 00:24:00.623021 5993 obj_retry.go:303] Ret\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-12T00:24:00Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-hpw5w_openshift-ovn-kubernetes(da25b0ba-5398-4185-a4c1-aeba44ae5633)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e3c08f155fa3cef9692edf10c4df9719f3647b5625a9ac68a7c158afb80dd86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f94a9f641815d76464c17c6145b9ac4803eedab3bea29a52d65a793d448cebe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f94a9f641815d76464c17c6145b9ac4803eedab3bea29a52d65a793d448cebe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hpw5w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:05Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:05 crc kubenswrapper[4606]: I1212 00:24:05.068730 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7rxps" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c831d6d-b07d-46dd-adc0-85239379350f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba170e097af91d1b27e0ee3e414e0831a5a93256c693247e7129a5fda0b9dc85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:24:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbjbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d66f7e90d4969cb2791da434d47d0e4fb269aa8760347c7d59bd8d16f4cf42c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:24:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbjbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:24:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7rxps\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:05Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:05 crc kubenswrapper[4606]: I1212 00:24:05.076841 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mjjwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0853dce1-c009-407e-960d-1113f85e503f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwwtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwwtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:24:03Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mjjwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:05Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:05 crc kubenswrapper[4606]: I1212 00:24:05.087522 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:05 crc kubenswrapper[4606]: I1212 00:24:05.087565 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:05 crc kubenswrapper[4606]: I1212 00:24:05.087580 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:05 crc kubenswrapper[4606]: I1212 00:24:05.087597 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:05 crc kubenswrapper[4606]: I1212 00:24:05.087607 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:05Z","lastTransitionTime":"2025-12-12T00:24:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:05 crc kubenswrapper[4606]: I1212 00:24:05.190287 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:05 crc kubenswrapper[4606]: I1212 00:24:05.190326 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:05 crc kubenswrapper[4606]: I1212 00:24:05.190346 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:05 crc kubenswrapper[4606]: I1212 00:24:05.190364 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:05 crc kubenswrapper[4606]: I1212 00:24:05.190375 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:05Z","lastTransitionTime":"2025-12-12T00:24:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:05 crc kubenswrapper[4606]: I1212 00:24:05.286352 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0853dce1-c009-407e-960d-1113f85e503f-metrics-certs\") pod \"network-metrics-daemon-mjjwd\" (UID: \"0853dce1-c009-407e-960d-1113f85e503f\") " pod="openshift-multus/network-metrics-daemon-mjjwd" Dec 12 00:24:05 crc kubenswrapper[4606]: E1212 00:24:05.286556 4606 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 12 00:24:05 crc kubenswrapper[4606]: E1212 00:24:05.286672 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0853dce1-c009-407e-960d-1113f85e503f-metrics-certs podName:0853dce1-c009-407e-960d-1113f85e503f nodeName:}" failed. No retries permitted until 2025-12-12 00:24:07.28664032 +0000 UTC m=+37.831993226 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0853dce1-c009-407e-960d-1113f85e503f-metrics-certs") pod "network-metrics-daemon-mjjwd" (UID: "0853dce1-c009-407e-960d-1113f85e503f") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 12 00:24:05 crc kubenswrapper[4606]: I1212 00:24:05.293267 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:05 crc kubenswrapper[4606]: I1212 00:24:05.293340 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:05 crc kubenswrapper[4606]: I1212 00:24:05.293359 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:05 crc kubenswrapper[4606]: I1212 00:24:05.293382 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:05 crc kubenswrapper[4606]: I1212 00:24:05.293400 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:05Z","lastTransitionTime":"2025-12-12T00:24:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:05 crc kubenswrapper[4606]: I1212 00:24:05.396473 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:05 crc kubenswrapper[4606]: I1212 00:24:05.396541 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:05 crc kubenswrapper[4606]: I1212 00:24:05.396558 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:05 crc kubenswrapper[4606]: I1212 00:24:05.396588 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:05 crc kubenswrapper[4606]: I1212 00:24:05.396607 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:05Z","lastTransitionTime":"2025-12-12T00:24:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:05 crc kubenswrapper[4606]: I1212 00:24:05.499014 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:05 crc kubenswrapper[4606]: I1212 00:24:05.499061 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:05 crc kubenswrapper[4606]: I1212 00:24:05.499077 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:05 crc kubenswrapper[4606]: I1212 00:24:05.499095 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:05 crc kubenswrapper[4606]: I1212 00:24:05.499108 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:05Z","lastTransitionTime":"2025-12-12T00:24:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:05 crc kubenswrapper[4606]: I1212 00:24:05.601688 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:05 crc kubenswrapper[4606]: I1212 00:24:05.601747 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:05 crc kubenswrapper[4606]: I1212 00:24:05.601765 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:05 crc kubenswrapper[4606]: I1212 00:24:05.601788 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:05 crc kubenswrapper[4606]: I1212 00:24:05.601805 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:05Z","lastTransitionTime":"2025-12-12T00:24:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:05 crc kubenswrapper[4606]: I1212 00:24:05.699662 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mjjwd" Dec 12 00:24:05 crc kubenswrapper[4606]: E1212 00:24:05.699853 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mjjwd" podUID="0853dce1-c009-407e-960d-1113f85e503f" Dec 12 00:24:05 crc kubenswrapper[4606]: I1212 00:24:05.705995 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:05 crc kubenswrapper[4606]: I1212 00:24:05.706053 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:05 crc kubenswrapper[4606]: I1212 00:24:05.706072 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:05 crc kubenswrapper[4606]: I1212 00:24:05.706098 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:05 crc kubenswrapper[4606]: I1212 00:24:05.706118 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:05Z","lastTransitionTime":"2025-12-12T00:24:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:05 crc kubenswrapper[4606]: I1212 00:24:05.808746 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:05 crc kubenswrapper[4606]: I1212 00:24:05.809854 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:05 crc kubenswrapper[4606]: I1212 00:24:05.809885 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:05 crc kubenswrapper[4606]: I1212 00:24:05.809914 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:05 crc kubenswrapper[4606]: I1212 00:24:05.809934 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:05Z","lastTransitionTime":"2025-12-12T00:24:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:05 crc kubenswrapper[4606]: I1212 00:24:05.912904 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:05 crc kubenswrapper[4606]: I1212 00:24:05.912982 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:05 crc kubenswrapper[4606]: I1212 00:24:05.913007 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:05 crc kubenswrapper[4606]: I1212 00:24:05.913034 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:05 crc kubenswrapper[4606]: I1212 00:24:05.913058 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:05Z","lastTransitionTime":"2025-12-12T00:24:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:06 crc kubenswrapper[4606]: I1212 00:24:06.016553 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:06 crc kubenswrapper[4606]: I1212 00:24:06.016796 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:06 crc kubenswrapper[4606]: I1212 00:24:06.016822 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:06 crc kubenswrapper[4606]: I1212 00:24:06.016852 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:06 crc kubenswrapper[4606]: I1212 00:24:06.016873 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:06Z","lastTransitionTime":"2025-12-12T00:24:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:06 crc kubenswrapper[4606]: I1212 00:24:06.119906 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:06 crc kubenswrapper[4606]: I1212 00:24:06.119965 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:06 crc kubenswrapper[4606]: I1212 00:24:06.119981 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:06 crc kubenswrapper[4606]: I1212 00:24:06.120003 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:06 crc kubenswrapper[4606]: I1212 00:24:06.120021 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:06Z","lastTransitionTime":"2025-12-12T00:24:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:06 crc kubenswrapper[4606]: I1212 00:24:06.222916 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:06 crc kubenswrapper[4606]: I1212 00:24:06.222988 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:06 crc kubenswrapper[4606]: I1212 00:24:06.223003 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:06 crc kubenswrapper[4606]: I1212 00:24:06.223022 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:06 crc kubenswrapper[4606]: I1212 00:24:06.223034 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:06Z","lastTransitionTime":"2025-12-12T00:24:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:06 crc kubenswrapper[4606]: I1212 00:24:06.325713 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:06 crc kubenswrapper[4606]: I1212 00:24:06.325761 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:06 crc kubenswrapper[4606]: I1212 00:24:06.325772 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:06 crc kubenswrapper[4606]: I1212 00:24:06.325792 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:06 crc kubenswrapper[4606]: I1212 00:24:06.325803 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:06Z","lastTransitionTime":"2025-12-12T00:24:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:06 crc kubenswrapper[4606]: I1212 00:24:06.428742 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:06 crc kubenswrapper[4606]: I1212 00:24:06.428812 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:06 crc kubenswrapper[4606]: I1212 00:24:06.428838 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:06 crc kubenswrapper[4606]: I1212 00:24:06.428873 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:06 crc kubenswrapper[4606]: I1212 00:24:06.428896 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:06Z","lastTransitionTime":"2025-12-12T00:24:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:06 crc kubenswrapper[4606]: I1212 00:24:06.531330 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:06 crc kubenswrapper[4606]: I1212 00:24:06.531405 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:06 crc kubenswrapper[4606]: I1212 00:24:06.531431 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:06 crc kubenswrapper[4606]: I1212 00:24:06.531461 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:06 crc kubenswrapper[4606]: I1212 00:24:06.531488 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:06Z","lastTransitionTime":"2025-12-12T00:24:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:06 crc kubenswrapper[4606]: I1212 00:24:06.635358 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:06 crc kubenswrapper[4606]: I1212 00:24:06.635436 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:06 crc kubenswrapper[4606]: I1212 00:24:06.635459 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:06 crc kubenswrapper[4606]: I1212 00:24:06.635488 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:06 crc kubenswrapper[4606]: I1212 00:24:06.635509 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:06Z","lastTransitionTime":"2025-12-12T00:24:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:06 crc kubenswrapper[4606]: I1212 00:24:06.699517 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 12 00:24:06 crc kubenswrapper[4606]: I1212 00:24:06.699526 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 12 00:24:06 crc kubenswrapper[4606]: E1212 00:24:06.699699 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 12 00:24:06 crc kubenswrapper[4606]: I1212 00:24:06.699553 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 12 00:24:06 crc kubenswrapper[4606]: E1212 00:24:06.699914 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 12 00:24:06 crc kubenswrapper[4606]: E1212 00:24:06.700140 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 12 00:24:06 crc kubenswrapper[4606]: I1212 00:24:06.738562 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:06 crc kubenswrapper[4606]: I1212 00:24:06.738636 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:06 crc kubenswrapper[4606]: I1212 00:24:06.738655 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:06 crc kubenswrapper[4606]: I1212 00:24:06.738682 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:06 crc kubenswrapper[4606]: I1212 00:24:06.738705 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:06Z","lastTransitionTime":"2025-12-12T00:24:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:06 crc kubenswrapper[4606]: I1212 00:24:06.842162 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:06 crc kubenswrapper[4606]: I1212 00:24:06.842262 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:06 crc kubenswrapper[4606]: I1212 00:24:06.842281 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:06 crc kubenswrapper[4606]: I1212 00:24:06.842306 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:06 crc kubenswrapper[4606]: I1212 00:24:06.842324 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:06Z","lastTransitionTime":"2025-12-12T00:24:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:06 crc kubenswrapper[4606]: I1212 00:24:06.946738 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:06 crc kubenswrapper[4606]: I1212 00:24:06.946820 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:06 crc kubenswrapper[4606]: I1212 00:24:06.946838 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:06 crc kubenswrapper[4606]: I1212 00:24:06.946866 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:06 crc kubenswrapper[4606]: I1212 00:24:06.946885 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:06Z","lastTransitionTime":"2025-12-12T00:24:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:07 crc kubenswrapper[4606]: I1212 00:24:07.049951 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:07 crc kubenswrapper[4606]: I1212 00:24:07.050018 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:07 crc kubenswrapper[4606]: I1212 00:24:07.050032 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:07 crc kubenswrapper[4606]: I1212 00:24:07.050058 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:07 crc kubenswrapper[4606]: I1212 00:24:07.050078 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:07Z","lastTransitionTime":"2025-12-12T00:24:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:07 crc kubenswrapper[4606]: I1212 00:24:07.151922 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:07 crc kubenswrapper[4606]: I1212 00:24:07.151978 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:07 crc kubenswrapper[4606]: I1212 00:24:07.151992 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:07 crc kubenswrapper[4606]: I1212 00:24:07.152012 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:07 crc kubenswrapper[4606]: I1212 00:24:07.152027 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:07Z","lastTransitionTime":"2025-12-12T00:24:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:07 crc kubenswrapper[4606]: I1212 00:24:07.255330 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:07 crc kubenswrapper[4606]: I1212 00:24:07.255383 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:07 crc kubenswrapper[4606]: I1212 00:24:07.255399 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:07 crc kubenswrapper[4606]: I1212 00:24:07.255424 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:07 crc kubenswrapper[4606]: I1212 00:24:07.255440 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:07Z","lastTransitionTime":"2025-12-12T00:24:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:07 crc kubenswrapper[4606]: I1212 00:24:07.315560 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0853dce1-c009-407e-960d-1113f85e503f-metrics-certs\") pod \"network-metrics-daemon-mjjwd\" (UID: \"0853dce1-c009-407e-960d-1113f85e503f\") " pod="openshift-multus/network-metrics-daemon-mjjwd" Dec 12 00:24:07 crc kubenswrapper[4606]: E1212 00:24:07.315811 4606 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 12 00:24:07 crc kubenswrapper[4606]: E1212 00:24:07.316116 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0853dce1-c009-407e-960d-1113f85e503f-metrics-certs podName:0853dce1-c009-407e-960d-1113f85e503f nodeName:}" failed. No retries permitted until 2025-12-12 00:24:11.316032961 +0000 UTC m=+41.861385897 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0853dce1-c009-407e-960d-1113f85e503f-metrics-certs") pod "network-metrics-daemon-mjjwd" (UID: "0853dce1-c009-407e-960d-1113f85e503f") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 12 00:24:07 crc kubenswrapper[4606]: I1212 00:24:07.358408 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:07 crc kubenswrapper[4606]: I1212 00:24:07.358479 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:07 crc kubenswrapper[4606]: I1212 00:24:07.358502 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:07 crc kubenswrapper[4606]: I1212 00:24:07.358542 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:07 crc kubenswrapper[4606]: I1212 00:24:07.358572 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:07Z","lastTransitionTime":"2025-12-12T00:24:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:07 crc kubenswrapper[4606]: I1212 00:24:07.461790 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:07 crc kubenswrapper[4606]: I1212 00:24:07.461845 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:07 crc kubenswrapper[4606]: I1212 00:24:07.461861 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:07 crc kubenswrapper[4606]: I1212 00:24:07.461884 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:07 crc kubenswrapper[4606]: I1212 00:24:07.461900 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:07Z","lastTransitionTime":"2025-12-12T00:24:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:07 crc kubenswrapper[4606]: I1212 00:24:07.566393 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:07 crc kubenswrapper[4606]: I1212 00:24:07.566453 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:07 crc kubenswrapper[4606]: I1212 00:24:07.566477 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:07 crc kubenswrapper[4606]: I1212 00:24:07.566506 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:07 crc kubenswrapper[4606]: I1212 00:24:07.566528 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:07Z","lastTransitionTime":"2025-12-12T00:24:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:07 crc kubenswrapper[4606]: I1212 00:24:07.669905 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:07 crc kubenswrapper[4606]: I1212 00:24:07.670276 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:07 crc kubenswrapper[4606]: I1212 00:24:07.670504 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:07 crc kubenswrapper[4606]: I1212 00:24:07.670676 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:07 crc kubenswrapper[4606]: I1212 00:24:07.670935 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:07Z","lastTransitionTime":"2025-12-12T00:24:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:07 crc kubenswrapper[4606]: I1212 00:24:07.699474 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mjjwd" Dec 12 00:24:07 crc kubenswrapper[4606]: E1212 00:24:07.699650 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mjjwd" podUID="0853dce1-c009-407e-960d-1113f85e503f" Dec 12 00:24:07 crc kubenswrapper[4606]: I1212 00:24:07.773810 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:07 crc kubenswrapper[4606]: I1212 00:24:07.773888 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:07 crc kubenswrapper[4606]: I1212 00:24:07.773910 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:07 crc kubenswrapper[4606]: I1212 00:24:07.773938 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:07 crc kubenswrapper[4606]: I1212 00:24:07.774019 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:07Z","lastTransitionTime":"2025-12-12T00:24:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:07 crc kubenswrapper[4606]: I1212 00:24:07.876717 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:07 crc kubenswrapper[4606]: I1212 00:24:07.876797 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:07 crc kubenswrapper[4606]: I1212 00:24:07.876823 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:07 crc kubenswrapper[4606]: I1212 00:24:07.876852 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:07 crc kubenswrapper[4606]: I1212 00:24:07.876871 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:07Z","lastTransitionTime":"2025-12-12T00:24:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:07 crc kubenswrapper[4606]: I1212 00:24:07.980749 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:07 crc kubenswrapper[4606]: I1212 00:24:07.980916 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:07 crc kubenswrapper[4606]: I1212 00:24:07.980998 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:07 crc kubenswrapper[4606]: I1212 00:24:07.981034 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:07 crc kubenswrapper[4606]: I1212 00:24:07.981122 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:07Z","lastTransitionTime":"2025-12-12T00:24:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:08 crc kubenswrapper[4606]: I1212 00:24:08.085010 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:08 crc kubenswrapper[4606]: I1212 00:24:08.085120 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:08 crc kubenswrapper[4606]: I1212 00:24:08.085223 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:08 crc kubenswrapper[4606]: I1212 00:24:08.085272 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:08 crc kubenswrapper[4606]: I1212 00:24:08.085298 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:08Z","lastTransitionTime":"2025-12-12T00:24:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:08 crc kubenswrapper[4606]: I1212 00:24:08.188916 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:08 crc kubenswrapper[4606]: I1212 00:24:08.188988 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:08 crc kubenswrapper[4606]: I1212 00:24:08.189012 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:08 crc kubenswrapper[4606]: I1212 00:24:08.189039 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:08 crc kubenswrapper[4606]: I1212 00:24:08.189057 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:08Z","lastTransitionTime":"2025-12-12T00:24:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:08 crc kubenswrapper[4606]: I1212 00:24:08.291910 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:08 crc kubenswrapper[4606]: I1212 00:24:08.291977 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:08 crc kubenswrapper[4606]: I1212 00:24:08.291994 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:08 crc kubenswrapper[4606]: I1212 00:24:08.292018 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:08 crc kubenswrapper[4606]: I1212 00:24:08.292034 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:08Z","lastTransitionTime":"2025-12-12T00:24:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:08 crc kubenswrapper[4606]: I1212 00:24:08.394912 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:08 crc kubenswrapper[4606]: I1212 00:24:08.395062 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:08 crc kubenswrapper[4606]: I1212 00:24:08.395086 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:08 crc kubenswrapper[4606]: I1212 00:24:08.395109 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:08 crc kubenswrapper[4606]: I1212 00:24:08.395126 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:08Z","lastTransitionTime":"2025-12-12T00:24:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:08 crc kubenswrapper[4606]: I1212 00:24:08.497845 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:08 crc kubenswrapper[4606]: I1212 00:24:08.497887 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:08 crc kubenswrapper[4606]: I1212 00:24:08.497899 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:08 crc kubenswrapper[4606]: I1212 00:24:08.497913 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:08 crc kubenswrapper[4606]: I1212 00:24:08.497924 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:08Z","lastTransitionTime":"2025-12-12T00:24:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:08 crc kubenswrapper[4606]: I1212 00:24:08.602517 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:08 crc kubenswrapper[4606]: I1212 00:24:08.602633 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:08 crc kubenswrapper[4606]: I1212 00:24:08.602674 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:08 crc kubenswrapper[4606]: I1212 00:24:08.602753 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:08 crc kubenswrapper[4606]: I1212 00:24:08.602774 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:08Z","lastTransitionTime":"2025-12-12T00:24:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:08 crc kubenswrapper[4606]: I1212 00:24:08.699233 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 12 00:24:08 crc kubenswrapper[4606]: I1212 00:24:08.699284 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 12 00:24:08 crc kubenswrapper[4606]: I1212 00:24:08.699253 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 12 00:24:08 crc kubenswrapper[4606]: E1212 00:24:08.699459 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 12 00:24:08 crc kubenswrapper[4606]: E1212 00:24:08.699566 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 12 00:24:08 crc kubenswrapper[4606]: E1212 00:24:08.699807 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 12 00:24:08 crc kubenswrapper[4606]: I1212 00:24:08.706461 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:08 crc kubenswrapper[4606]: I1212 00:24:08.706589 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:08 crc kubenswrapper[4606]: I1212 00:24:08.706613 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:08 crc kubenswrapper[4606]: I1212 00:24:08.706642 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:08 crc kubenswrapper[4606]: I1212 00:24:08.706664 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:08Z","lastTransitionTime":"2025-12-12T00:24:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:08 crc kubenswrapper[4606]: I1212 00:24:08.809525 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:08 crc kubenswrapper[4606]: I1212 00:24:08.809605 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:08 crc kubenswrapper[4606]: I1212 00:24:08.809641 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:08 crc kubenswrapper[4606]: I1212 00:24:08.809671 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:08 crc kubenswrapper[4606]: I1212 00:24:08.809690 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:08Z","lastTransitionTime":"2025-12-12T00:24:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:08 crc kubenswrapper[4606]: I1212 00:24:08.912717 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:08 crc kubenswrapper[4606]: I1212 00:24:08.912796 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:08 crc kubenswrapper[4606]: I1212 00:24:08.912819 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:08 crc kubenswrapper[4606]: I1212 00:24:08.912845 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:08 crc kubenswrapper[4606]: I1212 00:24:08.912866 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:08Z","lastTransitionTime":"2025-12-12T00:24:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:09 crc kubenswrapper[4606]: I1212 00:24:09.015816 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:09 crc kubenswrapper[4606]: I1212 00:24:09.015876 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:09 crc kubenswrapper[4606]: I1212 00:24:09.015896 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:09 crc kubenswrapper[4606]: I1212 00:24:09.015921 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:09 crc kubenswrapper[4606]: I1212 00:24:09.015938 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:09Z","lastTransitionTime":"2025-12-12T00:24:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:09 crc kubenswrapper[4606]: I1212 00:24:09.118980 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:09 crc kubenswrapper[4606]: I1212 00:24:09.119035 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:09 crc kubenswrapper[4606]: I1212 00:24:09.119052 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:09 crc kubenswrapper[4606]: I1212 00:24:09.119077 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:09 crc kubenswrapper[4606]: I1212 00:24:09.119106 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:09Z","lastTransitionTime":"2025-12-12T00:24:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:09 crc kubenswrapper[4606]: I1212 00:24:09.222799 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:09 crc kubenswrapper[4606]: I1212 00:24:09.222863 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:09 crc kubenswrapper[4606]: I1212 00:24:09.222880 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:09 crc kubenswrapper[4606]: I1212 00:24:09.222906 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:09 crc kubenswrapper[4606]: I1212 00:24:09.222924 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:09Z","lastTransitionTime":"2025-12-12T00:24:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:09 crc kubenswrapper[4606]: I1212 00:24:09.327269 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:09 crc kubenswrapper[4606]: I1212 00:24:09.327333 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:09 crc kubenswrapper[4606]: I1212 00:24:09.327350 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:09 crc kubenswrapper[4606]: I1212 00:24:09.327374 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:09 crc kubenswrapper[4606]: I1212 00:24:09.327391 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:09Z","lastTransitionTime":"2025-12-12T00:24:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:09 crc kubenswrapper[4606]: I1212 00:24:09.430658 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:09 crc kubenswrapper[4606]: I1212 00:24:09.430709 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:09 crc kubenswrapper[4606]: I1212 00:24:09.430726 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:09 crc kubenswrapper[4606]: I1212 00:24:09.430748 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:09 crc kubenswrapper[4606]: I1212 00:24:09.430765 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:09Z","lastTransitionTime":"2025-12-12T00:24:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:09 crc kubenswrapper[4606]: I1212 00:24:09.533532 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:09 crc kubenswrapper[4606]: I1212 00:24:09.533656 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:09 crc kubenswrapper[4606]: I1212 00:24:09.533681 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:09 crc kubenswrapper[4606]: I1212 00:24:09.533713 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:09 crc kubenswrapper[4606]: I1212 00:24:09.533744 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:09Z","lastTransitionTime":"2025-12-12T00:24:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:09 crc kubenswrapper[4606]: I1212 00:24:09.637211 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:09 crc kubenswrapper[4606]: I1212 00:24:09.637309 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:09 crc kubenswrapper[4606]: I1212 00:24:09.637327 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:09 crc kubenswrapper[4606]: I1212 00:24:09.637353 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:09 crc kubenswrapper[4606]: I1212 00:24:09.637372 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:09Z","lastTransitionTime":"2025-12-12T00:24:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:09 crc kubenswrapper[4606]: I1212 00:24:09.699938 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mjjwd" Dec 12 00:24:09 crc kubenswrapper[4606]: E1212 00:24:09.700500 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mjjwd" podUID="0853dce1-c009-407e-960d-1113f85e503f" Dec 12 00:24:09 crc kubenswrapper[4606]: I1212 00:24:09.718703 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a543e227-be89-40cb-941d-b4707cc28921\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b09638dd0d881593745c14548156f20a9366f65ecfa5018d510b902240dd4f7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-986w2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5de882904f4a5718bcf44d07da05096363447836f325f35130914c17a30c3a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-986w2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cqmz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:09Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:09 crc kubenswrapper[4606]: I1212 00:24:09.741762 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:09 crc kubenswrapper[4606]: I1212 00:24:09.742011 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xzcfk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84110159d07d70b0607ac029fec70fb043bbb0e310c39955e49becfe87e9faa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-spsv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xzcfk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:09Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:09 crc kubenswrapper[4606]: I1212 00:24:09.743479 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:09 crc kubenswrapper[4606]: I1212 00:24:09.743773 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:09 crc kubenswrapper[4606]: I1212 00:24:09.743926 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:09 crc kubenswrapper[4606]: I1212 00:24:09.744055 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:09Z","lastTransitionTime":"2025-12-12T00:24:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:09 crc kubenswrapper[4606]: I1212 00:24:09.783448 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f8df398-6835-482a-a2cd-3395b6e9efab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b993d03e36e69edb510ce98f50dfb98344fb6fe2b5debeff3a2004807dd00ee6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c6fef20d4e159a405f0ab0206c49e5e314e97bf7ddaf0758e090106945c9ffe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://accf40ef1339b65136666c1ea6f21a70d2152a650c87ef3ade19ddb6f9effbc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbd49ad6d6011a235d440da0e8058c0fc4dcc64effec37b50a68e84438d537a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2860bce1479ca1d558c4f5230d76cadcaf3efc0e3842ef0d371ecf1168cd852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7780eb068dc4a157794f9253367d2198a38a2c039cf3906fc2319cf48db3cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa7780eb068dc4a157794f9253367d2198a38a2c039cf3906fc2319cf48db3cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://684c387c89889a0a1e8d317ca70ebfc71ea377eb5de9c08c71a3ff02faf99e4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://684c387c89889a0a1e8d317ca70ebfc71ea377eb5de9c08c71a3ff02faf99e4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://78da314feaec6cab81e27eaea086d2bba3a05c4b73ce2add1a0bb57a226b0f5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78da314feaec6cab81e27eaea086d2bba3a05c4b73ce2add1a0bb57a226b0f5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:09Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:09 crc kubenswrapper[4606]: I1212 00:24:09.811358 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92839d27-9755-4aaa-a4c2-8aa83b6473de\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d18b3a560566347e9cfcd748453a89d1588de77f02a5e961a71916c6182eff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9531d10c07644d54aeff48edf4ac068968acbd1b6597baabcb0660e7bcd54c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76f2e4c2d97f36aebba4bcbabe12214a1df27637376d1440ee54d43fca39065\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b764ca1153735e365d95ed2855c7d41cd7457614e1ddce0f008d62720d607e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c75350d79a93c04db2bd6d71995920cde9b897776249b8696f3ead3939f7d5d4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"ed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1765499023\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1765499022\\\\\\\\\\\\\\\" (2025-12-11 23:23:42 +0000 UTC to 2026-12-11 23:23:42 +0000 UTC (now=2025-12-12 00:23:48.120831298 +0000 UTC))\\\\\\\"\\\\nI1212 00:23:48.120889 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1212 00:23:48.120921 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1212 00:23:48.120964 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1212 00:23:48.120986 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1212 00:23:48.121028 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3271453193/tls.crt::/tmp/serving-cert-3271453193/tls.key\\\\\\\"\\\\nI1212 00:23:48.121084 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1212 00:23:48.121211 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1212 00:23:48.121228 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1212 00:23:48.121265 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1212 00:23:48.121275 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1212 00:23:48.121345 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1212 00:23:48.121358 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF1212 00:23:48.123292 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3baf1bf5fbaa04f9f165f43b6a4fba496be0d453641fcd8bbc811687891dd66\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bbabfdcef0c9f4dcaada0bf5c2e98c083fd0f809462f017f8a767ef800e8e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0bbabfdcef0c9f4dcaada0bf5c2e98c083fd0f809462f017f8a767ef800e8e13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:09Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:09 crc kubenswrapper[4606]: I1212 00:24:09.828367 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ba650bd-b1e1-4950-98c6-2bee87181b18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de6cd366dc8d770945235faa6560cea35c0ef7eabd95cb272639e2116e2d623d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a380a96c542428bf9cf66391f4d843144052269b294636e863319b2ba954047f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://902a9d2fb0d7dcdec6339fac981ee0539911f1998a86ecf02a01015d3b1b5907\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4309a0a93977264e5000a930a32a4fb18ebe51fcf2af20bbf153f30f09759ff6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:09Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:09 crc kubenswrapper[4606]: I1212 00:24:09.847159 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:09 crc kubenswrapper[4606]: I1212 00:24:09.847226 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:09 crc kubenswrapper[4606]: I1212 00:24:09.847239 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:09 crc kubenswrapper[4606]: I1212 00:24:09.847257 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:09 crc kubenswrapper[4606]: I1212 00:24:09.847271 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:09Z","lastTransitionTime":"2025-12-12T00:24:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:09 crc kubenswrapper[4606]: I1212 00:24:09.848806 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:09Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:09 crc kubenswrapper[4606]: I1212 00:24:09.863376 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://694cd2f423e8241bd8fd68395824d1cd45a12d80e92ad57e63e59c0642b21384\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:09Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:09 crc kubenswrapper[4606]: I1212 00:24:09.883712 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4986a294a19f85082b9caf854f13f9c3587ca49314b40917b00768ee0bb51ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4251eab73ccb6141903993f8df922037762f7c11dc1e0ef1a2b5708ed435307\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:09Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:09 crc kubenswrapper[4606]: I1212 00:24:09.903138 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:09Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:09 crc kubenswrapper[4606]: I1212 00:24:09.919607 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qtz7b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f437a237-3eb8-4817-b0a9-35efece69933\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f587a144b02f6ba1229380174f2a4f2c50ed33702fe8d64eb5ff7174b32eb2d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qq49f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qtz7b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:09Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:09 crc kubenswrapper[4606]: I1212 00:24:09.937342 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce6c4e852b4c68e910621ad11058beecb40960d501d3e64f5fcad97c442df4ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:09Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:09 crc kubenswrapper[4606]: I1212 00:24:09.950030 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:09 crc kubenswrapper[4606]: I1212 00:24:09.950084 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:09 crc kubenswrapper[4606]: I1212 00:24:09.950102 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:09 crc kubenswrapper[4606]: I1212 00:24:09.950125 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:09 crc kubenswrapper[4606]: I1212 00:24:09.950139 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:09Z","lastTransitionTime":"2025-12-12T00:24:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:09 crc kubenswrapper[4606]: I1212 00:24:09.953235 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mjjwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0853dce1-c009-407e-960d-1113f85e503f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwwtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwwtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:24:03Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mjjwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:09Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:09 crc kubenswrapper[4606]: I1212 00:24:09.966204 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:09Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:09 crc kubenswrapper[4606]: I1212 00:24:09.981370 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-554rp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d34036b-3243-416a-a0c0-1d1ddb3a0ca5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://346417326a8a4ba8e839ce4870dbaacd33a90b4ab096d8ce573ce598d909b51a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9k8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-554rp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:09Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:09 crc kubenswrapper[4606]: I1212 00:24:09.996510 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-w4rbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"470c2076-46bf-4305-9fb1-3e509eb4d4f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78e7b08154aa4792439cd9e13c38f1c3a106a4d138d8e6a47d5406c82f11662c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34a40d0ce85078630553b90b7a35510e0d3523f600db8940d5e0cf063903192d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34a40d0ce85078630553b90b7a35510e0d3523f600db8940d5e0cf063903192d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f0d903a3e09486a00bc5031c1a100d7a520f8e0cbdcccbeadb796e70f1075b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f0d903a3e09486a00bc5031c1a100d7a520f8e0cbdcccbeadb796e70f1075b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06909fa1392ee8f29394ecd7411437e4f331ccd2007d0a534d0292175a7260ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06909fa1392ee8f29394ecd7411437e4f331ccd2007d0a534d0292175a7260ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://328f409c465a8289f2e2e070ac2475187e763de8354cad8af5b5b4e0a3539628\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://328f409c465a8289f2e2e070ac2475187e763de8354cad8af5b5b4e0a3539628\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09980f0c6b16141a2a66d2768c6e01f75c37314fea48825e9b9707a3fc4d3f9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09980f0c6b16141a2a66d2768c6e01f75c37314fea48825e9b9707a3fc4d3f9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d34cd079b74cccb75c66edf8c51b64113483767ced4eb56ad429d6527d0bb0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d34cd079b74cccb75c66edf8c51b64113483767ced4eb56ad429d6527d0bb0c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-w4rbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:09Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:10 crc kubenswrapper[4606]: I1212 00:24:10.025423 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da25b0ba-5398-4185-a4c1-aeba44ae5633\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa827acce161127216bcfcf3553efeb627cbe52e4d23a8cb011e025f67dde670\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://653bdfa6f15b75149c89a4aff0706e132368f8bc75520cde9f1e1c55c8866537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbb742680d55b47d51cd1964e187b6816e50a88befbce58d3a8746af74cf7071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://366ddc63e1156e702b24e0c3d3c02785c10324669fa851e46851f9a2b8a46a5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6183a4685fda61493c792e9ae693cd7f76e4aad9753f622b48347da3fd2b3b00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0867a89de2edf12036267a937b7e3b1a801e9710ff0a1250c20e020c80ac4d5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cd9db0d886b8f1de6f10f2be1348147d443ba4f3e542b171b1daf9f5abbd1ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5cd9db0d886b8f1de6f10f2be1348147d443ba4f3e542b171b1daf9f5abbd1ec\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-12T00:24:01Z\\\",\\\"message\\\":\\\"retry setup to complete in iterateRetryResources\\\\nF1212 00:24:00.622973 5993 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:00Z is after 2025-08-24T17:21:41Z]\\\\nI1212 00:24:00.622999 5993 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1212 00:24:00.623015 5993 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1212 00:24:00.623022 5993 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf in node crc\\\\nI1212 00:24:00.623021 5993 obj_retry.go:303] Ret\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-12T00:24:00Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-hpw5w_openshift-ovn-kubernetes(da25b0ba-5398-4185-a4c1-aeba44ae5633)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e3c08f155fa3cef9692edf10c4df9719f3647b5625a9ac68a7c158afb80dd86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f94a9f641815d76464c17c6145b9ac4803eedab3bea29a52d65a793d448cebe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f94a9f641815d76464c17c6145b9ac4803eedab3bea29a52d65a793d448cebe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hpw5w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:10Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:10 crc kubenswrapper[4606]: I1212 00:24:10.039366 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7rxps" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c831d6d-b07d-46dd-adc0-85239379350f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba170e097af91d1b27e0ee3e414e0831a5a93256c693247e7129a5fda0b9dc85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:24:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbjbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d66f7e90d4969cb2791da434d47d0e4fb269aa8760347c7d59bd8d16f4cf42c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:24:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbjbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:24:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7rxps\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:10Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:10 crc kubenswrapper[4606]: I1212 00:24:10.052317 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:10 crc kubenswrapper[4606]: I1212 00:24:10.052354 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:10 crc kubenswrapper[4606]: I1212 00:24:10.052365 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:10 crc kubenswrapper[4606]: I1212 00:24:10.052382 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:10 crc kubenswrapper[4606]: I1212 00:24:10.052394 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:10Z","lastTransitionTime":"2025-12-12T00:24:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:10 crc kubenswrapper[4606]: I1212 00:24:10.154847 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:10 crc kubenswrapper[4606]: I1212 00:24:10.154878 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:10 crc kubenswrapper[4606]: I1212 00:24:10.154888 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:10 crc kubenswrapper[4606]: I1212 00:24:10.154918 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:10 crc kubenswrapper[4606]: I1212 00:24:10.154927 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:10Z","lastTransitionTime":"2025-12-12T00:24:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:10 crc kubenswrapper[4606]: I1212 00:24:10.257108 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:10 crc kubenswrapper[4606]: I1212 00:24:10.257155 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:10 crc kubenswrapper[4606]: I1212 00:24:10.257198 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:10 crc kubenswrapper[4606]: I1212 00:24:10.257238 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:10 crc kubenswrapper[4606]: I1212 00:24:10.257269 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:10Z","lastTransitionTime":"2025-12-12T00:24:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:10 crc kubenswrapper[4606]: I1212 00:24:10.360219 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:10 crc kubenswrapper[4606]: I1212 00:24:10.360551 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:10 crc kubenswrapper[4606]: I1212 00:24:10.360733 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:10 crc kubenswrapper[4606]: I1212 00:24:10.360870 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:10 crc kubenswrapper[4606]: I1212 00:24:10.361026 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:10Z","lastTransitionTime":"2025-12-12T00:24:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:10 crc kubenswrapper[4606]: I1212 00:24:10.464475 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:10 crc kubenswrapper[4606]: I1212 00:24:10.464536 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:10 crc kubenswrapper[4606]: I1212 00:24:10.464556 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:10 crc kubenswrapper[4606]: I1212 00:24:10.464584 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:10 crc kubenswrapper[4606]: I1212 00:24:10.464604 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:10Z","lastTransitionTime":"2025-12-12T00:24:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:10 crc kubenswrapper[4606]: I1212 00:24:10.567378 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:10 crc kubenswrapper[4606]: I1212 00:24:10.567445 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:10 crc kubenswrapper[4606]: I1212 00:24:10.567463 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:10 crc kubenswrapper[4606]: I1212 00:24:10.567490 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:10 crc kubenswrapper[4606]: I1212 00:24:10.567514 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:10Z","lastTransitionTime":"2025-12-12T00:24:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:10 crc kubenswrapper[4606]: I1212 00:24:10.669899 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:10 crc kubenswrapper[4606]: I1212 00:24:10.669940 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:10 crc kubenswrapper[4606]: I1212 00:24:10.669952 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:10 crc kubenswrapper[4606]: I1212 00:24:10.669969 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:10 crc kubenswrapper[4606]: I1212 00:24:10.669980 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:10Z","lastTransitionTime":"2025-12-12T00:24:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:10 crc kubenswrapper[4606]: I1212 00:24:10.699652 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 12 00:24:10 crc kubenswrapper[4606]: I1212 00:24:10.699662 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 12 00:24:10 crc kubenswrapper[4606]: I1212 00:24:10.699690 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 12 00:24:10 crc kubenswrapper[4606]: E1212 00:24:10.699789 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 12 00:24:10 crc kubenswrapper[4606]: E1212 00:24:10.699886 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 12 00:24:10 crc kubenswrapper[4606]: E1212 00:24:10.699984 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 12 00:24:10 crc kubenswrapper[4606]: I1212 00:24:10.773816 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:10 crc kubenswrapper[4606]: I1212 00:24:10.773890 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:10 crc kubenswrapper[4606]: I1212 00:24:10.773916 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:10 crc kubenswrapper[4606]: I1212 00:24:10.773947 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:10 crc kubenswrapper[4606]: I1212 00:24:10.773974 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:10Z","lastTransitionTime":"2025-12-12T00:24:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:10 crc kubenswrapper[4606]: I1212 00:24:10.877619 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:10 crc kubenswrapper[4606]: I1212 00:24:10.877678 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:10 crc kubenswrapper[4606]: I1212 00:24:10.877695 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:10 crc kubenswrapper[4606]: I1212 00:24:10.877720 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:10 crc kubenswrapper[4606]: I1212 00:24:10.877741 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:10Z","lastTransitionTime":"2025-12-12T00:24:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:10 crc kubenswrapper[4606]: I1212 00:24:10.981005 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:10 crc kubenswrapper[4606]: I1212 00:24:10.981067 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:10 crc kubenswrapper[4606]: I1212 00:24:10.981083 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:10 crc kubenswrapper[4606]: I1212 00:24:10.981108 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:10 crc kubenswrapper[4606]: I1212 00:24:10.981124 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:10Z","lastTransitionTime":"2025-12-12T00:24:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:11 crc kubenswrapper[4606]: I1212 00:24:11.083839 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:11 crc kubenswrapper[4606]: I1212 00:24:11.083889 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:11 crc kubenswrapper[4606]: I1212 00:24:11.083900 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:11 crc kubenswrapper[4606]: I1212 00:24:11.083920 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:11 crc kubenswrapper[4606]: I1212 00:24:11.083935 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:11Z","lastTransitionTime":"2025-12-12T00:24:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:11 crc kubenswrapper[4606]: I1212 00:24:11.186970 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:11 crc kubenswrapper[4606]: I1212 00:24:11.187026 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:11 crc kubenswrapper[4606]: I1212 00:24:11.187045 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:11 crc kubenswrapper[4606]: I1212 00:24:11.187064 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:11 crc kubenswrapper[4606]: I1212 00:24:11.187075 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:11Z","lastTransitionTime":"2025-12-12T00:24:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:11 crc kubenswrapper[4606]: I1212 00:24:11.289934 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:11 crc kubenswrapper[4606]: I1212 00:24:11.289992 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:11 crc kubenswrapper[4606]: I1212 00:24:11.290009 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:11 crc kubenswrapper[4606]: I1212 00:24:11.290031 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:11 crc kubenswrapper[4606]: I1212 00:24:11.290048 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:11Z","lastTransitionTime":"2025-12-12T00:24:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:11 crc kubenswrapper[4606]: I1212 00:24:11.365620 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0853dce1-c009-407e-960d-1113f85e503f-metrics-certs\") pod \"network-metrics-daemon-mjjwd\" (UID: \"0853dce1-c009-407e-960d-1113f85e503f\") " pod="openshift-multus/network-metrics-daemon-mjjwd" Dec 12 00:24:11 crc kubenswrapper[4606]: E1212 00:24:11.365846 4606 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 12 00:24:11 crc kubenswrapper[4606]: E1212 00:24:11.365919 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0853dce1-c009-407e-960d-1113f85e503f-metrics-certs podName:0853dce1-c009-407e-960d-1113f85e503f nodeName:}" failed. No retries permitted until 2025-12-12 00:24:19.36589647 +0000 UTC m=+49.911249376 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0853dce1-c009-407e-960d-1113f85e503f-metrics-certs") pod "network-metrics-daemon-mjjwd" (UID: "0853dce1-c009-407e-960d-1113f85e503f") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 12 00:24:11 crc kubenswrapper[4606]: I1212 00:24:11.392952 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:11 crc kubenswrapper[4606]: I1212 00:24:11.393009 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:11 crc kubenswrapper[4606]: I1212 00:24:11.393023 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:11 crc kubenswrapper[4606]: I1212 00:24:11.393045 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:11 crc kubenswrapper[4606]: I1212 00:24:11.393056 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:11Z","lastTransitionTime":"2025-12-12T00:24:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:11 crc kubenswrapper[4606]: I1212 00:24:11.495465 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:11 crc kubenswrapper[4606]: I1212 00:24:11.495520 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:11 crc kubenswrapper[4606]: I1212 00:24:11.495537 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:11 crc kubenswrapper[4606]: I1212 00:24:11.495557 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:11 crc kubenswrapper[4606]: I1212 00:24:11.495570 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:11Z","lastTransitionTime":"2025-12-12T00:24:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:11 crc kubenswrapper[4606]: I1212 00:24:11.598526 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:11 crc kubenswrapper[4606]: I1212 00:24:11.598570 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:11 crc kubenswrapper[4606]: I1212 00:24:11.598585 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:11 crc kubenswrapper[4606]: I1212 00:24:11.598604 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:11 crc kubenswrapper[4606]: I1212 00:24:11.598620 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:11Z","lastTransitionTime":"2025-12-12T00:24:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:11 crc kubenswrapper[4606]: I1212 00:24:11.699516 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mjjwd" Dec 12 00:24:11 crc kubenswrapper[4606]: E1212 00:24:11.699656 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mjjwd" podUID="0853dce1-c009-407e-960d-1113f85e503f" Dec 12 00:24:11 crc kubenswrapper[4606]: I1212 00:24:11.701117 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:11 crc kubenswrapper[4606]: I1212 00:24:11.701152 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:11 crc kubenswrapper[4606]: I1212 00:24:11.701165 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:11 crc kubenswrapper[4606]: I1212 00:24:11.701206 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:11 crc kubenswrapper[4606]: I1212 00:24:11.701218 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:11Z","lastTransitionTime":"2025-12-12T00:24:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:11 crc kubenswrapper[4606]: I1212 00:24:11.713550 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:11 crc kubenswrapper[4606]: I1212 00:24:11.713583 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:11 crc kubenswrapper[4606]: I1212 00:24:11.713592 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:11 crc kubenswrapper[4606]: I1212 00:24:11.713603 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:11 crc kubenswrapper[4606]: I1212 00:24:11.713612 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:11Z","lastTransitionTime":"2025-12-12T00:24:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:11 crc kubenswrapper[4606]: E1212 00:24:11.731233 4606 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:24:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:24:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:24:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:24:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"19ab28be-ccfb-4859-88d0-8d375e2d06f2\\\",\\\"systemUUID\\\":\\\"d5982033-b2dc-474c-9cda-275cd567c208\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:11Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:11 crc kubenswrapper[4606]: I1212 00:24:11.736882 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:11 crc kubenswrapper[4606]: I1212 00:24:11.736938 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:11 crc kubenswrapper[4606]: I1212 00:24:11.736950 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:11 crc kubenswrapper[4606]: I1212 00:24:11.736978 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:11 crc kubenswrapper[4606]: I1212 00:24:11.736994 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:11Z","lastTransitionTime":"2025-12-12T00:24:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:11 crc kubenswrapper[4606]: E1212 00:24:11.750965 4606 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:24:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:24:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:24:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:24:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"19ab28be-ccfb-4859-88d0-8d375e2d06f2\\\",\\\"systemUUID\\\":\\\"d5982033-b2dc-474c-9cda-275cd567c208\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:11Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:11 crc kubenswrapper[4606]: I1212 00:24:11.756282 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:11 crc kubenswrapper[4606]: I1212 00:24:11.756340 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:11 crc kubenswrapper[4606]: I1212 00:24:11.756352 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:11 crc kubenswrapper[4606]: I1212 00:24:11.756375 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:11 crc kubenswrapper[4606]: I1212 00:24:11.756391 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:11Z","lastTransitionTime":"2025-12-12T00:24:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:11 crc kubenswrapper[4606]: E1212 00:24:11.774285 4606 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:24:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:24:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:24:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:24:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"19ab28be-ccfb-4859-88d0-8d375e2d06f2\\\",\\\"systemUUID\\\":\\\"d5982033-b2dc-474c-9cda-275cd567c208\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:11Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:11 crc kubenswrapper[4606]: I1212 00:24:11.779765 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:11 crc kubenswrapper[4606]: I1212 00:24:11.779835 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:11 crc kubenswrapper[4606]: I1212 00:24:11.779859 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:11 crc kubenswrapper[4606]: I1212 00:24:11.779890 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:11 crc kubenswrapper[4606]: I1212 00:24:11.779917 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:11Z","lastTransitionTime":"2025-12-12T00:24:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:11 crc kubenswrapper[4606]: E1212 00:24:11.800103 4606 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:24:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:24:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:24:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:24:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"19ab28be-ccfb-4859-88d0-8d375e2d06f2\\\",\\\"systemUUID\\\":\\\"d5982033-b2dc-474c-9cda-275cd567c208\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:11Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:11 crc kubenswrapper[4606]: I1212 00:24:11.806347 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:11 crc kubenswrapper[4606]: I1212 00:24:11.806402 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:11 crc kubenswrapper[4606]: I1212 00:24:11.806412 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:11 crc kubenswrapper[4606]: I1212 00:24:11.806431 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:11 crc kubenswrapper[4606]: I1212 00:24:11.806446 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:11Z","lastTransitionTime":"2025-12-12T00:24:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:11 crc kubenswrapper[4606]: E1212 00:24:11.820765 4606 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:24:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:24:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:24:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:24:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"19ab28be-ccfb-4859-88d0-8d375e2d06f2\\\",\\\"systemUUID\\\":\\\"d5982033-b2dc-474c-9cda-275cd567c208\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:11Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:11 crc kubenswrapper[4606]: E1212 00:24:11.820899 4606 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 12 00:24:11 crc kubenswrapper[4606]: I1212 00:24:11.823988 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:11 crc kubenswrapper[4606]: I1212 00:24:11.824018 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:11 crc kubenswrapper[4606]: I1212 00:24:11.824030 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:11 crc kubenswrapper[4606]: I1212 00:24:11.824049 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:11 crc kubenswrapper[4606]: I1212 00:24:11.824063 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:11Z","lastTransitionTime":"2025-12-12T00:24:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:11 crc kubenswrapper[4606]: I1212 00:24:11.927767 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:11 crc kubenswrapper[4606]: I1212 00:24:11.927854 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:11 crc kubenswrapper[4606]: I1212 00:24:11.927874 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:11 crc kubenswrapper[4606]: I1212 00:24:11.927906 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:11 crc kubenswrapper[4606]: I1212 00:24:11.927935 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:11Z","lastTransitionTime":"2025-12-12T00:24:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:12 crc kubenswrapper[4606]: I1212 00:24:12.031706 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:12 crc kubenswrapper[4606]: I1212 00:24:12.031773 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:12 crc kubenswrapper[4606]: I1212 00:24:12.031790 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:12 crc kubenswrapper[4606]: I1212 00:24:12.031922 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:12 crc kubenswrapper[4606]: I1212 00:24:12.031949 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:12Z","lastTransitionTime":"2025-12-12T00:24:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:12 crc kubenswrapper[4606]: I1212 00:24:12.135609 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:12 crc kubenswrapper[4606]: I1212 00:24:12.135704 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:12 crc kubenswrapper[4606]: I1212 00:24:12.135725 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:12 crc kubenswrapper[4606]: I1212 00:24:12.135750 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:12 crc kubenswrapper[4606]: I1212 00:24:12.135765 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:12Z","lastTransitionTime":"2025-12-12T00:24:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:12 crc kubenswrapper[4606]: I1212 00:24:12.242161 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:12 crc kubenswrapper[4606]: I1212 00:24:12.242237 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:12 crc kubenswrapper[4606]: I1212 00:24:12.242251 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:12 crc kubenswrapper[4606]: I1212 00:24:12.242277 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:12 crc kubenswrapper[4606]: I1212 00:24:12.242296 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:12Z","lastTransitionTime":"2025-12-12T00:24:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:12 crc kubenswrapper[4606]: I1212 00:24:12.346563 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:12 crc kubenswrapper[4606]: I1212 00:24:12.346617 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:12 crc kubenswrapper[4606]: I1212 00:24:12.346628 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:12 crc kubenswrapper[4606]: I1212 00:24:12.346648 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:12 crc kubenswrapper[4606]: I1212 00:24:12.346662 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:12Z","lastTransitionTime":"2025-12-12T00:24:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:12 crc kubenswrapper[4606]: I1212 00:24:12.449922 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:12 crc kubenswrapper[4606]: I1212 00:24:12.450002 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:12 crc kubenswrapper[4606]: I1212 00:24:12.450027 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:12 crc kubenswrapper[4606]: I1212 00:24:12.450058 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:12 crc kubenswrapper[4606]: I1212 00:24:12.450082 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:12Z","lastTransitionTime":"2025-12-12T00:24:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:12 crc kubenswrapper[4606]: I1212 00:24:12.552727 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:12 crc kubenswrapper[4606]: I1212 00:24:12.552804 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:12 crc kubenswrapper[4606]: I1212 00:24:12.552830 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:12 crc kubenswrapper[4606]: I1212 00:24:12.552865 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:12 crc kubenswrapper[4606]: I1212 00:24:12.552886 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:12Z","lastTransitionTime":"2025-12-12T00:24:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:12 crc kubenswrapper[4606]: I1212 00:24:12.655957 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:12 crc kubenswrapper[4606]: I1212 00:24:12.655989 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:12 crc kubenswrapper[4606]: I1212 00:24:12.655998 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:12 crc kubenswrapper[4606]: I1212 00:24:12.656026 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:12 crc kubenswrapper[4606]: I1212 00:24:12.656036 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:12Z","lastTransitionTime":"2025-12-12T00:24:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:12 crc kubenswrapper[4606]: I1212 00:24:12.699286 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 12 00:24:12 crc kubenswrapper[4606]: E1212 00:24:12.699497 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 12 00:24:12 crc kubenswrapper[4606]: I1212 00:24:12.699286 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 12 00:24:12 crc kubenswrapper[4606]: I1212 00:24:12.699286 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 12 00:24:12 crc kubenswrapper[4606]: E1212 00:24:12.699597 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 12 00:24:12 crc kubenswrapper[4606]: E1212 00:24:12.699677 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 12 00:24:12 crc kubenswrapper[4606]: I1212 00:24:12.759759 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:12 crc kubenswrapper[4606]: I1212 00:24:12.759816 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:12 crc kubenswrapper[4606]: I1212 00:24:12.759834 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:12 crc kubenswrapper[4606]: I1212 00:24:12.759859 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:12 crc kubenswrapper[4606]: I1212 00:24:12.759877 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:12Z","lastTransitionTime":"2025-12-12T00:24:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:12 crc kubenswrapper[4606]: I1212 00:24:12.862693 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:12 crc kubenswrapper[4606]: I1212 00:24:12.862755 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:12 crc kubenswrapper[4606]: I1212 00:24:12.862771 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:12 crc kubenswrapper[4606]: I1212 00:24:12.862797 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:12 crc kubenswrapper[4606]: I1212 00:24:12.862815 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:12Z","lastTransitionTime":"2025-12-12T00:24:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:12 crc kubenswrapper[4606]: I1212 00:24:12.965760 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:12 crc kubenswrapper[4606]: I1212 00:24:12.965799 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:12 crc kubenswrapper[4606]: I1212 00:24:12.965810 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:12 crc kubenswrapper[4606]: I1212 00:24:12.965827 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:12 crc kubenswrapper[4606]: I1212 00:24:12.965839 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:12Z","lastTransitionTime":"2025-12-12T00:24:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:13 crc kubenswrapper[4606]: I1212 00:24:13.068902 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:13 crc kubenswrapper[4606]: I1212 00:24:13.068949 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:13 crc kubenswrapper[4606]: I1212 00:24:13.068960 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:13 crc kubenswrapper[4606]: I1212 00:24:13.068977 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:13 crc kubenswrapper[4606]: I1212 00:24:13.068988 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:13Z","lastTransitionTime":"2025-12-12T00:24:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:13 crc kubenswrapper[4606]: I1212 00:24:13.171917 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:13 crc kubenswrapper[4606]: I1212 00:24:13.171979 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:13 crc kubenswrapper[4606]: I1212 00:24:13.171995 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:13 crc kubenswrapper[4606]: I1212 00:24:13.172018 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:13 crc kubenswrapper[4606]: I1212 00:24:13.172036 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:13Z","lastTransitionTime":"2025-12-12T00:24:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:13 crc kubenswrapper[4606]: I1212 00:24:13.274606 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:13 crc kubenswrapper[4606]: I1212 00:24:13.274665 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:13 crc kubenswrapper[4606]: I1212 00:24:13.274684 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:13 crc kubenswrapper[4606]: I1212 00:24:13.274704 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:13 crc kubenswrapper[4606]: I1212 00:24:13.274718 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:13Z","lastTransitionTime":"2025-12-12T00:24:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:13 crc kubenswrapper[4606]: I1212 00:24:13.377212 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:13 crc kubenswrapper[4606]: I1212 00:24:13.377258 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:13 crc kubenswrapper[4606]: I1212 00:24:13.377269 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:13 crc kubenswrapper[4606]: I1212 00:24:13.377286 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:13 crc kubenswrapper[4606]: I1212 00:24:13.377300 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:13Z","lastTransitionTime":"2025-12-12T00:24:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:13 crc kubenswrapper[4606]: I1212 00:24:13.480100 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:13 crc kubenswrapper[4606]: I1212 00:24:13.480152 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:13 crc kubenswrapper[4606]: I1212 00:24:13.480205 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:13 crc kubenswrapper[4606]: I1212 00:24:13.480235 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:13 crc kubenswrapper[4606]: I1212 00:24:13.480256 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:13Z","lastTransitionTime":"2025-12-12T00:24:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:13 crc kubenswrapper[4606]: I1212 00:24:13.582673 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:13 crc kubenswrapper[4606]: I1212 00:24:13.582742 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:13 crc kubenswrapper[4606]: I1212 00:24:13.582761 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:13 crc kubenswrapper[4606]: I1212 00:24:13.582787 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:13 crc kubenswrapper[4606]: I1212 00:24:13.582805 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:13Z","lastTransitionTime":"2025-12-12T00:24:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:13 crc kubenswrapper[4606]: I1212 00:24:13.684459 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:13 crc kubenswrapper[4606]: I1212 00:24:13.684487 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:13 crc kubenswrapper[4606]: I1212 00:24:13.684495 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:13 crc kubenswrapper[4606]: I1212 00:24:13.684508 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:13 crc kubenswrapper[4606]: I1212 00:24:13.684518 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:13Z","lastTransitionTime":"2025-12-12T00:24:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:13 crc kubenswrapper[4606]: I1212 00:24:13.699163 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mjjwd" Dec 12 00:24:13 crc kubenswrapper[4606]: E1212 00:24:13.699292 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mjjwd" podUID="0853dce1-c009-407e-960d-1113f85e503f" Dec 12 00:24:13 crc kubenswrapper[4606]: I1212 00:24:13.786863 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:13 crc kubenswrapper[4606]: I1212 00:24:13.786939 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:13 crc kubenswrapper[4606]: I1212 00:24:13.786961 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:13 crc kubenswrapper[4606]: I1212 00:24:13.786984 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:13 crc kubenswrapper[4606]: I1212 00:24:13.787001 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:13Z","lastTransitionTime":"2025-12-12T00:24:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:13 crc kubenswrapper[4606]: I1212 00:24:13.889595 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:13 crc kubenswrapper[4606]: I1212 00:24:13.889655 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:13 crc kubenswrapper[4606]: I1212 00:24:13.889667 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:13 crc kubenswrapper[4606]: I1212 00:24:13.889680 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:13 crc kubenswrapper[4606]: I1212 00:24:13.889688 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:13Z","lastTransitionTime":"2025-12-12T00:24:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:13 crc kubenswrapper[4606]: I1212 00:24:13.992973 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:13 crc kubenswrapper[4606]: I1212 00:24:13.993060 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:13 crc kubenswrapper[4606]: I1212 00:24:13.993082 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:13 crc kubenswrapper[4606]: I1212 00:24:13.993111 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:13 crc kubenswrapper[4606]: I1212 00:24:13.993132 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:13Z","lastTransitionTime":"2025-12-12T00:24:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:14 crc kubenswrapper[4606]: I1212 00:24:14.096007 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:14 crc kubenswrapper[4606]: I1212 00:24:14.096052 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:14 crc kubenswrapper[4606]: I1212 00:24:14.096063 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:14 crc kubenswrapper[4606]: I1212 00:24:14.096086 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:14 crc kubenswrapper[4606]: I1212 00:24:14.096100 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:14Z","lastTransitionTime":"2025-12-12T00:24:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:14 crc kubenswrapper[4606]: I1212 00:24:14.198470 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:14 crc kubenswrapper[4606]: I1212 00:24:14.198530 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:14 crc kubenswrapper[4606]: I1212 00:24:14.198550 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:14 crc kubenswrapper[4606]: I1212 00:24:14.198585 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:14 crc kubenswrapper[4606]: I1212 00:24:14.198620 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:14Z","lastTransitionTime":"2025-12-12T00:24:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:14 crc kubenswrapper[4606]: I1212 00:24:14.300649 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:14 crc kubenswrapper[4606]: I1212 00:24:14.300689 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:14 crc kubenswrapper[4606]: I1212 00:24:14.300700 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:14 crc kubenswrapper[4606]: I1212 00:24:14.300715 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:14 crc kubenswrapper[4606]: I1212 00:24:14.300726 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:14Z","lastTransitionTime":"2025-12-12T00:24:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:14 crc kubenswrapper[4606]: I1212 00:24:14.403905 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:14 crc kubenswrapper[4606]: I1212 00:24:14.403957 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:14 crc kubenswrapper[4606]: I1212 00:24:14.404027 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:14 crc kubenswrapper[4606]: I1212 00:24:14.404051 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:14 crc kubenswrapper[4606]: I1212 00:24:14.404065 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:14Z","lastTransitionTime":"2025-12-12T00:24:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:14 crc kubenswrapper[4606]: I1212 00:24:14.507491 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:14 crc kubenswrapper[4606]: I1212 00:24:14.507580 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:14 crc kubenswrapper[4606]: I1212 00:24:14.507598 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:14 crc kubenswrapper[4606]: I1212 00:24:14.507625 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:14 crc kubenswrapper[4606]: I1212 00:24:14.507671 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:14Z","lastTransitionTime":"2025-12-12T00:24:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:14 crc kubenswrapper[4606]: I1212 00:24:14.610936 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:14 crc kubenswrapper[4606]: I1212 00:24:14.610990 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:14 crc kubenswrapper[4606]: I1212 00:24:14.611007 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:14 crc kubenswrapper[4606]: I1212 00:24:14.611029 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:14 crc kubenswrapper[4606]: I1212 00:24:14.611045 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:14Z","lastTransitionTime":"2025-12-12T00:24:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:14 crc kubenswrapper[4606]: I1212 00:24:14.698627 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 12 00:24:14 crc kubenswrapper[4606]: I1212 00:24:14.698690 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 12 00:24:14 crc kubenswrapper[4606]: I1212 00:24:14.698801 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 12 00:24:14 crc kubenswrapper[4606]: E1212 00:24:14.698803 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 12 00:24:14 crc kubenswrapper[4606]: E1212 00:24:14.699461 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 12 00:24:14 crc kubenswrapper[4606]: E1212 00:24:14.699583 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 12 00:24:14 crc kubenswrapper[4606]: I1212 00:24:14.700016 4606 scope.go:117] "RemoveContainer" containerID="5cd9db0d886b8f1de6f10f2be1348147d443ba4f3e542b171b1daf9f5abbd1ec" Dec 12 00:24:14 crc kubenswrapper[4606]: I1212 00:24:14.716097 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:14 crc kubenswrapper[4606]: I1212 00:24:14.716578 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:14 crc kubenswrapper[4606]: I1212 00:24:14.716602 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:14 crc kubenswrapper[4606]: I1212 00:24:14.716631 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:14 crc kubenswrapper[4606]: I1212 00:24:14.716655 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:14Z","lastTransitionTime":"2025-12-12T00:24:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:14 crc kubenswrapper[4606]: I1212 00:24:14.819971 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:14 crc kubenswrapper[4606]: I1212 00:24:14.820026 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:14 crc kubenswrapper[4606]: I1212 00:24:14.820036 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:14 crc kubenswrapper[4606]: I1212 00:24:14.820052 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:14 crc kubenswrapper[4606]: I1212 00:24:14.820064 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:14Z","lastTransitionTime":"2025-12-12T00:24:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:14 crc kubenswrapper[4606]: I1212 00:24:14.923714 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:14 crc kubenswrapper[4606]: I1212 00:24:14.923781 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:14 crc kubenswrapper[4606]: I1212 00:24:14.923804 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:14 crc kubenswrapper[4606]: I1212 00:24:14.923832 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:14 crc kubenswrapper[4606]: I1212 00:24:14.923853 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:14Z","lastTransitionTime":"2025-12-12T00:24:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:15 crc kubenswrapper[4606]: I1212 00:24:15.027387 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:15 crc kubenswrapper[4606]: I1212 00:24:15.027444 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:15 crc kubenswrapper[4606]: I1212 00:24:15.027463 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:15 crc kubenswrapper[4606]: I1212 00:24:15.027492 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:15 crc kubenswrapper[4606]: I1212 00:24:15.027510 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:15Z","lastTransitionTime":"2025-12-12T00:24:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:15 crc kubenswrapper[4606]: I1212 00:24:15.129198 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:15 crc kubenswrapper[4606]: I1212 00:24:15.129217 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:15 crc kubenswrapper[4606]: I1212 00:24:15.129225 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:15 crc kubenswrapper[4606]: I1212 00:24:15.129238 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:15 crc kubenswrapper[4606]: I1212 00:24:15.129247 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:15Z","lastTransitionTime":"2025-12-12T00:24:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:15 crc kubenswrapper[4606]: I1212 00:24:15.231329 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:15 crc kubenswrapper[4606]: I1212 00:24:15.231360 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:15 crc kubenswrapper[4606]: I1212 00:24:15.231371 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:15 crc kubenswrapper[4606]: I1212 00:24:15.231385 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:15 crc kubenswrapper[4606]: I1212 00:24:15.231395 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:15Z","lastTransitionTime":"2025-12-12T00:24:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:15 crc kubenswrapper[4606]: I1212 00:24:15.333453 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:15 crc kubenswrapper[4606]: I1212 00:24:15.333485 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:15 crc kubenswrapper[4606]: I1212 00:24:15.333495 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:15 crc kubenswrapper[4606]: I1212 00:24:15.333510 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:15 crc kubenswrapper[4606]: I1212 00:24:15.333520 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:15Z","lastTransitionTime":"2025-12-12T00:24:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:15 crc kubenswrapper[4606]: I1212 00:24:15.435949 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:15 crc kubenswrapper[4606]: I1212 00:24:15.435981 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:15 crc kubenswrapper[4606]: I1212 00:24:15.435992 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:15 crc kubenswrapper[4606]: I1212 00:24:15.436007 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:15 crc kubenswrapper[4606]: I1212 00:24:15.436018 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:15Z","lastTransitionTime":"2025-12-12T00:24:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:15 crc kubenswrapper[4606]: I1212 00:24:15.537809 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:15 crc kubenswrapper[4606]: I1212 00:24:15.537868 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:15 crc kubenswrapper[4606]: I1212 00:24:15.537890 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:15 crc kubenswrapper[4606]: I1212 00:24:15.537910 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:15 crc kubenswrapper[4606]: I1212 00:24:15.537925 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:15Z","lastTransitionTime":"2025-12-12T00:24:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:15 crc kubenswrapper[4606]: I1212 00:24:15.640133 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:15 crc kubenswrapper[4606]: I1212 00:24:15.640227 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:15 crc kubenswrapper[4606]: I1212 00:24:15.640245 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:15 crc kubenswrapper[4606]: I1212 00:24:15.640271 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:15 crc kubenswrapper[4606]: I1212 00:24:15.640288 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:15Z","lastTransitionTime":"2025-12-12T00:24:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:15 crc kubenswrapper[4606]: I1212 00:24:15.699165 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mjjwd" Dec 12 00:24:15 crc kubenswrapper[4606]: E1212 00:24:15.699388 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mjjwd" podUID="0853dce1-c009-407e-960d-1113f85e503f" Dec 12 00:24:15 crc kubenswrapper[4606]: I1212 00:24:15.742661 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:15 crc kubenswrapper[4606]: I1212 00:24:15.742707 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:15 crc kubenswrapper[4606]: I1212 00:24:15.742719 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:15 crc kubenswrapper[4606]: I1212 00:24:15.742739 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:15 crc kubenswrapper[4606]: I1212 00:24:15.742751 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:15Z","lastTransitionTime":"2025-12-12T00:24:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:15 crc kubenswrapper[4606]: I1212 00:24:15.845098 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:15 crc kubenswrapper[4606]: I1212 00:24:15.845146 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:15 crc kubenswrapper[4606]: I1212 00:24:15.845155 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:15 crc kubenswrapper[4606]: I1212 00:24:15.845168 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:15 crc kubenswrapper[4606]: I1212 00:24:15.845205 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:15Z","lastTransitionTime":"2025-12-12T00:24:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:15 crc kubenswrapper[4606]: I1212 00:24:15.948725 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:15 crc kubenswrapper[4606]: I1212 00:24:15.948780 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:15 crc kubenswrapper[4606]: I1212 00:24:15.948796 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:15 crc kubenswrapper[4606]: I1212 00:24:15.948847 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:15 crc kubenswrapper[4606]: I1212 00:24:15.948869 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:15Z","lastTransitionTime":"2025-12-12T00:24:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:16 crc kubenswrapper[4606]: I1212 00:24:16.014981 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hpw5w_da25b0ba-5398-4185-a4c1-aeba44ae5633/ovnkube-controller/1.log" Dec 12 00:24:16 crc kubenswrapper[4606]: I1212 00:24:16.020245 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" event={"ID":"da25b0ba-5398-4185-a4c1-aeba44ae5633","Type":"ContainerStarted","Data":"8bc7d9363ed2e08b897cfc835f60d45070a0a21643ac00f147f7e0108af46f37"} Dec 12 00:24:16 crc kubenswrapper[4606]: I1212 00:24:16.021114 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" Dec 12 00:24:16 crc kubenswrapper[4606]: I1212 00:24:16.041450 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a543e227-be89-40cb-941d-b4707cc28921\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b09638dd0d881593745c14548156f20a9366f65ecfa5018d510b902240dd4f7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-986w2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5de882904f4a5718bcf44d07da05096363447836f325f35130914c17a30c3a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-986w2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cqmz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:16Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:16 crc kubenswrapper[4606]: I1212 00:24:16.052747 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:16 crc kubenswrapper[4606]: I1212 00:24:16.052813 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:16 crc kubenswrapper[4606]: I1212 00:24:16.052837 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:16 crc kubenswrapper[4606]: I1212 00:24:16.052871 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:16 crc kubenswrapper[4606]: I1212 00:24:16.052893 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:16Z","lastTransitionTime":"2025-12-12T00:24:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:16 crc kubenswrapper[4606]: I1212 00:24:16.062483 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xzcfk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84110159d07d70b0607ac029fec70fb043bbb0e310c39955e49becfe87e9faa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-spsv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xzcfk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:16Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:16 crc kubenswrapper[4606]: I1212 00:24:16.088101 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f8df398-6835-482a-a2cd-3395b6e9efab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b993d03e36e69edb510ce98f50dfb98344fb6fe2b5debeff3a2004807dd00ee6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c6fef20d4e159a405f0ab0206c49e5e314e97bf7ddaf0758e090106945c9ffe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://accf40ef1339b65136666c1ea6f21a70d2152a650c87ef3ade19ddb6f9effbc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbd49ad6d6011a235d440da0e8058c0fc4dcc64effec37b50a68e84438d537a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2860bce1479ca1d558c4f5230d76cadcaf3efc0e3842ef0d371ecf1168cd852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7780eb068dc4a157794f9253367d2198a38a2c039cf3906fc2319cf48db3cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa7780eb068dc4a157794f9253367d2198a38a2c039cf3906fc2319cf48db3cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://684c387c89889a0a1e8d317ca70ebfc71ea377eb5de9c08c71a3ff02faf99e4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://684c387c89889a0a1e8d317ca70ebfc71ea377eb5de9c08c71a3ff02faf99e4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://78da314feaec6cab81e27eaea086d2bba3a05c4b73ce2add1a0bb57a226b0f5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78da314feaec6cab81e27eaea086d2bba3a05c4b73ce2add1a0bb57a226b0f5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:16Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:16 crc kubenswrapper[4606]: I1212 00:24:16.109904 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92839d27-9755-4aaa-a4c2-8aa83b6473de\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d18b3a560566347e9cfcd748453a89d1588de77f02a5e961a71916c6182eff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9531d10c07644d54aeff48edf4ac068968acbd1b6597baabcb0660e7bcd54c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76f2e4c2d97f36aebba4bcbabe12214a1df27637376d1440ee54d43fca39065\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b764ca1153735e365d95ed2855c7d41cd7457614e1ddce0f008d62720d607e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c75350d79a93c04db2bd6d71995920cde9b897776249b8696f3ead3939f7d5d4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"ed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1765499023\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1765499022\\\\\\\\\\\\\\\" (2025-12-11 23:23:42 +0000 UTC to 2026-12-11 23:23:42 +0000 UTC (now=2025-12-12 00:23:48.120831298 +0000 UTC))\\\\\\\"\\\\nI1212 00:23:48.120889 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1212 00:23:48.120921 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1212 00:23:48.120964 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1212 00:23:48.120986 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1212 00:23:48.121028 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3271453193/tls.crt::/tmp/serving-cert-3271453193/tls.key\\\\\\\"\\\\nI1212 00:23:48.121084 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1212 00:23:48.121211 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1212 00:23:48.121228 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1212 00:23:48.121265 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1212 00:23:48.121275 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1212 00:23:48.121345 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1212 00:23:48.121358 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF1212 00:23:48.123292 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3baf1bf5fbaa04f9f165f43b6a4fba496be0d453641fcd8bbc811687891dd66\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bbabfdcef0c9f4dcaada0bf5c2e98c083fd0f809462f017f8a767ef800e8e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0bbabfdcef0c9f4dcaada0bf5c2e98c083fd0f809462f017f8a767ef800e8e13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:16Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:16 crc kubenswrapper[4606]: I1212 00:24:16.130971 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ba650bd-b1e1-4950-98c6-2bee87181b18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de6cd366dc8d770945235faa6560cea35c0ef7eabd95cb272639e2116e2d623d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a380a96c542428bf9cf66391f4d843144052269b294636e863319b2ba954047f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://902a9d2fb0d7dcdec6339fac981ee0539911f1998a86ecf02a01015d3b1b5907\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4309a0a93977264e5000a930a32a4fb18ebe51fcf2af20bbf153f30f09759ff6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:16Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:16 crc kubenswrapper[4606]: I1212 00:24:16.151379 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:16Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:16 crc kubenswrapper[4606]: I1212 00:24:16.156115 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:16 crc kubenswrapper[4606]: I1212 00:24:16.156216 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:16 crc kubenswrapper[4606]: I1212 00:24:16.156245 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:16 crc kubenswrapper[4606]: I1212 00:24:16.156282 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:16 crc kubenswrapper[4606]: I1212 00:24:16.156309 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:16Z","lastTransitionTime":"2025-12-12T00:24:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:16 crc kubenswrapper[4606]: I1212 00:24:16.168530 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://694cd2f423e8241bd8fd68395824d1cd45a12d80e92ad57e63e59c0642b21384\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:16Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:16 crc kubenswrapper[4606]: I1212 00:24:16.192554 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4986a294a19f85082b9caf854f13f9c3587ca49314b40917b00768ee0bb51ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4251eab73ccb6141903993f8df922037762f7c11dc1e0ef1a2b5708ed435307\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:16Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:16 crc kubenswrapper[4606]: I1212 00:24:16.216597 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:16Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:16 crc kubenswrapper[4606]: I1212 00:24:16.232708 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qtz7b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f437a237-3eb8-4817-b0a9-35efece69933\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f587a144b02f6ba1229380174f2a4f2c50ed33702fe8d64eb5ff7174b32eb2d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qq49f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qtz7b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:16Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:16 crc kubenswrapper[4606]: I1212 00:24:16.259751 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:16 crc kubenswrapper[4606]: I1212 00:24:16.259814 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:16 crc kubenswrapper[4606]: I1212 00:24:16.259839 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:16 crc kubenswrapper[4606]: I1212 00:24:16.259869 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:16 crc kubenswrapper[4606]: I1212 00:24:16.259886 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:16Z","lastTransitionTime":"2025-12-12T00:24:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:16 crc kubenswrapper[4606]: I1212 00:24:16.259881 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce6c4e852b4c68e910621ad11058beecb40960d501d3e64f5fcad97c442df4ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:16Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:16 crc kubenswrapper[4606]: I1212 00:24:16.284300 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mjjwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0853dce1-c009-407e-960d-1113f85e503f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwwtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwwtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:24:03Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mjjwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:16Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:16 crc kubenswrapper[4606]: I1212 00:24:16.308216 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:16Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:16 crc kubenswrapper[4606]: I1212 00:24:16.323503 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-554rp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d34036b-3243-416a-a0c0-1d1ddb3a0ca5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://346417326a8a4ba8e839ce4870dbaacd33a90b4ab096d8ce573ce598d909b51a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9k8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-554rp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:16Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:16 crc kubenswrapper[4606]: I1212 00:24:16.338186 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-w4rbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"470c2076-46bf-4305-9fb1-3e509eb4d4f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78e7b08154aa4792439cd9e13c38f1c3a106a4d138d8e6a47d5406c82f11662c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34a40d0ce85078630553b90b7a35510e0d3523f600db8940d5e0cf063903192d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34a40d0ce85078630553b90b7a35510e0d3523f600db8940d5e0cf063903192d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f0d903a3e09486a00bc5031c1a100d7a520f8e0cbdcccbeadb796e70f1075b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f0d903a3e09486a00bc5031c1a100d7a520f8e0cbdcccbeadb796e70f1075b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06909fa1392ee8f29394ecd7411437e4f331ccd2007d0a534d0292175a7260ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06909fa1392ee8f29394ecd7411437e4f331ccd2007d0a534d0292175a7260ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://328f409c465a8289f2e2e070ac2475187e763de8354cad8af5b5b4e0a3539628\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://328f409c465a8289f2e2e070ac2475187e763de8354cad8af5b5b4e0a3539628\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09980f0c6b16141a2a66d2768c6e01f75c37314fea48825e9b9707a3fc4d3f9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09980f0c6b16141a2a66d2768c6e01f75c37314fea48825e9b9707a3fc4d3f9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d34cd079b74cccb75c66edf8c51b64113483767ced4eb56ad429d6527d0bb0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d34cd079b74cccb75c66edf8c51b64113483767ced4eb56ad429d6527d0bb0c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-w4rbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:16Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:16 crc kubenswrapper[4606]: I1212 00:24:16.359044 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da25b0ba-5398-4185-a4c1-aeba44ae5633\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa827acce161127216bcfcf3553efeb627cbe52e4d23a8cb011e025f67dde670\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://653bdfa6f15b75149c89a4aff0706e132368f8bc75520cde9f1e1c55c8866537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbb742680d55b47d51cd1964e187b6816e50a88befbce58d3a8746af74cf7071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://366ddc63e1156e702b24e0c3d3c02785c10324669fa851e46851f9a2b8a46a5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6183a4685fda61493c792e9ae693cd7f76e4aad9753f622b48347da3fd2b3b00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0867a89de2edf12036267a937b7e3b1a801e9710ff0a1250c20e020c80ac4d5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bc7d9363ed2e08b897cfc835f60d45070a0a21643ac00f147f7e0108af46f37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5cd9db0d886b8f1de6f10f2be1348147d443ba4f3e542b171b1daf9f5abbd1ec\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-12T00:24:01Z\\\",\\\"message\\\":\\\"retry setup to complete in iterateRetryResources\\\\nF1212 00:24:00.622973 5993 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:00Z is after 2025-08-24T17:21:41Z]\\\\nI1212 00:24:00.622999 5993 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1212 00:24:00.623015 5993 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1212 00:24:00.623022 5993 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf in node crc\\\\nI1212 00:24:00.623021 5993 obj_retry.go:303] Ret\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-12T00:24:00Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e3c08f155fa3cef9692edf10c4df9719f3647b5625a9ac68a7c158afb80dd86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f94a9f641815d76464c17c6145b9ac4803eedab3bea29a52d65a793d448cebe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f94a9f641815d76464c17c6145b9ac4803eedab3bea29a52d65a793d448cebe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hpw5w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:16Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:16 crc kubenswrapper[4606]: I1212 00:24:16.362281 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:16 crc kubenswrapper[4606]: I1212 00:24:16.362311 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:16 crc kubenswrapper[4606]: I1212 00:24:16.362321 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:16 crc kubenswrapper[4606]: I1212 00:24:16.362337 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:16 crc kubenswrapper[4606]: I1212 00:24:16.362349 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:16Z","lastTransitionTime":"2025-12-12T00:24:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:16 crc kubenswrapper[4606]: I1212 00:24:16.371806 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7rxps" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c831d6d-b07d-46dd-adc0-85239379350f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba170e097af91d1b27e0ee3e414e0831a5a93256c693247e7129a5fda0b9dc85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:24:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbjbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d66f7e90d4969cb2791da434d47d0e4fb269aa8760347c7d59bd8d16f4cf42c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:24:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbjbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:24:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7rxps\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:16Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:16 crc kubenswrapper[4606]: I1212 00:24:16.464907 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:16 crc kubenswrapper[4606]: I1212 00:24:16.464975 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:16 crc kubenswrapper[4606]: I1212 00:24:16.464992 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:16 crc kubenswrapper[4606]: I1212 00:24:16.465015 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:16 crc kubenswrapper[4606]: I1212 00:24:16.465034 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:16Z","lastTransitionTime":"2025-12-12T00:24:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:16 crc kubenswrapper[4606]: I1212 00:24:16.567911 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:16 crc kubenswrapper[4606]: I1212 00:24:16.567993 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:16 crc kubenswrapper[4606]: I1212 00:24:16.568024 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:16 crc kubenswrapper[4606]: I1212 00:24:16.568053 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:16 crc kubenswrapper[4606]: I1212 00:24:16.568096 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:16Z","lastTransitionTime":"2025-12-12T00:24:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:16 crc kubenswrapper[4606]: I1212 00:24:16.670892 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:16 crc kubenswrapper[4606]: I1212 00:24:16.670960 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:16 crc kubenswrapper[4606]: I1212 00:24:16.670982 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:16 crc kubenswrapper[4606]: I1212 00:24:16.671012 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:16 crc kubenswrapper[4606]: I1212 00:24:16.671038 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:16Z","lastTransitionTime":"2025-12-12T00:24:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:16 crc kubenswrapper[4606]: I1212 00:24:16.699071 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 12 00:24:16 crc kubenswrapper[4606]: I1212 00:24:16.699077 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 12 00:24:16 crc kubenswrapper[4606]: I1212 00:24:16.699102 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 12 00:24:16 crc kubenswrapper[4606]: E1212 00:24:16.699349 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 12 00:24:16 crc kubenswrapper[4606]: E1212 00:24:16.699440 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 12 00:24:16 crc kubenswrapper[4606]: E1212 00:24:16.699565 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 12 00:24:16 crc kubenswrapper[4606]: I1212 00:24:16.773933 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:16 crc kubenswrapper[4606]: I1212 00:24:16.774022 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:16 crc kubenswrapper[4606]: I1212 00:24:16.774042 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:16 crc kubenswrapper[4606]: I1212 00:24:16.774068 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:16 crc kubenswrapper[4606]: I1212 00:24:16.774082 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:16Z","lastTransitionTime":"2025-12-12T00:24:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:16 crc kubenswrapper[4606]: I1212 00:24:16.876370 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:16 crc kubenswrapper[4606]: I1212 00:24:16.876452 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:16 crc kubenswrapper[4606]: I1212 00:24:16.876479 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:16 crc kubenswrapper[4606]: I1212 00:24:16.876512 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:16 crc kubenswrapper[4606]: I1212 00:24:16.876548 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:16Z","lastTransitionTime":"2025-12-12T00:24:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:16 crc kubenswrapper[4606]: I1212 00:24:16.978525 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:16 crc kubenswrapper[4606]: I1212 00:24:16.978583 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:16 crc kubenswrapper[4606]: I1212 00:24:16.978600 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:16 crc kubenswrapper[4606]: I1212 00:24:16.978622 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:16 crc kubenswrapper[4606]: I1212 00:24:16.978638 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:16Z","lastTransitionTime":"2025-12-12T00:24:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:17 crc kubenswrapper[4606]: I1212 00:24:17.027741 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hpw5w_da25b0ba-5398-4185-a4c1-aeba44ae5633/ovnkube-controller/2.log" Dec 12 00:24:17 crc kubenswrapper[4606]: I1212 00:24:17.028853 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hpw5w_da25b0ba-5398-4185-a4c1-aeba44ae5633/ovnkube-controller/1.log" Dec 12 00:24:17 crc kubenswrapper[4606]: I1212 00:24:17.033410 4606 generic.go:334] "Generic (PLEG): container finished" podID="da25b0ba-5398-4185-a4c1-aeba44ae5633" containerID="8bc7d9363ed2e08b897cfc835f60d45070a0a21643ac00f147f7e0108af46f37" exitCode=1 Dec 12 00:24:17 crc kubenswrapper[4606]: I1212 00:24:17.033476 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" event={"ID":"da25b0ba-5398-4185-a4c1-aeba44ae5633","Type":"ContainerDied","Data":"8bc7d9363ed2e08b897cfc835f60d45070a0a21643ac00f147f7e0108af46f37"} Dec 12 00:24:17 crc kubenswrapper[4606]: I1212 00:24:17.033530 4606 scope.go:117] "RemoveContainer" containerID="5cd9db0d886b8f1de6f10f2be1348147d443ba4f3e542b171b1daf9f5abbd1ec" Dec 12 00:24:17 crc kubenswrapper[4606]: I1212 00:24:17.034776 4606 scope.go:117] "RemoveContainer" containerID="8bc7d9363ed2e08b897cfc835f60d45070a0a21643ac00f147f7e0108af46f37" Dec 12 00:24:17 crc kubenswrapper[4606]: E1212 00:24:17.035093 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-hpw5w_openshift-ovn-kubernetes(da25b0ba-5398-4185-a4c1-aeba44ae5633)\"" pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" podUID="da25b0ba-5398-4185-a4c1-aeba44ae5633" Dec 12 00:24:17 crc kubenswrapper[4606]: I1212 00:24:17.054042 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4986a294a19f85082b9caf854f13f9c3587ca49314b40917b00768ee0bb51ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4251eab73ccb6141903993f8df922037762f7c11dc1e0ef1a2b5708ed435307\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:17Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:17 crc kubenswrapper[4606]: I1212 00:24:17.071735 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:17Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:17 crc kubenswrapper[4606]: I1212 00:24:17.081101 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:17 crc kubenswrapper[4606]: I1212 00:24:17.081139 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:17 crc kubenswrapper[4606]: I1212 00:24:17.081147 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:17 crc kubenswrapper[4606]: I1212 00:24:17.081185 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:17 crc kubenswrapper[4606]: I1212 00:24:17.081198 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:17Z","lastTransitionTime":"2025-12-12T00:24:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:17 crc kubenswrapper[4606]: I1212 00:24:17.086050 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qtz7b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f437a237-3eb8-4817-b0a9-35efece69933\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f587a144b02f6ba1229380174f2a4f2c50ed33702fe8d64eb5ff7174b32eb2d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qq49f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qtz7b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:17Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:17 crc kubenswrapper[4606]: I1212 00:24:17.103221 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce6c4e852b4c68e910621ad11058beecb40960d501d3e64f5fcad97c442df4ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:17Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:17 crc kubenswrapper[4606]: I1212 00:24:17.116437 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:17Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:17 crc kubenswrapper[4606]: I1212 00:24:17.126831 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-554rp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d34036b-3243-416a-a0c0-1d1ddb3a0ca5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://346417326a8a4ba8e839ce4870dbaacd33a90b4ab096d8ce573ce598d909b51a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9k8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-554rp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:17Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:17 crc kubenswrapper[4606]: I1212 00:24:17.139766 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-w4rbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"470c2076-46bf-4305-9fb1-3e509eb4d4f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78e7b08154aa4792439cd9e13c38f1c3a106a4d138d8e6a47d5406c82f11662c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34a40d0ce85078630553b90b7a35510e0d3523f600db8940d5e0cf063903192d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34a40d0ce85078630553b90b7a35510e0d3523f600db8940d5e0cf063903192d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f0d903a3e09486a00bc5031c1a100d7a520f8e0cbdcccbeadb796e70f1075b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f0d903a3e09486a00bc5031c1a100d7a520f8e0cbdcccbeadb796e70f1075b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06909fa1392ee8f29394ecd7411437e4f331ccd2007d0a534d0292175a7260ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06909fa1392ee8f29394ecd7411437e4f331ccd2007d0a534d0292175a7260ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://328f409c465a8289f2e2e070ac2475187e763de8354cad8af5b5b4e0a3539628\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://328f409c465a8289f2e2e070ac2475187e763de8354cad8af5b5b4e0a3539628\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09980f0c6b16141a2a66d2768c6e01f75c37314fea48825e9b9707a3fc4d3f9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09980f0c6b16141a2a66d2768c6e01f75c37314fea48825e9b9707a3fc4d3f9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d34cd079b74cccb75c66edf8c51b64113483767ced4eb56ad429d6527d0bb0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d34cd079b74cccb75c66edf8c51b64113483767ced4eb56ad429d6527d0bb0c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-w4rbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:17Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:17 crc kubenswrapper[4606]: I1212 00:24:17.155746 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da25b0ba-5398-4185-a4c1-aeba44ae5633\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa827acce161127216bcfcf3553efeb627cbe52e4d23a8cb011e025f67dde670\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://653bdfa6f15b75149c89a4aff0706e132368f8bc75520cde9f1e1c55c8866537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbb742680d55b47d51cd1964e187b6816e50a88befbce58d3a8746af74cf7071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://366ddc63e1156e702b24e0c3d3c02785c10324669fa851e46851f9a2b8a46a5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6183a4685fda61493c792e9ae693cd7f76e4aad9753f622b48347da3fd2b3b00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0867a89de2edf12036267a937b7e3b1a801e9710ff0a1250c20e020c80ac4d5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bc7d9363ed2e08b897cfc835f60d45070a0a21643ac00f147f7e0108af46f37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5cd9db0d886b8f1de6f10f2be1348147d443ba4f3e542b171b1daf9f5abbd1ec\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-12T00:24:01Z\\\",\\\"message\\\":\\\"retry setup to complete in iterateRetryResources\\\\nF1212 00:24:00.622973 5993 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:00Z is after 2025-08-24T17:21:41Z]\\\\nI1212 00:24:00.622999 5993 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1212 00:24:00.623015 5993 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1212 00:24:00.623022 5993 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf in node crc\\\\nI1212 00:24:00.623021 5993 obj_retry.go:303] Ret\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-12T00:24:00Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bc7d9363ed2e08b897cfc835f60d45070a0a21643ac00f147f7e0108af46f37\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-12T00:24:16Z\\\",\\\"message\\\":\\\"om/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1212 00:24:15.754838 6204 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1212 00:24:15.755303 6204 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1212 00:24:15.755326 6204 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1212 00:24:15.755332 6204 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI1212 00:24:15.755347 6204 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1212 00:24:15.755357 6204 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1212 00:24:15.755450 6204 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1212 00:24:15.755585 6204 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1212 00:24:15.755619 6204 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1212 00:24:15.755649 6204 factory.go:656] Stopping watch factory\\\\nI1212 00:24:15.755674 6204 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1212 00:24:15.755690 6204 handler.go:208] Removed *v1.NetworkPolicy ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-12T00:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e3c08f155fa3cef9692edf10c4df9719f3647b5625a9ac68a7c158afb80dd86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f94a9f641815d76464c17c6145b9ac4803eedab3bea29a52d65a793d448cebe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f94a9f641815d76464c17c6145b9ac4803eedab3bea29a52d65a793d448cebe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hpw5w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:17Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:17 crc kubenswrapper[4606]: I1212 00:24:17.168407 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7rxps" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c831d6d-b07d-46dd-adc0-85239379350f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba170e097af91d1b27e0ee3e414e0831a5a93256c693247e7129a5fda0b9dc85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:24:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbjbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d66f7e90d4969cb2791da434d47d0e4fb269aa8760347c7d59bd8d16f4cf42c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:24:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbjbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:24:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7rxps\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:17Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:17 crc kubenswrapper[4606]: I1212 00:24:17.179894 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mjjwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0853dce1-c009-407e-960d-1113f85e503f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwwtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwwtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:24:03Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mjjwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:17Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:17 crc kubenswrapper[4606]: I1212 00:24:17.183832 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:17 crc kubenswrapper[4606]: I1212 00:24:17.183899 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:17 crc kubenswrapper[4606]: I1212 00:24:17.183920 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:17 crc kubenswrapper[4606]: I1212 00:24:17.183945 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:17 crc kubenswrapper[4606]: I1212 00:24:17.183966 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:17Z","lastTransitionTime":"2025-12-12T00:24:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:17 crc kubenswrapper[4606]: I1212 00:24:17.193027 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xzcfk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84110159d07d70b0607ac029fec70fb043bbb0e310c39955e49becfe87e9faa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-spsv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xzcfk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:17Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:17 crc kubenswrapper[4606]: I1212 00:24:17.212333 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f8df398-6835-482a-a2cd-3395b6e9efab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b993d03e36e69edb510ce98f50dfb98344fb6fe2b5debeff3a2004807dd00ee6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c6fef20d4e159a405f0ab0206c49e5e314e97bf7ddaf0758e090106945c9ffe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://accf40ef1339b65136666c1ea6f21a70d2152a650c87ef3ade19ddb6f9effbc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbd49ad6d6011a235d440da0e8058c0fc4dcc64effec37b50a68e84438d537a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2860bce1479ca1d558c4f5230d76cadcaf3efc0e3842ef0d371ecf1168cd852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7780eb068dc4a157794f9253367d2198a38a2c039cf3906fc2319cf48db3cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa7780eb068dc4a157794f9253367d2198a38a2c039cf3906fc2319cf48db3cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://684c387c89889a0a1e8d317ca70ebfc71ea377eb5de9c08c71a3ff02faf99e4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://684c387c89889a0a1e8d317ca70ebfc71ea377eb5de9c08c71a3ff02faf99e4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://78da314feaec6cab81e27eaea086d2bba3a05c4b73ce2add1a0bb57a226b0f5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78da314feaec6cab81e27eaea086d2bba3a05c4b73ce2add1a0bb57a226b0f5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:17Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:17 crc kubenswrapper[4606]: I1212 00:24:17.228995 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92839d27-9755-4aaa-a4c2-8aa83b6473de\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d18b3a560566347e9cfcd748453a89d1588de77f02a5e961a71916c6182eff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9531d10c07644d54aeff48edf4ac068968acbd1b6597baabcb0660e7bcd54c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76f2e4c2d97f36aebba4bcbabe12214a1df27637376d1440ee54d43fca39065\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b764ca1153735e365d95ed2855c7d41cd7457614e1ddce0f008d62720d607e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c75350d79a93c04db2bd6d71995920cde9b897776249b8696f3ead3939f7d5d4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"ed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1765499023\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1765499022\\\\\\\\\\\\\\\" (2025-12-11 23:23:42 +0000 UTC to 2026-12-11 23:23:42 +0000 UTC (now=2025-12-12 00:23:48.120831298 +0000 UTC))\\\\\\\"\\\\nI1212 00:23:48.120889 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1212 00:23:48.120921 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1212 00:23:48.120964 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1212 00:23:48.120986 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1212 00:23:48.121028 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3271453193/tls.crt::/tmp/serving-cert-3271453193/tls.key\\\\\\\"\\\\nI1212 00:23:48.121084 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1212 00:23:48.121211 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1212 00:23:48.121228 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1212 00:23:48.121265 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1212 00:23:48.121275 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1212 00:23:48.121345 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1212 00:23:48.121358 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF1212 00:23:48.123292 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3baf1bf5fbaa04f9f165f43b6a4fba496be0d453641fcd8bbc811687891dd66\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bbabfdcef0c9f4dcaada0bf5c2e98c083fd0f809462f017f8a767ef800e8e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0bbabfdcef0c9f4dcaada0bf5c2e98c083fd0f809462f017f8a767ef800e8e13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:17Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:17 crc kubenswrapper[4606]: I1212 00:24:17.241949 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ba650bd-b1e1-4950-98c6-2bee87181b18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de6cd366dc8d770945235faa6560cea35c0ef7eabd95cb272639e2116e2d623d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a380a96c542428bf9cf66391f4d843144052269b294636e863319b2ba954047f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://902a9d2fb0d7dcdec6339fac981ee0539911f1998a86ecf02a01015d3b1b5907\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4309a0a93977264e5000a930a32a4fb18ebe51fcf2af20bbf153f30f09759ff6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:17Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:17 crc kubenswrapper[4606]: I1212 00:24:17.253944 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:17Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:17 crc kubenswrapper[4606]: I1212 00:24:17.264319 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://694cd2f423e8241bd8fd68395824d1cd45a12d80e92ad57e63e59c0642b21384\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:17Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:17 crc kubenswrapper[4606]: I1212 00:24:17.276322 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a543e227-be89-40cb-941d-b4707cc28921\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b09638dd0d881593745c14548156f20a9366f65ecfa5018d510b902240dd4f7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-986w2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5de882904f4a5718bcf44d07da05096363447836f325f35130914c17a30c3a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-986w2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cqmz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:17Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:17 crc kubenswrapper[4606]: I1212 00:24:17.286128 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:17 crc kubenswrapper[4606]: I1212 00:24:17.286215 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:17 crc kubenswrapper[4606]: I1212 00:24:17.286233 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:17 crc kubenswrapper[4606]: I1212 00:24:17.286260 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:17 crc kubenswrapper[4606]: I1212 00:24:17.286279 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:17Z","lastTransitionTime":"2025-12-12T00:24:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:17 crc kubenswrapper[4606]: I1212 00:24:17.389756 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:17 crc kubenswrapper[4606]: I1212 00:24:17.389829 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:17 crc kubenswrapper[4606]: I1212 00:24:17.389839 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:17 crc kubenswrapper[4606]: I1212 00:24:17.389859 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:17 crc kubenswrapper[4606]: I1212 00:24:17.389872 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:17Z","lastTransitionTime":"2025-12-12T00:24:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:17 crc kubenswrapper[4606]: I1212 00:24:17.493327 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:17 crc kubenswrapper[4606]: I1212 00:24:17.493399 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:17 crc kubenswrapper[4606]: I1212 00:24:17.493423 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:17 crc kubenswrapper[4606]: I1212 00:24:17.493452 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:17 crc kubenswrapper[4606]: I1212 00:24:17.493474 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:17Z","lastTransitionTime":"2025-12-12T00:24:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:17 crc kubenswrapper[4606]: I1212 00:24:17.597952 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:17 crc kubenswrapper[4606]: I1212 00:24:17.598010 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:17 crc kubenswrapper[4606]: I1212 00:24:17.598031 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:17 crc kubenswrapper[4606]: I1212 00:24:17.598073 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:17 crc kubenswrapper[4606]: I1212 00:24:17.598133 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:17Z","lastTransitionTime":"2025-12-12T00:24:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:17 crc kubenswrapper[4606]: I1212 00:24:17.674295 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 12 00:24:17 crc kubenswrapper[4606]: I1212 00:24:17.686653 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Dec 12 00:24:17 crc kubenswrapper[4606]: I1212 00:24:17.695657 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce6c4e852b4c68e910621ad11058beecb40960d501d3e64f5fcad97c442df4ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:17Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:17 crc kubenswrapper[4606]: I1212 00:24:17.699425 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mjjwd" Dec 12 00:24:17 crc kubenswrapper[4606]: E1212 00:24:17.699671 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mjjwd" podUID="0853dce1-c009-407e-960d-1113f85e503f" Dec 12 00:24:17 crc kubenswrapper[4606]: I1212 00:24:17.701935 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:17 crc kubenswrapper[4606]: I1212 00:24:17.701993 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:17 crc kubenswrapper[4606]: I1212 00:24:17.702007 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:17 crc kubenswrapper[4606]: I1212 00:24:17.702026 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:17 crc kubenswrapper[4606]: I1212 00:24:17.702038 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:17Z","lastTransitionTime":"2025-12-12T00:24:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:17 crc kubenswrapper[4606]: I1212 00:24:17.713450 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7rxps" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c831d6d-b07d-46dd-adc0-85239379350f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba170e097af91d1b27e0ee3e414e0831a5a93256c693247e7129a5fda0b9dc85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:24:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbjbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d66f7e90d4969cb2791da434d47d0e4fb269aa8760347c7d59bd8d16f4cf42c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:24:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbjbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:24:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7rxps\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:17Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:17 crc kubenswrapper[4606]: I1212 00:24:17.728523 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mjjwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0853dce1-c009-407e-960d-1113f85e503f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwwtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwwtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:24:03Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mjjwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:17Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:17 crc kubenswrapper[4606]: I1212 00:24:17.747304 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:17Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:17 crc kubenswrapper[4606]: I1212 00:24:17.761029 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-554rp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d34036b-3243-416a-a0c0-1d1ddb3a0ca5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://346417326a8a4ba8e839ce4870dbaacd33a90b4ab096d8ce573ce598d909b51a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9k8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-554rp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:17Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:17 crc kubenswrapper[4606]: I1212 00:24:17.784244 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-w4rbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"470c2076-46bf-4305-9fb1-3e509eb4d4f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78e7b08154aa4792439cd9e13c38f1c3a106a4d138d8e6a47d5406c82f11662c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34a40d0ce85078630553b90b7a35510e0d3523f600db8940d5e0cf063903192d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34a40d0ce85078630553b90b7a35510e0d3523f600db8940d5e0cf063903192d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f0d903a3e09486a00bc5031c1a100d7a520f8e0cbdcccbeadb796e70f1075b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f0d903a3e09486a00bc5031c1a100d7a520f8e0cbdcccbeadb796e70f1075b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06909fa1392ee8f29394ecd7411437e4f331ccd2007d0a534d0292175a7260ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06909fa1392ee8f29394ecd7411437e4f331ccd2007d0a534d0292175a7260ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://328f409c465a8289f2e2e070ac2475187e763de8354cad8af5b5b4e0a3539628\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://328f409c465a8289f2e2e070ac2475187e763de8354cad8af5b5b4e0a3539628\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09980f0c6b16141a2a66d2768c6e01f75c37314fea48825e9b9707a3fc4d3f9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09980f0c6b16141a2a66d2768c6e01f75c37314fea48825e9b9707a3fc4d3f9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d34cd079b74cccb75c66edf8c51b64113483767ced4eb56ad429d6527d0bb0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d34cd079b74cccb75c66edf8c51b64113483767ced4eb56ad429d6527d0bb0c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-w4rbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:17Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:17 crc kubenswrapper[4606]: I1212 00:24:17.805053 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:17 crc kubenswrapper[4606]: I1212 00:24:17.805345 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:17 crc kubenswrapper[4606]: I1212 00:24:17.805442 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:17 crc kubenswrapper[4606]: I1212 00:24:17.805549 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:17 crc kubenswrapper[4606]: I1212 00:24:17.805639 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:17Z","lastTransitionTime":"2025-12-12T00:24:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:17 crc kubenswrapper[4606]: I1212 00:24:17.809084 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da25b0ba-5398-4185-a4c1-aeba44ae5633\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa827acce161127216bcfcf3553efeb627cbe52e4d23a8cb011e025f67dde670\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://653bdfa6f15b75149c89a4aff0706e132368f8bc75520cde9f1e1c55c8866537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbb742680d55b47d51cd1964e187b6816e50a88befbce58d3a8746af74cf7071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://366ddc63e1156e702b24e0c3d3c02785c10324669fa851e46851f9a2b8a46a5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6183a4685fda61493c792e9ae693cd7f76e4aad9753f622b48347da3fd2b3b00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0867a89de2edf12036267a937b7e3b1a801e9710ff0a1250c20e020c80ac4d5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bc7d9363ed2e08b897cfc835f60d45070a0a21643ac00f147f7e0108af46f37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5cd9db0d886b8f1de6f10f2be1348147d443ba4f3e542b171b1daf9f5abbd1ec\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-12T00:24:01Z\\\",\\\"message\\\":\\\"retry setup to complete in iterateRetryResources\\\\nF1212 00:24:00.622973 5993 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:00Z is after 2025-08-24T17:21:41Z]\\\\nI1212 00:24:00.622999 5993 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1212 00:24:00.623015 5993 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1212 00:24:00.623022 5993 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf in node crc\\\\nI1212 00:24:00.623021 5993 obj_retry.go:303] Ret\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-12T00:24:00Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bc7d9363ed2e08b897cfc835f60d45070a0a21643ac00f147f7e0108af46f37\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-12T00:24:16Z\\\",\\\"message\\\":\\\"om/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1212 00:24:15.754838 6204 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1212 00:24:15.755303 6204 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1212 00:24:15.755326 6204 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1212 00:24:15.755332 6204 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI1212 00:24:15.755347 6204 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1212 00:24:15.755357 6204 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1212 00:24:15.755450 6204 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1212 00:24:15.755585 6204 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1212 00:24:15.755619 6204 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1212 00:24:15.755649 6204 factory.go:656] Stopping watch factory\\\\nI1212 00:24:15.755674 6204 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1212 00:24:15.755690 6204 handler.go:208] Removed *v1.NetworkPolicy ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-12T00:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e3c08f155fa3cef9692edf10c4df9719f3647b5625a9ac68a7c158afb80dd86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f94a9f641815d76464c17c6145b9ac4803eedab3bea29a52d65a793d448cebe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f94a9f641815d76464c17c6145b9ac4803eedab3bea29a52d65a793d448cebe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hpw5w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:17Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:17 crc kubenswrapper[4606]: I1212 00:24:17.827295 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://694cd2f423e8241bd8fd68395824d1cd45a12d80e92ad57e63e59c0642b21384\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:17Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:17 crc kubenswrapper[4606]: I1212 00:24:17.842539 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a543e227-be89-40cb-941d-b4707cc28921\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b09638dd0d881593745c14548156f20a9366f65ecfa5018d510b902240dd4f7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-986w2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5de882904f4a5718bcf44d07da05096363447836f325f35130914c17a30c3a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-986w2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cqmz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:17Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:17 crc kubenswrapper[4606]: I1212 00:24:17.861875 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xzcfk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84110159d07d70b0607ac029fec70fb043bbb0e310c39955e49becfe87e9faa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-spsv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xzcfk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:17Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:17 crc kubenswrapper[4606]: I1212 00:24:17.886160 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f8df398-6835-482a-a2cd-3395b6e9efab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b993d03e36e69edb510ce98f50dfb98344fb6fe2b5debeff3a2004807dd00ee6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c6fef20d4e159a405f0ab0206c49e5e314e97bf7ddaf0758e090106945c9ffe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://accf40ef1339b65136666c1ea6f21a70d2152a650c87ef3ade19ddb6f9effbc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbd49ad6d6011a235d440da0e8058c0fc4dcc64effec37b50a68e84438d537a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2860bce1479ca1d558c4f5230d76cadcaf3efc0e3842ef0d371ecf1168cd852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7780eb068dc4a157794f9253367d2198a38a2c039cf3906fc2319cf48db3cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa7780eb068dc4a157794f9253367d2198a38a2c039cf3906fc2319cf48db3cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://684c387c89889a0a1e8d317ca70ebfc71ea377eb5de9c08c71a3ff02faf99e4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://684c387c89889a0a1e8d317ca70ebfc71ea377eb5de9c08c71a3ff02faf99e4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://78da314feaec6cab81e27eaea086d2bba3a05c4b73ce2add1a0bb57a226b0f5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78da314feaec6cab81e27eaea086d2bba3a05c4b73ce2add1a0bb57a226b0f5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:17Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:17 crc kubenswrapper[4606]: I1212 00:24:17.904764 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92839d27-9755-4aaa-a4c2-8aa83b6473de\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d18b3a560566347e9cfcd748453a89d1588de77f02a5e961a71916c6182eff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9531d10c07644d54aeff48edf4ac068968acbd1b6597baabcb0660e7bcd54c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76f2e4c2d97f36aebba4bcbabe12214a1df27637376d1440ee54d43fca39065\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b764ca1153735e365d95ed2855c7d41cd7457614e1ddce0f008d62720d607e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c75350d79a93c04db2bd6d71995920cde9b897776249b8696f3ead3939f7d5d4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"ed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1765499023\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1765499022\\\\\\\\\\\\\\\" (2025-12-11 23:23:42 +0000 UTC to 2026-12-11 23:23:42 +0000 UTC (now=2025-12-12 00:23:48.120831298 +0000 UTC))\\\\\\\"\\\\nI1212 00:23:48.120889 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1212 00:23:48.120921 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1212 00:23:48.120964 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1212 00:23:48.120986 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1212 00:23:48.121028 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3271453193/tls.crt::/tmp/serving-cert-3271453193/tls.key\\\\\\\"\\\\nI1212 00:23:48.121084 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1212 00:23:48.121211 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1212 00:23:48.121228 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1212 00:23:48.121265 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1212 00:23:48.121275 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1212 00:23:48.121345 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1212 00:23:48.121358 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF1212 00:23:48.123292 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3baf1bf5fbaa04f9f165f43b6a4fba496be0d453641fcd8bbc811687891dd66\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bbabfdcef0c9f4dcaada0bf5c2e98c083fd0f809462f017f8a767ef800e8e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0bbabfdcef0c9f4dcaada0bf5c2e98c083fd0f809462f017f8a767ef800e8e13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:17Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:17 crc kubenswrapper[4606]: I1212 00:24:17.908601 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:17 crc kubenswrapper[4606]: I1212 00:24:17.908644 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:17 crc kubenswrapper[4606]: I1212 00:24:17.908660 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:17 crc kubenswrapper[4606]: I1212 00:24:17.908678 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:17 crc kubenswrapper[4606]: I1212 00:24:17.908690 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:17Z","lastTransitionTime":"2025-12-12T00:24:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:17 crc kubenswrapper[4606]: I1212 00:24:17.922928 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ba650bd-b1e1-4950-98c6-2bee87181b18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de6cd366dc8d770945235faa6560cea35c0ef7eabd95cb272639e2116e2d623d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a380a96c542428bf9cf66391f4d843144052269b294636e863319b2ba954047f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://902a9d2fb0d7dcdec6339fac981ee0539911f1998a86ecf02a01015d3b1b5907\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4309a0a93977264e5000a930a32a4fb18ebe51fcf2af20bbf153f30f09759ff6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:17Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:17 crc kubenswrapper[4606]: I1212 00:24:17.939782 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:17Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:17 crc kubenswrapper[4606]: I1212 00:24:17.953516 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qtz7b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f437a237-3eb8-4817-b0a9-35efece69933\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f587a144b02f6ba1229380174f2a4f2c50ed33702fe8d64eb5ff7174b32eb2d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qq49f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qtz7b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:17Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:17 crc kubenswrapper[4606]: I1212 00:24:17.965498 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4986a294a19f85082b9caf854f13f9c3587ca49314b40917b00768ee0bb51ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4251eab73ccb6141903993f8df922037762f7c11dc1e0ef1a2b5708ed435307\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:17Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:17 crc kubenswrapper[4606]: I1212 00:24:17.979082 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:17Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:18 crc kubenswrapper[4606]: I1212 00:24:18.011549 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:18 crc kubenswrapper[4606]: I1212 00:24:18.011595 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:18 crc kubenswrapper[4606]: I1212 00:24:18.011609 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:18 crc kubenswrapper[4606]: I1212 00:24:18.011626 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:18 crc kubenswrapper[4606]: I1212 00:24:18.011637 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:18Z","lastTransitionTime":"2025-12-12T00:24:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:18 crc kubenswrapper[4606]: I1212 00:24:18.037493 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hpw5w_da25b0ba-5398-4185-a4c1-aeba44ae5633/ovnkube-controller/2.log" Dec 12 00:24:18 crc kubenswrapper[4606]: I1212 00:24:18.041219 4606 scope.go:117] "RemoveContainer" containerID="8bc7d9363ed2e08b897cfc835f60d45070a0a21643ac00f147f7e0108af46f37" Dec 12 00:24:18 crc kubenswrapper[4606]: E1212 00:24:18.041393 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-hpw5w_openshift-ovn-kubernetes(da25b0ba-5398-4185-a4c1-aeba44ae5633)\"" pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" podUID="da25b0ba-5398-4185-a4c1-aeba44ae5633" Dec 12 00:24:18 crc kubenswrapper[4606]: I1212 00:24:18.052602 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a543e227-be89-40cb-941d-b4707cc28921\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b09638dd0d881593745c14548156f20a9366f65ecfa5018d510b902240dd4f7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-986w2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5de882904f4a5718bcf44d07da05096363447836f325f35130914c17a30c3a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-986w2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cqmz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:18Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:18 crc kubenswrapper[4606]: I1212 00:24:18.066556 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xzcfk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84110159d07d70b0607ac029fec70fb043bbb0e310c39955e49becfe87e9faa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-spsv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xzcfk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:18Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:18 crc kubenswrapper[4606]: I1212 00:24:18.083397 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f8df398-6835-482a-a2cd-3395b6e9efab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b993d03e36e69edb510ce98f50dfb98344fb6fe2b5debeff3a2004807dd00ee6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c6fef20d4e159a405f0ab0206c49e5e314e97bf7ddaf0758e090106945c9ffe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://accf40ef1339b65136666c1ea6f21a70d2152a650c87ef3ade19ddb6f9effbc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbd49ad6d6011a235d440da0e8058c0fc4dcc64effec37b50a68e84438d537a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2860bce1479ca1d558c4f5230d76cadcaf3efc0e3842ef0d371ecf1168cd852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7780eb068dc4a157794f9253367d2198a38a2c039cf3906fc2319cf48db3cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa7780eb068dc4a157794f9253367d2198a38a2c039cf3906fc2319cf48db3cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://684c387c89889a0a1e8d317ca70ebfc71ea377eb5de9c08c71a3ff02faf99e4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://684c387c89889a0a1e8d317ca70ebfc71ea377eb5de9c08c71a3ff02faf99e4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://78da314feaec6cab81e27eaea086d2bba3a05c4b73ce2add1a0bb57a226b0f5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78da314feaec6cab81e27eaea086d2bba3a05c4b73ce2add1a0bb57a226b0f5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:18Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:18 crc kubenswrapper[4606]: I1212 00:24:18.101900 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92839d27-9755-4aaa-a4c2-8aa83b6473de\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d18b3a560566347e9cfcd748453a89d1588de77f02a5e961a71916c6182eff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9531d10c07644d54aeff48edf4ac068968acbd1b6597baabcb0660e7bcd54c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76f2e4c2d97f36aebba4bcbabe12214a1df27637376d1440ee54d43fca39065\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b764ca1153735e365d95ed2855c7d41cd7457614e1ddce0f008d62720d607e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c75350d79a93c04db2bd6d71995920cde9b897776249b8696f3ead3939f7d5d4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"ed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1765499023\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1765499022\\\\\\\\\\\\\\\" (2025-12-11 23:23:42 +0000 UTC to 2026-12-11 23:23:42 +0000 UTC (now=2025-12-12 00:23:48.120831298 +0000 UTC))\\\\\\\"\\\\nI1212 00:23:48.120889 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1212 00:23:48.120921 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1212 00:23:48.120964 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1212 00:23:48.120986 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1212 00:23:48.121028 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3271453193/tls.crt::/tmp/serving-cert-3271453193/tls.key\\\\\\\"\\\\nI1212 00:23:48.121084 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1212 00:23:48.121211 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1212 00:23:48.121228 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1212 00:23:48.121265 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1212 00:23:48.121275 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1212 00:23:48.121345 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1212 00:23:48.121358 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF1212 00:23:48.123292 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3baf1bf5fbaa04f9f165f43b6a4fba496be0d453641fcd8bbc811687891dd66\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bbabfdcef0c9f4dcaada0bf5c2e98c083fd0f809462f017f8a767ef800e8e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0bbabfdcef0c9f4dcaada0bf5c2e98c083fd0f809462f017f8a767ef800e8e13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:18Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:18 crc kubenswrapper[4606]: I1212 00:24:18.113713 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:18 crc kubenswrapper[4606]: I1212 00:24:18.113761 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:18 crc kubenswrapper[4606]: I1212 00:24:18.113774 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:18 crc kubenswrapper[4606]: I1212 00:24:18.113794 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:18 crc kubenswrapper[4606]: I1212 00:24:18.113808 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:18Z","lastTransitionTime":"2025-12-12T00:24:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:18 crc kubenswrapper[4606]: I1212 00:24:18.116796 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ba650bd-b1e1-4950-98c6-2bee87181b18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de6cd366dc8d770945235faa6560cea35c0ef7eabd95cb272639e2116e2d623d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a380a96c542428bf9cf66391f4d843144052269b294636e863319b2ba954047f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://902a9d2fb0d7dcdec6339fac981ee0539911f1998a86ecf02a01015d3b1b5907\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4309a0a93977264e5000a930a32a4fb18ebe51fcf2af20bbf153f30f09759ff6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:18Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:18 crc kubenswrapper[4606]: I1212 00:24:18.130571 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:18Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:18 crc kubenswrapper[4606]: I1212 00:24:18.144630 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://694cd2f423e8241bd8fd68395824d1cd45a12d80e92ad57e63e59c0642b21384\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:18Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:18 crc kubenswrapper[4606]: I1212 00:24:18.158535 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab571c5f-6f1d-4cb9-9a00-a57cc7baad63\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7a27c1b503f68ce99c9004d0190e5c74380bf2ff33b2b0e1e7f424e5cf9d450\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a159e87c39859bf3b5652b40223e4a8fdd9dcae3d23c1fae17d0eb8b5842a71c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e338552ee9504d46234c3caf3d9b7306a033258a94b6cf7c542bb957ea32a94c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c88faa76d30856d9170f04a153de4239eefbef49b2ee25a5bd04502697248b5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c88faa76d30856d9170f04a153de4239eefbef49b2ee25a5bd04502697248b5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:29Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:18Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:18 crc kubenswrapper[4606]: I1212 00:24:18.170219 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4986a294a19f85082b9caf854f13f9c3587ca49314b40917b00768ee0bb51ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4251eab73ccb6141903993f8df922037762f7c11dc1e0ef1a2b5708ed435307\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:18Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:18 crc kubenswrapper[4606]: I1212 00:24:18.179282 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:18Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:18 crc kubenswrapper[4606]: I1212 00:24:18.186859 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qtz7b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f437a237-3eb8-4817-b0a9-35efece69933\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f587a144b02f6ba1229380174f2a4f2c50ed33702fe8d64eb5ff7174b32eb2d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qq49f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qtz7b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:18Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:18 crc kubenswrapper[4606]: I1212 00:24:18.197361 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce6c4e852b4c68e910621ad11058beecb40960d501d3e64f5fcad97c442df4ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:18Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:18 crc kubenswrapper[4606]: I1212 00:24:18.205479 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mjjwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0853dce1-c009-407e-960d-1113f85e503f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwwtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwwtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:24:03Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mjjwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:18Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:18 crc kubenswrapper[4606]: I1212 00:24:18.215495 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:18 crc kubenswrapper[4606]: I1212 00:24:18.215553 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:18 crc kubenswrapper[4606]: I1212 00:24:18.215578 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:18 crc kubenswrapper[4606]: I1212 00:24:18.215598 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:18 crc kubenswrapper[4606]: I1212 00:24:18.215611 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:18Z","lastTransitionTime":"2025-12-12T00:24:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:18 crc kubenswrapper[4606]: I1212 00:24:18.215821 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:18Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:18 crc kubenswrapper[4606]: I1212 00:24:18.223780 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-554rp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d34036b-3243-416a-a0c0-1d1ddb3a0ca5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://346417326a8a4ba8e839ce4870dbaacd33a90b4ab096d8ce573ce598d909b51a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9k8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-554rp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:18Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:18 crc kubenswrapper[4606]: I1212 00:24:18.234750 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-w4rbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"470c2076-46bf-4305-9fb1-3e509eb4d4f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78e7b08154aa4792439cd9e13c38f1c3a106a4d138d8e6a47d5406c82f11662c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34a40d0ce85078630553b90b7a35510e0d3523f600db8940d5e0cf063903192d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34a40d0ce85078630553b90b7a35510e0d3523f600db8940d5e0cf063903192d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f0d903a3e09486a00bc5031c1a100d7a520f8e0cbdcccbeadb796e70f1075b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f0d903a3e09486a00bc5031c1a100d7a520f8e0cbdcccbeadb796e70f1075b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06909fa1392ee8f29394ecd7411437e4f331ccd2007d0a534d0292175a7260ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06909fa1392ee8f29394ecd7411437e4f331ccd2007d0a534d0292175a7260ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://328f409c465a8289f2e2e070ac2475187e763de8354cad8af5b5b4e0a3539628\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://328f409c465a8289f2e2e070ac2475187e763de8354cad8af5b5b4e0a3539628\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09980f0c6b16141a2a66d2768c6e01f75c37314fea48825e9b9707a3fc4d3f9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09980f0c6b16141a2a66d2768c6e01f75c37314fea48825e9b9707a3fc4d3f9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d34cd079b74cccb75c66edf8c51b64113483767ced4eb56ad429d6527d0bb0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d34cd079b74cccb75c66edf8c51b64113483767ced4eb56ad429d6527d0bb0c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-w4rbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:18Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:18 crc kubenswrapper[4606]: I1212 00:24:18.253501 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da25b0ba-5398-4185-a4c1-aeba44ae5633\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa827acce161127216bcfcf3553efeb627cbe52e4d23a8cb011e025f67dde670\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://653bdfa6f15b75149c89a4aff0706e132368f8bc75520cde9f1e1c55c8866537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbb742680d55b47d51cd1964e187b6816e50a88befbce58d3a8746af74cf7071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://366ddc63e1156e702b24e0c3d3c02785c10324669fa851e46851f9a2b8a46a5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6183a4685fda61493c792e9ae693cd7f76e4aad9753f622b48347da3fd2b3b00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0867a89de2edf12036267a937b7e3b1a801e9710ff0a1250c20e020c80ac4d5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bc7d9363ed2e08b897cfc835f60d45070a0a21643ac00f147f7e0108af46f37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bc7d9363ed2e08b897cfc835f60d45070a0a21643ac00f147f7e0108af46f37\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-12T00:24:16Z\\\",\\\"message\\\":\\\"om/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1212 00:24:15.754838 6204 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1212 00:24:15.755303 6204 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1212 00:24:15.755326 6204 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1212 00:24:15.755332 6204 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI1212 00:24:15.755347 6204 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1212 00:24:15.755357 6204 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1212 00:24:15.755450 6204 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1212 00:24:15.755585 6204 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1212 00:24:15.755619 6204 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1212 00:24:15.755649 6204 factory.go:656] Stopping watch factory\\\\nI1212 00:24:15.755674 6204 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1212 00:24:15.755690 6204 handler.go:208] Removed *v1.NetworkPolicy ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-12T00:24:15Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-hpw5w_openshift-ovn-kubernetes(da25b0ba-5398-4185-a4c1-aeba44ae5633)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e3c08f155fa3cef9692edf10c4df9719f3647b5625a9ac68a7c158afb80dd86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f94a9f641815d76464c17c6145b9ac4803eedab3bea29a52d65a793d448cebe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f94a9f641815d76464c17c6145b9ac4803eedab3bea29a52d65a793d448cebe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hpw5w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:18Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:18 crc kubenswrapper[4606]: I1212 00:24:18.271150 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7rxps" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c831d6d-b07d-46dd-adc0-85239379350f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba170e097af91d1b27e0ee3e414e0831a5a93256c693247e7129a5fda0b9dc85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:24:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbjbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d66f7e90d4969cb2791da434d47d0e4fb269aa8760347c7d59bd8d16f4cf42c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:24:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbjbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:24:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7rxps\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:18Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:18 crc kubenswrapper[4606]: I1212 00:24:18.318763 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:18 crc kubenswrapper[4606]: I1212 00:24:18.318822 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:18 crc kubenswrapper[4606]: I1212 00:24:18.318842 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:18 crc kubenswrapper[4606]: I1212 00:24:18.318866 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:18 crc kubenswrapper[4606]: I1212 00:24:18.318884 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:18Z","lastTransitionTime":"2025-12-12T00:24:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:18 crc kubenswrapper[4606]: I1212 00:24:18.422324 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:18 crc kubenswrapper[4606]: I1212 00:24:18.422448 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:18 crc kubenswrapper[4606]: I1212 00:24:18.422478 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:18 crc kubenswrapper[4606]: I1212 00:24:18.422508 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:18 crc kubenswrapper[4606]: I1212 00:24:18.422533 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:18Z","lastTransitionTime":"2025-12-12T00:24:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:18 crc kubenswrapper[4606]: I1212 00:24:18.525411 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:18 crc kubenswrapper[4606]: I1212 00:24:18.525464 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:18 crc kubenswrapper[4606]: I1212 00:24:18.525480 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:18 crc kubenswrapper[4606]: I1212 00:24:18.525500 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:18 crc kubenswrapper[4606]: I1212 00:24:18.525515 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:18Z","lastTransitionTime":"2025-12-12T00:24:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:18 crc kubenswrapper[4606]: I1212 00:24:18.628725 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:18 crc kubenswrapper[4606]: I1212 00:24:18.628789 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:18 crc kubenswrapper[4606]: I1212 00:24:18.628807 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:18 crc kubenswrapper[4606]: I1212 00:24:18.628831 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:18 crc kubenswrapper[4606]: I1212 00:24:18.628850 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:18Z","lastTransitionTime":"2025-12-12T00:24:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:18 crc kubenswrapper[4606]: I1212 00:24:18.699116 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 12 00:24:18 crc kubenswrapper[4606]: I1212 00:24:18.699116 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 12 00:24:18 crc kubenswrapper[4606]: E1212 00:24:18.699413 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 12 00:24:18 crc kubenswrapper[4606]: I1212 00:24:18.699163 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 12 00:24:18 crc kubenswrapper[4606]: E1212 00:24:18.699560 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 12 00:24:18 crc kubenswrapper[4606]: E1212 00:24:18.699705 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 12 00:24:18 crc kubenswrapper[4606]: I1212 00:24:18.732013 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:18 crc kubenswrapper[4606]: I1212 00:24:18.732086 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:18 crc kubenswrapper[4606]: I1212 00:24:18.732131 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:18 crc kubenswrapper[4606]: I1212 00:24:18.732164 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:18 crc kubenswrapper[4606]: I1212 00:24:18.732220 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:18Z","lastTransitionTime":"2025-12-12T00:24:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:18 crc kubenswrapper[4606]: I1212 00:24:18.835311 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:18 crc kubenswrapper[4606]: I1212 00:24:18.835377 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:18 crc kubenswrapper[4606]: I1212 00:24:18.835398 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:18 crc kubenswrapper[4606]: I1212 00:24:18.835427 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:18 crc kubenswrapper[4606]: I1212 00:24:18.835448 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:18Z","lastTransitionTime":"2025-12-12T00:24:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:18 crc kubenswrapper[4606]: I1212 00:24:18.938756 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:18 crc kubenswrapper[4606]: I1212 00:24:18.938821 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:18 crc kubenswrapper[4606]: I1212 00:24:18.938874 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:18 crc kubenswrapper[4606]: I1212 00:24:18.938904 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:18 crc kubenswrapper[4606]: I1212 00:24:18.938924 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:18Z","lastTransitionTime":"2025-12-12T00:24:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:19 crc kubenswrapper[4606]: I1212 00:24:19.041949 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:19 crc kubenswrapper[4606]: I1212 00:24:19.042010 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:19 crc kubenswrapper[4606]: I1212 00:24:19.042033 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:19 crc kubenswrapper[4606]: I1212 00:24:19.042065 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:19 crc kubenswrapper[4606]: I1212 00:24:19.042087 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:19Z","lastTransitionTime":"2025-12-12T00:24:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:19 crc kubenswrapper[4606]: I1212 00:24:19.145370 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:19 crc kubenswrapper[4606]: I1212 00:24:19.145437 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:19 crc kubenswrapper[4606]: I1212 00:24:19.145461 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:19 crc kubenswrapper[4606]: I1212 00:24:19.145491 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:19 crc kubenswrapper[4606]: I1212 00:24:19.145512 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:19Z","lastTransitionTime":"2025-12-12T00:24:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:19 crc kubenswrapper[4606]: I1212 00:24:19.248886 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:19 crc kubenswrapper[4606]: I1212 00:24:19.248950 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:19 crc kubenswrapper[4606]: I1212 00:24:19.248970 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:19 crc kubenswrapper[4606]: I1212 00:24:19.248994 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:19 crc kubenswrapper[4606]: I1212 00:24:19.249010 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:19Z","lastTransitionTime":"2025-12-12T00:24:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:19 crc kubenswrapper[4606]: I1212 00:24:19.351737 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:19 crc kubenswrapper[4606]: I1212 00:24:19.352466 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:19 crc kubenswrapper[4606]: I1212 00:24:19.352779 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:19 crc kubenswrapper[4606]: I1212 00:24:19.352878 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:19 crc kubenswrapper[4606]: I1212 00:24:19.352967 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:19Z","lastTransitionTime":"2025-12-12T00:24:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:19 crc kubenswrapper[4606]: I1212 00:24:19.453792 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0853dce1-c009-407e-960d-1113f85e503f-metrics-certs\") pod \"network-metrics-daemon-mjjwd\" (UID: \"0853dce1-c009-407e-960d-1113f85e503f\") " pod="openshift-multus/network-metrics-daemon-mjjwd" Dec 12 00:24:19 crc kubenswrapper[4606]: E1212 00:24:19.454147 4606 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 12 00:24:19 crc kubenswrapper[4606]: E1212 00:24:19.454291 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0853dce1-c009-407e-960d-1113f85e503f-metrics-certs podName:0853dce1-c009-407e-960d-1113f85e503f nodeName:}" failed. No retries permitted until 2025-12-12 00:24:35.454259363 +0000 UTC m=+65.999612269 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0853dce1-c009-407e-960d-1113f85e503f-metrics-certs") pod "network-metrics-daemon-mjjwd" (UID: "0853dce1-c009-407e-960d-1113f85e503f") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 12 00:24:19 crc kubenswrapper[4606]: I1212 00:24:19.456410 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:19 crc kubenswrapper[4606]: I1212 00:24:19.456523 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:19 crc kubenswrapper[4606]: I1212 00:24:19.456653 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:19 crc kubenswrapper[4606]: I1212 00:24:19.456761 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:19 crc kubenswrapper[4606]: I1212 00:24:19.456882 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:19Z","lastTransitionTime":"2025-12-12T00:24:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:19 crc kubenswrapper[4606]: I1212 00:24:19.559937 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:19 crc kubenswrapper[4606]: I1212 00:24:19.559990 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:19 crc kubenswrapper[4606]: I1212 00:24:19.560006 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:19 crc kubenswrapper[4606]: I1212 00:24:19.560030 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:19 crc kubenswrapper[4606]: I1212 00:24:19.560048 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:19Z","lastTransitionTime":"2025-12-12T00:24:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:19 crc kubenswrapper[4606]: I1212 00:24:19.662411 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:19 crc kubenswrapper[4606]: I1212 00:24:19.662442 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:19 crc kubenswrapper[4606]: I1212 00:24:19.662452 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:19 crc kubenswrapper[4606]: I1212 00:24:19.662464 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:19 crc kubenswrapper[4606]: I1212 00:24:19.662473 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:19Z","lastTransitionTime":"2025-12-12T00:24:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:19 crc kubenswrapper[4606]: I1212 00:24:19.699694 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mjjwd" Dec 12 00:24:19 crc kubenswrapper[4606]: E1212 00:24:19.700083 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mjjwd" podUID="0853dce1-c009-407e-960d-1113f85e503f" Dec 12 00:24:19 crc kubenswrapper[4606]: I1212 00:24:19.718106 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:19Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:19 crc kubenswrapper[4606]: I1212 00:24:19.731951 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qtz7b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f437a237-3eb8-4817-b0a9-35efece69933\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f587a144b02f6ba1229380174f2a4f2c50ed33702fe8d64eb5ff7174b32eb2d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qq49f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qtz7b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:19Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:19 crc kubenswrapper[4606]: I1212 00:24:19.749937 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab571c5f-6f1d-4cb9-9a00-a57cc7baad63\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7a27c1b503f68ce99c9004d0190e5c74380bf2ff33b2b0e1e7f424e5cf9d450\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a159e87c39859bf3b5652b40223e4a8fdd9dcae3d23c1fae17d0eb8b5842a71c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e338552ee9504d46234c3caf3d9b7306a033258a94b6cf7c542bb957ea32a94c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c88faa76d30856d9170f04a153de4239eefbef49b2ee25a5bd04502697248b5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c88faa76d30856d9170f04a153de4239eefbef49b2ee25a5bd04502697248b5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:29Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:19Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:19 crc kubenswrapper[4606]: I1212 00:24:19.763190 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4986a294a19f85082b9caf854f13f9c3587ca49314b40917b00768ee0bb51ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4251eab73ccb6141903993f8df922037762f7c11dc1e0ef1a2b5708ed435307\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:19Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:19 crc kubenswrapper[4606]: I1212 00:24:19.764646 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:19 crc kubenswrapper[4606]: I1212 00:24:19.764750 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:19 crc kubenswrapper[4606]: I1212 00:24:19.764905 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:19 crc kubenswrapper[4606]: I1212 00:24:19.765044 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:19 crc kubenswrapper[4606]: I1212 00:24:19.765205 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:19Z","lastTransitionTime":"2025-12-12T00:24:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:19 crc kubenswrapper[4606]: I1212 00:24:19.781724 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce6c4e852b4c68e910621ad11058beecb40960d501d3e64f5fcad97c442df4ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:19Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:19 crc kubenswrapper[4606]: I1212 00:24:19.813499 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da25b0ba-5398-4185-a4c1-aeba44ae5633\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa827acce161127216bcfcf3553efeb627cbe52e4d23a8cb011e025f67dde670\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://653bdfa6f15b75149c89a4aff0706e132368f8bc75520cde9f1e1c55c8866537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbb742680d55b47d51cd1964e187b6816e50a88befbce58d3a8746af74cf7071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://366ddc63e1156e702b24e0c3d3c02785c10324669fa851e46851f9a2b8a46a5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6183a4685fda61493c792e9ae693cd7f76e4aad9753f622b48347da3fd2b3b00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0867a89de2edf12036267a937b7e3b1a801e9710ff0a1250c20e020c80ac4d5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bc7d9363ed2e08b897cfc835f60d45070a0a21643ac00f147f7e0108af46f37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bc7d9363ed2e08b897cfc835f60d45070a0a21643ac00f147f7e0108af46f37\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-12T00:24:16Z\\\",\\\"message\\\":\\\"om/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1212 00:24:15.754838 6204 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1212 00:24:15.755303 6204 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1212 00:24:15.755326 6204 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1212 00:24:15.755332 6204 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI1212 00:24:15.755347 6204 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1212 00:24:15.755357 6204 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1212 00:24:15.755450 6204 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1212 00:24:15.755585 6204 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1212 00:24:15.755619 6204 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1212 00:24:15.755649 6204 factory.go:656] Stopping watch factory\\\\nI1212 00:24:15.755674 6204 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1212 00:24:15.755690 6204 handler.go:208] Removed *v1.NetworkPolicy ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-12T00:24:15Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-hpw5w_openshift-ovn-kubernetes(da25b0ba-5398-4185-a4c1-aeba44ae5633)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e3c08f155fa3cef9692edf10c4df9719f3647b5625a9ac68a7c158afb80dd86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f94a9f641815d76464c17c6145b9ac4803eedab3bea29a52d65a793d448cebe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f94a9f641815d76464c17c6145b9ac4803eedab3bea29a52d65a793d448cebe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hpw5w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:19Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:19 crc kubenswrapper[4606]: I1212 00:24:19.828600 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7rxps" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c831d6d-b07d-46dd-adc0-85239379350f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba170e097af91d1b27e0ee3e414e0831a5a93256c693247e7129a5fda0b9dc85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:24:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbjbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d66f7e90d4969cb2791da434d47d0e4fb269aa8760347c7d59bd8d16f4cf42c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:24:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbjbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:24:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7rxps\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:19Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:19 crc kubenswrapper[4606]: I1212 00:24:19.840492 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mjjwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0853dce1-c009-407e-960d-1113f85e503f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwwtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwwtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:24:03Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mjjwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:19Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:19 crc kubenswrapper[4606]: I1212 00:24:19.853296 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:19Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:19 crc kubenswrapper[4606]: I1212 00:24:19.865957 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-554rp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d34036b-3243-416a-a0c0-1d1ddb3a0ca5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://346417326a8a4ba8e839ce4870dbaacd33a90b4ab096d8ce573ce598d909b51a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9k8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-554rp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:19Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:19 crc kubenswrapper[4606]: I1212 00:24:19.867217 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:19 crc kubenswrapper[4606]: I1212 00:24:19.867252 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:19 crc kubenswrapper[4606]: I1212 00:24:19.867263 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:19 crc kubenswrapper[4606]: I1212 00:24:19.867280 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:19 crc kubenswrapper[4606]: I1212 00:24:19.867293 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:19Z","lastTransitionTime":"2025-12-12T00:24:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:19 crc kubenswrapper[4606]: I1212 00:24:19.882750 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-w4rbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"470c2076-46bf-4305-9fb1-3e509eb4d4f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78e7b08154aa4792439cd9e13c38f1c3a106a4d138d8e6a47d5406c82f11662c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34a40d0ce85078630553b90b7a35510e0d3523f600db8940d5e0cf063903192d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34a40d0ce85078630553b90b7a35510e0d3523f600db8940d5e0cf063903192d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f0d903a3e09486a00bc5031c1a100d7a520f8e0cbdcccbeadb796e70f1075b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f0d903a3e09486a00bc5031c1a100d7a520f8e0cbdcccbeadb796e70f1075b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06909fa1392ee8f29394ecd7411437e4f331ccd2007d0a534d0292175a7260ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06909fa1392ee8f29394ecd7411437e4f331ccd2007d0a534d0292175a7260ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://328f409c465a8289f2e2e070ac2475187e763de8354cad8af5b5b4e0a3539628\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://328f409c465a8289f2e2e070ac2475187e763de8354cad8af5b5b4e0a3539628\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09980f0c6b16141a2a66d2768c6e01f75c37314fea48825e9b9707a3fc4d3f9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09980f0c6b16141a2a66d2768c6e01f75c37314fea48825e9b9707a3fc4d3f9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d34cd079b74cccb75c66edf8c51b64113483767ced4eb56ad429d6527d0bb0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d34cd079b74cccb75c66edf8c51b64113483767ced4eb56ad429d6527d0bb0c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-w4rbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:19Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:19 crc kubenswrapper[4606]: I1212 00:24:19.895097 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:19Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:19 crc kubenswrapper[4606]: I1212 00:24:19.907422 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://694cd2f423e8241bd8fd68395824d1cd45a12d80e92ad57e63e59c0642b21384\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:19Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:19 crc kubenswrapper[4606]: I1212 00:24:19.919864 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a543e227-be89-40cb-941d-b4707cc28921\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b09638dd0d881593745c14548156f20a9366f65ecfa5018d510b902240dd4f7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-986w2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5de882904f4a5718bcf44d07da05096363447836f325f35130914c17a30c3a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-986w2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cqmz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:19Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:19 crc kubenswrapper[4606]: I1212 00:24:19.935968 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xzcfk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84110159d07d70b0607ac029fec70fb043bbb0e310c39955e49becfe87e9faa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-spsv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xzcfk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:19Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:19 crc kubenswrapper[4606]: I1212 00:24:19.962032 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f8df398-6835-482a-a2cd-3395b6e9efab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b993d03e36e69edb510ce98f50dfb98344fb6fe2b5debeff3a2004807dd00ee6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c6fef20d4e159a405f0ab0206c49e5e314e97bf7ddaf0758e090106945c9ffe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://accf40ef1339b65136666c1ea6f21a70d2152a650c87ef3ade19ddb6f9effbc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbd49ad6d6011a235d440da0e8058c0fc4dcc64effec37b50a68e84438d537a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2860bce1479ca1d558c4f5230d76cadcaf3efc0e3842ef0d371ecf1168cd852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7780eb068dc4a157794f9253367d2198a38a2c039cf3906fc2319cf48db3cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa7780eb068dc4a157794f9253367d2198a38a2c039cf3906fc2319cf48db3cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://684c387c89889a0a1e8d317ca70ebfc71ea377eb5de9c08c71a3ff02faf99e4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://684c387c89889a0a1e8d317ca70ebfc71ea377eb5de9c08c71a3ff02faf99e4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://78da314feaec6cab81e27eaea086d2bba3a05c4b73ce2add1a0bb57a226b0f5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78da314feaec6cab81e27eaea086d2bba3a05c4b73ce2add1a0bb57a226b0f5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:19Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:19 crc kubenswrapper[4606]: I1212 00:24:19.969804 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:19 crc kubenswrapper[4606]: I1212 00:24:19.969836 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:19 crc kubenswrapper[4606]: I1212 00:24:19.969847 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:19 crc kubenswrapper[4606]: I1212 00:24:19.969862 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:19 crc kubenswrapper[4606]: I1212 00:24:19.969872 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:19Z","lastTransitionTime":"2025-12-12T00:24:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:19 crc kubenswrapper[4606]: I1212 00:24:19.984077 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92839d27-9755-4aaa-a4c2-8aa83b6473de\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d18b3a560566347e9cfcd748453a89d1588de77f02a5e961a71916c6182eff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9531d10c07644d54aeff48edf4ac068968acbd1b6597baabcb0660e7bcd54c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76f2e4c2d97f36aebba4bcbabe12214a1df27637376d1440ee54d43fca39065\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b764ca1153735e365d95ed2855c7d41cd7457614e1ddce0f008d62720d607e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c75350d79a93c04db2bd6d71995920cde9b897776249b8696f3ead3939f7d5d4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"ed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1765499023\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1765499022\\\\\\\\\\\\\\\" (2025-12-11 23:23:42 +0000 UTC to 2026-12-11 23:23:42 +0000 UTC (now=2025-12-12 00:23:48.120831298 +0000 UTC))\\\\\\\"\\\\nI1212 00:23:48.120889 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1212 00:23:48.120921 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1212 00:23:48.120964 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1212 00:23:48.120986 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1212 00:23:48.121028 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3271453193/tls.crt::/tmp/serving-cert-3271453193/tls.key\\\\\\\"\\\\nI1212 00:23:48.121084 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1212 00:23:48.121211 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1212 00:23:48.121228 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1212 00:23:48.121265 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1212 00:23:48.121275 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1212 00:23:48.121345 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1212 00:23:48.121358 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF1212 00:23:48.123292 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3baf1bf5fbaa04f9f165f43b6a4fba496be0d453641fcd8bbc811687891dd66\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bbabfdcef0c9f4dcaada0bf5c2e98c083fd0f809462f017f8a767ef800e8e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0bbabfdcef0c9f4dcaada0bf5c2e98c083fd0f809462f017f8a767ef800e8e13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:19Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:20 crc kubenswrapper[4606]: I1212 00:24:20.001930 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ba650bd-b1e1-4950-98c6-2bee87181b18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de6cd366dc8d770945235faa6560cea35c0ef7eabd95cb272639e2116e2d623d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a380a96c542428bf9cf66391f4d843144052269b294636e863319b2ba954047f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://902a9d2fb0d7dcdec6339fac981ee0539911f1998a86ecf02a01015d3b1b5907\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4309a0a93977264e5000a930a32a4fb18ebe51fcf2af20bbf153f30f09759ff6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:19Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:20 crc kubenswrapper[4606]: I1212 00:24:20.071873 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:20 crc kubenswrapper[4606]: I1212 00:24:20.071934 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:20 crc kubenswrapper[4606]: I1212 00:24:20.071951 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:20 crc kubenswrapper[4606]: I1212 00:24:20.071972 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:20 crc kubenswrapper[4606]: I1212 00:24:20.071984 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:20Z","lastTransitionTime":"2025-12-12T00:24:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:20 crc kubenswrapper[4606]: I1212 00:24:20.174527 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:20 crc kubenswrapper[4606]: I1212 00:24:20.174579 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:20 crc kubenswrapper[4606]: I1212 00:24:20.174592 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:20 crc kubenswrapper[4606]: I1212 00:24:20.174611 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:20 crc kubenswrapper[4606]: I1212 00:24:20.174624 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:20Z","lastTransitionTime":"2025-12-12T00:24:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:20 crc kubenswrapper[4606]: I1212 00:24:20.278087 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:20 crc kubenswrapper[4606]: I1212 00:24:20.278122 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:20 crc kubenswrapper[4606]: I1212 00:24:20.278133 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:20 crc kubenswrapper[4606]: I1212 00:24:20.278149 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:20 crc kubenswrapper[4606]: I1212 00:24:20.278161 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:20Z","lastTransitionTime":"2025-12-12T00:24:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:20 crc kubenswrapper[4606]: I1212 00:24:20.381249 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:20 crc kubenswrapper[4606]: I1212 00:24:20.381556 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:20 crc kubenswrapper[4606]: I1212 00:24:20.381679 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:20 crc kubenswrapper[4606]: I1212 00:24:20.381819 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:20 crc kubenswrapper[4606]: I1212 00:24:20.381939 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:20Z","lastTransitionTime":"2025-12-12T00:24:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:20 crc kubenswrapper[4606]: I1212 00:24:20.484845 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:20 crc kubenswrapper[4606]: I1212 00:24:20.485268 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:20 crc kubenswrapper[4606]: I1212 00:24:20.485348 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:20 crc kubenswrapper[4606]: I1212 00:24:20.485407 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:20 crc kubenswrapper[4606]: I1212 00:24:20.485459 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:20Z","lastTransitionTime":"2025-12-12T00:24:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:20 crc kubenswrapper[4606]: I1212 00:24:20.588705 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:20 crc kubenswrapper[4606]: I1212 00:24:20.588750 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:20 crc kubenswrapper[4606]: I1212 00:24:20.588763 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:20 crc kubenswrapper[4606]: I1212 00:24:20.588781 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:20 crc kubenswrapper[4606]: I1212 00:24:20.588794 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:20Z","lastTransitionTime":"2025-12-12T00:24:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:20 crc kubenswrapper[4606]: I1212 00:24:20.666663 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 12 00:24:20 crc kubenswrapper[4606]: E1212 00:24:20.666833 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-12 00:24:52.666805852 +0000 UTC m=+83.212158718 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 12 00:24:20 crc kubenswrapper[4606]: I1212 00:24:20.690895 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:20 crc kubenswrapper[4606]: I1212 00:24:20.690940 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:20 crc kubenswrapper[4606]: I1212 00:24:20.690958 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:20 crc kubenswrapper[4606]: I1212 00:24:20.690978 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:20 crc kubenswrapper[4606]: I1212 00:24:20.690993 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:20Z","lastTransitionTime":"2025-12-12T00:24:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:20 crc kubenswrapper[4606]: I1212 00:24:20.699340 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 12 00:24:20 crc kubenswrapper[4606]: I1212 00:24:20.699339 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 12 00:24:20 crc kubenswrapper[4606]: E1212 00:24:20.699455 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 12 00:24:20 crc kubenswrapper[4606]: E1212 00:24:20.699509 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 12 00:24:20 crc kubenswrapper[4606]: I1212 00:24:20.699355 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 12 00:24:20 crc kubenswrapper[4606]: E1212 00:24:20.699562 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 12 00:24:20 crc kubenswrapper[4606]: I1212 00:24:20.768416 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 12 00:24:20 crc kubenswrapper[4606]: I1212 00:24:20.768451 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 12 00:24:20 crc kubenswrapper[4606]: I1212 00:24:20.768468 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 12 00:24:20 crc kubenswrapper[4606]: I1212 00:24:20.768491 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 12 00:24:20 crc kubenswrapper[4606]: E1212 00:24:20.768597 4606 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 12 00:24:20 crc kubenswrapper[4606]: E1212 00:24:20.768620 4606 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 12 00:24:20 crc kubenswrapper[4606]: E1212 00:24:20.768682 4606 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 12 00:24:20 crc kubenswrapper[4606]: E1212 00:24:20.768690 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-12 00:24:52.768673338 +0000 UTC m=+83.314026204 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 12 00:24:20 crc kubenswrapper[4606]: E1212 00:24:20.768697 4606 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 12 00:24:20 crc kubenswrapper[4606]: E1212 00:24:20.768710 4606 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 12 00:24:20 crc kubenswrapper[4606]: E1212 00:24:20.768752 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-12 00:24:52.76874008 +0000 UTC m=+83.314092936 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 12 00:24:20 crc kubenswrapper[4606]: E1212 00:24:20.768631 4606 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 12 00:24:20 crc kubenswrapper[4606]: E1212 00:24:20.768770 4606 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 12 00:24:20 crc kubenswrapper[4606]: E1212 00:24:20.768788 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-12 00:24:52.768781741 +0000 UTC m=+83.314134607 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 12 00:24:20 crc kubenswrapper[4606]: E1212 00:24:20.768816 4606 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 12 00:24:20 crc kubenswrapper[4606]: E1212 00:24:20.768836 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-12 00:24:52.768830752 +0000 UTC m=+83.314183618 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 12 00:24:20 crc kubenswrapper[4606]: I1212 00:24:20.793549 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:20 crc kubenswrapper[4606]: I1212 00:24:20.793606 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:20 crc kubenswrapper[4606]: I1212 00:24:20.793623 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:20 crc kubenswrapper[4606]: I1212 00:24:20.793648 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:20 crc kubenswrapper[4606]: I1212 00:24:20.793663 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:20Z","lastTransitionTime":"2025-12-12T00:24:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:20 crc kubenswrapper[4606]: I1212 00:24:20.896051 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:20 crc kubenswrapper[4606]: I1212 00:24:20.896098 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:20 crc kubenswrapper[4606]: I1212 00:24:20.896111 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:20 crc kubenswrapper[4606]: I1212 00:24:20.896129 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:20 crc kubenswrapper[4606]: I1212 00:24:20.896141 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:20Z","lastTransitionTime":"2025-12-12T00:24:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:20 crc kubenswrapper[4606]: I1212 00:24:20.998885 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:20 crc kubenswrapper[4606]: I1212 00:24:20.998966 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:20 crc kubenswrapper[4606]: I1212 00:24:20.998990 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:20 crc kubenswrapper[4606]: I1212 00:24:20.999023 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:20 crc kubenswrapper[4606]: I1212 00:24:20.999040 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:20Z","lastTransitionTime":"2025-12-12T00:24:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:21 crc kubenswrapper[4606]: I1212 00:24:21.102431 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:21 crc kubenswrapper[4606]: I1212 00:24:21.102484 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:21 crc kubenswrapper[4606]: I1212 00:24:21.102501 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:21 crc kubenswrapper[4606]: I1212 00:24:21.102523 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:21 crc kubenswrapper[4606]: I1212 00:24:21.102542 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:21Z","lastTransitionTime":"2025-12-12T00:24:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:21 crc kubenswrapper[4606]: I1212 00:24:21.204895 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:21 crc kubenswrapper[4606]: I1212 00:24:21.204929 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:21 crc kubenswrapper[4606]: I1212 00:24:21.204940 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:21 crc kubenswrapper[4606]: I1212 00:24:21.204956 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:21 crc kubenswrapper[4606]: I1212 00:24:21.204966 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:21Z","lastTransitionTime":"2025-12-12T00:24:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:21 crc kubenswrapper[4606]: I1212 00:24:21.308143 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:21 crc kubenswrapper[4606]: I1212 00:24:21.308278 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:21 crc kubenswrapper[4606]: I1212 00:24:21.308303 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:21 crc kubenswrapper[4606]: I1212 00:24:21.308334 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:21 crc kubenswrapper[4606]: I1212 00:24:21.308355 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:21Z","lastTransitionTime":"2025-12-12T00:24:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:21 crc kubenswrapper[4606]: I1212 00:24:21.411481 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:21 crc kubenswrapper[4606]: I1212 00:24:21.411547 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:21 crc kubenswrapper[4606]: I1212 00:24:21.411570 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:21 crc kubenswrapper[4606]: I1212 00:24:21.411597 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:21 crc kubenswrapper[4606]: I1212 00:24:21.411619 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:21Z","lastTransitionTime":"2025-12-12T00:24:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:21 crc kubenswrapper[4606]: I1212 00:24:21.514140 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:21 crc kubenswrapper[4606]: I1212 00:24:21.514256 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:21 crc kubenswrapper[4606]: I1212 00:24:21.514282 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:21 crc kubenswrapper[4606]: I1212 00:24:21.514309 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:21 crc kubenswrapper[4606]: I1212 00:24:21.514330 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:21Z","lastTransitionTime":"2025-12-12T00:24:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:21 crc kubenswrapper[4606]: I1212 00:24:21.617334 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:21 crc kubenswrapper[4606]: I1212 00:24:21.617393 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:21 crc kubenswrapper[4606]: I1212 00:24:21.617417 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:21 crc kubenswrapper[4606]: I1212 00:24:21.617450 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:21 crc kubenswrapper[4606]: I1212 00:24:21.617473 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:21Z","lastTransitionTime":"2025-12-12T00:24:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:21 crc kubenswrapper[4606]: I1212 00:24:21.699382 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mjjwd" Dec 12 00:24:21 crc kubenswrapper[4606]: E1212 00:24:21.699676 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mjjwd" podUID="0853dce1-c009-407e-960d-1113f85e503f" Dec 12 00:24:21 crc kubenswrapper[4606]: I1212 00:24:21.719665 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:21 crc kubenswrapper[4606]: I1212 00:24:21.719727 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:21 crc kubenswrapper[4606]: I1212 00:24:21.719747 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:21 crc kubenswrapper[4606]: I1212 00:24:21.719773 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:21 crc kubenswrapper[4606]: I1212 00:24:21.719793 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:21Z","lastTransitionTime":"2025-12-12T00:24:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:21 crc kubenswrapper[4606]: I1212 00:24:21.823033 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:21 crc kubenswrapper[4606]: I1212 00:24:21.823094 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:21 crc kubenswrapper[4606]: I1212 00:24:21.823114 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:21 crc kubenswrapper[4606]: I1212 00:24:21.823139 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:21 crc kubenswrapper[4606]: I1212 00:24:21.823156 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:21Z","lastTransitionTime":"2025-12-12T00:24:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:21 crc kubenswrapper[4606]: I1212 00:24:21.876595 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:21 crc kubenswrapper[4606]: I1212 00:24:21.876641 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:21 crc kubenswrapper[4606]: I1212 00:24:21.876650 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:21 crc kubenswrapper[4606]: I1212 00:24:21.876665 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:21 crc kubenswrapper[4606]: I1212 00:24:21.876676 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:21Z","lastTransitionTime":"2025-12-12T00:24:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:21 crc kubenswrapper[4606]: E1212 00:24:21.895982 4606 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:24:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:24:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:24:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:24:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"19ab28be-ccfb-4859-88d0-8d375e2d06f2\\\",\\\"systemUUID\\\":\\\"d5982033-b2dc-474c-9cda-275cd567c208\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:21Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:21 crc kubenswrapper[4606]: I1212 00:24:21.904483 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:21 crc kubenswrapper[4606]: I1212 00:24:21.904786 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:21 crc kubenswrapper[4606]: I1212 00:24:21.904901 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:21 crc kubenswrapper[4606]: I1212 00:24:21.905001 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:21 crc kubenswrapper[4606]: I1212 00:24:21.905112 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:21Z","lastTransitionTime":"2025-12-12T00:24:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:21 crc kubenswrapper[4606]: E1212 00:24:21.920586 4606 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:24:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:24:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:24:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:24:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"19ab28be-ccfb-4859-88d0-8d375e2d06f2\\\",\\\"systemUUID\\\":\\\"d5982033-b2dc-474c-9cda-275cd567c208\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:21Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:21 crc kubenswrapper[4606]: I1212 00:24:21.926008 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:21 crc kubenswrapper[4606]: I1212 00:24:21.926083 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:21 crc kubenswrapper[4606]: I1212 00:24:21.926095 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:21 crc kubenswrapper[4606]: I1212 00:24:21.926116 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:21 crc kubenswrapper[4606]: I1212 00:24:21.926128 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:21Z","lastTransitionTime":"2025-12-12T00:24:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:21 crc kubenswrapper[4606]: E1212 00:24:21.941324 4606 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:24:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:24:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:24:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:24:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"19ab28be-ccfb-4859-88d0-8d375e2d06f2\\\",\\\"systemUUID\\\":\\\"d5982033-b2dc-474c-9cda-275cd567c208\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:21Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:21 crc kubenswrapper[4606]: I1212 00:24:21.944822 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:21 crc kubenswrapper[4606]: I1212 00:24:21.944846 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:21 crc kubenswrapper[4606]: I1212 00:24:21.944857 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:21 crc kubenswrapper[4606]: I1212 00:24:21.944872 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:21 crc kubenswrapper[4606]: I1212 00:24:21.944883 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:21Z","lastTransitionTime":"2025-12-12T00:24:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:21 crc kubenswrapper[4606]: E1212 00:24:21.957900 4606 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:24:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:24:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:24:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:24:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"19ab28be-ccfb-4859-88d0-8d375e2d06f2\\\",\\\"systemUUID\\\":\\\"d5982033-b2dc-474c-9cda-275cd567c208\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:21Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:21 crc kubenswrapper[4606]: I1212 00:24:21.960704 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:21 crc kubenswrapper[4606]: I1212 00:24:21.960730 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:21 crc kubenswrapper[4606]: I1212 00:24:21.960747 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:21 crc kubenswrapper[4606]: I1212 00:24:21.960890 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:21 crc kubenswrapper[4606]: I1212 00:24:21.960906 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:21Z","lastTransitionTime":"2025-12-12T00:24:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:21 crc kubenswrapper[4606]: E1212 00:24:21.971244 4606 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:24:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:24:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:24:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:24:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"19ab28be-ccfb-4859-88d0-8d375e2d06f2\\\",\\\"systemUUID\\\":\\\"d5982033-b2dc-474c-9cda-275cd567c208\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:21Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:21 crc kubenswrapper[4606]: E1212 00:24:21.971410 4606 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 12 00:24:21 crc kubenswrapper[4606]: I1212 00:24:21.973311 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:21 crc kubenswrapper[4606]: I1212 00:24:21.973338 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:21 crc kubenswrapper[4606]: I1212 00:24:21.973348 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:21 crc kubenswrapper[4606]: I1212 00:24:21.973364 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:21 crc kubenswrapper[4606]: I1212 00:24:21.973374 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:21Z","lastTransitionTime":"2025-12-12T00:24:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:22 crc kubenswrapper[4606]: I1212 00:24:22.075563 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:22 crc kubenswrapper[4606]: I1212 00:24:22.075600 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:22 crc kubenswrapper[4606]: I1212 00:24:22.075612 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:22 crc kubenswrapper[4606]: I1212 00:24:22.075627 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:22 crc kubenswrapper[4606]: I1212 00:24:22.075638 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:22Z","lastTransitionTime":"2025-12-12T00:24:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:22 crc kubenswrapper[4606]: I1212 00:24:22.178610 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:22 crc kubenswrapper[4606]: I1212 00:24:22.178646 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:22 crc kubenswrapper[4606]: I1212 00:24:22.178654 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:22 crc kubenswrapper[4606]: I1212 00:24:22.178671 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:22 crc kubenswrapper[4606]: I1212 00:24:22.178680 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:22Z","lastTransitionTime":"2025-12-12T00:24:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:22 crc kubenswrapper[4606]: I1212 00:24:22.282101 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:22 crc kubenswrapper[4606]: I1212 00:24:22.282150 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:22 crc kubenswrapper[4606]: I1212 00:24:22.282166 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:22 crc kubenswrapper[4606]: I1212 00:24:22.282241 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:22 crc kubenswrapper[4606]: I1212 00:24:22.282259 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:22Z","lastTransitionTime":"2025-12-12T00:24:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:22 crc kubenswrapper[4606]: I1212 00:24:22.384531 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:22 crc kubenswrapper[4606]: I1212 00:24:22.384585 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:22 crc kubenswrapper[4606]: I1212 00:24:22.384599 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:22 crc kubenswrapper[4606]: I1212 00:24:22.384618 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:22 crc kubenswrapper[4606]: I1212 00:24:22.384630 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:22Z","lastTransitionTime":"2025-12-12T00:24:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:22 crc kubenswrapper[4606]: I1212 00:24:22.487904 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:22 crc kubenswrapper[4606]: I1212 00:24:22.487964 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:22 crc kubenswrapper[4606]: I1212 00:24:22.487987 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:22 crc kubenswrapper[4606]: I1212 00:24:22.488017 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:22 crc kubenswrapper[4606]: I1212 00:24:22.488039 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:22Z","lastTransitionTime":"2025-12-12T00:24:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:22 crc kubenswrapper[4606]: I1212 00:24:22.591074 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:22 crc kubenswrapper[4606]: I1212 00:24:22.591123 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:22 crc kubenswrapper[4606]: I1212 00:24:22.591142 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:22 crc kubenswrapper[4606]: I1212 00:24:22.591165 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:22 crc kubenswrapper[4606]: I1212 00:24:22.591209 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:22Z","lastTransitionTime":"2025-12-12T00:24:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:22 crc kubenswrapper[4606]: I1212 00:24:22.693542 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:22 crc kubenswrapper[4606]: I1212 00:24:22.693594 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:22 crc kubenswrapper[4606]: I1212 00:24:22.693611 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:22 crc kubenswrapper[4606]: I1212 00:24:22.693636 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:22 crc kubenswrapper[4606]: I1212 00:24:22.693653 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:22Z","lastTransitionTime":"2025-12-12T00:24:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:22 crc kubenswrapper[4606]: I1212 00:24:22.699322 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 12 00:24:22 crc kubenswrapper[4606]: I1212 00:24:22.699327 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 12 00:24:22 crc kubenswrapper[4606]: E1212 00:24:22.699723 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 12 00:24:22 crc kubenswrapper[4606]: E1212 00:24:22.699488 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 12 00:24:22 crc kubenswrapper[4606]: I1212 00:24:22.700123 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 12 00:24:22 crc kubenswrapper[4606]: E1212 00:24:22.700335 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 12 00:24:22 crc kubenswrapper[4606]: I1212 00:24:22.795642 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:22 crc kubenswrapper[4606]: I1212 00:24:22.795688 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:22 crc kubenswrapper[4606]: I1212 00:24:22.795699 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:22 crc kubenswrapper[4606]: I1212 00:24:22.795715 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:22 crc kubenswrapper[4606]: I1212 00:24:22.795726 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:22Z","lastTransitionTime":"2025-12-12T00:24:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:22 crc kubenswrapper[4606]: I1212 00:24:22.898437 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:22 crc kubenswrapper[4606]: I1212 00:24:22.898531 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:22 crc kubenswrapper[4606]: I1212 00:24:22.898556 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:22 crc kubenswrapper[4606]: I1212 00:24:22.898586 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:22 crc kubenswrapper[4606]: I1212 00:24:22.898611 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:22Z","lastTransitionTime":"2025-12-12T00:24:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:23 crc kubenswrapper[4606]: I1212 00:24:23.001815 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:23 crc kubenswrapper[4606]: I1212 00:24:23.001863 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:23 crc kubenswrapper[4606]: I1212 00:24:23.001878 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:23 crc kubenswrapper[4606]: I1212 00:24:23.001897 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:23 crc kubenswrapper[4606]: I1212 00:24:23.001911 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:23Z","lastTransitionTime":"2025-12-12T00:24:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:23 crc kubenswrapper[4606]: I1212 00:24:23.104162 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:23 crc kubenswrapper[4606]: I1212 00:24:23.104255 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:23 crc kubenswrapper[4606]: I1212 00:24:23.104266 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:23 crc kubenswrapper[4606]: I1212 00:24:23.104287 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:23 crc kubenswrapper[4606]: I1212 00:24:23.104299 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:23Z","lastTransitionTime":"2025-12-12T00:24:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:23 crc kubenswrapper[4606]: I1212 00:24:23.207312 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:23 crc kubenswrapper[4606]: I1212 00:24:23.207388 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:23 crc kubenswrapper[4606]: I1212 00:24:23.207420 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:23 crc kubenswrapper[4606]: I1212 00:24:23.207440 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:23 crc kubenswrapper[4606]: I1212 00:24:23.207454 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:23Z","lastTransitionTime":"2025-12-12T00:24:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:23 crc kubenswrapper[4606]: I1212 00:24:23.309541 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:23 crc kubenswrapper[4606]: I1212 00:24:23.309616 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:23 crc kubenswrapper[4606]: I1212 00:24:23.309628 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:23 crc kubenswrapper[4606]: I1212 00:24:23.309646 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:23 crc kubenswrapper[4606]: I1212 00:24:23.309657 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:23Z","lastTransitionTime":"2025-12-12T00:24:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:23 crc kubenswrapper[4606]: I1212 00:24:23.412961 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:23 crc kubenswrapper[4606]: I1212 00:24:23.413014 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:23 crc kubenswrapper[4606]: I1212 00:24:23.413026 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:23 crc kubenswrapper[4606]: I1212 00:24:23.413045 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:23 crc kubenswrapper[4606]: I1212 00:24:23.413058 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:23Z","lastTransitionTime":"2025-12-12T00:24:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:23 crc kubenswrapper[4606]: I1212 00:24:23.515667 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:23 crc kubenswrapper[4606]: I1212 00:24:23.515751 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:23 crc kubenswrapper[4606]: I1212 00:24:23.515776 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:23 crc kubenswrapper[4606]: I1212 00:24:23.515802 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:23 crc kubenswrapper[4606]: I1212 00:24:23.515822 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:23Z","lastTransitionTime":"2025-12-12T00:24:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:23 crc kubenswrapper[4606]: I1212 00:24:23.618148 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:23 crc kubenswrapper[4606]: I1212 00:24:23.618217 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:23 crc kubenswrapper[4606]: I1212 00:24:23.618227 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:23 crc kubenswrapper[4606]: I1212 00:24:23.618266 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:23 crc kubenswrapper[4606]: I1212 00:24:23.618300 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:23Z","lastTransitionTime":"2025-12-12T00:24:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:23 crc kubenswrapper[4606]: I1212 00:24:23.699430 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mjjwd" Dec 12 00:24:23 crc kubenswrapper[4606]: E1212 00:24:23.699648 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mjjwd" podUID="0853dce1-c009-407e-960d-1113f85e503f" Dec 12 00:24:23 crc kubenswrapper[4606]: I1212 00:24:23.722263 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:23 crc kubenswrapper[4606]: I1212 00:24:23.722319 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:23 crc kubenswrapper[4606]: I1212 00:24:23.722335 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:23 crc kubenswrapper[4606]: I1212 00:24:23.722692 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:23 crc kubenswrapper[4606]: I1212 00:24:23.722736 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:23Z","lastTransitionTime":"2025-12-12T00:24:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:23 crc kubenswrapper[4606]: I1212 00:24:23.825146 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:23 crc kubenswrapper[4606]: I1212 00:24:23.825228 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:23 crc kubenswrapper[4606]: I1212 00:24:23.825241 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:23 crc kubenswrapper[4606]: I1212 00:24:23.825258 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:23 crc kubenswrapper[4606]: I1212 00:24:23.825270 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:23Z","lastTransitionTime":"2025-12-12T00:24:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:23 crc kubenswrapper[4606]: I1212 00:24:23.927648 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:23 crc kubenswrapper[4606]: I1212 00:24:23.927699 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:23 crc kubenswrapper[4606]: I1212 00:24:23.927722 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:23 crc kubenswrapper[4606]: I1212 00:24:23.927753 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:23 crc kubenswrapper[4606]: I1212 00:24:23.927775 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:23Z","lastTransitionTime":"2025-12-12T00:24:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:24 crc kubenswrapper[4606]: I1212 00:24:24.031132 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:24 crc kubenswrapper[4606]: I1212 00:24:24.031232 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:24 crc kubenswrapper[4606]: I1212 00:24:24.031256 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:24 crc kubenswrapper[4606]: I1212 00:24:24.031298 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:24 crc kubenswrapper[4606]: I1212 00:24:24.031321 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:24Z","lastTransitionTime":"2025-12-12T00:24:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:24 crc kubenswrapper[4606]: I1212 00:24:24.134910 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:24 crc kubenswrapper[4606]: I1212 00:24:24.134975 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:24 crc kubenswrapper[4606]: I1212 00:24:24.134992 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:24 crc kubenswrapper[4606]: I1212 00:24:24.135015 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:24 crc kubenswrapper[4606]: I1212 00:24:24.135030 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:24Z","lastTransitionTime":"2025-12-12T00:24:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:24 crc kubenswrapper[4606]: I1212 00:24:24.237819 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:24 crc kubenswrapper[4606]: I1212 00:24:24.237871 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:24 crc kubenswrapper[4606]: I1212 00:24:24.237883 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:24 crc kubenswrapper[4606]: I1212 00:24:24.237901 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:24 crc kubenswrapper[4606]: I1212 00:24:24.237913 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:24Z","lastTransitionTime":"2025-12-12T00:24:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:24 crc kubenswrapper[4606]: I1212 00:24:24.340761 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:24 crc kubenswrapper[4606]: I1212 00:24:24.340823 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:24 crc kubenswrapper[4606]: I1212 00:24:24.340835 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:24 crc kubenswrapper[4606]: I1212 00:24:24.340864 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:24 crc kubenswrapper[4606]: I1212 00:24:24.340879 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:24Z","lastTransitionTime":"2025-12-12T00:24:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:24 crc kubenswrapper[4606]: I1212 00:24:24.447310 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:24 crc kubenswrapper[4606]: I1212 00:24:24.447366 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:24 crc kubenswrapper[4606]: I1212 00:24:24.447387 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:24 crc kubenswrapper[4606]: I1212 00:24:24.447412 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:24 crc kubenswrapper[4606]: I1212 00:24:24.447434 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:24Z","lastTransitionTime":"2025-12-12T00:24:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:24 crc kubenswrapper[4606]: I1212 00:24:24.549836 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:24 crc kubenswrapper[4606]: I1212 00:24:24.549973 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:24 crc kubenswrapper[4606]: I1212 00:24:24.549993 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:24 crc kubenswrapper[4606]: I1212 00:24:24.550017 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:24 crc kubenswrapper[4606]: I1212 00:24:24.550035 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:24Z","lastTransitionTime":"2025-12-12T00:24:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:24 crc kubenswrapper[4606]: I1212 00:24:24.652663 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:24 crc kubenswrapper[4606]: I1212 00:24:24.652730 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:24 crc kubenswrapper[4606]: I1212 00:24:24.652743 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:24 crc kubenswrapper[4606]: I1212 00:24:24.652761 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:24 crc kubenswrapper[4606]: I1212 00:24:24.652775 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:24Z","lastTransitionTime":"2025-12-12T00:24:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:24 crc kubenswrapper[4606]: I1212 00:24:24.699388 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 12 00:24:24 crc kubenswrapper[4606]: I1212 00:24:24.699431 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 12 00:24:24 crc kubenswrapper[4606]: I1212 00:24:24.699449 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 12 00:24:24 crc kubenswrapper[4606]: E1212 00:24:24.699552 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 12 00:24:24 crc kubenswrapper[4606]: E1212 00:24:24.699677 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 12 00:24:24 crc kubenswrapper[4606]: E1212 00:24:24.699782 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 12 00:24:24 crc kubenswrapper[4606]: I1212 00:24:24.755457 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:24 crc kubenswrapper[4606]: I1212 00:24:24.755508 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:24 crc kubenswrapper[4606]: I1212 00:24:24.755524 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:24 crc kubenswrapper[4606]: I1212 00:24:24.755543 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:24 crc kubenswrapper[4606]: I1212 00:24:24.755558 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:24Z","lastTransitionTime":"2025-12-12T00:24:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:24 crc kubenswrapper[4606]: I1212 00:24:24.858230 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:24 crc kubenswrapper[4606]: I1212 00:24:24.858271 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:24 crc kubenswrapper[4606]: I1212 00:24:24.858284 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:24 crc kubenswrapper[4606]: I1212 00:24:24.858302 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:24 crc kubenswrapper[4606]: I1212 00:24:24.858314 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:24Z","lastTransitionTime":"2025-12-12T00:24:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:24 crc kubenswrapper[4606]: I1212 00:24:24.961615 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:24 crc kubenswrapper[4606]: I1212 00:24:24.961683 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:24 crc kubenswrapper[4606]: I1212 00:24:24.961696 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:24 crc kubenswrapper[4606]: I1212 00:24:24.961717 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:24 crc kubenswrapper[4606]: I1212 00:24:24.961729 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:24Z","lastTransitionTime":"2025-12-12T00:24:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:25 crc kubenswrapper[4606]: I1212 00:24:25.063698 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:25 crc kubenswrapper[4606]: I1212 00:24:25.063740 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:25 crc kubenswrapper[4606]: I1212 00:24:25.063748 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:25 crc kubenswrapper[4606]: I1212 00:24:25.063766 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:25 crc kubenswrapper[4606]: I1212 00:24:25.063776 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:25Z","lastTransitionTime":"2025-12-12T00:24:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:25 crc kubenswrapper[4606]: I1212 00:24:25.166683 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:25 crc kubenswrapper[4606]: I1212 00:24:25.166750 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:25 crc kubenswrapper[4606]: I1212 00:24:25.166767 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:25 crc kubenswrapper[4606]: I1212 00:24:25.166790 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:25 crc kubenswrapper[4606]: I1212 00:24:25.166809 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:25Z","lastTransitionTime":"2025-12-12T00:24:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:25 crc kubenswrapper[4606]: I1212 00:24:25.269226 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:25 crc kubenswrapper[4606]: I1212 00:24:25.269292 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:25 crc kubenswrapper[4606]: I1212 00:24:25.269310 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:25 crc kubenswrapper[4606]: I1212 00:24:25.269336 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:25 crc kubenswrapper[4606]: I1212 00:24:25.269356 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:25Z","lastTransitionTime":"2025-12-12T00:24:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:25 crc kubenswrapper[4606]: I1212 00:24:25.372472 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:25 crc kubenswrapper[4606]: I1212 00:24:25.372520 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:25 crc kubenswrapper[4606]: I1212 00:24:25.372533 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:25 crc kubenswrapper[4606]: I1212 00:24:25.372550 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:25 crc kubenswrapper[4606]: I1212 00:24:25.372562 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:25Z","lastTransitionTime":"2025-12-12T00:24:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:25 crc kubenswrapper[4606]: I1212 00:24:25.475078 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:25 crc kubenswrapper[4606]: I1212 00:24:25.475143 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:25 crc kubenswrapper[4606]: I1212 00:24:25.475169 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:25 crc kubenswrapper[4606]: I1212 00:24:25.475270 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:25 crc kubenswrapper[4606]: I1212 00:24:25.475300 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:25Z","lastTransitionTime":"2025-12-12T00:24:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:25 crc kubenswrapper[4606]: I1212 00:24:25.578240 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:25 crc kubenswrapper[4606]: I1212 00:24:25.578403 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:25 crc kubenswrapper[4606]: I1212 00:24:25.578476 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:25 crc kubenswrapper[4606]: I1212 00:24:25.578504 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:25 crc kubenswrapper[4606]: I1212 00:24:25.578526 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:25Z","lastTransitionTime":"2025-12-12T00:24:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:25 crc kubenswrapper[4606]: I1212 00:24:25.681041 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:25 crc kubenswrapper[4606]: I1212 00:24:25.681106 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:25 crc kubenswrapper[4606]: I1212 00:24:25.681119 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:25 crc kubenswrapper[4606]: I1212 00:24:25.681162 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:25 crc kubenswrapper[4606]: I1212 00:24:25.681203 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:25Z","lastTransitionTime":"2025-12-12T00:24:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:25 crc kubenswrapper[4606]: I1212 00:24:25.698783 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mjjwd" Dec 12 00:24:25 crc kubenswrapper[4606]: E1212 00:24:25.698872 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mjjwd" podUID="0853dce1-c009-407e-960d-1113f85e503f" Dec 12 00:24:25 crc kubenswrapper[4606]: I1212 00:24:25.783835 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:25 crc kubenswrapper[4606]: I1212 00:24:25.783898 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:25 crc kubenswrapper[4606]: I1212 00:24:25.783917 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:25 crc kubenswrapper[4606]: I1212 00:24:25.783941 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:25 crc kubenswrapper[4606]: I1212 00:24:25.783958 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:25Z","lastTransitionTime":"2025-12-12T00:24:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:25 crc kubenswrapper[4606]: I1212 00:24:25.886776 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:25 crc kubenswrapper[4606]: I1212 00:24:25.886824 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:25 crc kubenswrapper[4606]: I1212 00:24:25.886837 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:25 crc kubenswrapper[4606]: I1212 00:24:25.886854 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:25 crc kubenswrapper[4606]: I1212 00:24:25.886866 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:25Z","lastTransitionTime":"2025-12-12T00:24:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:25 crc kubenswrapper[4606]: I1212 00:24:25.989057 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:25 crc kubenswrapper[4606]: I1212 00:24:25.989108 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:25 crc kubenswrapper[4606]: I1212 00:24:25.989123 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:25 crc kubenswrapper[4606]: I1212 00:24:25.989142 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:25 crc kubenswrapper[4606]: I1212 00:24:25.989155 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:25Z","lastTransitionTime":"2025-12-12T00:24:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:26 crc kubenswrapper[4606]: I1212 00:24:26.092528 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:26 crc kubenswrapper[4606]: I1212 00:24:26.092591 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:26 crc kubenswrapper[4606]: I1212 00:24:26.092606 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:26 crc kubenswrapper[4606]: I1212 00:24:26.092628 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:26 crc kubenswrapper[4606]: I1212 00:24:26.092642 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:26Z","lastTransitionTime":"2025-12-12T00:24:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:26 crc kubenswrapper[4606]: I1212 00:24:26.195650 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:26 crc kubenswrapper[4606]: I1212 00:24:26.195760 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:26 crc kubenswrapper[4606]: I1212 00:24:26.195793 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:26 crc kubenswrapper[4606]: I1212 00:24:26.195813 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:26 crc kubenswrapper[4606]: I1212 00:24:26.195826 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:26Z","lastTransitionTime":"2025-12-12T00:24:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:26 crc kubenswrapper[4606]: I1212 00:24:26.298098 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:26 crc kubenswrapper[4606]: I1212 00:24:26.298140 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:26 crc kubenswrapper[4606]: I1212 00:24:26.298151 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:26 crc kubenswrapper[4606]: I1212 00:24:26.298165 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:26 crc kubenswrapper[4606]: I1212 00:24:26.298191 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:26Z","lastTransitionTime":"2025-12-12T00:24:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:26 crc kubenswrapper[4606]: I1212 00:24:26.401137 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:26 crc kubenswrapper[4606]: I1212 00:24:26.401193 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:26 crc kubenswrapper[4606]: I1212 00:24:26.401202 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:26 crc kubenswrapper[4606]: I1212 00:24:26.401218 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:26 crc kubenswrapper[4606]: I1212 00:24:26.401227 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:26Z","lastTransitionTime":"2025-12-12T00:24:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:26 crc kubenswrapper[4606]: I1212 00:24:26.504064 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:26 crc kubenswrapper[4606]: I1212 00:24:26.504126 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:26 crc kubenswrapper[4606]: I1212 00:24:26.504145 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:26 crc kubenswrapper[4606]: I1212 00:24:26.504210 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:26 crc kubenswrapper[4606]: I1212 00:24:26.504232 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:26Z","lastTransitionTime":"2025-12-12T00:24:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:26 crc kubenswrapper[4606]: I1212 00:24:26.607627 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:26 crc kubenswrapper[4606]: I1212 00:24:26.607948 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:26 crc kubenswrapper[4606]: I1212 00:24:26.608144 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:26 crc kubenswrapper[4606]: I1212 00:24:26.608374 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:26 crc kubenswrapper[4606]: I1212 00:24:26.608581 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:26Z","lastTransitionTime":"2025-12-12T00:24:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:26 crc kubenswrapper[4606]: I1212 00:24:26.698933 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 12 00:24:26 crc kubenswrapper[4606]: E1212 00:24:26.699543 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 12 00:24:26 crc kubenswrapper[4606]: I1212 00:24:26.698977 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 12 00:24:26 crc kubenswrapper[4606]: E1212 00:24:26.699667 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 12 00:24:26 crc kubenswrapper[4606]: I1212 00:24:26.698948 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 12 00:24:26 crc kubenswrapper[4606]: E1212 00:24:26.699753 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 12 00:24:26 crc kubenswrapper[4606]: I1212 00:24:26.711641 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:26 crc kubenswrapper[4606]: I1212 00:24:26.711682 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:26 crc kubenswrapper[4606]: I1212 00:24:26.711695 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:26 crc kubenswrapper[4606]: I1212 00:24:26.711718 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:26 crc kubenswrapper[4606]: I1212 00:24:26.711732 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:26Z","lastTransitionTime":"2025-12-12T00:24:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:26 crc kubenswrapper[4606]: I1212 00:24:26.814348 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:26 crc kubenswrapper[4606]: I1212 00:24:26.814407 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:26 crc kubenswrapper[4606]: I1212 00:24:26.814421 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:26 crc kubenswrapper[4606]: I1212 00:24:26.814441 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:26 crc kubenswrapper[4606]: I1212 00:24:26.814458 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:26Z","lastTransitionTime":"2025-12-12T00:24:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:26 crc kubenswrapper[4606]: I1212 00:24:26.918447 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:26 crc kubenswrapper[4606]: I1212 00:24:26.918880 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:26 crc kubenswrapper[4606]: I1212 00:24:26.919063 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:26 crc kubenswrapper[4606]: I1212 00:24:26.919361 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:26 crc kubenswrapper[4606]: I1212 00:24:26.919593 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:26Z","lastTransitionTime":"2025-12-12T00:24:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:27 crc kubenswrapper[4606]: I1212 00:24:27.023150 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:27 crc kubenswrapper[4606]: I1212 00:24:27.023273 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:27 crc kubenswrapper[4606]: I1212 00:24:27.023292 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:27 crc kubenswrapper[4606]: I1212 00:24:27.023320 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:27 crc kubenswrapper[4606]: I1212 00:24:27.023334 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:27Z","lastTransitionTime":"2025-12-12T00:24:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:27 crc kubenswrapper[4606]: I1212 00:24:27.126312 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:27 crc kubenswrapper[4606]: I1212 00:24:27.126362 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:27 crc kubenswrapper[4606]: I1212 00:24:27.126375 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:27 crc kubenswrapper[4606]: I1212 00:24:27.126398 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:27 crc kubenswrapper[4606]: I1212 00:24:27.126416 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:27Z","lastTransitionTime":"2025-12-12T00:24:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:27 crc kubenswrapper[4606]: I1212 00:24:27.229128 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:27 crc kubenswrapper[4606]: I1212 00:24:27.229197 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:27 crc kubenswrapper[4606]: I1212 00:24:27.229215 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:27 crc kubenswrapper[4606]: I1212 00:24:27.229235 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:27 crc kubenswrapper[4606]: I1212 00:24:27.229249 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:27Z","lastTransitionTime":"2025-12-12T00:24:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:27 crc kubenswrapper[4606]: I1212 00:24:27.332339 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:27 crc kubenswrapper[4606]: I1212 00:24:27.332673 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:27 crc kubenswrapper[4606]: I1212 00:24:27.332798 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:27 crc kubenswrapper[4606]: I1212 00:24:27.332917 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:27 crc kubenswrapper[4606]: I1212 00:24:27.333029 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:27Z","lastTransitionTime":"2025-12-12T00:24:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:27 crc kubenswrapper[4606]: I1212 00:24:27.436724 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:27 crc kubenswrapper[4606]: I1212 00:24:27.436779 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:27 crc kubenswrapper[4606]: I1212 00:24:27.436795 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:27 crc kubenswrapper[4606]: I1212 00:24:27.436818 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:27 crc kubenswrapper[4606]: I1212 00:24:27.436836 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:27Z","lastTransitionTime":"2025-12-12T00:24:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:27 crc kubenswrapper[4606]: I1212 00:24:27.538928 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:27 crc kubenswrapper[4606]: I1212 00:24:27.538990 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:27 crc kubenswrapper[4606]: I1212 00:24:27.539008 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:27 crc kubenswrapper[4606]: I1212 00:24:27.539035 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:27 crc kubenswrapper[4606]: I1212 00:24:27.539052 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:27Z","lastTransitionTime":"2025-12-12T00:24:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:27 crc kubenswrapper[4606]: I1212 00:24:27.641744 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:27 crc kubenswrapper[4606]: I1212 00:24:27.641836 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:27 crc kubenswrapper[4606]: I1212 00:24:27.641865 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:27 crc kubenswrapper[4606]: I1212 00:24:27.641896 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:27 crc kubenswrapper[4606]: I1212 00:24:27.641916 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:27Z","lastTransitionTime":"2025-12-12T00:24:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:27 crc kubenswrapper[4606]: I1212 00:24:27.699160 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mjjwd" Dec 12 00:24:27 crc kubenswrapper[4606]: E1212 00:24:27.699433 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mjjwd" podUID="0853dce1-c009-407e-960d-1113f85e503f" Dec 12 00:24:27 crc kubenswrapper[4606]: I1212 00:24:27.745511 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:27 crc kubenswrapper[4606]: I1212 00:24:27.745604 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:27 crc kubenswrapper[4606]: I1212 00:24:27.745623 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:27 crc kubenswrapper[4606]: I1212 00:24:27.745650 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:27 crc kubenswrapper[4606]: I1212 00:24:27.745668 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:27Z","lastTransitionTime":"2025-12-12T00:24:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:27 crc kubenswrapper[4606]: I1212 00:24:27.849218 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:27 crc kubenswrapper[4606]: I1212 00:24:27.849285 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:27 crc kubenswrapper[4606]: I1212 00:24:27.849309 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:27 crc kubenswrapper[4606]: I1212 00:24:27.849336 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:27 crc kubenswrapper[4606]: I1212 00:24:27.849357 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:27Z","lastTransitionTime":"2025-12-12T00:24:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:27 crc kubenswrapper[4606]: I1212 00:24:27.952261 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:27 crc kubenswrapper[4606]: I1212 00:24:27.952305 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:27 crc kubenswrapper[4606]: I1212 00:24:27.952318 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:27 crc kubenswrapper[4606]: I1212 00:24:27.952335 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:27 crc kubenswrapper[4606]: I1212 00:24:27.952346 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:27Z","lastTransitionTime":"2025-12-12T00:24:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:28 crc kubenswrapper[4606]: I1212 00:24:28.054784 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:28 crc kubenswrapper[4606]: I1212 00:24:28.054829 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:28 crc kubenswrapper[4606]: I1212 00:24:28.054842 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:28 crc kubenswrapper[4606]: I1212 00:24:28.054861 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:28 crc kubenswrapper[4606]: I1212 00:24:28.054876 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:28Z","lastTransitionTime":"2025-12-12T00:24:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:28 crc kubenswrapper[4606]: I1212 00:24:28.158144 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:28 crc kubenswrapper[4606]: I1212 00:24:28.158201 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:28 crc kubenswrapper[4606]: I1212 00:24:28.158213 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:28 crc kubenswrapper[4606]: I1212 00:24:28.158230 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:28 crc kubenswrapper[4606]: I1212 00:24:28.158241 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:28Z","lastTransitionTime":"2025-12-12T00:24:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:28 crc kubenswrapper[4606]: I1212 00:24:28.264401 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:28 crc kubenswrapper[4606]: I1212 00:24:28.264454 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:28 crc kubenswrapper[4606]: I1212 00:24:28.264468 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:28 crc kubenswrapper[4606]: I1212 00:24:28.264492 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:28 crc kubenswrapper[4606]: I1212 00:24:28.264552 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:28Z","lastTransitionTime":"2025-12-12T00:24:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:28 crc kubenswrapper[4606]: I1212 00:24:28.366893 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:28 crc kubenswrapper[4606]: I1212 00:24:28.366955 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:28 crc kubenswrapper[4606]: I1212 00:24:28.366967 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:28 crc kubenswrapper[4606]: I1212 00:24:28.366983 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:28 crc kubenswrapper[4606]: I1212 00:24:28.366994 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:28Z","lastTransitionTime":"2025-12-12T00:24:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:28 crc kubenswrapper[4606]: I1212 00:24:28.469410 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:28 crc kubenswrapper[4606]: I1212 00:24:28.469646 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:28 crc kubenswrapper[4606]: I1212 00:24:28.469737 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:28 crc kubenswrapper[4606]: I1212 00:24:28.469832 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:28 crc kubenswrapper[4606]: I1212 00:24:28.469936 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:28Z","lastTransitionTime":"2025-12-12T00:24:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:28 crc kubenswrapper[4606]: I1212 00:24:28.572882 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:28 crc kubenswrapper[4606]: I1212 00:24:28.572915 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:28 crc kubenswrapper[4606]: I1212 00:24:28.572926 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:28 crc kubenswrapper[4606]: I1212 00:24:28.572943 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:28 crc kubenswrapper[4606]: I1212 00:24:28.572954 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:28Z","lastTransitionTime":"2025-12-12T00:24:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:28 crc kubenswrapper[4606]: I1212 00:24:28.675484 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:28 crc kubenswrapper[4606]: I1212 00:24:28.675535 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:28 crc kubenswrapper[4606]: I1212 00:24:28.675544 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:28 crc kubenswrapper[4606]: I1212 00:24:28.675557 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:28 crc kubenswrapper[4606]: I1212 00:24:28.675569 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:28Z","lastTransitionTime":"2025-12-12T00:24:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:28 crc kubenswrapper[4606]: I1212 00:24:28.699530 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 12 00:24:28 crc kubenswrapper[4606]: I1212 00:24:28.699574 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 12 00:24:28 crc kubenswrapper[4606]: E1212 00:24:28.699703 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 12 00:24:28 crc kubenswrapper[4606]: I1212 00:24:28.699951 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 12 00:24:28 crc kubenswrapper[4606]: E1212 00:24:28.700052 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 12 00:24:28 crc kubenswrapper[4606]: E1212 00:24:28.700374 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 12 00:24:28 crc kubenswrapper[4606]: I1212 00:24:28.778189 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:28 crc kubenswrapper[4606]: I1212 00:24:28.778246 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:28 crc kubenswrapper[4606]: I1212 00:24:28.778257 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:28 crc kubenswrapper[4606]: I1212 00:24:28.778272 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:28 crc kubenswrapper[4606]: I1212 00:24:28.778304 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:28Z","lastTransitionTime":"2025-12-12T00:24:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:28 crc kubenswrapper[4606]: I1212 00:24:28.880619 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:28 crc kubenswrapper[4606]: I1212 00:24:28.881151 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:28 crc kubenswrapper[4606]: I1212 00:24:28.881243 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:28 crc kubenswrapper[4606]: I1212 00:24:28.881312 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:28 crc kubenswrapper[4606]: I1212 00:24:28.881372 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:28Z","lastTransitionTime":"2025-12-12T00:24:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:28 crc kubenswrapper[4606]: I1212 00:24:28.983520 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:28 crc kubenswrapper[4606]: I1212 00:24:28.983589 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:28 crc kubenswrapper[4606]: I1212 00:24:28.983612 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:28 crc kubenswrapper[4606]: I1212 00:24:28.983641 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:28 crc kubenswrapper[4606]: I1212 00:24:28.983662 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:28Z","lastTransitionTime":"2025-12-12T00:24:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:29 crc kubenswrapper[4606]: I1212 00:24:29.086560 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:29 crc kubenswrapper[4606]: I1212 00:24:29.086635 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:29 crc kubenswrapper[4606]: I1212 00:24:29.086650 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:29 crc kubenswrapper[4606]: I1212 00:24:29.086672 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:29 crc kubenswrapper[4606]: I1212 00:24:29.086688 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:29Z","lastTransitionTime":"2025-12-12T00:24:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:29 crc kubenswrapper[4606]: I1212 00:24:29.189076 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:29 crc kubenswrapper[4606]: I1212 00:24:29.189119 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:29 crc kubenswrapper[4606]: I1212 00:24:29.189132 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:29 crc kubenswrapper[4606]: I1212 00:24:29.189150 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:29 crc kubenswrapper[4606]: I1212 00:24:29.189160 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:29Z","lastTransitionTime":"2025-12-12T00:24:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:29 crc kubenswrapper[4606]: I1212 00:24:29.291797 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:29 crc kubenswrapper[4606]: I1212 00:24:29.291840 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:29 crc kubenswrapper[4606]: I1212 00:24:29.291856 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:29 crc kubenswrapper[4606]: I1212 00:24:29.291875 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:29 crc kubenswrapper[4606]: I1212 00:24:29.291890 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:29Z","lastTransitionTime":"2025-12-12T00:24:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:29 crc kubenswrapper[4606]: I1212 00:24:29.394764 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:29 crc kubenswrapper[4606]: I1212 00:24:29.394822 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:29 crc kubenswrapper[4606]: I1212 00:24:29.394840 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:29 crc kubenswrapper[4606]: I1212 00:24:29.394864 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:29 crc kubenswrapper[4606]: I1212 00:24:29.394881 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:29Z","lastTransitionTime":"2025-12-12T00:24:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:29 crc kubenswrapper[4606]: I1212 00:24:29.497931 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:29 crc kubenswrapper[4606]: I1212 00:24:29.497998 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:29 crc kubenswrapper[4606]: I1212 00:24:29.498014 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:29 crc kubenswrapper[4606]: I1212 00:24:29.498038 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:29 crc kubenswrapper[4606]: I1212 00:24:29.498055 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:29Z","lastTransitionTime":"2025-12-12T00:24:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:29 crc kubenswrapper[4606]: I1212 00:24:29.601021 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:29 crc kubenswrapper[4606]: I1212 00:24:29.601092 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:29 crc kubenswrapper[4606]: I1212 00:24:29.601110 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:29 crc kubenswrapper[4606]: I1212 00:24:29.601130 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:29 crc kubenswrapper[4606]: I1212 00:24:29.601144 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:29Z","lastTransitionTime":"2025-12-12T00:24:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:29 crc kubenswrapper[4606]: I1212 00:24:29.698704 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mjjwd" Dec 12 00:24:29 crc kubenswrapper[4606]: E1212 00:24:29.699332 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mjjwd" podUID="0853dce1-c009-407e-960d-1113f85e503f" Dec 12 00:24:29 crc kubenswrapper[4606]: I1212 00:24:29.699447 4606 scope.go:117] "RemoveContainer" containerID="8bc7d9363ed2e08b897cfc835f60d45070a0a21643ac00f147f7e0108af46f37" Dec 12 00:24:29 crc kubenswrapper[4606]: E1212 00:24:29.700045 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-hpw5w_openshift-ovn-kubernetes(da25b0ba-5398-4185-a4c1-aeba44ae5633)\"" pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" podUID="da25b0ba-5398-4185-a4c1-aeba44ae5633" Dec 12 00:24:29 crc kubenswrapper[4606]: I1212 00:24:29.714908 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:29 crc kubenswrapper[4606]: I1212 00:24:29.715134 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:29 crc kubenswrapper[4606]: I1212 00:24:29.715261 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:29 crc kubenswrapper[4606]: I1212 00:24:29.715307 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:29 crc kubenswrapper[4606]: I1212 00:24:29.715336 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:29Z","lastTransitionTime":"2025-12-12T00:24:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:29 crc kubenswrapper[4606]: I1212 00:24:29.721248 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4986a294a19f85082b9caf854f13f9c3587ca49314b40917b00768ee0bb51ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4251eab73ccb6141903993f8df922037762f7c11dc1e0ef1a2b5708ed435307\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:29Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:29 crc kubenswrapper[4606]: I1212 00:24:29.743576 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:29Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:29 crc kubenswrapper[4606]: I1212 00:24:29.760875 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qtz7b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f437a237-3eb8-4817-b0a9-35efece69933\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f587a144b02f6ba1229380174f2a4f2c50ed33702fe8d64eb5ff7174b32eb2d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qq49f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qtz7b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:29Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:29 crc kubenswrapper[4606]: I1212 00:24:29.778528 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab571c5f-6f1d-4cb9-9a00-a57cc7baad63\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7a27c1b503f68ce99c9004d0190e5c74380bf2ff33b2b0e1e7f424e5cf9d450\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a159e87c39859bf3b5652b40223e4a8fdd9dcae3d23c1fae17d0eb8b5842a71c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e338552ee9504d46234c3caf3d9b7306a033258a94b6cf7c542bb957ea32a94c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c88faa76d30856d9170f04a153de4239eefbef49b2ee25a5bd04502697248b5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c88faa76d30856d9170f04a153de4239eefbef49b2ee25a5bd04502697248b5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:29Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:29Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:29 crc kubenswrapper[4606]: I1212 00:24:29.806003 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce6c4e852b4c68e910621ad11058beecb40960d501d3e64f5fcad97c442df4ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:29Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:29 crc kubenswrapper[4606]: I1212 00:24:29.818554 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:29 crc kubenswrapper[4606]: I1212 00:24:29.818603 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:29 crc kubenswrapper[4606]: I1212 00:24:29.818625 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:29 crc kubenswrapper[4606]: I1212 00:24:29.818654 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:29 crc kubenswrapper[4606]: I1212 00:24:29.818676 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:29Z","lastTransitionTime":"2025-12-12T00:24:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:29 crc kubenswrapper[4606]: I1212 00:24:29.829553 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-w4rbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"470c2076-46bf-4305-9fb1-3e509eb4d4f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78e7b08154aa4792439cd9e13c38f1c3a106a4d138d8e6a47d5406c82f11662c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34a40d0ce85078630553b90b7a35510e0d3523f600db8940d5e0cf063903192d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34a40d0ce85078630553b90b7a35510e0d3523f600db8940d5e0cf063903192d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f0d903a3e09486a00bc5031c1a100d7a520f8e0cbdcccbeadb796e70f1075b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f0d903a3e09486a00bc5031c1a100d7a520f8e0cbdcccbeadb796e70f1075b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06909fa1392ee8f29394ecd7411437e4f331ccd2007d0a534d0292175a7260ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06909fa1392ee8f29394ecd7411437e4f331ccd2007d0a534d0292175a7260ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://328f409c465a8289f2e2e070ac2475187e763de8354cad8af5b5b4e0a3539628\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://328f409c465a8289f2e2e070ac2475187e763de8354cad8af5b5b4e0a3539628\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09980f0c6b16141a2a66d2768c6e01f75c37314fea48825e9b9707a3fc4d3f9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09980f0c6b16141a2a66d2768c6e01f75c37314fea48825e9b9707a3fc4d3f9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d34cd079b74cccb75c66edf8c51b64113483767ced4eb56ad429d6527d0bb0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d34cd079b74cccb75c66edf8c51b64113483767ced4eb56ad429d6527d0bb0c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-w4rbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:29Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:29 crc kubenswrapper[4606]: I1212 00:24:29.859403 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da25b0ba-5398-4185-a4c1-aeba44ae5633\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa827acce161127216bcfcf3553efeb627cbe52e4d23a8cb011e025f67dde670\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://653bdfa6f15b75149c89a4aff0706e132368f8bc75520cde9f1e1c55c8866537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbb742680d55b47d51cd1964e187b6816e50a88befbce58d3a8746af74cf7071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://366ddc63e1156e702b24e0c3d3c02785c10324669fa851e46851f9a2b8a46a5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6183a4685fda61493c792e9ae693cd7f76e4aad9753f622b48347da3fd2b3b00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0867a89de2edf12036267a937b7e3b1a801e9710ff0a1250c20e020c80ac4d5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bc7d9363ed2e08b897cfc835f60d45070a0a21643ac00f147f7e0108af46f37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bc7d9363ed2e08b897cfc835f60d45070a0a21643ac00f147f7e0108af46f37\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-12T00:24:16Z\\\",\\\"message\\\":\\\"om/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1212 00:24:15.754838 6204 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1212 00:24:15.755303 6204 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1212 00:24:15.755326 6204 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1212 00:24:15.755332 6204 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI1212 00:24:15.755347 6204 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1212 00:24:15.755357 6204 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1212 00:24:15.755450 6204 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1212 00:24:15.755585 6204 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1212 00:24:15.755619 6204 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1212 00:24:15.755649 6204 factory.go:656] Stopping watch factory\\\\nI1212 00:24:15.755674 6204 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1212 00:24:15.755690 6204 handler.go:208] Removed *v1.NetworkPolicy ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-12T00:24:15Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-hpw5w_openshift-ovn-kubernetes(da25b0ba-5398-4185-a4c1-aeba44ae5633)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e3c08f155fa3cef9692edf10c4df9719f3647b5625a9ac68a7c158afb80dd86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f94a9f641815d76464c17c6145b9ac4803eedab3bea29a52d65a793d448cebe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f94a9f641815d76464c17c6145b9ac4803eedab3bea29a52d65a793d448cebe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hpw5w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:29Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:29 crc kubenswrapper[4606]: I1212 00:24:29.874223 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7rxps" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c831d6d-b07d-46dd-adc0-85239379350f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba170e097af91d1b27e0ee3e414e0831a5a93256c693247e7129a5fda0b9dc85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:24:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbjbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d66f7e90d4969cb2791da434d47d0e4fb269aa8760347c7d59bd8d16f4cf42c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:24:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbjbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:24:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7rxps\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:29Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:29 crc kubenswrapper[4606]: I1212 00:24:29.887970 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mjjwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0853dce1-c009-407e-960d-1113f85e503f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwwtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwwtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:24:03Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mjjwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:29Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:29 crc kubenswrapper[4606]: I1212 00:24:29.902435 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:29Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:29 crc kubenswrapper[4606]: I1212 00:24:29.914402 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-554rp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d34036b-3243-416a-a0c0-1d1ddb3a0ca5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://346417326a8a4ba8e839ce4870dbaacd33a90b4ab096d8ce573ce598d909b51a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9k8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-554rp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:29Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:29 crc kubenswrapper[4606]: I1212 00:24:29.921389 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:29 crc kubenswrapper[4606]: I1212 00:24:29.921419 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:29 crc kubenswrapper[4606]: I1212 00:24:29.921429 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:29 crc kubenswrapper[4606]: I1212 00:24:29.921447 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:29 crc kubenswrapper[4606]: I1212 00:24:29.921461 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:29Z","lastTransitionTime":"2025-12-12T00:24:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:29 crc kubenswrapper[4606]: I1212 00:24:29.930752 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ba650bd-b1e1-4950-98c6-2bee87181b18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de6cd366dc8d770945235faa6560cea35c0ef7eabd95cb272639e2116e2d623d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a380a96c542428bf9cf66391f4d843144052269b294636e863319b2ba954047f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://902a9d2fb0d7dcdec6339fac981ee0539911f1998a86ecf02a01015d3b1b5907\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4309a0a93977264e5000a930a32a4fb18ebe51fcf2af20bbf153f30f09759ff6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:29Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:29 crc kubenswrapper[4606]: I1212 00:24:29.946563 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:29Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:29 crc kubenswrapper[4606]: I1212 00:24:29.958601 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://694cd2f423e8241bd8fd68395824d1cd45a12d80e92ad57e63e59c0642b21384\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:29Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:29 crc kubenswrapper[4606]: I1212 00:24:29.971850 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a543e227-be89-40cb-941d-b4707cc28921\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b09638dd0d881593745c14548156f20a9366f65ecfa5018d510b902240dd4f7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-986w2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5de882904f4a5718bcf44d07da05096363447836f325f35130914c17a30c3a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-986w2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cqmz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:29Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:29 crc kubenswrapper[4606]: I1212 00:24:29.985846 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xzcfk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84110159d07d70b0607ac029fec70fb043bbb0e310c39955e49becfe87e9faa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-spsv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xzcfk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:29Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:30 crc kubenswrapper[4606]: I1212 00:24:30.008492 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f8df398-6835-482a-a2cd-3395b6e9efab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b993d03e36e69edb510ce98f50dfb98344fb6fe2b5debeff3a2004807dd00ee6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c6fef20d4e159a405f0ab0206c49e5e314e97bf7ddaf0758e090106945c9ffe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://accf40ef1339b65136666c1ea6f21a70d2152a650c87ef3ade19ddb6f9effbc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbd49ad6d6011a235d440da0e8058c0fc4dcc64effec37b50a68e84438d537a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2860bce1479ca1d558c4f5230d76cadcaf3efc0e3842ef0d371ecf1168cd852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7780eb068dc4a157794f9253367d2198a38a2c039cf3906fc2319cf48db3cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa7780eb068dc4a157794f9253367d2198a38a2c039cf3906fc2319cf48db3cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://684c387c89889a0a1e8d317ca70ebfc71ea377eb5de9c08c71a3ff02faf99e4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://684c387c89889a0a1e8d317ca70ebfc71ea377eb5de9c08c71a3ff02faf99e4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://78da314feaec6cab81e27eaea086d2bba3a05c4b73ce2add1a0bb57a226b0f5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78da314feaec6cab81e27eaea086d2bba3a05c4b73ce2add1a0bb57a226b0f5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:30Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:30 crc kubenswrapper[4606]: I1212 00:24:30.021265 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92839d27-9755-4aaa-a4c2-8aa83b6473de\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d18b3a560566347e9cfcd748453a89d1588de77f02a5e961a71916c6182eff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9531d10c07644d54aeff48edf4ac068968acbd1b6597baabcb0660e7bcd54c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76f2e4c2d97f36aebba4bcbabe12214a1df27637376d1440ee54d43fca39065\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b764ca1153735e365d95ed2855c7d41cd7457614e1ddce0f008d62720d607e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c75350d79a93c04db2bd6d71995920cde9b897776249b8696f3ead3939f7d5d4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"ed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1765499023\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1765499022\\\\\\\\\\\\\\\" (2025-12-11 23:23:42 +0000 UTC to 2026-12-11 23:23:42 +0000 UTC (now=2025-12-12 00:23:48.120831298 +0000 UTC))\\\\\\\"\\\\nI1212 00:23:48.120889 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1212 00:23:48.120921 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1212 00:23:48.120964 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1212 00:23:48.120986 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1212 00:23:48.121028 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3271453193/tls.crt::/tmp/serving-cert-3271453193/tls.key\\\\\\\"\\\\nI1212 00:23:48.121084 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1212 00:23:48.121211 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1212 00:23:48.121228 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1212 00:23:48.121265 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1212 00:23:48.121275 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1212 00:23:48.121345 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1212 00:23:48.121358 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF1212 00:23:48.123292 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3baf1bf5fbaa04f9f165f43b6a4fba496be0d453641fcd8bbc811687891dd66\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bbabfdcef0c9f4dcaada0bf5c2e98c083fd0f809462f017f8a767ef800e8e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0bbabfdcef0c9f4dcaada0bf5c2e98c083fd0f809462f017f8a767ef800e8e13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:30Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:30 crc kubenswrapper[4606]: I1212 00:24:30.023452 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:30 crc kubenswrapper[4606]: I1212 00:24:30.023621 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:30 crc kubenswrapper[4606]: I1212 00:24:30.023736 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:30 crc kubenswrapper[4606]: I1212 00:24:30.023878 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:30 crc kubenswrapper[4606]: I1212 00:24:30.024017 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:30Z","lastTransitionTime":"2025-12-12T00:24:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:30 crc kubenswrapper[4606]: I1212 00:24:30.126438 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:30 crc kubenswrapper[4606]: I1212 00:24:30.126474 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:30 crc kubenswrapper[4606]: I1212 00:24:30.126486 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:30 crc kubenswrapper[4606]: I1212 00:24:30.126501 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:30 crc kubenswrapper[4606]: I1212 00:24:30.126511 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:30Z","lastTransitionTime":"2025-12-12T00:24:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:30 crc kubenswrapper[4606]: I1212 00:24:30.229612 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:30 crc kubenswrapper[4606]: I1212 00:24:30.229661 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:30 crc kubenswrapper[4606]: I1212 00:24:30.229674 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:30 crc kubenswrapper[4606]: I1212 00:24:30.229690 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:30 crc kubenswrapper[4606]: I1212 00:24:30.229701 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:30Z","lastTransitionTime":"2025-12-12T00:24:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:30 crc kubenswrapper[4606]: I1212 00:24:30.333976 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:30 crc kubenswrapper[4606]: I1212 00:24:30.334472 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:30 crc kubenswrapper[4606]: I1212 00:24:30.334689 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:30 crc kubenswrapper[4606]: I1212 00:24:30.334889 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:30 crc kubenswrapper[4606]: I1212 00:24:30.335076 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:30Z","lastTransitionTime":"2025-12-12T00:24:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:30 crc kubenswrapper[4606]: I1212 00:24:30.438739 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:30 crc kubenswrapper[4606]: I1212 00:24:30.438773 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:30 crc kubenswrapper[4606]: I1212 00:24:30.438783 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:30 crc kubenswrapper[4606]: I1212 00:24:30.438797 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:30 crc kubenswrapper[4606]: I1212 00:24:30.438807 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:30Z","lastTransitionTime":"2025-12-12T00:24:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:30 crc kubenswrapper[4606]: I1212 00:24:30.540547 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:30 crc kubenswrapper[4606]: I1212 00:24:30.540778 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:30 crc kubenswrapper[4606]: I1212 00:24:30.540866 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:30 crc kubenswrapper[4606]: I1212 00:24:30.540998 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:30 crc kubenswrapper[4606]: I1212 00:24:30.541122 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:30Z","lastTransitionTime":"2025-12-12T00:24:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:30 crc kubenswrapper[4606]: I1212 00:24:30.643751 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:30 crc kubenswrapper[4606]: I1212 00:24:30.643795 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:30 crc kubenswrapper[4606]: I1212 00:24:30.643805 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:30 crc kubenswrapper[4606]: I1212 00:24:30.643819 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:30 crc kubenswrapper[4606]: I1212 00:24:30.643828 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:30Z","lastTransitionTime":"2025-12-12T00:24:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:30 crc kubenswrapper[4606]: I1212 00:24:30.699420 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 12 00:24:30 crc kubenswrapper[4606]: I1212 00:24:30.699462 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 12 00:24:30 crc kubenswrapper[4606]: I1212 00:24:30.699435 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 12 00:24:30 crc kubenswrapper[4606]: E1212 00:24:30.699567 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 12 00:24:30 crc kubenswrapper[4606]: E1212 00:24:30.699644 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 12 00:24:30 crc kubenswrapper[4606]: E1212 00:24:30.699747 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 12 00:24:30 crc kubenswrapper[4606]: I1212 00:24:30.747361 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:30 crc kubenswrapper[4606]: I1212 00:24:30.747397 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:30 crc kubenswrapper[4606]: I1212 00:24:30.747405 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:30 crc kubenswrapper[4606]: I1212 00:24:30.747421 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:30 crc kubenswrapper[4606]: I1212 00:24:30.747430 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:30Z","lastTransitionTime":"2025-12-12T00:24:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:30 crc kubenswrapper[4606]: I1212 00:24:30.850458 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:30 crc kubenswrapper[4606]: I1212 00:24:30.850777 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:30 crc kubenswrapper[4606]: I1212 00:24:30.850879 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:30 crc kubenswrapper[4606]: I1212 00:24:30.850968 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:30 crc kubenswrapper[4606]: I1212 00:24:30.851056 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:30Z","lastTransitionTime":"2025-12-12T00:24:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:30 crc kubenswrapper[4606]: I1212 00:24:30.953544 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:30 crc kubenswrapper[4606]: I1212 00:24:30.953622 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:30 crc kubenswrapper[4606]: I1212 00:24:30.953646 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:30 crc kubenswrapper[4606]: I1212 00:24:30.953675 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:30 crc kubenswrapper[4606]: I1212 00:24:30.953698 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:30Z","lastTransitionTime":"2025-12-12T00:24:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:31 crc kubenswrapper[4606]: I1212 00:24:31.056144 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:31 crc kubenswrapper[4606]: I1212 00:24:31.056263 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:31 crc kubenswrapper[4606]: I1212 00:24:31.056284 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:31 crc kubenswrapper[4606]: I1212 00:24:31.056311 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:31 crc kubenswrapper[4606]: I1212 00:24:31.056329 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:31Z","lastTransitionTime":"2025-12-12T00:24:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:31 crc kubenswrapper[4606]: I1212 00:24:31.160677 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:31 crc kubenswrapper[4606]: I1212 00:24:31.161073 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:31 crc kubenswrapper[4606]: I1212 00:24:31.161083 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:31 crc kubenswrapper[4606]: I1212 00:24:31.161101 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:31 crc kubenswrapper[4606]: I1212 00:24:31.161111 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:31Z","lastTransitionTime":"2025-12-12T00:24:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:31 crc kubenswrapper[4606]: I1212 00:24:31.264215 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:31 crc kubenswrapper[4606]: I1212 00:24:31.264277 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:31 crc kubenswrapper[4606]: I1212 00:24:31.264289 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:31 crc kubenswrapper[4606]: I1212 00:24:31.264312 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:31 crc kubenswrapper[4606]: I1212 00:24:31.264327 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:31Z","lastTransitionTime":"2025-12-12T00:24:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:31 crc kubenswrapper[4606]: I1212 00:24:31.367006 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:31 crc kubenswrapper[4606]: I1212 00:24:31.367103 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:31 crc kubenswrapper[4606]: I1212 00:24:31.367128 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:31 crc kubenswrapper[4606]: I1212 00:24:31.367158 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:31 crc kubenswrapper[4606]: I1212 00:24:31.367213 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:31Z","lastTransitionTime":"2025-12-12T00:24:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:31 crc kubenswrapper[4606]: I1212 00:24:31.470007 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:31 crc kubenswrapper[4606]: I1212 00:24:31.470043 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:31 crc kubenswrapper[4606]: I1212 00:24:31.470053 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:31 crc kubenswrapper[4606]: I1212 00:24:31.470117 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:31 crc kubenswrapper[4606]: I1212 00:24:31.470132 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:31Z","lastTransitionTime":"2025-12-12T00:24:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:31 crc kubenswrapper[4606]: I1212 00:24:31.572954 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:31 crc kubenswrapper[4606]: I1212 00:24:31.573036 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:31 crc kubenswrapper[4606]: I1212 00:24:31.573053 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:31 crc kubenswrapper[4606]: I1212 00:24:31.573111 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:31 crc kubenswrapper[4606]: I1212 00:24:31.573131 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:31Z","lastTransitionTime":"2025-12-12T00:24:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:31 crc kubenswrapper[4606]: I1212 00:24:31.675637 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:31 crc kubenswrapper[4606]: I1212 00:24:31.675686 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:31 crc kubenswrapper[4606]: I1212 00:24:31.675698 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:31 crc kubenswrapper[4606]: I1212 00:24:31.675718 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:31 crc kubenswrapper[4606]: I1212 00:24:31.675733 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:31Z","lastTransitionTime":"2025-12-12T00:24:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:31 crc kubenswrapper[4606]: I1212 00:24:31.699544 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mjjwd" Dec 12 00:24:31 crc kubenswrapper[4606]: E1212 00:24:31.699754 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mjjwd" podUID="0853dce1-c009-407e-960d-1113f85e503f" Dec 12 00:24:31 crc kubenswrapper[4606]: I1212 00:24:31.778459 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:31 crc kubenswrapper[4606]: I1212 00:24:31.778511 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:31 crc kubenswrapper[4606]: I1212 00:24:31.778523 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:31 crc kubenswrapper[4606]: I1212 00:24:31.778542 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:31 crc kubenswrapper[4606]: I1212 00:24:31.778555 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:31Z","lastTransitionTime":"2025-12-12T00:24:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:31 crc kubenswrapper[4606]: I1212 00:24:31.881619 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:31 crc kubenswrapper[4606]: I1212 00:24:31.881661 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:31 crc kubenswrapper[4606]: I1212 00:24:31.881674 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:31 crc kubenswrapper[4606]: I1212 00:24:31.881706 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:31 crc kubenswrapper[4606]: I1212 00:24:31.881720 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:31Z","lastTransitionTime":"2025-12-12T00:24:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:31 crc kubenswrapper[4606]: I1212 00:24:31.984960 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:31 crc kubenswrapper[4606]: I1212 00:24:31.985874 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:31 crc kubenswrapper[4606]: I1212 00:24:31.986037 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:31 crc kubenswrapper[4606]: I1212 00:24:31.986256 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:31 crc kubenswrapper[4606]: I1212 00:24:31.986484 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:31Z","lastTransitionTime":"2025-12-12T00:24:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:32 crc kubenswrapper[4606]: I1212 00:24:32.016026 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:32 crc kubenswrapper[4606]: I1212 00:24:32.016084 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:32 crc kubenswrapper[4606]: I1212 00:24:32.016109 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:32 crc kubenswrapper[4606]: I1212 00:24:32.016135 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:32 crc kubenswrapper[4606]: I1212 00:24:32.016157 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:32Z","lastTransitionTime":"2025-12-12T00:24:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:32 crc kubenswrapper[4606]: E1212 00:24:32.037276 4606 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:24:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:24:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:24:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:24:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"19ab28be-ccfb-4859-88d0-8d375e2d06f2\\\",\\\"systemUUID\\\":\\\"d5982033-b2dc-474c-9cda-275cd567c208\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:32Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:32 crc kubenswrapper[4606]: I1212 00:24:32.042799 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:32 crc kubenswrapper[4606]: I1212 00:24:32.042885 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:32 crc kubenswrapper[4606]: I1212 00:24:32.042936 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:32 crc kubenswrapper[4606]: I1212 00:24:32.042964 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:32 crc kubenswrapper[4606]: I1212 00:24:32.042982 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:32Z","lastTransitionTime":"2025-12-12T00:24:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:32 crc kubenswrapper[4606]: E1212 00:24:32.063094 4606 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:24:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:24:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:24:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:24:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"19ab28be-ccfb-4859-88d0-8d375e2d06f2\\\",\\\"systemUUID\\\":\\\"d5982033-b2dc-474c-9cda-275cd567c208\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:32Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:32 crc kubenswrapper[4606]: I1212 00:24:32.068395 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:32 crc kubenswrapper[4606]: I1212 00:24:32.068461 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:32 crc kubenswrapper[4606]: I1212 00:24:32.068471 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:32 crc kubenswrapper[4606]: I1212 00:24:32.068491 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:32 crc kubenswrapper[4606]: I1212 00:24:32.068515 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:32Z","lastTransitionTime":"2025-12-12T00:24:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:32 crc kubenswrapper[4606]: E1212 00:24:32.087948 4606 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:24:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:24:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:24:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:24:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"19ab28be-ccfb-4859-88d0-8d375e2d06f2\\\",\\\"systemUUID\\\":\\\"d5982033-b2dc-474c-9cda-275cd567c208\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:32Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:32 crc kubenswrapper[4606]: I1212 00:24:32.091865 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:32 crc kubenswrapper[4606]: I1212 00:24:32.091892 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:32 crc kubenswrapper[4606]: I1212 00:24:32.091900 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:32 crc kubenswrapper[4606]: I1212 00:24:32.091916 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:32 crc kubenswrapper[4606]: I1212 00:24:32.091927 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:32Z","lastTransitionTime":"2025-12-12T00:24:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:32 crc kubenswrapper[4606]: E1212 00:24:32.102828 4606 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:24:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:24:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:24:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:24:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"19ab28be-ccfb-4859-88d0-8d375e2d06f2\\\",\\\"systemUUID\\\":\\\"d5982033-b2dc-474c-9cda-275cd567c208\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:32Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:32 crc kubenswrapper[4606]: I1212 00:24:32.108407 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:32 crc kubenswrapper[4606]: I1212 00:24:32.108600 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:32 crc kubenswrapper[4606]: I1212 00:24:32.108721 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:32 crc kubenswrapper[4606]: I1212 00:24:32.108864 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:32 crc kubenswrapper[4606]: I1212 00:24:32.108992 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:32Z","lastTransitionTime":"2025-12-12T00:24:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:32 crc kubenswrapper[4606]: E1212 00:24:32.127085 4606 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:24:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:24:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:24:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:24:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"19ab28be-ccfb-4859-88d0-8d375e2d06f2\\\",\\\"systemUUID\\\":\\\"d5982033-b2dc-474c-9cda-275cd567c208\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:32Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:32 crc kubenswrapper[4606]: E1212 00:24:32.127632 4606 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 12 00:24:32 crc kubenswrapper[4606]: I1212 00:24:32.129358 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:32 crc kubenswrapper[4606]: I1212 00:24:32.129391 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:32 crc kubenswrapper[4606]: I1212 00:24:32.129399 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:32 crc kubenswrapper[4606]: I1212 00:24:32.129414 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:32 crc kubenswrapper[4606]: I1212 00:24:32.129425 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:32Z","lastTransitionTime":"2025-12-12T00:24:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:32 crc kubenswrapper[4606]: I1212 00:24:32.232320 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:32 crc kubenswrapper[4606]: I1212 00:24:32.232360 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:32 crc kubenswrapper[4606]: I1212 00:24:32.232370 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:32 crc kubenswrapper[4606]: I1212 00:24:32.232388 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:32 crc kubenswrapper[4606]: I1212 00:24:32.232401 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:32Z","lastTransitionTime":"2025-12-12T00:24:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:32 crc kubenswrapper[4606]: I1212 00:24:32.335098 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:32 crc kubenswrapper[4606]: I1212 00:24:32.335243 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:32 crc kubenswrapper[4606]: I1212 00:24:32.335255 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:32 crc kubenswrapper[4606]: I1212 00:24:32.335279 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:32 crc kubenswrapper[4606]: I1212 00:24:32.335291 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:32Z","lastTransitionTime":"2025-12-12T00:24:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:32 crc kubenswrapper[4606]: I1212 00:24:32.437721 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:32 crc kubenswrapper[4606]: I1212 00:24:32.437772 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:32 crc kubenswrapper[4606]: I1212 00:24:32.437788 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:32 crc kubenswrapper[4606]: I1212 00:24:32.437812 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:32 crc kubenswrapper[4606]: I1212 00:24:32.437840 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:32Z","lastTransitionTime":"2025-12-12T00:24:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:32 crc kubenswrapper[4606]: I1212 00:24:32.541202 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:32 crc kubenswrapper[4606]: I1212 00:24:32.541258 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:32 crc kubenswrapper[4606]: I1212 00:24:32.541271 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:32 crc kubenswrapper[4606]: I1212 00:24:32.541296 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:32 crc kubenswrapper[4606]: I1212 00:24:32.541314 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:32Z","lastTransitionTime":"2025-12-12T00:24:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:32 crc kubenswrapper[4606]: I1212 00:24:32.644352 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:32 crc kubenswrapper[4606]: I1212 00:24:32.644417 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:32 crc kubenswrapper[4606]: I1212 00:24:32.644433 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:32 crc kubenswrapper[4606]: I1212 00:24:32.644456 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:32 crc kubenswrapper[4606]: I1212 00:24:32.644470 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:32Z","lastTransitionTime":"2025-12-12T00:24:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:32 crc kubenswrapper[4606]: I1212 00:24:32.699276 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 12 00:24:32 crc kubenswrapper[4606]: I1212 00:24:32.699341 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 12 00:24:32 crc kubenswrapper[4606]: E1212 00:24:32.699425 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 12 00:24:32 crc kubenswrapper[4606]: I1212 00:24:32.699276 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 12 00:24:32 crc kubenswrapper[4606]: E1212 00:24:32.699494 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 12 00:24:32 crc kubenswrapper[4606]: E1212 00:24:32.699636 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 12 00:24:32 crc kubenswrapper[4606]: I1212 00:24:32.746503 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:32 crc kubenswrapper[4606]: I1212 00:24:32.746533 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:32 crc kubenswrapper[4606]: I1212 00:24:32.746541 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:32 crc kubenswrapper[4606]: I1212 00:24:32.746553 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:32 crc kubenswrapper[4606]: I1212 00:24:32.746581 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:32Z","lastTransitionTime":"2025-12-12T00:24:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:32 crc kubenswrapper[4606]: I1212 00:24:32.849357 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:32 crc kubenswrapper[4606]: I1212 00:24:32.849454 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:32 crc kubenswrapper[4606]: I1212 00:24:32.849474 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:32 crc kubenswrapper[4606]: I1212 00:24:32.849497 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:32 crc kubenswrapper[4606]: I1212 00:24:32.849744 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:32Z","lastTransitionTime":"2025-12-12T00:24:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:32 crc kubenswrapper[4606]: I1212 00:24:32.952966 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:32 crc kubenswrapper[4606]: I1212 00:24:32.953012 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:32 crc kubenswrapper[4606]: I1212 00:24:32.953024 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:32 crc kubenswrapper[4606]: I1212 00:24:32.953042 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:32 crc kubenswrapper[4606]: I1212 00:24:32.953055 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:32Z","lastTransitionTime":"2025-12-12T00:24:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:33 crc kubenswrapper[4606]: I1212 00:24:33.055319 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:33 crc kubenswrapper[4606]: I1212 00:24:33.055846 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:33 crc kubenswrapper[4606]: I1212 00:24:33.055929 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:33 crc kubenswrapper[4606]: I1212 00:24:33.056026 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:33 crc kubenswrapper[4606]: I1212 00:24:33.056087 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:33Z","lastTransitionTime":"2025-12-12T00:24:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:33 crc kubenswrapper[4606]: I1212 00:24:33.158511 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:33 crc kubenswrapper[4606]: I1212 00:24:33.158549 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:33 crc kubenswrapper[4606]: I1212 00:24:33.158558 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:33 crc kubenswrapper[4606]: I1212 00:24:33.158573 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:33 crc kubenswrapper[4606]: I1212 00:24:33.158584 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:33Z","lastTransitionTime":"2025-12-12T00:24:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:33 crc kubenswrapper[4606]: I1212 00:24:33.260834 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:33 crc kubenswrapper[4606]: I1212 00:24:33.260864 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:33 crc kubenswrapper[4606]: I1212 00:24:33.260873 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:33 crc kubenswrapper[4606]: I1212 00:24:33.260886 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:33 crc kubenswrapper[4606]: I1212 00:24:33.260896 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:33Z","lastTransitionTime":"2025-12-12T00:24:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:33 crc kubenswrapper[4606]: I1212 00:24:33.363326 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:33 crc kubenswrapper[4606]: I1212 00:24:33.363381 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:33 crc kubenswrapper[4606]: I1212 00:24:33.363391 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:33 crc kubenswrapper[4606]: I1212 00:24:33.363414 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:33 crc kubenswrapper[4606]: I1212 00:24:33.363431 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:33Z","lastTransitionTime":"2025-12-12T00:24:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:33 crc kubenswrapper[4606]: I1212 00:24:33.466445 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:33 crc kubenswrapper[4606]: I1212 00:24:33.466762 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:33 crc kubenswrapper[4606]: I1212 00:24:33.466865 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:33 crc kubenswrapper[4606]: I1212 00:24:33.466964 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:33 crc kubenswrapper[4606]: I1212 00:24:33.467066 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:33Z","lastTransitionTime":"2025-12-12T00:24:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:33 crc kubenswrapper[4606]: I1212 00:24:33.569825 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:33 crc kubenswrapper[4606]: I1212 00:24:33.569878 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:33 crc kubenswrapper[4606]: I1212 00:24:33.569888 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:33 crc kubenswrapper[4606]: I1212 00:24:33.569902 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:33 crc kubenswrapper[4606]: I1212 00:24:33.569911 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:33Z","lastTransitionTime":"2025-12-12T00:24:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:33 crc kubenswrapper[4606]: I1212 00:24:33.672367 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:33 crc kubenswrapper[4606]: I1212 00:24:33.672581 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:33 crc kubenswrapper[4606]: I1212 00:24:33.672705 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:33 crc kubenswrapper[4606]: I1212 00:24:33.672798 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:33 crc kubenswrapper[4606]: I1212 00:24:33.672886 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:33Z","lastTransitionTime":"2025-12-12T00:24:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:33 crc kubenswrapper[4606]: I1212 00:24:33.698973 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mjjwd" Dec 12 00:24:33 crc kubenswrapper[4606]: E1212 00:24:33.699425 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mjjwd" podUID="0853dce1-c009-407e-960d-1113f85e503f" Dec 12 00:24:33 crc kubenswrapper[4606]: I1212 00:24:33.780419 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:33 crc kubenswrapper[4606]: I1212 00:24:33.780458 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:33 crc kubenswrapper[4606]: I1212 00:24:33.780468 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:33 crc kubenswrapper[4606]: I1212 00:24:33.780488 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:33 crc kubenswrapper[4606]: I1212 00:24:33.780499 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:33Z","lastTransitionTime":"2025-12-12T00:24:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:33 crc kubenswrapper[4606]: I1212 00:24:33.883806 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:33 crc kubenswrapper[4606]: I1212 00:24:33.884107 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:33 crc kubenswrapper[4606]: I1212 00:24:33.884286 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:33 crc kubenswrapper[4606]: I1212 00:24:33.884328 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:33 crc kubenswrapper[4606]: I1212 00:24:33.884355 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:33Z","lastTransitionTime":"2025-12-12T00:24:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:33 crc kubenswrapper[4606]: I1212 00:24:33.987064 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:33 crc kubenswrapper[4606]: I1212 00:24:33.987102 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:33 crc kubenswrapper[4606]: I1212 00:24:33.987113 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:33 crc kubenswrapper[4606]: I1212 00:24:33.987133 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:33 crc kubenswrapper[4606]: I1212 00:24:33.987146 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:33Z","lastTransitionTime":"2025-12-12T00:24:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:34 crc kubenswrapper[4606]: I1212 00:24:34.089074 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:34 crc kubenswrapper[4606]: I1212 00:24:34.089109 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:34 crc kubenswrapper[4606]: I1212 00:24:34.089121 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:34 crc kubenswrapper[4606]: I1212 00:24:34.089139 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:34 crc kubenswrapper[4606]: I1212 00:24:34.089151 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:34Z","lastTransitionTime":"2025-12-12T00:24:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:34 crc kubenswrapper[4606]: I1212 00:24:34.191086 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:34 crc kubenswrapper[4606]: I1212 00:24:34.191134 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:34 crc kubenswrapper[4606]: I1212 00:24:34.191147 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:34 crc kubenswrapper[4606]: I1212 00:24:34.191164 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:34 crc kubenswrapper[4606]: I1212 00:24:34.191199 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:34Z","lastTransitionTime":"2025-12-12T00:24:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:34 crc kubenswrapper[4606]: I1212 00:24:34.292988 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:34 crc kubenswrapper[4606]: I1212 00:24:34.293025 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:34 crc kubenswrapper[4606]: I1212 00:24:34.293035 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:34 crc kubenswrapper[4606]: I1212 00:24:34.293052 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:34 crc kubenswrapper[4606]: I1212 00:24:34.293062 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:34Z","lastTransitionTime":"2025-12-12T00:24:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:34 crc kubenswrapper[4606]: I1212 00:24:34.395370 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:34 crc kubenswrapper[4606]: I1212 00:24:34.395433 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:34 crc kubenswrapper[4606]: I1212 00:24:34.395450 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:34 crc kubenswrapper[4606]: I1212 00:24:34.395473 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:34 crc kubenswrapper[4606]: I1212 00:24:34.395489 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:34Z","lastTransitionTime":"2025-12-12T00:24:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:34 crc kubenswrapper[4606]: I1212 00:24:34.497508 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:34 crc kubenswrapper[4606]: I1212 00:24:34.497553 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:34 crc kubenswrapper[4606]: I1212 00:24:34.497570 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:34 crc kubenswrapper[4606]: I1212 00:24:34.497592 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:34 crc kubenswrapper[4606]: I1212 00:24:34.497608 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:34Z","lastTransitionTime":"2025-12-12T00:24:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:34 crc kubenswrapper[4606]: I1212 00:24:34.599413 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:34 crc kubenswrapper[4606]: I1212 00:24:34.599488 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:34 crc kubenswrapper[4606]: I1212 00:24:34.599516 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:34 crc kubenswrapper[4606]: I1212 00:24:34.599545 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:34 crc kubenswrapper[4606]: I1212 00:24:34.599573 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:34Z","lastTransitionTime":"2025-12-12T00:24:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:34 crc kubenswrapper[4606]: I1212 00:24:34.699538 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 12 00:24:34 crc kubenswrapper[4606]: I1212 00:24:34.699633 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 12 00:24:34 crc kubenswrapper[4606]: I1212 00:24:34.699540 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 12 00:24:34 crc kubenswrapper[4606]: E1212 00:24:34.699662 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 12 00:24:34 crc kubenswrapper[4606]: E1212 00:24:34.699804 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 12 00:24:34 crc kubenswrapper[4606]: E1212 00:24:34.699892 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 12 00:24:34 crc kubenswrapper[4606]: I1212 00:24:34.701795 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:34 crc kubenswrapper[4606]: I1212 00:24:34.701823 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:34 crc kubenswrapper[4606]: I1212 00:24:34.701861 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:34 crc kubenswrapper[4606]: I1212 00:24:34.701875 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:34 crc kubenswrapper[4606]: I1212 00:24:34.701885 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:34Z","lastTransitionTime":"2025-12-12T00:24:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:34 crc kubenswrapper[4606]: I1212 00:24:34.804256 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:34 crc kubenswrapper[4606]: I1212 00:24:34.804325 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:34 crc kubenswrapper[4606]: I1212 00:24:34.804333 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:34 crc kubenswrapper[4606]: I1212 00:24:34.804375 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:34 crc kubenswrapper[4606]: I1212 00:24:34.804387 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:34Z","lastTransitionTime":"2025-12-12T00:24:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:34 crc kubenswrapper[4606]: I1212 00:24:34.909153 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:34 crc kubenswrapper[4606]: I1212 00:24:34.909213 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:34 crc kubenswrapper[4606]: I1212 00:24:34.909223 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:34 crc kubenswrapper[4606]: I1212 00:24:34.909238 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:34 crc kubenswrapper[4606]: I1212 00:24:34.909248 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:34Z","lastTransitionTime":"2025-12-12T00:24:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:35 crc kubenswrapper[4606]: I1212 00:24:35.011615 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:35 crc kubenswrapper[4606]: I1212 00:24:35.011652 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:35 crc kubenswrapper[4606]: I1212 00:24:35.011670 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:35 crc kubenswrapper[4606]: I1212 00:24:35.011686 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:35 crc kubenswrapper[4606]: I1212 00:24:35.011695 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:35Z","lastTransitionTime":"2025-12-12T00:24:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:35 crc kubenswrapper[4606]: I1212 00:24:35.114197 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:35 crc kubenswrapper[4606]: I1212 00:24:35.114247 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:35 crc kubenswrapper[4606]: I1212 00:24:35.114261 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:35 crc kubenswrapper[4606]: I1212 00:24:35.114276 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:35 crc kubenswrapper[4606]: I1212 00:24:35.114540 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:35Z","lastTransitionTime":"2025-12-12T00:24:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:35 crc kubenswrapper[4606]: I1212 00:24:35.216481 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:35 crc kubenswrapper[4606]: I1212 00:24:35.216518 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:35 crc kubenswrapper[4606]: I1212 00:24:35.216528 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:35 crc kubenswrapper[4606]: I1212 00:24:35.216542 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:35 crc kubenswrapper[4606]: I1212 00:24:35.216553 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:35Z","lastTransitionTime":"2025-12-12T00:24:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:35 crc kubenswrapper[4606]: I1212 00:24:35.318535 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:35 crc kubenswrapper[4606]: I1212 00:24:35.318566 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:35 crc kubenswrapper[4606]: I1212 00:24:35.318574 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:35 crc kubenswrapper[4606]: I1212 00:24:35.318587 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:35 crc kubenswrapper[4606]: I1212 00:24:35.318596 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:35Z","lastTransitionTime":"2025-12-12T00:24:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:35 crc kubenswrapper[4606]: I1212 00:24:35.420661 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:35 crc kubenswrapper[4606]: I1212 00:24:35.420781 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:35 crc kubenswrapper[4606]: I1212 00:24:35.420798 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:35 crc kubenswrapper[4606]: I1212 00:24:35.421100 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:35 crc kubenswrapper[4606]: I1212 00:24:35.421378 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:35Z","lastTransitionTime":"2025-12-12T00:24:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:35 crc kubenswrapper[4606]: I1212 00:24:35.455391 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0853dce1-c009-407e-960d-1113f85e503f-metrics-certs\") pod \"network-metrics-daemon-mjjwd\" (UID: \"0853dce1-c009-407e-960d-1113f85e503f\") " pod="openshift-multus/network-metrics-daemon-mjjwd" Dec 12 00:24:35 crc kubenswrapper[4606]: E1212 00:24:35.455515 4606 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 12 00:24:35 crc kubenswrapper[4606]: E1212 00:24:35.455565 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0853dce1-c009-407e-960d-1113f85e503f-metrics-certs podName:0853dce1-c009-407e-960d-1113f85e503f nodeName:}" failed. No retries permitted until 2025-12-12 00:25:07.45555088 +0000 UTC m=+98.000903746 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0853dce1-c009-407e-960d-1113f85e503f-metrics-certs") pod "network-metrics-daemon-mjjwd" (UID: "0853dce1-c009-407e-960d-1113f85e503f") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 12 00:24:35 crc kubenswrapper[4606]: I1212 00:24:35.523558 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:35 crc kubenswrapper[4606]: I1212 00:24:35.523600 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:35 crc kubenswrapper[4606]: I1212 00:24:35.523610 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:35 crc kubenswrapper[4606]: I1212 00:24:35.523624 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:35 crc kubenswrapper[4606]: I1212 00:24:35.523634 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:35Z","lastTransitionTime":"2025-12-12T00:24:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:35 crc kubenswrapper[4606]: I1212 00:24:35.626524 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:35 crc kubenswrapper[4606]: I1212 00:24:35.626549 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:35 crc kubenswrapper[4606]: I1212 00:24:35.626557 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:35 crc kubenswrapper[4606]: I1212 00:24:35.626569 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:35 crc kubenswrapper[4606]: I1212 00:24:35.626579 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:35Z","lastTransitionTime":"2025-12-12T00:24:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:35 crc kubenswrapper[4606]: I1212 00:24:35.698902 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mjjwd" Dec 12 00:24:35 crc kubenswrapper[4606]: E1212 00:24:35.699035 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mjjwd" podUID="0853dce1-c009-407e-960d-1113f85e503f" Dec 12 00:24:35 crc kubenswrapper[4606]: I1212 00:24:35.728524 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:35 crc kubenswrapper[4606]: I1212 00:24:35.728549 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:35 crc kubenswrapper[4606]: I1212 00:24:35.728558 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:35 crc kubenswrapper[4606]: I1212 00:24:35.728569 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:35 crc kubenswrapper[4606]: I1212 00:24:35.728579 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:35Z","lastTransitionTime":"2025-12-12T00:24:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:35 crc kubenswrapper[4606]: I1212 00:24:35.830368 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:35 crc kubenswrapper[4606]: I1212 00:24:35.830401 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:35 crc kubenswrapper[4606]: I1212 00:24:35.830410 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:35 crc kubenswrapper[4606]: I1212 00:24:35.830422 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:35 crc kubenswrapper[4606]: I1212 00:24:35.830430 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:35Z","lastTransitionTime":"2025-12-12T00:24:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:35 crc kubenswrapper[4606]: I1212 00:24:35.933357 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:35 crc kubenswrapper[4606]: I1212 00:24:35.933432 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:35 crc kubenswrapper[4606]: I1212 00:24:35.933458 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:35 crc kubenswrapper[4606]: I1212 00:24:35.933489 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:35 crc kubenswrapper[4606]: I1212 00:24:35.933513 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:35Z","lastTransitionTime":"2025-12-12T00:24:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:36 crc kubenswrapper[4606]: I1212 00:24:36.036800 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:36 crc kubenswrapper[4606]: I1212 00:24:36.036871 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:36 crc kubenswrapper[4606]: I1212 00:24:36.036889 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:36 crc kubenswrapper[4606]: I1212 00:24:36.036962 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:36 crc kubenswrapper[4606]: I1212 00:24:36.036980 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:36Z","lastTransitionTime":"2025-12-12T00:24:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:36 crc kubenswrapper[4606]: I1212 00:24:36.139027 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:36 crc kubenswrapper[4606]: I1212 00:24:36.139088 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:36 crc kubenswrapper[4606]: I1212 00:24:36.139099 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:36 crc kubenswrapper[4606]: I1212 00:24:36.139114 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:36 crc kubenswrapper[4606]: I1212 00:24:36.139125 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:36Z","lastTransitionTime":"2025-12-12T00:24:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:36 crc kubenswrapper[4606]: I1212 00:24:36.241227 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:36 crc kubenswrapper[4606]: I1212 00:24:36.241264 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:36 crc kubenswrapper[4606]: I1212 00:24:36.241275 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:36 crc kubenswrapper[4606]: I1212 00:24:36.241289 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:36 crc kubenswrapper[4606]: I1212 00:24:36.241299 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:36Z","lastTransitionTime":"2025-12-12T00:24:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:36 crc kubenswrapper[4606]: I1212 00:24:36.343360 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:36 crc kubenswrapper[4606]: I1212 00:24:36.343599 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:36 crc kubenswrapper[4606]: I1212 00:24:36.343689 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:36 crc kubenswrapper[4606]: I1212 00:24:36.343773 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:36 crc kubenswrapper[4606]: I1212 00:24:36.343832 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:36Z","lastTransitionTime":"2025-12-12T00:24:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:36 crc kubenswrapper[4606]: I1212 00:24:36.446580 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:36 crc kubenswrapper[4606]: I1212 00:24:36.446872 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:36 crc kubenswrapper[4606]: I1212 00:24:36.447477 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:36 crc kubenswrapper[4606]: I1212 00:24:36.447554 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:36 crc kubenswrapper[4606]: I1212 00:24:36.447631 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:36Z","lastTransitionTime":"2025-12-12T00:24:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:36 crc kubenswrapper[4606]: I1212 00:24:36.549712 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:36 crc kubenswrapper[4606]: I1212 00:24:36.549751 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:36 crc kubenswrapper[4606]: I1212 00:24:36.549763 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:36 crc kubenswrapper[4606]: I1212 00:24:36.549779 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:36 crc kubenswrapper[4606]: I1212 00:24:36.549790 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:36Z","lastTransitionTime":"2025-12-12T00:24:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:36 crc kubenswrapper[4606]: I1212 00:24:36.652405 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:36 crc kubenswrapper[4606]: I1212 00:24:36.652655 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:36 crc kubenswrapper[4606]: I1212 00:24:36.652794 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:36 crc kubenswrapper[4606]: I1212 00:24:36.652872 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:36 crc kubenswrapper[4606]: I1212 00:24:36.652936 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:36Z","lastTransitionTime":"2025-12-12T00:24:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:36 crc kubenswrapper[4606]: I1212 00:24:36.698821 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 12 00:24:36 crc kubenswrapper[4606]: I1212 00:24:36.698841 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 12 00:24:36 crc kubenswrapper[4606]: E1212 00:24:36.699283 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 12 00:24:36 crc kubenswrapper[4606]: E1212 00:24:36.699308 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 12 00:24:36 crc kubenswrapper[4606]: I1212 00:24:36.698923 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 12 00:24:36 crc kubenswrapper[4606]: E1212 00:24:36.699571 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 12 00:24:36 crc kubenswrapper[4606]: I1212 00:24:36.754959 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:36 crc kubenswrapper[4606]: I1212 00:24:36.754991 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:36 crc kubenswrapper[4606]: I1212 00:24:36.754999 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:36 crc kubenswrapper[4606]: I1212 00:24:36.755011 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:36 crc kubenswrapper[4606]: I1212 00:24:36.755019 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:36Z","lastTransitionTime":"2025-12-12T00:24:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:36 crc kubenswrapper[4606]: I1212 00:24:36.856569 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:36 crc kubenswrapper[4606]: I1212 00:24:36.856612 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:36 crc kubenswrapper[4606]: I1212 00:24:36.856625 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:36 crc kubenswrapper[4606]: I1212 00:24:36.856643 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:36 crc kubenswrapper[4606]: I1212 00:24:36.856655 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:36Z","lastTransitionTime":"2025-12-12T00:24:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:36 crc kubenswrapper[4606]: I1212 00:24:36.958926 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:36 crc kubenswrapper[4606]: I1212 00:24:36.958973 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:36 crc kubenswrapper[4606]: I1212 00:24:36.958981 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:36 crc kubenswrapper[4606]: I1212 00:24:36.958996 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:36 crc kubenswrapper[4606]: I1212 00:24:36.959005 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:36Z","lastTransitionTime":"2025-12-12T00:24:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:37 crc kubenswrapper[4606]: I1212 00:24:37.060961 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:37 crc kubenswrapper[4606]: I1212 00:24:37.061256 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:37 crc kubenswrapper[4606]: I1212 00:24:37.061340 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:37 crc kubenswrapper[4606]: I1212 00:24:37.061424 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:37 crc kubenswrapper[4606]: I1212 00:24:37.061487 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:37Z","lastTransitionTime":"2025-12-12T00:24:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:37 crc kubenswrapper[4606]: I1212 00:24:37.097475 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xzcfk_b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0/kube-multus/0.log" Dec 12 00:24:37 crc kubenswrapper[4606]: I1212 00:24:37.097536 4606 generic.go:334] "Generic (PLEG): container finished" podID="b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0" containerID="84110159d07d70b0607ac029fec70fb043bbb0e310c39955e49becfe87e9faa0" exitCode=1 Dec 12 00:24:37 crc kubenswrapper[4606]: I1212 00:24:37.097573 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xzcfk" event={"ID":"b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0","Type":"ContainerDied","Data":"84110159d07d70b0607ac029fec70fb043bbb0e310c39955e49becfe87e9faa0"} Dec 12 00:24:37 crc kubenswrapper[4606]: I1212 00:24:37.097968 4606 scope.go:117] "RemoveContainer" containerID="84110159d07d70b0607ac029fec70fb043bbb0e310c39955e49becfe87e9faa0" Dec 12 00:24:37 crc kubenswrapper[4606]: I1212 00:24:37.108344 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qtz7b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f437a237-3eb8-4817-b0a9-35efece69933\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f587a144b02f6ba1229380174f2a4f2c50ed33702fe8d64eb5ff7174b32eb2d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qq49f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qtz7b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:37Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:37 crc kubenswrapper[4606]: I1212 00:24:37.119574 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab571c5f-6f1d-4cb9-9a00-a57cc7baad63\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7a27c1b503f68ce99c9004d0190e5c74380bf2ff33b2b0e1e7f424e5cf9d450\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a159e87c39859bf3b5652b40223e4a8fdd9dcae3d23c1fae17d0eb8b5842a71c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e338552ee9504d46234c3caf3d9b7306a033258a94b6cf7c542bb957ea32a94c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c88faa76d30856d9170f04a153de4239eefbef49b2ee25a5bd04502697248b5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c88faa76d30856d9170f04a153de4239eefbef49b2ee25a5bd04502697248b5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:29Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:37Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:37 crc kubenswrapper[4606]: I1212 00:24:37.132157 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4986a294a19f85082b9caf854f13f9c3587ca49314b40917b00768ee0bb51ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4251eab73ccb6141903993f8df922037762f7c11dc1e0ef1a2b5708ed435307\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:37Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:37 crc kubenswrapper[4606]: I1212 00:24:37.142984 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:37Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:37 crc kubenswrapper[4606]: I1212 00:24:37.155220 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce6c4e852b4c68e910621ad11058beecb40960d501d3e64f5fcad97c442df4ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:37Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:37 crc kubenswrapper[4606]: I1212 00:24:37.163553 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:37 crc kubenswrapper[4606]: I1212 00:24:37.163587 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:37 crc kubenswrapper[4606]: I1212 00:24:37.163597 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:37 crc kubenswrapper[4606]: I1212 00:24:37.163612 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:37 crc kubenswrapper[4606]: I1212 00:24:37.163621 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:37Z","lastTransitionTime":"2025-12-12T00:24:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:37 crc kubenswrapper[4606]: I1212 00:24:37.166336 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7rxps" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c831d6d-b07d-46dd-adc0-85239379350f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba170e097af91d1b27e0ee3e414e0831a5a93256c693247e7129a5fda0b9dc85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:24:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbjbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d66f7e90d4969cb2791da434d47d0e4fb269aa8760347c7d59bd8d16f4cf42c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:24:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbjbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:24:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7rxps\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:37Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:37 crc kubenswrapper[4606]: I1212 00:24:37.185239 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mjjwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0853dce1-c009-407e-960d-1113f85e503f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwwtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwwtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:24:03Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mjjwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:37Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:37 crc kubenswrapper[4606]: I1212 00:24:37.196285 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:37Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:37 crc kubenswrapper[4606]: I1212 00:24:37.205445 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-554rp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d34036b-3243-416a-a0c0-1d1ddb3a0ca5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://346417326a8a4ba8e839ce4870dbaacd33a90b4ab096d8ce573ce598d909b51a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9k8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-554rp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:37Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:37 crc kubenswrapper[4606]: I1212 00:24:37.217160 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-w4rbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"470c2076-46bf-4305-9fb1-3e509eb4d4f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78e7b08154aa4792439cd9e13c38f1c3a106a4d138d8e6a47d5406c82f11662c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34a40d0ce85078630553b90b7a35510e0d3523f600db8940d5e0cf063903192d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34a40d0ce85078630553b90b7a35510e0d3523f600db8940d5e0cf063903192d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f0d903a3e09486a00bc5031c1a100d7a520f8e0cbdcccbeadb796e70f1075b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f0d903a3e09486a00bc5031c1a100d7a520f8e0cbdcccbeadb796e70f1075b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06909fa1392ee8f29394ecd7411437e4f331ccd2007d0a534d0292175a7260ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06909fa1392ee8f29394ecd7411437e4f331ccd2007d0a534d0292175a7260ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://328f409c465a8289f2e2e070ac2475187e763de8354cad8af5b5b4e0a3539628\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://328f409c465a8289f2e2e070ac2475187e763de8354cad8af5b5b4e0a3539628\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09980f0c6b16141a2a66d2768c6e01f75c37314fea48825e9b9707a3fc4d3f9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09980f0c6b16141a2a66d2768c6e01f75c37314fea48825e9b9707a3fc4d3f9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d34cd079b74cccb75c66edf8c51b64113483767ced4eb56ad429d6527d0bb0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d34cd079b74cccb75c66edf8c51b64113483767ced4eb56ad429d6527d0bb0c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-w4rbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:37Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:37 crc kubenswrapper[4606]: I1212 00:24:37.233710 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da25b0ba-5398-4185-a4c1-aeba44ae5633\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa827acce161127216bcfcf3553efeb627cbe52e4d23a8cb011e025f67dde670\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://653bdfa6f15b75149c89a4aff0706e132368f8bc75520cde9f1e1c55c8866537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbb742680d55b47d51cd1964e187b6816e50a88befbce58d3a8746af74cf7071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://366ddc63e1156e702b24e0c3d3c02785c10324669fa851e46851f9a2b8a46a5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6183a4685fda61493c792e9ae693cd7f76e4aad9753f622b48347da3fd2b3b00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0867a89de2edf12036267a937b7e3b1a801e9710ff0a1250c20e020c80ac4d5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bc7d9363ed2e08b897cfc835f60d45070a0a21643ac00f147f7e0108af46f37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bc7d9363ed2e08b897cfc835f60d45070a0a21643ac00f147f7e0108af46f37\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-12T00:24:16Z\\\",\\\"message\\\":\\\"om/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1212 00:24:15.754838 6204 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1212 00:24:15.755303 6204 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1212 00:24:15.755326 6204 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1212 00:24:15.755332 6204 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI1212 00:24:15.755347 6204 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1212 00:24:15.755357 6204 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1212 00:24:15.755450 6204 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1212 00:24:15.755585 6204 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1212 00:24:15.755619 6204 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1212 00:24:15.755649 6204 factory.go:656] Stopping watch factory\\\\nI1212 00:24:15.755674 6204 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1212 00:24:15.755690 6204 handler.go:208] Removed *v1.NetworkPolicy ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-12T00:24:15Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-hpw5w_openshift-ovn-kubernetes(da25b0ba-5398-4185-a4c1-aeba44ae5633)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e3c08f155fa3cef9692edf10c4df9719f3647b5625a9ac68a7c158afb80dd86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f94a9f641815d76464c17c6145b9ac4803eedab3bea29a52d65a793d448cebe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f94a9f641815d76464c17c6145b9ac4803eedab3bea29a52d65a793d448cebe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hpw5w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:37Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:37 crc kubenswrapper[4606]: I1212 00:24:37.278064 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:37 crc kubenswrapper[4606]: I1212 00:24:37.278113 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:37 crc kubenswrapper[4606]: I1212 00:24:37.278126 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:37 crc kubenswrapper[4606]: I1212 00:24:37.278141 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:37 crc kubenswrapper[4606]: I1212 00:24:37.278153 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:37Z","lastTransitionTime":"2025-12-12T00:24:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:37 crc kubenswrapper[4606]: I1212 00:24:37.281471 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://694cd2f423e8241bd8fd68395824d1cd45a12d80e92ad57e63e59c0642b21384\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:37Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:37 crc kubenswrapper[4606]: I1212 00:24:37.291738 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a543e227-be89-40cb-941d-b4707cc28921\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b09638dd0d881593745c14548156f20a9366f65ecfa5018d510b902240dd4f7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-986w2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5de882904f4a5718bcf44d07da05096363447836f325f35130914c17a30c3a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-986w2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cqmz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:37Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:37 crc kubenswrapper[4606]: I1212 00:24:37.302304 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xzcfk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84110159d07d70b0607ac029fec70fb043bbb0e310c39955e49becfe87e9faa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84110159d07d70b0607ac029fec70fb043bbb0e310c39955e49becfe87e9faa0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-12T00:24:36Z\\\",\\\"message\\\":\\\"2025-12-12T00:23:51+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_d712852f-61b9-48f1-a1d6-1031b964219c\\\\n2025-12-12T00:23:51+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_d712852f-61b9-48f1-a1d6-1031b964219c to /host/opt/cni/bin/\\\\n2025-12-12T00:23:51Z [verbose] multus-daemon started\\\\n2025-12-12T00:23:51Z [verbose] Readiness Indicator file check\\\\n2025-12-12T00:24:36Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-spsv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xzcfk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:37Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:37 crc kubenswrapper[4606]: I1212 00:24:37.322630 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f8df398-6835-482a-a2cd-3395b6e9efab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b993d03e36e69edb510ce98f50dfb98344fb6fe2b5debeff3a2004807dd00ee6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c6fef20d4e159a405f0ab0206c49e5e314e97bf7ddaf0758e090106945c9ffe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://accf40ef1339b65136666c1ea6f21a70d2152a650c87ef3ade19ddb6f9effbc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbd49ad6d6011a235d440da0e8058c0fc4dcc64effec37b50a68e84438d537a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2860bce1479ca1d558c4f5230d76cadcaf3efc0e3842ef0d371ecf1168cd852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7780eb068dc4a157794f9253367d2198a38a2c039cf3906fc2319cf48db3cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa7780eb068dc4a157794f9253367d2198a38a2c039cf3906fc2319cf48db3cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://684c387c89889a0a1e8d317ca70ebfc71ea377eb5de9c08c71a3ff02faf99e4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://684c387c89889a0a1e8d317ca70ebfc71ea377eb5de9c08c71a3ff02faf99e4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://78da314feaec6cab81e27eaea086d2bba3a05c4b73ce2add1a0bb57a226b0f5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78da314feaec6cab81e27eaea086d2bba3a05c4b73ce2add1a0bb57a226b0f5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:37Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:37 crc kubenswrapper[4606]: I1212 00:24:37.335379 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92839d27-9755-4aaa-a4c2-8aa83b6473de\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d18b3a560566347e9cfcd748453a89d1588de77f02a5e961a71916c6182eff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9531d10c07644d54aeff48edf4ac068968acbd1b6597baabcb0660e7bcd54c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76f2e4c2d97f36aebba4bcbabe12214a1df27637376d1440ee54d43fca39065\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b764ca1153735e365d95ed2855c7d41cd7457614e1ddce0f008d62720d607e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c75350d79a93c04db2bd6d71995920cde9b897776249b8696f3ead3939f7d5d4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"ed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1765499023\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1765499022\\\\\\\\\\\\\\\" (2025-12-11 23:23:42 +0000 UTC to 2026-12-11 23:23:42 +0000 UTC (now=2025-12-12 00:23:48.120831298 +0000 UTC))\\\\\\\"\\\\nI1212 00:23:48.120889 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1212 00:23:48.120921 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1212 00:23:48.120964 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1212 00:23:48.120986 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1212 00:23:48.121028 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3271453193/tls.crt::/tmp/serving-cert-3271453193/tls.key\\\\\\\"\\\\nI1212 00:23:48.121084 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1212 00:23:48.121211 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1212 00:23:48.121228 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1212 00:23:48.121265 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1212 00:23:48.121275 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1212 00:23:48.121345 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1212 00:23:48.121358 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF1212 00:23:48.123292 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3baf1bf5fbaa04f9f165f43b6a4fba496be0d453641fcd8bbc811687891dd66\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bbabfdcef0c9f4dcaada0bf5c2e98c083fd0f809462f017f8a767ef800e8e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0bbabfdcef0c9f4dcaada0bf5c2e98c083fd0f809462f017f8a767ef800e8e13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:37Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:37 crc kubenswrapper[4606]: I1212 00:24:37.347898 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ba650bd-b1e1-4950-98c6-2bee87181b18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de6cd366dc8d770945235faa6560cea35c0ef7eabd95cb272639e2116e2d623d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a380a96c542428bf9cf66391f4d843144052269b294636e863319b2ba954047f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://902a9d2fb0d7dcdec6339fac981ee0539911f1998a86ecf02a01015d3b1b5907\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4309a0a93977264e5000a930a32a4fb18ebe51fcf2af20bbf153f30f09759ff6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:37Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:37 crc kubenswrapper[4606]: I1212 00:24:37.359624 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:37Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:37 crc kubenswrapper[4606]: I1212 00:24:37.380118 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:37 crc kubenswrapper[4606]: I1212 00:24:37.380155 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:37 crc kubenswrapper[4606]: I1212 00:24:37.380183 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:37 crc kubenswrapper[4606]: I1212 00:24:37.380198 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:37 crc kubenswrapper[4606]: I1212 00:24:37.380208 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:37Z","lastTransitionTime":"2025-12-12T00:24:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:37 crc kubenswrapper[4606]: I1212 00:24:37.481907 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:37 crc kubenswrapper[4606]: I1212 00:24:37.482041 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:37 crc kubenswrapper[4606]: I1212 00:24:37.482110 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:37 crc kubenswrapper[4606]: I1212 00:24:37.482214 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:37 crc kubenswrapper[4606]: I1212 00:24:37.482297 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:37Z","lastTransitionTime":"2025-12-12T00:24:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:37 crc kubenswrapper[4606]: I1212 00:24:37.584286 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:37 crc kubenswrapper[4606]: I1212 00:24:37.584546 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:37 crc kubenswrapper[4606]: I1212 00:24:37.584634 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:37 crc kubenswrapper[4606]: I1212 00:24:37.584731 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:37 crc kubenswrapper[4606]: I1212 00:24:37.584847 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:37Z","lastTransitionTime":"2025-12-12T00:24:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:37 crc kubenswrapper[4606]: I1212 00:24:37.687526 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:37 crc kubenswrapper[4606]: I1212 00:24:37.687584 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:37 crc kubenswrapper[4606]: I1212 00:24:37.687603 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:37 crc kubenswrapper[4606]: I1212 00:24:37.687629 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:37 crc kubenswrapper[4606]: I1212 00:24:37.687647 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:37Z","lastTransitionTime":"2025-12-12T00:24:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:37 crc kubenswrapper[4606]: I1212 00:24:37.698847 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mjjwd" Dec 12 00:24:37 crc kubenswrapper[4606]: E1212 00:24:37.699039 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mjjwd" podUID="0853dce1-c009-407e-960d-1113f85e503f" Dec 12 00:24:37 crc kubenswrapper[4606]: I1212 00:24:37.789846 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:37 crc kubenswrapper[4606]: I1212 00:24:37.790056 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:37 crc kubenswrapper[4606]: I1212 00:24:37.790144 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:37 crc kubenswrapper[4606]: I1212 00:24:37.790276 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:37 crc kubenswrapper[4606]: I1212 00:24:37.790360 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:37Z","lastTransitionTime":"2025-12-12T00:24:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:37 crc kubenswrapper[4606]: I1212 00:24:37.892123 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:37 crc kubenswrapper[4606]: I1212 00:24:37.892152 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:37 crc kubenswrapper[4606]: I1212 00:24:37.892162 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:37 crc kubenswrapper[4606]: I1212 00:24:37.892197 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:37 crc kubenswrapper[4606]: I1212 00:24:37.892208 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:37Z","lastTransitionTime":"2025-12-12T00:24:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:37 crc kubenswrapper[4606]: I1212 00:24:37.993833 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:37 crc kubenswrapper[4606]: I1212 00:24:37.993864 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:37 crc kubenswrapper[4606]: I1212 00:24:37.993873 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:37 crc kubenswrapper[4606]: I1212 00:24:37.993887 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:37 crc kubenswrapper[4606]: I1212 00:24:37.993897 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:37Z","lastTransitionTime":"2025-12-12T00:24:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:38 crc kubenswrapper[4606]: I1212 00:24:38.096322 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:38 crc kubenswrapper[4606]: I1212 00:24:38.096362 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:38 crc kubenswrapper[4606]: I1212 00:24:38.096372 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:38 crc kubenswrapper[4606]: I1212 00:24:38.096389 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:38 crc kubenswrapper[4606]: I1212 00:24:38.096401 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:38Z","lastTransitionTime":"2025-12-12T00:24:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:38 crc kubenswrapper[4606]: I1212 00:24:38.101839 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xzcfk_b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0/kube-multus/0.log" Dec 12 00:24:38 crc kubenswrapper[4606]: I1212 00:24:38.101940 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xzcfk" event={"ID":"b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0","Type":"ContainerStarted","Data":"d3601dd0b8c0a574cc4689906f3c4ed491f3432cae8b328c55f07e5d17b6e976"} Dec 12 00:24:38 crc kubenswrapper[4606]: I1212 00:24:38.122272 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f8df398-6835-482a-a2cd-3395b6e9efab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b993d03e36e69edb510ce98f50dfb98344fb6fe2b5debeff3a2004807dd00ee6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c6fef20d4e159a405f0ab0206c49e5e314e97bf7ddaf0758e090106945c9ffe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://accf40ef1339b65136666c1ea6f21a70d2152a650c87ef3ade19ddb6f9effbc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbd49ad6d6011a235d440da0e8058c0fc4dcc64effec37b50a68e84438d537a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2860bce1479ca1d558c4f5230d76cadcaf3efc0e3842ef0d371ecf1168cd852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7780eb068dc4a157794f9253367d2198a38a2c039cf3906fc2319cf48db3cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa7780eb068dc4a157794f9253367d2198a38a2c039cf3906fc2319cf48db3cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://684c387c89889a0a1e8d317ca70ebfc71ea377eb5de9c08c71a3ff02faf99e4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://684c387c89889a0a1e8d317ca70ebfc71ea377eb5de9c08c71a3ff02faf99e4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://78da314feaec6cab81e27eaea086d2bba3a05c4b73ce2add1a0bb57a226b0f5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78da314feaec6cab81e27eaea086d2bba3a05c4b73ce2add1a0bb57a226b0f5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:38Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:38 crc kubenswrapper[4606]: I1212 00:24:38.136425 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92839d27-9755-4aaa-a4c2-8aa83b6473de\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d18b3a560566347e9cfcd748453a89d1588de77f02a5e961a71916c6182eff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9531d10c07644d54aeff48edf4ac068968acbd1b6597baabcb0660e7bcd54c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76f2e4c2d97f36aebba4bcbabe12214a1df27637376d1440ee54d43fca39065\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b764ca1153735e365d95ed2855c7d41cd7457614e1ddce0f008d62720d607e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c75350d79a93c04db2bd6d71995920cde9b897776249b8696f3ead3939f7d5d4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"ed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1765499023\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1765499022\\\\\\\\\\\\\\\" (2025-12-11 23:23:42 +0000 UTC to 2026-12-11 23:23:42 +0000 UTC (now=2025-12-12 00:23:48.120831298 +0000 UTC))\\\\\\\"\\\\nI1212 00:23:48.120889 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1212 00:23:48.120921 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1212 00:23:48.120964 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1212 00:23:48.120986 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1212 00:23:48.121028 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3271453193/tls.crt::/tmp/serving-cert-3271453193/tls.key\\\\\\\"\\\\nI1212 00:23:48.121084 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1212 00:23:48.121211 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1212 00:23:48.121228 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1212 00:23:48.121265 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1212 00:23:48.121275 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1212 00:23:48.121345 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1212 00:23:48.121358 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF1212 00:23:48.123292 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3baf1bf5fbaa04f9f165f43b6a4fba496be0d453641fcd8bbc811687891dd66\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bbabfdcef0c9f4dcaada0bf5c2e98c083fd0f809462f017f8a767ef800e8e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0bbabfdcef0c9f4dcaada0bf5c2e98c083fd0f809462f017f8a767ef800e8e13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:38Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:38 crc kubenswrapper[4606]: I1212 00:24:38.148594 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ba650bd-b1e1-4950-98c6-2bee87181b18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de6cd366dc8d770945235faa6560cea35c0ef7eabd95cb272639e2116e2d623d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a380a96c542428bf9cf66391f4d843144052269b294636e863319b2ba954047f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://902a9d2fb0d7dcdec6339fac981ee0539911f1998a86ecf02a01015d3b1b5907\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4309a0a93977264e5000a930a32a4fb18ebe51fcf2af20bbf153f30f09759ff6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:38Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:38 crc kubenswrapper[4606]: I1212 00:24:38.159609 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:38Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:38 crc kubenswrapper[4606]: I1212 00:24:38.170265 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://694cd2f423e8241bd8fd68395824d1cd45a12d80e92ad57e63e59c0642b21384\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:38Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:38 crc kubenswrapper[4606]: I1212 00:24:38.184065 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a543e227-be89-40cb-941d-b4707cc28921\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b09638dd0d881593745c14548156f20a9366f65ecfa5018d510b902240dd4f7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-986w2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5de882904f4a5718bcf44d07da05096363447836f325f35130914c17a30c3a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-986w2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cqmz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:38Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:38 crc kubenswrapper[4606]: I1212 00:24:38.196704 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xzcfk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3601dd0b8c0a574cc4689906f3c4ed491f3432cae8b328c55f07e5d17b6e976\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84110159d07d70b0607ac029fec70fb043bbb0e310c39955e49becfe87e9faa0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-12T00:24:36Z\\\",\\\"message\\\":\\\"2025-12-12T00:23:51+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_d712852f-61b9-48f1-a1d6-1031b964219c\\\\n2025-12-12T00:23:51+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_d712852f-61b9-48f1-a1d6-1031b964219c to /host/opt/cni/bin/\\\\n2025-12-12T00:23:51Z [verbose] multus-daemon started\\\\n2025-12-12T00:23:51Z [verbose] Readiness Indicator file check\\\\n2025-12-12T00:24:36Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-spsv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xzcfk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:38Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:38 crc kubenswrapper[4606]: I1212 00:24:38.198053 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:38 crc kubenswrapper[4606]: I1212 00:24:38.198085 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:38 crc kubenswrapper[4606]: I1212 00:24:38.198098 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:38 crc kubenswrapper[4606]: I1212 00:24:38.198127 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:38 crc kubenswrapper[4606]: I1212 00:24:38.198141 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:38Z","lastTransitionTime":"2025-12-12T00:24:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:38 crc kubenswrapper[4606]: I1212 00:24:38.210506 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab571c5f-6f1d-4cb9-9a00-a57cc7baad63\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7a27c1b503f68ce99c9004d0190e5c74380bf2ff33b2b0e1e7f424e5cf9d450\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a159e87c39859bf3b5652b40223e4a8fdd9dcae3d23c1fae17d0eb8b5842a71c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e338552ee9504d46234c3caf3d9b7306a033258a94b6cf7c542bb957ea32a94c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c88faa76d30856d9170f04a153de4239eefbef49b2ee25a5bd04502697248b5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c88faa76d30856d9170f04a153de4239eefbef49b2ee25a5bd04502697248b5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:29Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:38Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:38 crc kubenswrapper[4606]: I1212 00:24:38.222484 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4986a294a19f85082b9caf854f13f9c3587ca49314b40917b00768ee0bb51ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4251eab73ccb6141903993f8df922037762f7c11dc1e0ef1a2b5708ed435307\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:38Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:38 crc kubenswrapper[4606]: I1212 00:24:38.233885 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:38Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:38 crc kubenswrapper[4606]: I1212 00:24:38.243966 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qtz7b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f437a237-3eb8-4817-b0a9-35efece69933\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f587a144b02f6ba1229380174f2a4f2c50ed33702fe8d64eb5ff7174b32eb2d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qq49f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qtz7b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:38Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:38 crc kubenswrapper[4606]: I1212 00:24:38.254657 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce6c4e852b4c68e910621ad11058beecb40960d501d3e64f5fcad97c442df4ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:38Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:38 crc kubenswrapper[4606]: I1212 00:24:38.265372 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:38Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:38 crc kubenswrapper[4606]: I1212 00:24:38.273690 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-554rp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d34036b-3243-416a-a0c0-1d1ddb3a0ca5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://346417326a8a4ba8e839ce4870dbaacd33a90b4ab096d8ce573ce598d909b51a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9k8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-554rp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:38Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:38 crc kubenswrapper[4606]: I1212 00:24:38.285629 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-w4rbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"470c2076-46bf-4305-9fb1-3e509eb4d4f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78e7b08154aa4792439cd9e13c38f1c3a106a4d138d8e6a47d5406c82f11662c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34a40d0ce85078630553b90b7a35510e0d3523f600db8940d5e0cf063903192d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34a40d0ce85078630553b90b7a35510e0d3523f600db8940d5e0cf063903192d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f0d903a3e09486a00bc5031c1a100d7a520f8e0cbdcccbeadb796e70f1075b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f0d903a3e09486a00bc5031c1a100d7a520f8e0cbdcccbeadb796e70f1075b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06909fa1392ee8f29394ecd7411437e4f331ccd2007d0a534d0292175a7260ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06909fa1392ee8f29394ecd7411437e4f331ccd2007d0a534d0292175a7260ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://328f409c465a8289f2e2e070ac2475187e763de8354cad8af5b5b4e0a3539628\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://328f409c465a8289f2e2e070ac2475187e763de8354cad8af5b5b4e0a3539628\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09980f0c6b16141a2a66d2768c6e01f75c37314fea48825e9b9707a3fc4d3f9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09980f0c6b16141a2a66d2768c6e01f75c37314fea48825e9b9707a3fc4d3f9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d34cd079b74cccb75c66edf8c51b64113483767ced4eb56ad429d6527d0bb0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d34cd079b74cccb75c66edf8c51b64113483767ced4eb56ad429d6527d0bb0c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-w4rbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:38Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:38 crc kubenswrapper[4606]: I1212 00:24:38.301378 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:38 crc kubenswrapper[4606]: I1212 00:24:38.301426 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:38 crc kubenswrapper[4606]: I1212 00:24:38.301443 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:38 crc kubenswrapper[4606]: I1212 00:24:38.301464 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:38 crc kubenswrapper[4606]: I1212 00:24:38.301483 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:38Z","lastTransitionTime":"2025-12-12T00:24:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:38 crc kubenswrapper[4606]: I1212 00:24:38.303730 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da25b0ba-5398-4185-a4c1-aeba44ae5633\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa827acce161127216bcfcf3553efeb627cbe52e4d23a8cb011e025f67dde670\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://653bdfa6f15b75149c89a4aff0706e132368f8bc75520cde9f1e1c55c8866537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbb742680d55b47d51cd1964e187b6816e50a88befbce58d3a8746af74cf7071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://366ddc63e1156e702b24e0c3d3c02785c10324669fa851e46851f9a2b8a46a5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6183a4685fda61493c792e9ae693cd7f76e4aad9753f622b48347da3fd2b3b00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0867a89de2edf12036267a937b7e3b1a801e9710ff0a1250c20e020c80ac4d5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bc7d9363ed2e08b897cfc835f60d45070a0a21643ac00f147f7e0108af46f37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bc7d9363ed2e08b897cfc835f60d45070a0a21643ac00f147f7e0108af46f37\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-12T00:24:16Z\\\",\\\"message\\\":\\\"om/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1212 00:24:15.754838 6204 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1212 00:24:15.755303 6204 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1212 00:24:15.755326 6204 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1212 00:24:15.755332 6204 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI1212 00:24:15.755347 6204 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1212 00:24:15.755357 6204 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1212 00:24:15.755450 6204 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1212 00:24:15.755585 6204 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1212 00:24:15.755619 6204 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1212 00:24:15.755649 6204 factory.go:656] Stopping watch factory\\\\nI1212 00:24:15.755674 6204 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1212 00:24:15.755690 6204 handler.go:208] Removed *v1.NetworkPolicy ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-12T00:24:15Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-hpw5w_openshift-ovn-kubernetes(da25b0ba-5398-4185-a4c1-aeba44ae5633)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e3c08f155fa3cef9692edf10c4df9719f3647b5625a9ac68a7c158afb80dd86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f94a9f641815d76464c17c6145b9ac4803eedab3bea29a52d65a793d448cebe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f94a9f641815d76464c17c6145b9ac4803eedab3bea29a52d65a793d448cebe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hpw5w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:38Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:38 crc kubenswrapper[4606]: I1212 00:24:38.313871 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7rxps" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c831d6d-b07d-46dd-adc0-85239379350f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba170e097af91d1b27e0ee3e414e0831a5a93256c693247e7129a5fda0b9dc85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:24:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbjbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d66f7e90d4969cb2791da434d47d0e4fb269aa8760347c7d59bd8d16f4cf42c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:24:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbjbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:24:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7rxps\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:38Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:38 crc kubenswrapper[4606]: I1212 00:24:38.324393 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mjjwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0853dce1-c009-407e-960d-1113f85e503f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwwtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwwtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:24:03Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mjjwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:38Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:38 crc kubenswrapper[4606]: I1212 00:24:38.404036 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:38 crc kubenswrapper[4606]: I1212 00:24:38.404064 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:38 crc kubenswrapper[4606]: I1212 00:24:38.404072 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:38 crc kubenswrapper[4606]: I1212 00:24:38.404085 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:38 crc kubenswrapper[4606]: I1212 00:24:38.404094 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:38Z","lastTransitionTime":"2025-12-12T00:24:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:38 crc kubenswrapper[4606]: I1212 00:24:38.507026 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:38 crc kubenswrapper[4606]: I1212 00:24:38.507058 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:38 crc kubenswrapper[4606]: I1212 00:24:38.507066 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:38 crc kubenswrapper[4606]: I1212 00:24:38.507081 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:38 crc kubenswrapper[4606]: I1212 00:24:38.507090 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:38Z","lastTransitionTime":"2025-12-12T00:24:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:38 crc kubenswrapper[4606]: I1212 00:24:38.609831 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:38 crc kubenswrapper[4606]: I1212 00:24:38.609879 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:38 crc kubenswrapper[4606]: I1212 00:24:38.609890 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:38 crc kubenswrapper[4606]: I1212 00:24:38.609909 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:38 crc kubenswrapper[4606]: I1212 00:24:38.609921 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:38Z","lastTransitionTime":"2025-12-12T00:24:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:38 crc kubenswrapper[4606]: I1212 00:24:38.699611 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 12 00:24:38 crc kubenswrapper[4606]: E1212 00:24:38.699754 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 12 00:24:38 crc kubenswrapper[4606]: I1212 00:24:38.699611 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 12 00:24:38 crc kubenswrapper[4606]: I1212 00:24:38.699615 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 12 00:24:38 crc kubenswrapper[4606]: E1212 00:24:38.699939 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 12 00:24:38 crc kubenswrapper[4606]: E1212 00:24:38.699961 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 12 00:24:38 crc kubenswrapper[4606]: I1212 00:24:38.712186 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:38 crc kubenswrapper[4606]: I1212 00:24:38.712209 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:38 crc kubenswrapper[4606]: I1212 00:24:38.712218 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:38 crc kubenswrapper[4606]: I1212 00:24:38.712232 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:38 crc kubenswrapper[4606]: I1212 00:24:38.712241 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:38Z","lastTransitionTime":"2025-12-12T00:24:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:38 crc kubenswrapper[4606]: I1212 00:24:38.815258 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:38 crc kubenswrapper[4606]: I1212 00:24:38.815300 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:38 crc kubenswrapper[4606]: I1212 00:24:38.815309 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:38 crc kubenswrapper[4606]: I1212 00:24:38.815323 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:38 crc kubenswrapper[4606]: I1212 00:24:38.815332 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:38Z","lastTransitionTime":"2025-12-12T00:24:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:38 crc kubenswrapper[4606]: I1212 00:24:38.918375 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:38 crc kubenswrapper[4606]: I1212 00:24:38.918411 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:38 crc kubenswrapper[4606]: I1212 00:24:38.918422 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:38 crc kubenswrapper[4606]: I1212 00:24:38.918437 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:38 crc kubenswrapper[4606]: I1212 00:24:38.918453 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:38Z","lastTransitionTime":"2025-12-12T00:24:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:39 crc kubenswrapper[4606]: I1212 00:24:39.021198 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:39 crc kubenswrapper[4606]: I1212 00:24:39.021237 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:39 crc kubenswrapper[4606]: I1212 00:24:39.021251 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:39 crc kubenswrapper[4606]: I1212 00:24:39.021266 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:39 crc kubenswrapper[4606]: I1212 00:24:39.021277 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:39Z","lastTransitionTime":"2025-12-12T00:24:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:39 crc kubenswrapper[4606]: I1212 00:24:39.123727 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:39 crc kubenswrapper[4606]: I1212 00:24:39.123784 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:39 crc kubenswrapper[4606]: I1212 00:24:39.123802 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:39 crc kubenswrapper[4606]: I1212 00:24:39.123825 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:39 crc kubenswrapper[4606]: I1212 00:24:39.123841 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:39Z","lastTransitionTime":"2025-12-12T00:24:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:39 crc kubenswrapper[4606]: I1212 00:24:39.226660 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:39 crc kubenswrapper[4606]: I1212 00:24:39.226697 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:39 crc kubenswrapper[4606]: I1212 00:24:39.226710 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:39 crc kubenswrapper[4606]: I1212 00:24:39.226729 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:39 crc kubenswrapper[4606]: I1212 00:24:39.226741 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:39Z","lastTransitionTime":"2025-12-12T00:24:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:39 crc kubenswrapper[4606]: I1212 00:24:39.329594 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:39 crc kubenswrapper[4606]: I1212 00:24:39.329635 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:39 crc kubenswrapper[4606]: I1212 00:24:39.329647 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:39 crc kubenswrapper[4606]: I1212 00:24:39.329665 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:39 crc kubenswrapper[4606]: I1212 00:24:39.329705 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:39Z","lastTransitionTime":"2025-12-12T00:24:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:39 crc kubenswrapper[4606]: I1212 00:24:39.433608 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:39 crc kubenswrapper[4606]: I1212 00:24:39.433649 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:39 crc kubenswrapper[4606]: I1212 00:24:39.433659 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:39 crc kubenswrapper[4606]: I1212 00:24:39.433676 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:39 crc kubenswrapper[4606]: I1212 00:24:39.433687 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:39Z","lastTransitionTime":"2025-12-12T00:24:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:39 crc kubenswrapper[4606]: I1212 00:24:39.536638 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:39 crc kubenswrapper[4606]: I1212 00:24:39.536693 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:39 crc kubenswrapper[4606]: I1212 00:24:39.536705 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:39 crc kubenswrapper[4606]: I1212 00:24:39.536721 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:39 crc kubenswrapper[4606]: I1212 00:24:39.536733 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:39Z","lastTransitionTime":"2025-12-12T00:24:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:39 crc kubenswrapper[4606]: I1212 00:24:39.638484 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:39 crc kubenswrapper[4606]: I1212 00:24:39.638538 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:39 crc kubenswrapper[4606]: I1212 00:24:39.638548 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:39 crc kubenswrapper[4606]: I1212 00:24:39.638562 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:39 crc kubenswrapper[4606]: I1212 00:24:39.638572 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:39Z","lastTransitionTime":"2025-12-12T00:24:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:39 crc kubenswrapper[4606]: I1212 00:24:39.698834 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mjjwd" Dec 12 00:24:39 crc kubenswrapper[4606]: E1212 00:24:39.698962 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mjjwd" podUID="0853dce1-c009-407e-960d-1113f85e503f" Dec 12 00:24:39 crc kubenswrapper[4606]: I1212 00:24:39.716633 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-w4rbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"470c2076-46bf-4305-9fb1-3e509eb4d4f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78e7b08154aa4792439cd9e13c38f1c3a106a4d138d8e6a47d5406c82f11662c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34a40d0ce85078630553b90b7a35510e0d3523f600db8940d5e0cf063903192d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34a40d0ce85078630553b90b7a35510e0d3523f600db8940d5e0cf063903192d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f0d903a3e09486a00bc5031c1a100d7a520f8e0cbdcccbeadb796e70f1075b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f0d903a3e09486a00bc5031c1a100d7a520f8e0cbdcccbeadb796e70f1075b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06909fa1392ee8f29394ecd7411437e4f331ccd2007d0a534d0292175a7260ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06909fa1392ee8f29394ecd7411437e4f331ccd2007d0a534d0292175a7260ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://328f409c465a8289f2e2e070ac2475187e763de8354cad8af5b5b4e0a3539628\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://328f409c465a8289f2e2e070ac2475187e763de8354cad8af5b5b4e0a3539628\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09980f0c6b16141a2a66d2768c6e01f75c37314fea48825e9b9707a3fc4d3f9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09980f0c6b16141a2a66d2768c6e01f75c37314fea48825e9b9707a3fc4d3f9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d34cd079b74cccb75c66edf8c51b64113483767ced4eb56ad429d6527d0bb0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d34cd079b74cccb75c66edf8c51b64113483767ced4eb56ad429d6527d0bb0c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-w4rbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:39Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:39 crc kubenswrapper[4606]: I1212 00:24:39.732993 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da25b0ba-5398-4185-a4c1-aeba44ae5633\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa827acce161127216bcfcf3553efeb627cbe52e4d23a8cb011e025f67dde670\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://653bdfa6f15b75149c89a4aff0706e132368f8bc75520cde9f1e1c55c8866537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbb742680d55b47d51cd1964e187b6816e50a88befbce58d3a8746af74cf7071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://366ddc63e1156e702b24e0c3d3c02785c10324669fa851e46851f9a2b8a46a5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6183a4685fda61493c792e9ae693cd7f76e4aad9753f622b48347da3fd2b3b00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0867a89de2edf12036267a937b7e3b1a801e9710ff0a1250c20e020c80ac4d5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bc7d9363ed2e08b897cfc835f60d45070a0a21643ac00f147f7e0108af46f37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bc7d9363ed2e08b897cfc835f60d45070a0a21643ac00f147f7e0108af46f37\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-12T00:24:16Z\\\",\\\"message\\\":\\\"om/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1212 00:24:15.754838 6204 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1212 00:24:15.755303 6204 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1212 00:24:15.755326 6204 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1212 00:24:15.755332 6204 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI1212 00:24:15.755347 6204 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1212 00:24:15.755357 6204 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1212 00:24:15.755450 6204 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1212 00:24:15.755585 6204 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1212 00:24:15.755619 6204 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1212 00:24:15.755649 6204 factory.go:656] Stopping watch factory\\\\nI1212 00:24:15.755674 6204 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1212 00:24:15.755690 6204 handler.go:208] Removed *v1.NetworkPolicy ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-12T00:24:15Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-hpw5w_openshift-ovn-kubernetes(da25b0ba-5398-4185-a4c1-aeba44ae5633)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e3c08f155fa3cef9692edf10c4df9719f3647b5625a9ac68a7c158afb80dd86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f94a9f641815d76464c17c6145b9ac4803eedab3bea29a52d65a793d448cebe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f94a9f641815d76464c17c6145b9ac4803eedab3bea29a52d65a793d448cebe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hpw5w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:39Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:39 crc kubenswrapper[4606]: I1212 00:24:39.740715 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:39 crc kubenswrapper[4606]: I1212 00:24:39.740754 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:39 crc kubenswrapper[4606]: I1212 00:24:39.740764 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:39 crc kubenswrapper[4606]: I1212 00:24:39.740781 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:39 crc kubenswrapper[4606]: I1212 00:24:39.740792 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:39Z","lastTransitionTime":"2025-12-12T00:24:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:39 crc kubenswrapper[4606]: I1212 00:24:39.743118 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7rxps" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c831d6d-b07d-46dd-adc0-85239379350f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba170e097af91d1b27e0ee3e414e0831a5a93256c693247e7129a5fda0b9dc85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:24:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbjbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d66f7e90d4969cb2791da434d47d0e4fb269aa8760347c7d59bd8d16f4cf42c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:24:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbjbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:24:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7rxps\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:39Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:39 crc kubenswrapper[4606]: I1212 00:24:39.752626 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mjjwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0853dce1-c009-407e-960d-1113f85e503f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwwtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwwtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:24:03Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mjjwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:39Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:39 crc kubenswrapper[4606]: I1212 00:24:39.762369 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:39Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:39 crc kubenswrapper[4606]: I1212 00:24:39.773013 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-554rp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d34036b-3243-416a-a0c0-1d1ddb3a0ca5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://346417326a8a4ba8e839ce4870dbaacd33a90b4ab096d8ce573ce598d909b51a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9k8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-554rp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:39Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:39 crc kubenswrapper[4606]: I1212 00:24:39.789492 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ba650bd-b1e1-4950-98c6-2bee87181b18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de6cd366dc8d770945235faa6560cea35c0ef7eabd95cb272639e2116e2d623d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a380a96c542428bf9cf66391f4d843144052269b294636e863319b2ba954047f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://902a9d2fb0d7dcdec6339fac981ee0539911f1998a86ecf02a01015d3b1b5907\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4309a0a93977264e5000a930a32a4fb18ebe51fcf2af20bbf153f30f09759ff6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:39Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:39 crc kubenswrapper[4606]: I1212 00:24:39.801084 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:39Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:39 crc kubenswrapper[4606]: I1212 00:24:39.811002 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://694cd2f423e8241bd8fd68395824d1cd45a12d80e92ad57e63e59c0642b21384\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:39Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:39 crc kubenswrapper[4606]: I1212 00:24:39.819557 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a543e227-be89-40cb-941d-b4707cc28921\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b09638dd0d881593745c14548156f20a9366f65ecfa5018d510b902240dd4f7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-986w2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5de882904f4a5718bcf44d07da05096363447836f325f35130914c17a30c3a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-986w2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cqmz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:39Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:39 crc kubenswrapper[4606]: I1212 00:24:39.833049 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xzcfk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3601dd0b8c0a574cc4689906f3c4ed491f3432cae8b328c55f07e5d17b6e976\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84110159d07d70b0607ac029fec70fb043bbb0e310c39955e49becfe87e9faa0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-12T00:24:36Z\\\",\\\"message\\\":\\\"2025-12-12T00:23:51+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_d712852f-61b9-48f1-a1d6-1031b964219c\\\\n2025-12-12T00:23:51+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_d712852f-61b9-48f1-a1d6-1031b964219c to /host/opt/cni/bin/\\\\n2025-12-12T00:23:51Z [verbose] multus-daemon started\\\\n2025-12-12T00:23:51Z [verbose] Readiness Indicator file check\\\\n2025-12-12T00:24:36Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-spsv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xzcfk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:39Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:39 crc kubenswrapper[4606]: I1212 00:24:39.843973 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:39 crc kubenswrapper[4606]: I1212 00:24:39.844018 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:39 crc kubenswrapper[4606]: I1212 00:24:39.844029 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:39 crc kubenswrapper[4606]: I1212 00:24:39.844045 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:39 crc kubenswrapper[4606]: I1212 00:24:39.844057 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:39Z","lastTransitionTime":"2025-12-12T00:24:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:39 crc kubenswrapper[4606]: I1212 00:24:39.852049 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f8df398-6835-482a-a2cd-3395b6e9efab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b993d03e36e69edb510ce98f50dfb98344fb6fe2b5debeff3a2004807dd00ee6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c6fef20d4e159a405f0ab0206c49e5e314e97bf7ddaf0758e090106945c9ffe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://accf40ef1339b65136666c1ea6f21a70d2152a650c87ef3ade19ddb6f9effbc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbd49ad6d6011a235d440da0e8058c0fc4dcc64effec37b50a68e84438d537a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2860bce1479ca1d558c4f5230d76cadcaf3efc0e3842ef0d371ecf1168cd852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7780eb068dc4a157794f9253367d2198a38a2c039cf3906fc2319cf48db3cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa7780eb068dc4a157794f9253367d2198a38a2c039cf3906fc2319cf48db3cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://684c387c89889a0a1e8d317ca70ebfc71ea377eb5de9c08c71a3ff02faf99e4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://684c387c89889a0a1e8d317ca70ebfc71ea377eb5de9c08c71a3ff02faf99e4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://78da314feaec6cab81e27eaea086d2bba3a05c4b73ce2add1a0bb57a226b0f5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78da314feaec6cab81e27eaea086d2bba3a05c4b73ce2add1a0bb57a226b0f5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:39Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:39 crc kubenswrapper[4606]: I1212 00:24:39.879888 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92839d27-9755-4aaa-a4c2-8aa83b6473de\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d18b3a560566347e9cfcd748453a89d1588de77f02a5e961a71916c6182eff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9531d10c07644d54aeff48edf4ac068968acbd1b6597baabcb0660e7bcd54c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76f2e4c2d97f36aebba4bcbabe12214a1df27637376d1440ee54d43fca39065\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b764ca1153735e365d95ed2855c7d41cd7457614e1ddce0f008d62720d607e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c75350d79a93c04db2bd6d71995920cde9b897776249b8696f3ead3939f7d5d4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"ed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1765499023\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1765499022\\\\\\\\\\\\\\\" (2025-12-11 23:23:42 +0000 UTC to 2026-12-11 23:23:42 +0000 UTC (now=2025-12-12 00:23:48.120831298 +0000 UTC))\\\\\\\"\\\\nI1212 00:23:48.120889 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1212 00:23:48.120921 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1212 00:23:48.120964 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1212 00:23:48.120986 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1212 00:23:48.121028 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3271453193/tls.crt::/tmp/serving-cert-3271453193/tls.key\\\\\\\"\\\\nI1212 00:23:48.121084 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1212 00:23:48.121211 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1212 00:23:48.121228 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1212 00:23:48.121265 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1212 00:23:48.121275 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1212 00:23:48.121345 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1212 00:23:48.121358 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF1212 00:23:48.123292 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3baf1bf5fbaa04f9f165f43b6a4fba496be0d453641fcd8bbc811687891dd66\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bbabfdcef0c9f4dcaada0bf5c2e98c083fd0f809462f017f8a767ef800e8e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0bbabfdcef0c9f4dcaada0bf5c2e98c083fd0f809462f017f8a767ef800e8e13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:39Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:39 crc kubenswrapper[4606]: I1212 00:24:39.898403 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4986a294a19f85082b9caf854f13f9c3587ca49314b40917b00768ee0bb51ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4251eab73ccb6141903993f8df922037762f7c11dc1e0ef1a2b5708ed435307\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:39Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:39 crc kubenswrapper[4606]: I1212 00:24:39.925280 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:39Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:39 crc kubenswrapper[4606]: I1212 00:24:39.937412 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qtz7b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f437a237-3eb8-4817-b0a9-35efece69933\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f587a144b02f6ba1229380174f2a4f2c50ed33702fe8d64eb5ff7174b32eb2d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qq49f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qtz7b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:39Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:39 crc kubenswrapper[4606]: I1212 00:24:39.946015 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:39 crc kubenswrapper[4606]: I1212 00:24:39.946048 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:39 crc kubenswrapper[4606]: I1212 00:24:39.946069 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:39 crc kubenswrapper[4606]: I1212 00:24:39.946087 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:39 crc kubenswrapper[4606]: I1212 00:24:39.946098 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:39Z","lastTransitionTime":"2025-12-12T00:24:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:39 crc kubenswrapper[4606]: I1212 00:24:39.948719 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab571c5f-6f1d-4cb9-9a00-a57cc7baad63\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7a27c1b503f68ce99c9004d0190e5c74380bf2ff33b2b0e1e7f424e5cf9d450\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a159e87c39859bf3b5652b40223e4a8fdd9dcae3d23c1fae17d0eb8b5842a71c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e338552ee9504d46234c3caf3d9b7306a033258a94b6cf7c542bb957ea32a94c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c88faa76d30856d9170f04a153de4239eefbef49b2ee25a5bd04502697248b5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c88faa76d30856d9170f04a153de4239eefbef49b2ee25a5bd04502697248b5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:29Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:39Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:39 crc kubenswrapper[4606]: I1212 00:24:39.960130 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce6c4e852b4c68e910621ad11058beecb40960d501d3e64f5fcad97c442df4ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:39Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:40 crc kubenswrapper[4606]: I1212 00:24:40.047950 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:40 crc kubenswrapper[4606]: I1212 00:24:40.047990 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:40 crc kubenswrapper[4606]: I1212 00:24:40.048001 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:40 crc kubenswrapper[4606]: I1212 00:24:40.048015 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:40 crc kubenswrapper[4606]: I1212 00:24:40.048024 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:40Z","lastTransitionTime":"2025-12-12T00:24:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:40 crc kubenswrapper[4606]: I1212 00:24:40.150142 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:40 crc kubenswrapper[4606]: I1212 00:24:40.150557 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:40 crc kubenswrapper[4606]: I1212 00:24:40.150571 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:40 crc kubenswrapper[4606]: I1212 00:24:40.150590 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:40 crc kubenswrapper[4606]: I1212 00:24:40.150602 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:40Z","lastTransitionTime":"2025-12-12T00:24:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:40 crc kubenswrapper[4606]: I1212 00:24:40.253233 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:40 crc kubenswrapper[4606]: I1212 00:24:40.253272 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:40 crc kubenswrapper[4606]: I1212 00:24:40.253285 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:40 crc kubenswrapper[4606]: I1212 00:24:40.253301 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:40 crc kubenswrapper[4606]: I1212 00:24:40.253313 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:40Z","lastTransitionTime":"2025-12-12T00:24:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:40 crc kubenswrapper[4606]: I1212 00:24:40.355239 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:40 crc kubenswrapper[4606]: I1212 00:24:40.355275 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:40 crc kubenswrapper[4606]: I1212 00:24:40.355287 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:40 crc kubenswrapper[4606]: I1212 00:24:40.355302 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:40 crc kubenswrapper[4606]: I1212 00:24:40.355312 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:40Z","lastTransitionTime":"2025-12-12T00:24:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:40 crc kubenswrapper[4606]: I1212 00:24:40.457396 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:40 crc kubenswrapper[4606]: I1212 00:24:40.457432 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:40 crc kubenswrapper[4606]: I1212 00:24:40.457443 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:40 crc kubenswrapper[4606]: I1212 00:24:40.457459 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:40 crc kubenswrapper[4606]: I1212 00:24:40.457470 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:40Z","lastTransitionTime":"2025-12-12T00:24:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:40 crc kubenswrapper[4606]: I1212 00:24:40.559248 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:40 crc kubenswrapper[4606]: I1212 00:24:40.559281 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:40 crc kubenswrapper[4606]: I1212 00:24:40.559293 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:40 crc kubenswrapper[4606]: I1212 00:24:40.559309 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:40 crc kubenswrapper[4606]: I1212 00:24:40.559320 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:40Z","lastTransitionTime":"2025-12-12T00:24:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:40 crc kubenswrapper[4606]: I1212 00:24:40.661361 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:40 crc kubenswrapper[4606]: I1212 00:24:40.661399 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:40 crc kubenswrapper[4606]: I1212 00:24:40.661410 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:40 crc kubenswrapper[4606]: I1212 00:24:40.661434 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:40 crc kubenswrapper[4606]: I1212 00:24:40.661448 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:40Z","lastTransitionTime":"2025-12-12T00:24:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:40 crc kubenswrapper[4606]: I1212 00:24:40.698844 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 12 00:24:40 crc kubenswrapper[4606]: I1212 00:24:40.698900 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 12 00:24:40 crc kubenswrapper[4606]: I1212 00:24:40.698936 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 12 00:24:40 crc kubenswrapper[4606]: E1212 00:24:40.698965 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 12 00:24:40 crc kubenswrapper[4606]: E1212 00:24:40.699072 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 12 00:24:40 crc kubenswrapper[4606]: E1212 00:24:40.699118 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 12 00:24:40 crc kubenswrapper[4606]: I1212 00:24:40.763508 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:40 crc kubenswrapper[4606]: I1212 00:24:40.763544 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:40 crc kubenswrapper[4606]: I1212 00:24:40.763556 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:40 crc kubenswrapper[4606]: I1212 00:24:40.763592 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:40 crc kubenswrapper[4606]: I1212 00:24:40.763606 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:40Z","lastTransitionTime":"2025-12-12T00:24:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:40 crc kubenswrapper[4606]: I1212 00:24:40.865650 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:40 crc kubenswrapper[4606]: I1212 00:24:40.865682 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:40 crc kubenswrapper[4606]: I1212 00:24:40.865692 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:40 crc kubenswrapper[4606]: I1212 00:24:40.865706 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:40 crc kubenswrapper[4606]: I1212 00:24:40.865716 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:40Z","lastTransitionTime":"2025-12-12T00:24:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:40 crc kubenswrapper[4606]: I1212 00:24:40.967834 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:40 crc kubenswrapper[4606]: I1212 00:24:40.967869 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:40 crc kubenswrapper[4606]: I1212 00:24:40.967881 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:40 crc kubenswrapper[4606]: I1212 00:24:40.967895 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:40 crc kubenswrapper[4606]: I1212 00:24:40.967905 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:40Z","lastTransitionTime":"2025-12-12T00:24:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:41 crc kubenswrapper[4606]: I1212 00:24:41.069695 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:41 crc kubenswrapper[4606]: I1212 00:24:41.069730 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:41 crc kubenswrapper[4606]: I1212 00:24:41.069742 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:41 crc kubenswrapper[4606]: I1212 00:24:41.069758 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:41 crc kubenswrapper[4606]: I1212 00:24:41.069768 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:41Z","lastTransitionTime":"2025-12-12T00:24:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:41 crc kubenswrapper[4606]: I1212 00:24:41.171751 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:41 crc kubenswrapper[4606]: I1212 00:24:41.171795 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:41 crc kubenswrapper[4606]: I1212 00:24:41.171811 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:41 crc kubenswrapper[4606]: I1212 00:24:41.171831 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:41 crc kubenswrapper[4606]: I1212 00:24:41.171846 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:41Z","lastTransitionTime":"2025-12-12T00:24:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:41 crc kubenswrapper[4606]: I1212 00:24:41.273956 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:41 crc kubenswrapper[4606]: I1212 00:24:41.273997 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:41 crc kubenswrapper[4606]: I1212 00:24:41.274007 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:41 crc kubenswrapper[4606]: I1212 00:24:41.274023 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:41 crc kubenswrapper[4606]: I1212 00:24:41.274034 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:41Z","lastTransitionTime":"2025-12-12T00:24:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:41 crc kubenswrapper[4606]: I1212 00:24:41.376581 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:41 crc kubenswrapper[4606]: I1212 00:24:41.376618 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:41 crc kubenswrapper[4606]: I1212 00:24:41.376630 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:41 crc kubenswrapper[4606]: I1212 00:24:41.376651 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:41 crc kubenswrapper[4606]: I1212 00:24:41.376669 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:41Z","lastTransitionTime":"2025-12-12T00:24:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:41 crc kubenswrapper[4606]: I1212 00:24:41.478892 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:41 crc kubenswrapper[4606]: I1212 00:24:41.478922 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:41 crc kubenswrapper[4606]: I1212 00:24:41.478930 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:41 crc kubenswrapper[4606]: I1212 00:24:41.479140 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:41 crc kubenswrapper[4606]: I1212 00:24:41.479150 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:41Z","lastTransitionTime":"2025-12-12T00:24:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:41 crc kubenswrapper[4606]: I1212 00:24:41.581651 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:41 crc kubenswrapper[4606]: I1212 00:24:41.581680 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:41 crc kubenswrapper[4606]: I1212 00:24:41.581688 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:41 crc kubenswrapper[4606]: I1212 00:24:41.581700 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:41 crc kubenswrapper[4606]: I1212 00:24:41.581708 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:41Z","lastTransitionTime":"2025-12-12T00:24:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:41 crc kubenswrapper[4606]: I1212 00:24:41.684451 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:41 crc kubenswrapper[4606]: I1212 00:24:41.684515 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:41 crc kubenswrapper[4606]: I1212 00:24:41.684541 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:41 crc kubenswrapper[4606]: I1212 00:24:41.684572 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:41 crc kubenswrapper[4606]: I1212 00:24:41.684594 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:41Z","lastTransitionTime":"2025-12-12T00:24:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:41 crc kubenswrapper[4606]: I1212 00:24:41.699286 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mjjwd" Dec 12 00:24:41 crc kubenswrapper[4606]: E1212 00:24:41.699413 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mjjwd" podUID="0853dce1-c009-407e-960d-1113f85e503f" Dec 12 00:24:41 crc kubenswrapper[4606]: I1212 00:24:41.709244 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Dec 12 00:24:41 crc kubenswrapper[4606]: I1212 00:24:41.787524 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:41 crc kubenswrapper[4606]: I1212 00:24:41.787563 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:41 crc kubenswrapper[4606]: I1212 00:24:41.787571 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:41 crc kubenswrapper[4606]: I1212 00:24:41.787588 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:41 crc kubenswrapper[4606]: I1212 00:24:41.787601 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:41Z","lastTransitionTime":"2025-12-12T00:24:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:41 crc kubenswrapper[4606]: I1212 00:24:41.890223 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:41 crc kubenswrapper[4606]: I1212 00:24:41.890274 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:41 crc kubenswrapper[4606]: I1212 00:24:41.890285 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:41 crc kubenswrapper[4606]: I1212 00:24:41.890303 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:41 crc kubenswrapper[4606]: I1212 00:24:41.890312 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:41Z","lastTransitionTime":"2025-12-12T00:24:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:41 crc kubenswrapper[4606]: I1212 00:24:41.992809 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:41 crc kubenswrapper[4606]: I1212 00:24:41.992842 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:41 crc kubenswrapper[4606]: I1212 00:24:41.992851 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:41 crc kubenswrapper[4606]: I1212 00:24:41.992866 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:41 crc kubenswrapper[4606]: I1212 00:24:41.992875 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:41Z","lastTransitionTime":"2025-12-12T00:24:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:42 crc kubenswrapper[4606]: I1212 00:24:42.094959 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:42 crc kubenswrapper[4606]: I1212 00:24:42.095018 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:42 crc kubenswrapper[4606]: I1212 00:24:42.095043 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:42 crc kubenswrapper[4606]: I1212 00:24:42.095072 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:42 crc kubenswrapper[4606]: I1212 00:24:42.095095 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:42Z","lastTransitionTime":"2025-12-12T00:24:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:42 crc kubenswrapper[4606]: I1212 00:24:42.198479 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:42 crc kubenswrapper[4606]: I1212 00:24:42.198513 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:42 crc kubenswrapper[4606]: I1212 00:24:42.198521 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:42 crc kubenswrapper[4606]: I1212 00:24:42.198533 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:42 crc kubenswrapper[4606]: I1212 00:24:42.198542 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:42Z","lastTransitionTime":"2025-12-12T00:24:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:42 crc kubenswrapper[4606]: I1212 00:24:42.300879 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:42 crc kubenswrapper[4606]: I1212 00:24:42.300943 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:42 crc kubenswrapper[4606]: I1212 00:24:42.300966 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:42 crc kubenswrapper[4606]: I1212 00:24:42.300996 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:42 crc kubenswrapper[4606]: I1212 00:24:42.301017 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:42Z","lastTransitionTime":"2025-12-12T00:24:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:42 crc kubenswrapper[4606]: I1212 00:24:42.354625 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:42 crc kubenswrapper[4606]: I1212 00:24:42.354652 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:42 crc kubenswrapper[4606]: I1212 00:24:42.354661 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:42 crc kubenswrapper[4606]: I1212 00:24:42.354672 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:42 crc kubenswrapper[4606]: I1212 00:24:42.354681 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:42Z","lastTransitionTime":"2025-12-12T00:24:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:42 crc kubenswrapper[4606]: E1212 00:24:42.367740 4606 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:24:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:24:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:24:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:24:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"19ab28be-ccfb-4859-88d0-8d375e2d06f2\\\",\\\"systemUUID\\\":\\\"d5982033-b2dc-474c-9cda-275cd567c208\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:42Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:42 crc kubenswrapper[4606]: I1212 00:24:42.371365 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:42 crc kubenswrapper[4606]: I1212 00:24:42.371398 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:42 crc kubenswrapper[4606]: I1212 00:24:42.371408 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:42 crc kubenswrapper[4606]: I1212 00:24:42.371421 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:42 crc kubenswrapper[4606]: I1212 00:24:42.371432 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:42Z","lastTransitionTime":"2025-12-12T00:24:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:42 crc kubenswrapper[4606]: E1212 00:24:42.382948 4606 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:24:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:24:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:24:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:24:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"19ab28be-ccfb-4859-88d0-8d375e2d06f2\\\",\\\"systemUUID\\\":\\\"d5982033-b2dc-474c-9cda-275cd567c208\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:42Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:42 crc kubenswrapper[4606]: I1212 00:24:42.386884 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:42 crc kubenswrapper[4606]: I1212 00:24:42.386929 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:42 crc kubenswrapper[4606]: I1212 00:24:42.386941 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:42 crc kubenswrapper[4606]: I1212 00:24:42.386959 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:42 crc kubenswrapper[4606]: I1212 00:24:42.386971 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:42Z","lastTransitionTime":"2025-12-12T00:24:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:42 crc kubenswrapper[4606]: E1212 00:24:42.399258 4606 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:24:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:24:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:24:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:24:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"19ab28be-ccfb-4859-88d0-8d375e2d06f2\\\",\\\"systemUUID\\\":\\\"d5982033-b2dc-474c-9cda-275cd567c208\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:42Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:42 crc kubenswrapper[4606]: I1212 00:24:42.402880 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:42 crc kubenswrapper[4606]: I1212 00:24:42.402987 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:42 crc kubenswrapper[4606]: I1212 00:24:42.403062 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:42 crc kubenswrapper[4606]: I1212 00:24:42.403125 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:42 crc kubenswrapper[4606]: I1212 00:24:42.403203 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:42Z","lastTransitionTime":"2025-12-12T00:24:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:42 crc kubenswrapper[4606]: E1212 00:24:42.418490 4606 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:24:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:24:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:24:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:24:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"19ab28be-ccfb-4859-88d0-8d375e2d06f2\\\",\\\"systemUUID\\\":\\\"d5982033-b2dc-474c-9cda-275cd567c208\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:42Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:42 crc kubenswrapper[4606]: I1212 00:24:42.421933 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:42 crc kubenswrapper[4606]: I1212 00:24:42.422059 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:42 crc kubenswrapper[4606]: I1212 00:24:42.422238 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:42 crc kubenswrapper[4606]: I1212 00:24:42.422371 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:42 crc kubenswrapper[4606]: I1212 00:24:42.422433 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:42Z","lastTransitionTime":"2025-12-12T00:24:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:42 crc kubenswrapper[4606]: E1212 00:24:42.435955 4606 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:24:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:24:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:24:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:24:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"19ab28be-ccfb-4859-88d0-8d375e2d06f2\\\",\\\"systemUUID\\\":\\\"d5982033-b2dc-474c-9cda-275cd567c208\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:42Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:42 crc kubenswrapper[4606]: E1212 00:24:42.436109 4606 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 12 00:24:42 crc kubenswrapper[4606]: I1212 00:24:42.437393 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:42 crc kubenswrapper[4606]: I1212 00:24:42.437412 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:42 crc kubenswrapper[4606]: I1212 00:24:42.437420 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:42 crc kubenswrapper[4606]: I1212 00:24:42.437434 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:42 crc kubenswrapper[4606]: I1212 00:24:42.437443 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:42Z","lastTransitionTime":"2025-12-12T00:24:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:42 crc kubenswrapper[4606]: I1212 00:24:42.539326 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:42 crc kubenswrapper[4606]: I1212 00:24:42.539392 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:42 crc kubenswrapper[4606]: I1212 00:24:42.539408 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:42 crc kubenswrapper[4606]: I1212 00:24:42.539428 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:42 crc kubenswrapper[4606]: I1212 00:24:42.539441 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:42Z","lastTransitionTime":"2025-12-12T00:24:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:42 crc kubenswrapper[4606]: I1212 00:24:42.642119 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:42 crc kubenswrapper[4606]: I1212 00:24:42.642195 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:42 crc kubenswrapper[4606]: I1212 00:24:42.642211 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:42 crc kubenswrapper[4606]: I1212 00:24:42.642233 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:42 crc kubenswrapper[4606]: I1212 00:24:42.642249 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:42Z","lastTransitionTime":"2025-12-12T00:24:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:42 crc kubenswrapper[4606]: I1212 00:24:42.698616 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 12 00:24:42 crc kubenswrapper[4606]: I1212 00:24:42.698649 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 12 00:24:42 crc kubenswrapper[4606]: I1212 00:24:42.698706 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 12 00:24:42 crc kubenswrapper[4606]: E1212 00:24:42.698757 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 12 00:24:42 crc kubenswrapper[4606]: E1212 00:24:42.698906 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 12 00:24:42 crc kubenswrapper[4606]: E1212 00:24:42.698939 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 12 00:24:42 crc kubenswrapper[4606]: I1212 00:24:42.699527 4606 scope.go:117] "RemoveContainer" containerID="8bc7d9363ed2e08b897cfc835f60d45070a0a21643ac00f147f7e0108af46f37" Dec 12 00:24:42 crc kubenswrapper[4606]: I1212 00:24:42.745092 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:42 crc kubenswrapper[4606]: I1212 00:24:42.745143 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:42 crc kubenswrapper[4606]: I1212 00:24:42.745154 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:42 crc kubenswrapper[4606]: I1212 00:24:42.745191 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:42 crc kubenswrapper[4606]: I1212 00:24:42.745205 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:42Z","lastTransitionTime":"2025-12-12T00:24:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:42 crc kubenswrapper[4606]: I1212 00:24:42.848650 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:42 crc kubenswrapper[4606]: I1212 00:24:42.848694 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:42 crc kubenswrapper[4606]: I1212 00:24:42.848706 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:42 crc kubenswrapper[4606]: I1212 00:24:42.848721 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:42 crc kubenswrapper[4606]: I1212 00:24:42.848732 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:42Z","lastTransitionTime":"2025-12-12T00:24:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:42 crc kubenswrapper[4606]: I1212 00:24:42.951058 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:42 crc kubenswrapper[4606]: I1212 00:24:42.951119 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:42 crc kubenswrapper[4606]: I1212 00:24:42.951141 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:42 crc kubenswrapper[4606]: I1212 00:24:42.951165 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:42 crc kubenswrapper[4606]: I1212 00:24:42.951225 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:42Z","lastTransitionTime":"2025-12-12T00:24:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:43 crc kubenswrapper[4606]: I1212 00:24:43.053819 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:43 crc kubenswrapper[4606]: I1212 00:24:43.053856 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:43 crc kubenswrapper[4606]: I1212 00:24:43.053865 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:43 crc kubenswrapper[4606]: I1212 00:24:43.053878 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:43 crc kubenswrapper[4606]: I1212 00:24:43.053888 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:43Z","lastTransitionTime":"2025-12-12T00:24:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:43 crc kubenswrapper[4606]: I1212 00:24:43.156427 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:43 crc kubenswrapper[4606]: I1212 00:24:43.156487 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:43 crc kubenswrapper[4606]: I1212 00:24:43.156505 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:43 crc kubenswrapper[4606]: I1212 00:24:43.156530 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:43 crc kubenswrapper[4606]: I1212 00:24:43.156548 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:43Z","lastTransitionTime":"2025-12-12T00:24:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:43 crc kubenswrapper[4606]: I1212 00:24:43.259401 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:43 crc kubenswrapper[4606]: I1212 00:24:43.259438 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:43 crc kubenswrapper[4606]: I1212 00:24:43.259452 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:43 crc kubenswrapper[4606]: I1212 00:24:43.259473 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:43 crc kubenswrapper[4606]: I1212 00:24:43.259489 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:43Z","lastTransitionTime":"2025-12-12T00:24:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:43 crc kubenswrapper[4606]: I1212 00:24:43.362187 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:43 crc kubenswrapper[4606]: I1212 00:24:43.362223 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:43 crc kubenswrapper[4606]: I1212 00:24:43.362234 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:43 crc kubenswrapper[4606]: I1212 00:24:43.362251 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:43 crc kubenswrapper[4606]: I1212 00:24:43.362264 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:43Z","lastTransitionTime":"2025-12-12T00:24:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:43 crc kubenswrapper[4606]: I1212 00:24:43.465128 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:43 crc kubenswrapper[4606]: I1212 00:24:43.465189 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:43 crc kubenswrapper[4606]: I1212 00:24:43.465198 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:43 crc kubenswrapper[4606]: I1212 00:24:43.465216 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:43 crc kubenswrapper[4606]: I1212 00:24:43.465225 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:43Z","lastTransitionTime":"2025-12-12T00:24:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:43 crc kubenswrapper[4606]: I1212 00:24:43.567710 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:43 crc kubenswrapper[4606]: I1212 00:24:43.567742 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:43 crc kubenswrapper[4606]: I1212 00:24:43.567754 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:43 crc kubenswrapper[4606]: I1212 00:24:43.567769 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:43 crc kubenswrapper[4606]: I1212 00:24:43.567796 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:43Z","lastTransitionTime":"2025-12-12T00:24:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:43 crc kubenswrapper[4606]: I1212 00:24:43.670345 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:43 crc kubenswrapper[4606]: I1212 00:24:43.670372 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:43 crc kubenswrapper[4606]: I1212 00:24:43.670381 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:43 crc kubenswrapper[4606]: I1212 00:24:43.670393 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:43 crc kubenswrapper[4606]: I1212 00:24:43.670401 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:43Z","lastTransitionTime":"2025-12-12T00:24:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:43 crc kubenswrapper[4606]: I1212 00:24:43.699250 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mjjwd" Dec 12 00:24:43 crc kubenswrapper[4606]: E1212 00:24:43.699394 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mjjwd" podUID="0853dce1-c009-407e-960d-1113f85e503f" Dec 12 00:24:43 crc kubenswrapper[4606]: I1212 00:24:43.777755 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:43 crc kubenswrapper[4606]: I1212 00:24:43.777796 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:43 crc kubenswrapper[4606]: I1212 00:24:43.777806 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:43 crc kubenswrapper[4606]: I1212 00:24:43.777821 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:43 crc kubenswrapper[4606]: I1212 00:24:43.777830 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:43Z","lastTransitionTime":"2025-12-12T00:24:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:43 crc kubenswrapper[4606]: I1212 00:24:43.880644 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:43 crc kubenswrapper[4606]: I1212 00:24:43.880679 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:43 crc kubenswrapper[4606]: I1212 00:24:43.880689 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:43 crc kubenswrapper[4606]: I1212 00:24:43.880705 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:43 crc kubenswrapper[4606]: I1212 00:24:43.880716 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:43Z","lastTransitionTime":"2025-12-12T00:24:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:43 crc kubenswrapper[4606]: I1212 00:24:43.983157 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:43 crc kubenswrapper[4606]: I1212 00:24:43.983227 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:43 crc kubenswrapper[4606]: I1212 00:24:43.983239 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:43 crc kubenswrapper[4606]: I1212 00:24:43.983263 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:43 crc kubenswrapper[4606]: I1212 00:24:43.983275 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:43Z","lastTransitionTime":"2025-12-12T00:24:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:44 crc kubenswrapper[4606]: I1212 00:24:44.085346 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:44 crc kubenswrapper[4606]: I1212 00:24:44.085377 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:44 crc kubenswrapper[4606]: I1212 00:24:44.085425 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:44 crc kubenswrapper[4606]: I1212 00:24:44.085439 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:44 crc kubenswrapper[4606]: I1212 00:24:44.085448 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:44Z","lastTransitionTime":"2025-12-12T00:24:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:44 crc kubenswrapper[4606]: I1212 00:24:44.120239 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hpw5w_da25b0ba-5398-4185-a4c1-aeba44ae5633/ovnkube-controller/2.log" Dec 12 00:24:44 crc kubenswrapper[4606]: I1212 00:24:44.122518 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" event={"ID":"da25b0ba-5398-4185-a4c1-aeba44ae5633","Type":"ContainerStarted","Data":"04af88555385f41f61f31bcc6dcfd9ea677fd98ba9ef0b1a25003fff11e8be1b"} Dec 12 00:24:44 crc kubenswrapper[4606]: I1212 00:24:44.123108 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" Dec 12 00:24:44 crc kubenswrapper[4606]: I1212 00:24:44.136344 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67cd27ea-8882-4aa1-ba7a-eb2058cac536\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c06b3470d2a194c99cd63ef6039a6f2abff89fc93e8f61feca67c4eefa9b9ac8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://892ac723ca54cc612171eec750846b774fcfc54e09531ff344813a1e1afc61fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://892ac723ca54cc612171eec750846b774fcfc54e09531ff344813a1e1afc61fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:44Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:44 crc kubenswrapper[4606]: I1212 00:24:44.148276 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab571c5f-6f1d-4cb9-9a00-a57cc7baad63\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7a27c1b503f68ce99c9004d0190e5c74380bf2ff33b2b0e1e7f424e5cf9d450\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a159e87c39859bf3b5652b40223e4a8fdd9dcae3d23c1fae17d0eb8b5842a71c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e338552ee9504d46234c3caf3d9b7306a033258a94b6cf7c542bb957ea32a94c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c88faa76d30856d9170f04a153de4239eefbef49b2ee25a5bd04502697248b5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c88faa76d30856d9170f04a153de4239eefbef49b2ee25a5bd04502697248b5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:29Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:44Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:44 crc kubenswrapper[4606]: I1212 00:24:44.161845 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4986a294a19f85082b9caf854f13f9c3587ca49314b40917b00768ee0bb51ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4251eab73ccb6141903993f8df922037762f7c11dc1e0ef1a2b5708ed435307\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:44Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:44 crc kubenswrapper[4606]: I1212 00:24:44.176142 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:44Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:44 crc kubenswrapper[4606]: I1212 00:24:44.186439 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qtz7b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f437a237-3eb8-4817-b0a9-35efece69933\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f587a144b02f6ba1229380174f2a4f2c50ed33702fe8d64eb5ff7174b32eb2d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qq49f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qtz7b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:44Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:44 crc kubenswrapper[4606]: I1212 00:24:44.188048 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:44 crc kubenswrapper[4606]: I1212 00:24:44.188098 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:44 crc kubenswrapper[4606]: I1212 00:24:44.188110 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:44 crc kubenswrapper[4606]: I1212 00:24:44.188132 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:44 crc kubenswrapper[4606]: I1212 00:24:44.188143 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:44Z","lastTransitionTime":"2025-12-12T00:24:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:44 crc kubenswrapper[4606]: I1212 00:24:44.198444 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce6c4e852b4c68e910621ad11058beecb40960d501d3e64f5fcad97c442df4ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:44Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:44 crc kubenswrapper[4606]: I1212 00:24:44.208742 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:44Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:44 crc kubenswrapper[4606]: I1212 00:24:44.216559 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-554rp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d34036b-3243-416a-a0c0-1d1ddb3a0ca5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://346417326a8a4ba8e839ce4870dbaacd33a90b4ab096d8ce573ce598d909b51a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9k8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-554rp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:44Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:44 crc kubenswrapper[4606]: I1212 00:24:44.228752 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-w4rbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"470c2076-46bf-4305-9fb1-3e509eb4d4f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78e7b08154aa4792439cd9e13c38f1c3a106a4d138d8e6a47d5406c82f11662c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34a40d0ce85078630553b90b7a35510e0d3523f600db8940d5e0cf063903192d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34a40d0ce85078630553b90b7a35510e0d3523f600db8940d5e0cf063903192d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f0d903a3e09486a00bc5031c1a100d7a520f8e0cbdcccbeadb796e70f1075b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f0d903a3e09486a00bc5031c1a100d7a520f8e0cbdcccbeadb796e70f1075b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06909fa1392ee8f29394ecd7411437e4f331ccd2007d0a534d0292175a7260ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06909fa1392ee8f29394ecd7411437e4f331ccd2007d0a534d0292175a7260ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://328f409c465a8289f2e2e070ac2475187e763de8354cad8af5b5b4e0a3539628\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://328f409c465a8289f2e2e070ac2475187e763de8354cad8af5b5b4e0a3539628\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09980f0c6b16141a2a66d2768c6e01f75c37314fea48825e9b9707a3fc4d3f9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09980f0c6b16141a2a66d2768c6e01f75c37314fea48825e9b9707a3fc4d3f9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d34cd079b74cccb75c66edf8c51b64113483767ced4eb56ad429d6527d0bb0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d34cd079b74cccb75c66edf8c51b64113483767ced4eb56ad429d6527d0bb0c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-w4rbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:44Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:44 crc kubenswrapper[4606]: I1212 00:24:44.244197 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da25b0ba-5398-4185-a4c1-aeba44ae5633\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa827acce161127216bcfcf3553efeb627cbe52e4d23a8cb011e025f67dde670\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://653bdfa6f15b75149c89a4aff0706e132368f8bc75520cde9f1e1c55c8866537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbb742680d55b47d51cd1964e187b6816e50a88befbce58d3a8746af74cf7071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://366ddc63e1156e702b24e0c3d3c02785c10324669fa851e46851f9a2b8a46a5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6183a4685fda61493c792e9ae693cd7f76e4aad9753f622b48347da3fd2b3b00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0867a89de2edf12036267a937b7e3b1a801e9710ff0a1250c20e020c80ac4d5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04af88555385f41f61f31bcc6dcfd9ea677fd98ba9ef0b1a25003fff11e8be1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bc7d9363ed2e08b897cfc835f60d45070a0a21643ac00f147f7e0108af46f37\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-12T00:24:16Z\\\",\\\"message\\\":\\\"om/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1212 00:24:15.754838 6204 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1212 00:24:15.755303 6204 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1212 00:24:15.755326 6204 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1212 00:24:15.755332 6204 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI1212 00:24:15.755347 6204 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1212 00:24:15.755357 6204 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1212 00:24:15.755450 6204 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1212 00:24:15.755585 6204 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1212 00:24:15.755619 6204 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1212 00:24:15.755649 6204 factory.go:656] Stopping watch factory\\\\nI1212 00:24:15.755674 6204 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1212 00:24:15.755690 6204 handler.go:208] Removed *v1.NetworkPolicy ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-12T00:24:15Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e3c08f155fa3cef9692edf10c4df9719f3647b5625a9ac68a7c158afb80dd86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f94a9f641815d76464c17c6145b9ac4803eedab3bea29a52d65a793d448cebe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f94a9f641815d76464c17c6145b9ac4803eedab3bea29a52d65a793d448cebe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hpw5w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:44Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:44 crc kubenswrapper[4606]: I1212 00:24:44.255114 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7rxps" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c831d6d-b07d-46dd-adc0-85239379350f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba170e097af91d1b27e0ee3e414e0831a5a93256c693247e7129a5fda0b9dc85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:24:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbjbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d66f7e90d4969cb2791da434d47d0e4fb269aa8760347c7d59bd8d16f4cf42c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:24:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbjbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:24:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7rxps\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:44Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:44 crc kubenswrapper[4606]: I1212 00:24:44.263307 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mjjwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0853dce1-c009-407e-960d-1113f85e503f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwwtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwwtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:24:03Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mjjwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:44Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:44 crc kubenswrapper[4606]: I1212 00:24:44.279422 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f8df398-6835-482a-a2cd-3395b6e9efab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b993d03e36e69edb510ce98f50dfb98344fb6fe2b5debeff3a2004807dd00ee6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c6fef20d4e159a405f0ab0206c49e5e314e97bf7ddaf0758e090106945c9ffe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://accf40ef1339b65136666c1ea6f21a70d2152a650c87ef3ade19ddb6f9effbc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbd49ad6d6011a235d440da0e8058c0fc4dcc64effec37b50a68e84438d537a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2860bce1479ca1d558c4f5230d76cadcaf3efc0e3842ef0d371ecf1168cd852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7780eb068dc4a157794f9253367d2198a38a2c039cf3906fc2319cf48db3cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa7780eb068dc4a157794f9253367d2198a38a2c039cf3906fc2319cf48db3cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://684c387c89889a0a1e8d317ca70ebfc71ea377eb5de9c08c71a3ff02faf99e4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://684c387c89889a0a1e8d317ca70ebfc71ea377eb5de9c08c71a3ff02faf99e4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://78da314feaec6cab81e27eaea086d2bba3a05c4b73ce2add1a0bb57a226b0f5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78da314feaec6cab81e27eaea086d2bba3a05c4b73ce2add1a0bb57a226b0f5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:44Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:44 crc kubenswrapper[4606]: I1212 00:24:44.289617 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:44 crc kubenswrapper[4606]: I1212 00:24:44.289668 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:44 crc kubenswrapper[4606]: I1212 00:24:44.289680 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:44 crc kubenswrapper[4606]: I1212 00:24:44.289699 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:44 crc kubenswrapper[4606]: I1212 00:24:44.289709 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:44Z","lastTransitionTime":"2025-12-12T00:24:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:44 crc kubenswrapper[4606]: I1212 00:24:44.291699 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92839d27-9755-4aaa-a4c2-8aa83b6473de\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d18b3a560566347e9cfcd748453a89d1588de77f02a5e961a71916c6182eff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9531d10c07644d54aeff48edf4ac068968acbd1b6597baabcb0660e7bcd54c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76f2e4c2d97f36aebba4bcbabe12214a1df27637376d1440ee54d43fca39065\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b764ca1153735e365d95ed2855c7d41cd7457614e1ddce0f008d62720d607e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c75350d79a93c04db2bd6d71995920cde9b897776249b8696f3ead3939f7d5d4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"ed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1765499023\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1765499022\\\\\\\\\\\\\\\" (2025-12-11 23:23:42 +0000 UTC to 2026-12-11 23:23:42 +0000 UTC (now=2025-12-12 00:23:48.120831298 +0000 UTC))\\\\\\\"\\\\nI1212 00:23:48.120889 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1212 00:23:48.120921 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1212 00:23:48.120964 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1212 00:23:48.120986 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1212 00:23:48.121028 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3271453193/tls.crt::/tmp/serving-cert-3271453193/tls.key\\\\\\\"\\\\nI1212 00:23:48.121084 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1212 00:23:48.121211 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1212 00:23:48.121228 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1212 00:23:48.121265 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1212 00:23:48.121275 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1212 00:23:48.121345 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1212 00:23:48.121358 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF1212 00:23:48.123292 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3baf1bf5fbaa04f9f165f43b6a4fba496be0d453641fcd8bbc811687891dd66\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bbabfdcef0c9f4dcaada0bf5c2e98c083fd0f809462f017f8a767ef800e8e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0bbabfdcef0c9f4dcaada0bf5c2e98c083fd0f809462f017f8a767ef800e8e13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:44Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:44 crc kubenswrapper[4606]: I1212 00:24:44.302529 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ba650bd-b1e1-4950-98c6-2bee87181b18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de6cd366dc8d770945235faa6560cea35c0ef7eabd95cb272639e2116e2d623d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a380a96c542428bf9cf66391f4d843144052269b294636e863319b2ba954047f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://902a9d2fb0d7dcdec6339fac981ee0539911f1998a86ecf02a01015d3b1b5907\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4309a0a93977264e5000a930a32a4fb18ebe51fcf2af20bbf153f30f09759ff6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:44Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:44 crc kubenswrapper[4606]: I1212 00:24:44.312372 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:44Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:44 crc kubenswrapper[4606]: I1212 00:24:44.321603 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://694cd2f423e8241bd8fd68395824d1cd45a12d80e92ad57e63e59c0642b21384\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:44Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:44 crc kubenswrapper[4606]: I1212 00:24:44.330232 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a543e227-be89-40cb-941d-b4707cc28921\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b09638dd0d881593745c14548156f20a9366f65ecfa5018d510b902240dd4f7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-986w2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5de882904f4a5718bcf44d07da05096363447836f325f35130914c17a30c3a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-986w2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cqmz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:44Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:44 crc kubenswrapper[4606]: I1212 00:24:44.340379 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xzcfk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3601dd0b8c0a574cc4689906f3c4ed491f3432cae8b328c55f07e5d17b6e976\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84110159d07d70b0607ac029fec70fb043bbb0e310c39955e49becfe87e9faa0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-12T00:24:36Z\\\",\\\"message\\\":\\\"2025-12-12T00:23:51+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_d712852f-61b9-48f1-a1d6-1031b964219c\\\\n2025-12-12T00:23:51+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_d712852f-61b9-48f1-a1d6-1031b964219c to /host/opt/cni/bin/\\\\n2025-12-12T00:23:51Z [verbose] multus-daemon started\\\\n2025-12-12T00:23:51Z [verbose] Readiness Indicator file check\\\\n2025-12-12T00:24:36Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-spsv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xzcfk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:44Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:44 crc kubenswrapper[4606]: I1212 00:24:44.392084 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:44 crc kubenswrapper[4606]: I1212 00:24:44.392128 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:44 crc kubenswrapper[4606]: I1212 00:24:44.392139 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:44 crc kubenswrapper[4606]: I1212 00:24:44.392155 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:44 crc kubenswrapper[4606]: I1212 00:24:44.392169 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:44Z","lastTransitionTime":"2025-12-12T00:24:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:44 crc kubenswrapper[4606]: I1212 00:24:44.495806 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:44 crc kubenswrapper[4606]: I1212 00:24:44.495988 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:44 crc kubenswrapper[4606]: I1212 00:24:44.496235 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:44 crc kubenswrapper[4606]: I1212 00:24:44.496267 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:44 crc kubenswrapper[4606]: I1212 00:24:44.496289 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:44Z","lastTransitionTime":"2025-12-12T00:24:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:44 crc kubenswrapper[4606]: I1212 00:24:44.598572 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:44 crc kubenswrapper[4606]: I1212 00:24:44.598600 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:44 crc kubenswrapper[4606]: I1212 00:24:44.598607 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:44 crc kubenswrapper[4606]: I1212 00:24:44.598620 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:44 crc kubenswrapper[4606]: I1212 00:24:44.598628 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:44Z","lastTransitionTime":"2025-12-12T00:24:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:44 crc kubenswrapper[4606]: I1212 00:24:44.701197 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 12 00:24:44 crc kubenswrapper[4606]: E1212 00:24:44.701320 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 12 00:24:44 crc kubenswrapper[4606]: I1212 00:24:44.701197 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 12 00:24:44 crc kubenswrapper[4606]: E1212 00:24:44.701415 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 12 00:24:44 crc kubenswrapper[4606]: I1212 00:24:44.701548 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:44 crc kubenswrapper[4606]: I1212 00:24:44.701600 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:44 crc kubenswrapper[4606]: I1212 00:24:44.701617 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:44 crc kubenswrapper[4606]: I1212 00:24:44.701638 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:44 crc kubenswrapper[4606]: I1212 00:24:44.701655 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:44Z","lastTransitionTime":"2025-12-12T00:24:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:44 crc kubenswrapper[4606]: I1212 00:24:44.701870 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 12 00:24:44 crc kubenswrapper[4606]: E1212 00:24:44.701996 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 12 00:24:44 crc kubenswrapper[4606]: I1212 00:24:44.804224 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:44 crc kubenswrapper[4606]: I1212 00:24:44.804287 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:44 crc kubenswrapper[4606]: I1212 00:24:44.804308 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:44 crc kubenswrapper[4606]: I1212 00:24:44.804337 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:44 crc kubenswrapper[4606]: I1212 00:24:44.804359 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:44Z","lastTransitionTime":"2025-12-12T00:24:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:44 crc kubenswrapper[4606]: I1212 00:24:44.907261 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:44 crc kubenswrapper[4606]: I1212 00:24:44.907307 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:44 crc kubenswrapper[4606]: I1212 00:24:44.907316 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:44 crc kubenswrapper[4606]: I1212 00:24:44.907333 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:44 crc kubenswrapper[4606]: I1212 00:24:44.907343 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:44Z","lastTransitionTime":"2025-12-12T00:24:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:45 crc kubenswrapper[4606]: I1212 00:24:45.011387 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:45 crc kubenswrapper[4606]: I1212 00:24:45.011467 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:45 crc kubenswrapper[4606]: I1212 00:24:45.011485 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:45 crc kubenswrapper[4606]: I1212 00:24:45.011504 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:45 crc kubenswrapper[4606]: I1212 00:24:45.011520 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:45Z","lastTransitionTime":"2025-12-12T00:24:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:45 crc kubenswrapper[4606]: I1212 00:24:45.113913 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:45 crc kubenswrapper[4606]: I1212 00:24:45.113973 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:45 crc kubenswrapper[4606]: I1212 00:24:45.113992 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:45 crc kubenswrapper[4606]: I1212 00:24:45.114017 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:45 crc kubenswrapper[4606]: I1212 00:24:45.114036 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:45Z","lastTransitionTime":"2025-12-12T00:24:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:45 crc kubenswrapper[4606]: I1212 00:24:45.126974 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hpw5w_da25b0ba-5398-4185-a4c1-aeba44ae5633/ovnkube-controller/3.log" Dec 12 00:24:45 crc kubenswrapper[4606]: I1212 00:24:45.127793 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hpw5w_da25b0ba-5398-4185-a4c1-aeba44ae5633/ovnkube-controller/2.log" Dec 12 00:24:45 crc kubenswrapper[4606]: I1212 00:24:45.131650 4606 generic.go:334] "Generic (PLEG): container finished" podID="da25b0ba-5398-4185-a4c1-aeba44ae5633" containerID="04af88555385f41f61f31bcc6dcfd9ea677fd98ba9ef0b1a25003fff11e8be1b" exitCode=1 Dec 12 00:24:45 crc kubenswrapper[4606]: I1212 00:24:45.131704 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" event={"ID":"da25b0ba-5398-4185-a4c1-aeba44ae5633","Type":"ContainerDied","Data":"04af88555385f41f61f31bcc6dcfd9ea677fd98ba9ef0b1a25003fff11e8be1b"} Dec 12 00:24:45 crc kubenswrapper[4606]: I1212 00:24:45.131750 4606 scope.go:117] "RemoveContainer" containerID="8bc7d9363ed2e08b897cfc835f60d45070a0a21643ac00f147f7e0108af46f37" Dec 12 00:24:45 crc kubenswrapper[4606]: I1212 00:24:45.133407 4606 scope.go:117] "RemoveContainer" containerID="04af88555385f41f61f31bcc6dcfd9ea677fd98ba9ef0b1a25003fff11e8be1b" Dec 12 00:24:45 crc kubenswrapper[4606]: E1212 00:24:45.133781 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-hpw5w_openshift-ovn-kubernetes(da25b0ba-5398-4185-a4c1-aeba44ae5633)\"" pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" podUID="da25b0ba-5398-4185-a4c1-aeba44ae5633" Dec 12 00:24:45 crc kubenswrapper[4606]: I1212 00:24:45.149592 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce6c4e852b4c68e910621ad11058beecb40960d501d3e64f5fcad97c442df4ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:45Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:45 crc kubenswrapper[4606]: I1212 00:24:45.160821 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7rxps" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c831d6d-b07d-46dd-adc0-85239379350f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba170e097af91d1b27e0ee3e414e0831a5a93256c693247e7129a5fda0b9dc85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:24:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbjbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d66f7e90d4969cb2791da434d47d0e4fb269aa8760347c7d59bd8d16f4cf42c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:24:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbjbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:24:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7rxps\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:45Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:45 crc kubenswrapper[4606]: I1212 00:24:45.170425 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mjjwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0853dce1-c009-407e-960d-1113f85e503f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwwtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwwtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:24:03Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mjjwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:45Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:45 crc kubenswrapper[4606]: I1212 00:24:45.181235 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:45Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:45 crc kubenswrapper[4606]: I1212 00:24:45.191184 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-554rp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d34036b-3243-416a-a0c0-1d1ddb3a0ca5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://346417326a8a4ba8e839ce4870dbaacd33a90b4ab096d8ce573ce598d909b51a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9k8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-554rp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:45Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:45 crc kubenswrapper[4606]: I1212 00:24:45.209015 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-w4rbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"470c2076-46bf-4305-9fb1-3e509eb4d4f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78e7b08154aa4792439cd9e13c38f1c3a106a4d138d8e6a47d5406c82f11662c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34a40d0ce85078630553b90b7a35510e0d3523f600db8940d5e0cf063903192d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34a40d0ce85078630553b90b7a35510e0d3523f600db8940d5e0cf063903192d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f0d903a3e09486a00bc5031c1a100d7a520f8e0cbdcccbeadb796e70f1075b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f0d903a3e09486a00bc5031c1a100d7a520f8e0cbdcccbeadb796e70f1075b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06909fa1392ee8f29394ecd7411437e4f331ccd2007d0a534d0292175a7260ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06909fa1392ee8f29394ecd7411437e4f331ccd2007d0a534d0292175a7260ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://328f409c465a8289f2e2e070ac2475187e763de8354cad8af5b5b4e0a3539628\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://328f409c465a8289f2e2e070ac2475187e763de8354cad8af5b5b4e0a3539628\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09980f0c6b16141a2a66d2768c6e01f75c37314fea48825e9b9707a3fc4d3f9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09980f0c6b16141a2a66d2768c6e01f75c37314fea48825e9b9707a3fc4d3f9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d34cd079b74cccb75c66edf8c51b64113483767ced4eb56ad429d6527d0bb0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d34cd079b74cccb75c66edf8c51b64113483767ced4eb56ad429d6527d0bb0c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-w4rbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:45Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:45 crc kubenswrapper[4606]: I1212 00:24:45.217098 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:45 crc kubenswrapper[4606]: I1212 00:24:45.217130 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:45 crc kubenswrapper[4606]: I1212 00:24:45.217141 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:45 crc kubenswrapper[4606]: I1212 00:24:45.217157 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:45 crc kubenswrapper[4606]: I1212 00:24:45.217167 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:45Z","lastTransitionTime":"2025-12-12T00:24:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:45 crc kubenswrapper[4606]: I1212 00:24:45.239150 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da25b0ba-5398-4185-a4c1-aeba44ae5633\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa827acce161127216bcfcf3553efeb627cbe52e4d23a8cb011e025f67dde670\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://653bdfa6f15b75149c89a4aff0706e132368f8bc75520cde9f1e1c55c8866537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbb742680d55b47d51cd1964e187b6816e50a88befbce58d3a8746af74cf7071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://366ddc63e1156e702b24e0c3d3c02785c10324669fa851e46851f9a2b8a46a5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6183a4685fda61493c792e9ae693cd7f76e4aad9753f622b48347da3fd2b3b00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0867a89de2edf12036267a937b7e3b1a801e9710ff0a1250c20e020c80ac4d5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04af88555385f41f61f31bcc6dcfd9ea677fd98ba9ef0b1a25003fff11e8be1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bc7d9363ed2e08b897cfc835f60d45070a0a21643ac00f147f7e0108af46f37\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-12T00:24:16Z\\\",\\\"message\\\":\\\"om/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1212 00:24:15.754838 6204 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1212 00:24:15.755303 6204 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1212 00:24:15.755326 6204 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1212 00:24:15.755332 6204 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI1212 00:24:15.755347 6204 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1212 00:24:15.755357 6204 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1212 00:24:15.755450 6204 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1212 00:24:15.755585 6204 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1212 00:24:15.755619 6204 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1212 00:24:15.755649 6204 factory.go:656] Stopping watch factory\\\\nI1212 00:24:15.755674 6204 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1212 00:24:15.755690 6204 handler.go:208] Removed *v1.NetworkPolicy ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-12T00:24:15Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04af88555385f41f61f31bcc6dcfd9ea677fd98ba9ef0b1a25003fff11e8be1b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-12T00:24:44Z\\\",\\\"message\\\":\\\"atch factory\\\\nI1212 00:24:44.033859 6579 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1212 00:24:44.033876 6579 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1212 00:24:44.033886 6579 handler.go:208] Removed *v1.Node event handler 2\\\\nI1212 00:24:44.033899 6579 handler.go:208] Removed *v1.Node event handler 7\\\\nI1212 00:24:44.033905 6579 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1212 00:24:44.033909 6579 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1212 00:24:44.033926 6579 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1212 00:24:44.033935 6579 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1212 00:24:44.034088 6579 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1212 00:24:44.034244 6579 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1212 00:24:44.034528 6579 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1212 00:24:44.034862 6579 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-12T00:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e3c08f155fa3cef9692edf10c4df9719f3647b5625a9ac68a7c158afb80dd86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f94a9f641815d76464c17c6145b9ac4803eedab3bea29a52d65a793d448cebe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f94a9f641815d76464c17c6145b9ac4803eedab3bea29a52d65a793d448cebe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hpw5w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:45Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:45 crc kubenswrapper[4606]: I1212 00:24:45.250366 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://694cd2f423e8241bd8fd68395824d1cd45a12d80e92ad57e63e59c0642b21384\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:45Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:45 crc kubenswrapper[4606]: I1212 00:24:45.263518 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a543e227-be89-40cb-941d-b4707cc28921\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b09638dd0d881593745c14548156f20a9366f65ecfa5018d510b902240dd4f7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-986w2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5de882904f4a5718bcf44d07da05096363447836f325f35130914c17a30c3a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-986w2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cqmz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:45Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:45 crc kubenswrapper[4606]: I1212 00:24:45.275026 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xzcfk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3601dd0b8c0a574cc4689906f3c4ed491f3432cae8b328c55f07e5d17b6e976\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84110159d07d70b0607ac029fec70fb043bbb0e310c39955e49becfe87e9faa0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-12T00:24:36Z\\\",\\\"message\\\":\\\"2025-12-12T00:23:51+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_d712852f-61b9-48f1-a1d6-1031b964219c\\\\n2025-12-12T00:23:51+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_d712852f-61b9-48f1-a1d6-1031b964219c to /host/opt/cni/bin/\\\\n2025-12-12T00:23:51Z [verbose] multus-daemon started\\\\n2025-12-12T00:23:51Z [verbose] Readiness Indicator file check\\\\n2025-12-12T00:24:36Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-spsv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xzcfk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:45Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:45 crc kubenswrapper[4606]: I1212 00:24:45.292103 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f8df398-6835-482a-a2cd-3395b6e9efab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b993d03e36e69edb510ce98f50dfb98344fb6fe2b5debeff3a2004807dd00ee6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c6fef20d4e159a405f0ab0206c49e5e314e97bf7ddaf0758e090106945c9ffe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://accf40ef1339b65136666c1ea6f21a70d2152a650c87ef3ade19ddb6f9effbc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbd49ad6d6011a235d440da0e8058c0fc4dcc64effec37b50a68e84438d537a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2860bce1479ca1d558c4f5230d76cadcaf3efc0e3842ef0d371ecf1168cd852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7780eb068dc4a157794f9253367d2198a38a2c039cf3906fc2319cf48db3cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa7780eb068dc4a157794f9253367d2198a38a2c039cf3906fc2319cf48db3cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://684c387c89889a0a1e8d317ca70ebfc71ea377eb5de9c08c71a3ff02faf99e4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://684c387c89889a0a1e8d317ca70ebfc71ea377eb5de9c08c71a3ff02faf99e4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://78da314feaec6cab81e27eaea086d2bba3a05c4b73ce2add1a0bb57a226b0f5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78da314feaec6cab81e27eaea086d2bba3a05c4b73ce2add1a0bb57a226b0f5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:45Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:45 crc kubenswrapper[4606]: I1212 00:24:45.306426 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92839d27-9755-4aaa-a4c2-8aa83b6473de\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d18b3a560566347e9cfcd748453a89d1588de77f02a5e961a71916c6182eff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9531d10c07644d54aeff48edf4ac068968acbd1b6597baabcb0660e7bcd54c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76f2e4c2d97f36aebba4bcbabe12214a1df27637376d1440ee54d43fca39065\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b764ca1153735e365d95ed2855c7d41cd7457614e1ddce0f008d62720d607e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c75350d79a93c04db2bd6d71995920cde9b897776249b8696f3ead3939f7d5d4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"ed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1765499023\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1765499022\\\\\\\\\\\\\\\" (2025-12-11 23:23:42 +0000 UTC to 2026-12-11 23:23:42 +0000 UTC (now=2025-12-12 00:23:48.120831298 +0000 UTC))\\\\\\\"\\\\nI1212 00:23:48.120889 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1212 00:23:48.120921 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1212 00:23:48.120964 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1212 00:23:48.120986 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1212 00:23:48.121028 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3271453193/tls.crt::/tmp/serving-cert-3271453193/tls.key\\\\\\\"\\\\nI1212 00:23:48.121084 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1212 00:23:48.121211 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1212 00:23:48.121228 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1212 00:23:48.121265 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1212 00:23:48.121275 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1212 00:23:48.121345 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1212 00:23:48.121358 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF1212 00:23:48.123292 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3baf1bf5fbaa04f9f165f43b6a4fba496be0d453641fcd8bbc811687891dd66\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bbabfdcef0c9f4dcaada0bf5c2e98c083fd0f809462f017f8a767ef800e8e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0bbabfdcef0c9f4dcaada0bf5c2e98c083fd0f809462f017f8a767ef800e8e13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:45Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:45 crc kubenswrapper[4606]: I1212 00:24:45.316398 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ba650bd-b1e1-4950-98c6-2bee87181b18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de6cd366dc8d770945235faa6560cea35c0ef7eabd95cb272639e2116e2d623d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a380a96c542428bf9cf66391f4d843144052269b294636e863319b2ba954047f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://902a9d2fb0d7dcdec6339fac981ee0539911f1998a86ecf02a01015d3b1b5907\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4309a0a93977264e5000a930a32a4fb18ebe51fcf2af20bbf153f30f09759ff6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:45Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:45 crc kubenswrapper[4606]: I1212 00:24:45.318872 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:45 crc kubenswrapper[4606]: I1212 00:24:45.318909 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:45 crc kubenswrapper[4606]: I1212 00:24:45.318922 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:45 crc kubenswrapper[4606]: I1212 00:24:45.318939 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:45 crc kubenswrapper[4606]: I1212 00:24:45.318951 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:45Z","lastTransitionTime":"2025-12-12T00:24:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:45 crc kubenswrapper[4606]: I1212 00:24:45.326654 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:45Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:45 crc kubenswrapper[4606]: I1212 00:24:45.336276 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qtz7b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f437a237-3eb8-4817-b0a9-35efece69933\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f587a144b02f6ba1229380174f2a4f2c50ed33702fe8d64eb5ff7174b32eb2d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qq49f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qtz7b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:45Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:45 crc kubenswrapper[4606]: I1212 00:24:45.347335 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67cd27ea-8882-4aa1-ba7a-eb2058cac536\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c06b3470d2a194c99cd63ef6039a6f2abff89fc93e8f61feca67c4eefa9b9ac8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://892ac723ca54cc612171eec750846b774fcfc54e09531ff344813a1e1afc61fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://892ac723ca54cc612171eec750846b774fcfc54e09531ff344813a1e1afc61fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:45Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:45 crc kubenswrapper[4606]: I1212 00:24:45.359018 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab571c5f-6f1d-4cb9-9a00-a57cc7baad63\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7a27c1b503f68ce99c9004d0190e5c74380bf2ff33b2b0e1e7f424e5cf9d450\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a159e87c39859bf3b5652b40223e4a8fdd9dcae3d23c1fae17d0eb8b5842a71c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e338552ee9504d46234c3caf3d9b7306a033258a94b6cf7c542bb957ea32a94c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c88faa76d30856d9170f04a153de4239eefbef49b2ee25a5bd04502697248b5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c88faa76d30856d9170f04a153de4239eefbef49b2ee25a5bd04502697248b5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:29Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:45Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:45 crc kubenswrapper[4606]: I1212 00:24:45.375006 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4986a294a19f85082b9caf854f13f9c3587ca49314b40917b00768ee0bb51ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4251eab73ccb6141903993f8df922037762f7c11dc1e0ef1a2b5708ed435307\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:45Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:45 crc kubenswrapper[4606]: I1212 00:24:45.387933 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:45Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:45 crc kubenswrapper[4606]: I1212 00:24:45.422421 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:45 crc kubenswrapper[4606]: I1212 00:24:45.422472 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:45 crc kubenswrapper[4606]: I1212 00:24:45.422489 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:45 crc kubenswrapper[4606]: I1212 00:24:45.422512 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:45 crc kubenswrapper[4606]: I1212 00:24:45.422528 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:45Z","lastTransitionTime":"2025-12-12T00:24:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:45 crc kubenswrapper[4606]: I1212 00:24:45.525058 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:45 crc kubenswrapper[4606]: I1212 00:24:45.525124 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:45 crc kubenswrapper[4606]: I1212 00:24:45.525220 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:45 crc kubenswrapper[4606]: I1212 00:24:45.525248 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:45 crc kubenswrapper[4606]: I1212 00:24:45.525267 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:45Z","lastTransitionTime":"2025-12-12T00:24:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:45 crc kubenswrapper[4606]: I1212 00:24:45.628672 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:45 crc kubenswrapper[4606]: I1212 00:24:45.628738 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:45 crc kubenswrapper[4606]: I1212 00:24:45.628761 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:45 crc kubenswrapper[4606]: I1212 00:24:45.628787 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:45 crc kubenswrapper[4606]: I1212 00:24:45.628804 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:45Z","lastTransitionTime":"2025-12-12T00:24:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:45 crc kubenswrapper[4606]: I1212 00:24:45.699358 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mjjwd" Dec 12 00:24:45 crc kubenswrapper[4606]: E1212 00:24:45.699566 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mjjwd" podUID="0853dce1-c009-407e-960d-1113f85e503f" Dec 12 00:24:45 crc kubenswrapper[4606]: I1212 00:24:45.731504 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:45 crc kubenswrapper[4606]: I1212 00:24:45.731551 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:45 crc kubenswrapper[4606]: I1212 00:24:45.731565 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:45 crc kubenswrapper[4606]: I1212 00:24:45.731584 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:45 crc kubenswrapper[4606]: I1212 00:24:45.731597 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:45Z","lastTransitionTime":"2025-12-12T00:24:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:45 crc kubenswrapper[4606]: I1212 00:24:45.834150 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:45 crc kubenswrapper[4606]: I1212 00:24:45.834195 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:45 crc kubenswrapper[4606]: I1212 00:24:45.834205 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:45 crc kubenswrapper[4606]: I1212 00:24:45.834221 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:45 crc kubenswrapper[4606]: I1212 00:24:45.834231 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:45Z","lastTransitionTime":"2025-12-12T00:24:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:45 crc kubenswrapper[4606]: I1212 00:24:45.937539 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:45 crc kubenswrapper[4606]: I1212 00:24:45.937598 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:45 crc kubenswrapper[4606]: I1212 00:24:45.937619 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:45 crc kubenswrapper[4606]: I1212 00:24:45.937648 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:45 crc kubenswrapper[4606]: I1212 00:24:45.937673 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:45Z","lastTransitionTime":"2025-12-12T00:24:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:46 crc kubenswrapper[4606]: I1212 00:24:46.040264 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:46 crc kubenswrapper[4606]: I1212 00:24:46.040326 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:46 crc kubenswrapper[4606]: I1212 00:24:46.040343 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:46 crc kubenswrapper[4606]: I1212 00:24:46.040374 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:46 crc kubenswrapper[4606]: I1212 00:24:46.040390 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:46Z","lastTransitionTime":"2025-12-12T00:24:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:46 crc kubenswrapper[4606]: I1212 00:24:46.137814 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hpw5w_da25b0ba-5398-4185-a4c1-aeba44ae5633/ovnkube-controller/3.log" Dec 12 00:24:46 crc kubenswrapper[4606]: I1212 00:24:46.141480 4606 scope.go:117] "RemoveContainer" containerID="04af88555385f41f61f31bcc6dcfd9ea677fd98ba9ef0b1a25003fff11e8be1b" Dec 12 00:24:46 crc kubenswrapper[4606]: E1212 00:24:46.141619 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-hpw5w_openshift-ovn-kubernetes(da25b0ba-5398-4185-a4c1-aeba44ae5633)\"" pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" podUID="da25b0ba-5398-4185-a4c1-aeba44ae5633" Dec 12 00:24:46 crc kubenswrapper[4606]: I1212 00:24:46.141992 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:46 crc kubenswrapper[4606]: I1212 00:24:46.142035 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:46 crc kubenswrapper[4606]: I1212 00:24:46.142053 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:46 crc kubenswrapper[4606]: I1212 00:24:46.142074 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:46 crc kubenswrapper[4606]: I1212 00:24:46.142090 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:46Z","lastTransitionTime":"2025-12-12T00:24:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:46 crc kubenswrapper[4606]: I1212 00:24:46.155484 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce6c4e852b4c68e910621ad11058beecb40960d501d3e64f5fcad97c442df4ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:46Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:46 crc kubenswrapper[4606]: I1212 00:24:46.183227 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da25b0ba-5398-4185-a4c1-aeba44ae5633\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa827acce161127216bcfcf3553efeb627cbe52e4d23a8cb011e025f67dde670\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://653bdfa6f15b75149c89a4aff0706e132368f8bc75520cde9f1e1c55c8866537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbb742680d55b47d51cd1964e187b6816e50a88befbce58d3a8746af74cf7071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://366ddc63e1156e702b24e0c3d3c02785c10324669fa851e46851f9a2b8a46a5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6183a4685fda61493c792e9ae693cd7f76e4aad9753f622b48347da3fd2b3b00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0867a89de2edf12036267a937b7e3b1a801e9710ff0a1250c20e020c80ac4d5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04af88555385f41f61f31bcc6dcfd9ea677fd98ba9ef0b1a25003fff11e8be1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04af88555385f41f61f31bcc6dcfd9ea677fd98ba9ef0b1a25003fff11e8be1b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-12T00:24:44Z\\\",\\\"message\\\":\\\"atch factory\\\\nI1212 00:24:44.033859 6579 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1212 00:24:44.033876 6579 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1212 00:24:44.033886 6579 handler.go:208] Removed *v1.Node event handler 2\\\\nI1212 00:24:44.033899 6579 handler.go:208] Removed *v1.Node event handler 7\\\\nI1212 00:24:44.033905 6579 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1212 00:24:44.033909 6579 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1212 00:24:44.033926 6579 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1212 00:24:44.033935 6579 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1212 00:24:44.034088 6579 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1212 00:24:44.034244 6579 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1212 00:24:44.034528 6579 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1212 00:24:44.034862 6579 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-12T00:24:43Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-hpw5w_openshift-ovn-kubernetes(da25b0ba-5398-4185-a4c1-aeba44ae5633)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e3c08f155fa3cef9692edf10c4df9719f3647b5625a9ac68a7c158afb80dd86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f94a9f641815d76464c17c6145b9ac4803eedab3bea29a52d65a793d448cebe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f94a9f641815d76464c17c6145b9ac4803eedab3bea29a52d65a793d448cebe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hpw5w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:46Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:46 crc kubenswrapper[4606]: I1212 00:24:46.199639 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7rxps" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c831d6d-b07d-46dd-adc0-85239379350f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba170e097af91d1b27e0ee3e414e0831a5a93256c693247e7129a5fda0b9dc85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:24:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbjbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d66f7e90d4969cb2791da434d47d0e4fb269aa8760347c7d59bd8d16f4cf42c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:24:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbjbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:24:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7rxps\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:46Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:46 crc kubenswrapper[4606]: I1212 00:24:46.214212 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mjjwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0853dce1-c009-407e-960d-1113f85e503f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwwtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwwtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:24:03Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mjjwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:46Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:46 crc kubenswrapper[4606]: I1212 00:24:46.232395 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:46Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:46 crc kubenswrapper[4606]: I1212 00:24:46.244272 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:46 crc kubenswrapper[4606]: I1212 00:24:46.244322 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:46 crc kubenswrapper[4606]: I1212 00:24:46.244336 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:46 crc kubenswrapper[4606]: I1212 00:24:46.244356 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:46 crc kubenswrapper[4606]: I1212 00:24:46.244371 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:46Z","lastTransitionTime":"2025-12-12T00:24:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:46 crc kubenswrapper[4606]: I1212 00:24:46.246387 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-554rp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d34036b-3243-416a-a0c0-1d1ddb3a0ca5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://346417326a8a4ba8e839ce4870dbaacd33a90b4ab096d8ce573ce598d909b51a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9k8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-554rp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:46Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:46 crc kubenswrapper[4606]: I1212 00:24:46.262540 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-w4rbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"470c2076-46bf-4305-9fb1-3e509eb4d4f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78e7b08154aa4792439cd9e13c38f1c3a106a4d138d8e6a47d5406c82f11662c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34a40d0ce85078630553b90b7a35510e0d3523f600db8940d5e0cf063903192d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34a40d0ce85078630553b90b7a35510e0d3523f600db8940d5e0cf063903192d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f0d903a3e09486a00bc5031c1a100d7a520f8e0cbdcccbeadb796e70f1075b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f0d903a3e09486a00bc5031c1a100d7a520f8e0cbdcccbeadb796e70f1075b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06909fa1392ee8f29394ecd7411437e4f331ccd2007d0a534d0292175a7260ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06909fa1392ee8f29394ecd7411437e4f331ccd2007d0a534d0292175a7260ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://328f409c465a8289f2e2e070ac2475187e763de8354cad8af5b5b4e0a3539628\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://328f409c465a8289f2e2e070ac2475187e763de8354cad8af5b5b4e0a3539628\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09980f0c6b16141a2a66d2768c6e01f75c37314fea48825e9b9707a3fc4d3f9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09980f0c6b16141a2a66d2768c6e01f75c37314fea48825e9b9707a3fc4d3f9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d34cd079b74cccb75c66edf8c51b64113483767ced4eb56ad429d6527d0bb0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d34cd079b74cccb75c66edf8c51b64113483767ced4eb56ad429d6527d0bb0c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-w4rbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:46Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:46 crc kubenswrapper[4606]: I1212 00:24:46.276294 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:46Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:46 crc kubenswrapper[4606]: I1212 00:24:46.288727 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://694cd2f423e8241bd8fd68395824d1cd45a12d80e92ad57e63e59c0642b21384\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:46Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:46 crc kubenswrapper[4606]: I1212 00:24:46.301135 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a543e227-be89-40cb-941d-b4707cc28921\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b09638dd0d881593745c14548156f20a9366f65ecfa5018d510b902240dd4f7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-986w2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5de882904f4a5718bcf44d07da05096363447836f325f35130914c17a30c3a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-986w2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cqmz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:46Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:46 crc kubenswrapper[4606]: I1212 00:24:46.312837 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xzcfk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3601dd0b8c0a574cc4689906f3c4ed491f3432cae8b328c55f07e5d17b6e976\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84110159d07d70b0607ac029fec70fb043bbb0e310c39955e49becfe87e9faa0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-12T00:24:36Z\\\",\\\"message\\\":\\\"2025-12-12T00:23:51+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_d712852f-61b9-48f1-a1d6-1031b964219c\\\\n2025-12-12T00:23:51+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_d712852f-61b9-48f1-a1d6-1031b964219c to /host/opt/cni/bin/\\\\n2025-12-12T00:23:51Z [verbose] multus-daemon started\\\\n2025-12-12T00:23:51Z [verbose] Readiness Indicator file check\\\\n2025-12-12T00:24:36Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-spsv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xzcfk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:46Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:46 crc kubenswrapper[4606]: I1212 00:24:46.343772 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f8df398-6835-482a-a2cd-3395b6e9efab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b993d03e36e69edb510ce98f50dfb98344fb6fe2b5debeff3a2004807dd00ee6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c6fef20d4e159a405f0ab0206c49e5e314e97bf7ddaf0758e090106945c9ffe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://accf40ef1339b65136666c1ea6f21a70d2152a650c87ef3ade19ddb6f9effbc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbd49ad6d6011a235d440da0e8058c0fc4dcc64effec37b50a68e84438d537a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2860bce1479ca1d558c4f5230d76cadcaf3efc0e3842ef0d371ecf1168cd852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7780eb068dc4a157794f9253367d2198a38a2c039cf3906fc2319cf48db3cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa7780eb068dc4a157794f9253367d2198a38a2c039cf3906fc2319cf48db3cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://684c387c89889a0a1e8d317ca70ebfc71ea377eb5de9c08c71a3ff02faf99e4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://684c387c89889a0a1e8d317ca70ebfc71ea377eb5de9c08c71a3ff02faf99e4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://78da314feaec6cab81e27eaea086d2bba3a05c4b73ce2add1a0bb57a226b0f5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78da314feaec6cab81e27eaea086d2bba3a05c4b73ce2add1a0bb57a226b0f5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:46Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:46 crc kubenswrapper[4606]: I1212 00:24:46.347521 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:46 crc kubenswrapper[4606]: I1212 00:24:46.347572 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:46 crc kubenswrapper[4606]: I1212 00:24:46.347589 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:46 crc kubenswrapper[4606]: I1212 00:24:46.347610 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:46 crc kubenswrapper[4606]: I1212 00:24:46.347633 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:46Z","lastTransitionTime":"2025-12-12T00:24:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:46 crc kubenswrapper[4606]: I1212 00:24:46.368713 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92839d27-9755-4aaa-a4c2-8aa83b6473de\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d18b3a560566347e9cfcd748453a89d1588de77f02a5e961a71916c6182eff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9531d10c07644d54aeff48edf4ac068968acbd1b6597baabcb0660e7bcd54c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76f2e4c2d97f36aebba4bcbabe12214a1df27637376d1440ee54d43fca39065\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b764ca1153735e365d95ed2855c7d41cd7457614e1ddce0f008d62720d607e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c75350d79a93c04db2bd6d71995920cde9b897776249b8696f3ead3939f7d5d4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"ed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1765499023\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1765499022\\\\\\\\\\\\\\\" (2025-12-11 23:23:42 +0000 UTC to 2026-12-11 23:23:42 +0000 UTC (now=2025-12-12 00:23:48.120831298 +0000 UTC))\\\\\\\"\\\\nI1212 00:23:48.120889 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1212 00:23:48.120921 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1212 00:23:48.120964 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1212 00:23:48.120986 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1212 00:23:48.121028 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3271453193/tls.crt::/tmp/serving-cert-3271453193/tls.key\\\\\\\"\\\\nI1212 00:23:48.121084 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1212 00:23:48.121211 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1212 00:23:48.121228 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1212 00:23:48.121265 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1212 00:23:48.121275 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1212 00:23:48.121345 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1212 00:23:48.121358 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF1212 00:23:48.123292 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3baf1bf5fbaa04f9f165f43b6a4fba496be0d453641fcd8bbc811687891dd66\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bbabfdcef0c9f4dcaada0bf5c2e98c083fd0f809462f017f8a767ef800e8e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0bbabfdcef0c9f4dcaada0bf5c2e98c083fd0f809462f017f8a767ef800e8e13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:46Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:46 crc kubenswrapper[4606]: I1212 00:24:46.385385 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ba650bd-b1e1-4950-98c6-2bee87181b18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de6cd366dc8d770945235faa6560cea35c0ef7eabd95cb272639e2116e2d623d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a380a96c542428bf9cf66391f4d843144052269b294636e863319b2ba954047f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://902a9d2fb0d7dcdec6339fac981ee0539911f1998a86ecf02a01015d3b1b5907\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4309a0a93977264e5000a930a32a4fb18ebe51fcf2af20bbf153f30f09759ff6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:46Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:46 crc kubenswrapper[4606]: I1212 00:24:46.399697 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:46Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:46 crc kubenswrapper[4606]: I1212 00:24:46.410585 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qtz7b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f437a237-3eb8-4817-b0a9-35efece69933\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f587a144b02f6ba1229380174f2a4f2c50ed33702fe8d64eb5ff7174b32eb2d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qq49f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qtz7b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:46Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:46 crc kubenswrapper[4606]: I1212 00:24:46.419276 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67cd27ea-8882-4aa1-ba7a-eb2058cac536\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c06b3470d2a194c99cd63ef6039a6f2abff89fc93e8f61feca67c4eefa9b9ac8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://892ac723ca54cc612171eec750846b774fcfc54e09531ff344813a1e1afc61fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://892ac723ca54cc612171eec750846b774fcfc54e09531ff344813a1e1afc61fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:46Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:46 crc kubenswrapper[4606]: I1212 00:24:46.431678 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab571c5f-6f1d-4cb9-9a00-a57cc7baad63\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7a27c1b503f68ce99c9004d0190e5c74380bf2ff33b2b0e1e7f424e5cf9d450\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a159e87c39859bf3b5652b40223e4a8fdd9dcae3d23c1fae17d0eb8b5842a71c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e338552ee9504d46234c3caf3d9b7306a033258a94b6cf7c542bb957ea32a94c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c88faa76d30856d9170f04a153de4239eefbef49b2ee25a5bd04502697248b5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c88faa76d30856d9170f04a153de4239eefbef49b2ee25a5bd04502697248b5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:29Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:46Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:46 crc kubenswrapper[4606]: I1212 00:24:46.443459 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4986a294a19f85082b9caf854f13f9c3587ca49314b40917b00768ee0bb51ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4251eab73ccb6141903993f8df922037762f7c11dc1e0ef1a2b5708ed435307\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:46Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:46 crc kubenswrapper[4606]: I1212 00:24:46.449974 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:46 crc kubenswrapper[4606]: I1212 00:24:46.450003 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:46 crc kubenswrapper[4606]: I1212 00:24:46.450012 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:46 crc kubenswrapper[4606]: I1212 00:24:46.450024 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:46 crc kubenswrapper[4606]: I1212 00:24:46.450034 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:46Z","lastTransitionTime":"2025-12-12T00:24:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:46 crc kubenswrapper[4606]: I1212 00:24:46.552917 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:46 crc kubenswrapper[4606]: I1212 00:24:46.552970 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:46 crc kubenswrapper[4606]: I1212 00:24:46.552986 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:46 crc kubenswrapper[4606]: I1212 00:24:46.553010 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:46 crc kubenswrapper[4606]: I1212 00:24:46.553029 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:46Z","lastTransitionTime":"2025-12-12T00:24:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:46 crc kubenswrapper[4606]: I1212 00:24:46.655079 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:46 crc kubenswrapper[4606]: I1212 00:24:46.655140 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:46 crc kubenswrapper[4606]: I1212 00:24:46.655157 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:46 crc kubenswrapper[4606]: I1212 00:24:46.655225 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:46 crc kubenswrapper[4606]: I1212 00:24:46.655243 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:46Z","lastTransitionTime":"2025-12-12T00:24:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:46 crc kubenswrapper[4606]: I1212 00:24:46.698608 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 12 00:24:46 crc kubenswrapper[4606]: I1212 00:24:46.698607 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 12 00:24:46 crc kubenswrapper[4606]: E1212 00:24:46.698730 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 12 00:24:46 crc kubenswrapper[4606]: E1212 00:24:46.698800 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 12 00:24:46 crc kubenswrapper[4606]: I1212 00:24:46.698630 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 12 00:24:46 crc kubenswrapper[4606]: E1212 00:24:46.698904 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 12 00:24:46 crc kubenswrapper[4606]: I1212 00:24:46.757290 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:46 crc kubenswrapper[4606]: I1212 00:24:46.757360 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:46 crc kubenswrapper[4606]: I1212 00:24:46.757373 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:46 crc kubenswrapper[4606]: I1212 00:24:46.757392 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:46 crc kubenswrapper[4606]: I1212 00:24:46.757405 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:46Z","lastTransitionTime":"2025-12-12T00:24:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:46 crc kubenswrapper[4606]: I1212 00:24:46.859690 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:46 crc kubenswrapper[4606]: I1212 00:24:46.859751 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:46 crc kubenswrapper[4606]: I1212 00:24:46.859767 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:46 crc kubenswrapper[4606]: I1212 00:24:46.859792 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:46 crc kubenswrapper[4606]: I1212 00:24:46.859810 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:46Z","lastTransitionTime":"2025-12-12T00:24:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:46 crc kubenswrapper[4606]: I1212 00:24:46.962635 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:46 crc kubenswrapper[4606]: I1212 00:24:46.962715 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:46 crc kubenswrapper[4606]: I1212 00:24:46.962739 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:46 crc kubenswrapper[4606]: I1212 00:24:46.962767 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:46 crc kubenswrapper[4606]: I1212 00:24:46.962787 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:46Z","lastTransitionTime":"2025-12-12T00:24:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:47 crc kubenswrapper[4606]: I1212 00:24:47.065509 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:47 crc kubenswrapper[4606]: I1212 00:24:47.065547 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:47 crc kubenswrapper[4606]: I1212 00:24:47.065557 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:47 crc kubenswrapper[4606]: I1212 00:24:47.065571 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:47 crc kubenswrapper[4606]: I1212 00:24:47.065580 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:47Z","lastTransitionTime":"2025-12-12T00:24:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:47 crc kubenswrapper[4606]: I1212 00:24:47.168412 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:47 crc kubenswrapper[4606]: I1212 00:24:47.168447 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:47 crc kubenswrapper[4606]: I1212 00:24:47.168459 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:47 crc kubenswrapper[4606]: I1212 00:24:47.168474 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:47 crc kubenswrapper[4606]: I1212 00:24:47.168486 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:47Z","lastTransitionTime":"2025-12-12T00:24:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:47 crc kubenswrapper[4606]: I1212 00:24:47.271860 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:47 crc kubenswrapper[4606]: I1212 00:24:47.271899 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:47 crc kubenswrapper[4606]: I1212 00:24:47.271911 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:47 crc kubenswrapper[4606]: I1212 00:24:47.271928 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:47 crc kubenswrapper[4606]: I1212 00:24:47.271949 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:47Z","lastTransitionTime":"2025-12-12T00:24:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:47 crc kubenswrapper[4606]: I1212 00:24:47.374224 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:47 crc kubenswrapper[4606]: I1212 00:24:47.374273 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:47 crc kubenswrapper[4606]: I1212 00:24:47.374288 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:47 crc kubenswrapper[4606]: I1212 00:24:47.374306 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:47 crc kubenswrapper[4606]: I1212 00:24:47.374319 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:47Z","lastTransitionTime":"2025-12-12T00:24:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:47 crc kubenswrapper[4606]: I1212 00:24:47.476518 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:47 crc kubenswrapper[4606]: I1212 00:24:47.476553 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:47 crc kubenswrapper[4606]: I1212 00:24:47.476562 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:47 crc kubenswrapper[4606]: I1212 00:24:47.476576 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:47 crc kubenswrapper[4606]: I1212 00:24:47.476586 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:47Z","lastTransitionTime":"2025-12-12T00:24:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:47 crc kubenswrapper[4606]: I1212 00:24:47.579265 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:47 crc kubenswrapper[4606]: I1212 00:24:47.579300 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:47 crc kubenswrapper[4606]: I1212 00:24:47.579309 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:47 crc kubenswrapper[4606]: I1212 00:24:47.579324 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:47 crc kubenswrapper[4606]: I1212 00:24:47.579335 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:47Z","lastTransitionTime":"2025-12-12T00:24:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:48 crc kubenswrapper[4606]: I1212 00:24:47.681310 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:48 crc kubenswrapper[4606]: I1212 00:24:47.681346 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:48 crc kubenswrapper[4606]: I1212 00:24:47.681356 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:48 crc kubenswrapper[4606]: I1212 00:24:47.681372 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:48 crc kubenswrapper[4606]: I1212 00:24:47.681383 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:47Z","lastTransitionTime":"2025-12-12T00:24:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:48 crc kubenswrapper[4606]: I1212 00:24:47.699206 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mjjwd" Dec 12 00:24:48 crc kubenswrapper[4606]: E1212 00:24:47.699336 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mjjwd" podUID="0853dce1-c009-407e-960d-1113f85e503f" Dec 12 00:24:48 crc kubenswrapper[4606]: I1212 00:24:47.783668 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:48 crc kubenswrapper[4606]: I1212 00:24:47.783696 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:48 crc kubenswrapper[4606]: I1212 00:24:47.783707 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:48 crc kubenswrapper[4606]: I1212 00:24:47.783726 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:48 crc kubenswrapper[4606]: I1212 00:24:47.783738 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:47Z","lastTransitionTime":"2025-12-12T00:24:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:48 crc kubenswrapper[4606]: I1212 00:24:47.886168 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:48 crc kubenswrapper[4606]: I1212 00:24:47.886236 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:48 crc kubenswrapper[4606]: I1212 00:24:47.886247 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:48 crc kubenswrapper[4606]: I1212 00:24:47.886263 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:48 crc kubenswrapper[4606]: I1212 00:24:47.886274 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:47Z","lastTransitionTime":"2025-12-12T00:24:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:48 crc kubenswrapper[4606]: I1212 00:24:47.988230 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:48 crc kubenswrapper[4606]: I1212 00:24:47.988253 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:48 crc kubenswrapper[4606]: I1212 00:24:47.988261 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:48 crc kubenswrapper[4606]: I1212 00:24:47.988273 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:48 crc kubenswrapper[4606]: I1212 00:24:47.988281 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:47Z","lastTransitionTime":"2025-12-12T00:24:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:48 crc kubenswrapper[4606]: I1212 00:24:48.090118 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:48 crc kubenswrapper[4606]: I1212 00:24:48.090152 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:48 crc kubenswrapper[4606]: I1212 00:24:48.090162 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:48 crc kubenswrapper[4606]: I1212 00:24:48.090199 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:48 crc kubenswrapper[4606]: I1212 00:24:48.090211 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:48Z","lastTransitionTime":"2025-12-12T00:24:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:48 crc kubenswrapper[4606]: I1212 00:24:48.193034 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:48 crc kubenswrapper[4606]: I1212 00:24:48.193096 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:48 crc kubenswrapper[4606]: I1212 00:24:48.193118 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:48 crc kubenswrapper[4606]: I1212 00:24:48.193143 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:48 crc kubenswrapper[4606]: I1212 00:24:48.193163 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:48Z","lastTransitionTime":"2025-12-12T00:24:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:48 crc kubenswrapper[4606]: I1212 00:24:48.296080 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:48 crc kubenswrapper[4606]: I1212 00:24:48.296129 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:48 crc kubenswrapper[4606]: I1212 00:24:48.296147 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:48 crc kubenswrapper[4606]: I1212 00:24:48.296166 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:48 crc kubenswrapper[4606]: I1212 00:24:48.296264 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:48Z","lastTransitionTime":"2025-12-12T00:24:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:48 crc kubenswrapper[4606]: I1212 00:24:48.398128 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:48 crc kubenswrapper[4606]: I1212 00:24:48.398163 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:48 crc kubenswrapper[4606]: I1212 00:24:48.398200 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:48 crc kubenswrapper[4606]: I1212 00:24:48.398220 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:48 crc kubenswrapper[4606]: I1212 00:24:48.398233 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:48Z","lastTransitionTime":"2025-12-12T00:24:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:48 crc kubenswrapper[4606]: I1212 00:24:48.500785 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:48 crc kubenswrapper[4606]: I1212 00:24:48.500829 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:48 crc kubenswrapper[4606]: I1212 00:24:48.500841 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:48 crc kubenswrapper[4606]: I1212 00:24:48.500857 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:48 crc kubenswrapper[4606]: I1212 00:24:48.500868 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:48Z","lastTransitionTime":"2025-12-12T00:24:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:48 crc kubenswrapper[4606]: I1212 00:24:48.603775 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:48 crc kubenswrapper[4606]: I1212 00:24:48.603830 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:48 crc kubenswrapper[4606]: I1212 00:24:48.603845 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:48 crc kubenswrapper[4606]: I1212 00:24:48.603868 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:48 crc kubenswrapper[4606]: I1212 00:24:48.603886 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:48Z","lastTransitionTime":"2025-12-12T00:24:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:48 crc kubenswrapper[4606]: I1212 00:24:48.699048 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 12 00:24:48 crc kubenswrapper[4606]: I1212 00:24:48.699078 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 12 00:24:48 crc kubenswrapper[4606]: I1212 00:24:48.699079 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 12 00:24:48 crc kubenswrapper[4606]: E1212 00:24:48.699195 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 12 00:24:48 crc kubenswrapper[4606]: E1212 00:24:48.699397 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 12 00:24:48 crc kubenswrapper[4606]: E1212 00:24:48.699488 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 12 00:24:48 crc kubenswrapper[4606]: I1212 00:24:48.706616 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:48 crc kubenswrapper[4606]: I1212 00:24:48.706752 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:48 crc kubenswrapper[4606]: I1212 00:24:48.706762 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:48 crc kubenswrapper[4606]: I1212 00:24:48.706776 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:48 crc kubenswrapper[4606]: I1212 00:24:48.706784 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:48Z","lastTransitionTime":"2025-12-12T00:24:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:48 crc kubenswrapper[4606]: I1212 00:24:48.809759 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:48 crc kubenswrapper[4606]: I1212 00:24:48.809816 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:48 crc kubenswrapper[4606]: I1212 00:24:48.809831 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:48 crc kubenswrapper[4606]: I1212 00:24:48.809853 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:48 crc kubenswrapper[4606]: I1212 00:24:48.809868 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:48Z","lastTransitionTime":"2025-12-12T00:24:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:48 crc kubenswrapper[4606]: I1212 00:24:48.913655 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:48 crc kubenswrapper[4606]: I1212 00:24:48.913727 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:48 crc kubenswrapper[4606]: I1212 00:24:48.913748 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:48 crc kubenswrapper[4606]: I1212 00:24:48.913775 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:48 crc kubenswrapper[4606]: I1212 00:24:48.913795 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:48Z","lastTransitionTime":"2025-12-12T00:24:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:49 crc kubenswrapper[4606]: I1212 00:24:49.016887 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:49 crc kubenswrapper[4606]: I1212 00:24:49.016966 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:49 crc kubenswrapper[4606]: I1212 00:24:49.016987 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:49 crc kubenswrapper[4606]: I1212 00:24:49.017021 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:49 crc kubenswrapper[4606]: I1212 00:24:49.017057 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:49Z","lastTransitionTime":"2025-12-12T00:24:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:49 crc kubenswrapper[4606]: I1212 00:24:49.125762 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:49 crc kubenswrapper[4606]: I1212 00:24:49.125810 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:49 crc kubenswrapper[4606]: I1212 00:24:49.125822 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:49 crc kubenswrapper[4606]: I1212 00:24:49.125840 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:49 crc kubenswrapper[4606]: I1212 00:24:49.125852 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:49Z","lastTransitionTime":"2025-12-12T00:24:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:49 crc kubenswrapper[4606]: I1212 00:24:49.228323 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:49 crc kubenswrapper[4606]: I1212 00:24:49.228367 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:49 crc kubenswrapper[4606]: I1212 00:24:49.228378 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:49 crc kubenswrapper[4606]: I1212 00:24:49.228394 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:49 crc kubenswrapper[4606]: I1212 00:24:49.228406 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:49Z","lastTransitionTime":"2025-12-12T00:24:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:49 crc kubenswrapper[4606]: I1212 00:24:49.330981 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:49 crc kubenswrapper[4606]: I1212 00:24:49.331017 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:49 crc kubenswrapper[4606]: I1212 00:24:49.331028 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:49 crc kubenswrapper[4606]: I1212 00:24:49.331044 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:49 crc kubenswrapper[4606]: I1212 00:24:49.331056 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:49Z","lastTransitionTime":"2025-12-12T00:24:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:49 crc kubenswrapper[4606]: I1212 00:24:49.434167 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:49 crc kubenswrapper[4606]: I1212 00:24:49.434229 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:49 crc kubenswrapper[4606]: I1212 00:24:49.434244 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:49 crc kubenswrapper[4606]: I1212 00:24:49.434262 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:49 crc kubenswrapper[4606]: I1212 00:24:49.434275 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:49Z","lastTransitionTime":"2025-12-12T00:24:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:49 crc kubenswrapper[4606]: I1212 00:24:49.537114 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:49 crc kubenswrapper[4606]: I1212 00:24:49.537190 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:49 crc kubenswrapper[4606]: I1212 00:24:49.537204 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:49 crc kubenswrapper[4606]: I1212 00:24:49.537221 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:49 crc kubenswrapper[4606]: I1212 00:24:49.537235 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:49Z","lastTransitionTime":"2025-12-12T00:24:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:49 crc kubenswrapper[4606]: I1212 00:24:49.639906 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:49 crc kubenswrapper[4606]: I1212 00:24:49.639982 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:49 crc kubenswrapper[4606]: I1212 00:24:49.639996 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:49 crc kubenswrapper[4606]: I1212 00:24:49.640019 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:49 crc kubenswrapper[4606]: I1212 00:24:49.640036 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:49Z","lastTransitionTime":"2025-12-12T00:24:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:49 crc kubenswrapper[4606]: I1212 00:24:49.699582 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mjjwd" Dec 12 00:24:49 crc kubenswrapper[4606]: E1212 00:24:49.699769 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mjjwd" podUID="0853dce1-c009-407e-960d-1113f85e503f" Dec 12 00:24:49 crc kubenswrapper[4606]: I1212 00:24:49.712408 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce6c4e852b4c68e910621ad11058beecb40960d501d3e64f5fcad97c442df4ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:49Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:49 crc kubenswrapper[4606]: I1212 00:24:49.722164 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-554rp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d34036b-3243-416a-a0c0-1d1ddb3a0ca5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://346417326a8a4ba8e839ce4870dbaacd33a90b4ab096d8ce573ce598d909b51a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9k8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-554rp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:49Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:49 crc kubenswrapper[4606]: I1212 00:24:49.734970 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-w4rbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"470c2076-46bf-4305-9fb1-3e509eb4d4f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78e7b08154aa4792439cd9e13c38f1c3a106a4d138d8e6a47d5406c82f11662c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34a40d0ce85078630553b90b7a35510e0d3523f600db8940d5e0cf063903192d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34a40d0ce85078630553b90b7a35510e0d3523f600db8940d5e0cf063903192d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f0d903a3e09486a00bc5031c1a100d7a520f8e0cbdcccbeadb796e70f1075b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f0d903a3e09486a00bc5031c1a100d7a520f8e0cbdcccbeadb796e70f1075b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06909fa1392ee8f29394ecd7411437e4f331ccd2007d0a534d0292175a7260ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06909fa1392ee8f29394ecd7411437e4f331ccd2007d0a534d0292175a7260ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://328f409c465a8289f2e2e070ac2475187e763de8354cad8af5b5b4e0a3539628\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://328f409c465a8289f2e2e070ac2475187e763de8354cad8af5b5b4e0a3539628\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09980f0c6b16141a2a66d2768c6e01f75c37314fea48825e9b9707a3fc4d3f9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09980f0c6b16141a2a66d2768c6e01f75c37314fea48825e9b9707a3fc4d3f9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d34cd079b74cccb75c66edf8c51b64113483767ced4eb56ad429d6527d0bb0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d34cd079b74cccb75c66edf8c51b64113483767ced4eb56ad429d6527d0bb0c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-w4rbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:49Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:49 crc kubenswrapper[4606]: I1212 00:24:49.742953 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:49 crc kubenswrapper[4606]: I1212 00:24:49.742991 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:49 crc kubenswrapper[4606]: I1212 00:24:49.743009 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:49 crc kubenswrapper[4606]: I1212 00:24:49.743031 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:49 crc kubenswrapper[4606]: I1212 00:24:49.743047 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:49Z","lastTransitionTime":"2025-12-12T00:24:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:49 crc kubenswrapper[4606]: I1212 00:24:49.753573 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da25b0ba-5398-4185-a4c1-aeba44ae5633\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa827acce161127216bcfcf3553efeb627cbe52e4d23a8cb011e025f67dde670\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://653bdfa6f15b75149c89a4aff0706e132368f8bc75520cde9f1e1c55c8866537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbb742680d55b47d51cd1964e187b6816e50a88befbce58d3a8746af74cf7071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://366ddc63e1156e702b24e0c3d3c02785c10324669fa851e46851f9a2b8a46a5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6183a4685fda61493c792e9ae693cd7f76e4aad9753f622b48347da3fd2b3b00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0867a89de2edf12036267a937b7e3b1a801e9710ff0a1250c20e020c80ac4d5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04af88555385f41f61f31bcc6dcfd9ea677fd98ba9ef0b1a25003fff11e8be1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04af88555385f41f61f31bcc6dcfd9ea677fd98ba9ef0b1a25003fff11e8be1b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-12T00:24:44Z\\\",\\\"message\\\":\\\"atch factory\\\\nI1212 00:24:44.033859 6579 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1212 00:24:44.033876 6579 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1212 00:24:44.033886 6579 handler.go:208] Removed *v1.Node event handler 2\\\\nI1212 00:24:44.033899 6579 handler.go:208] Removed *v1.Node event handler 7\\\\nI1212 00:24:44.033905 6579 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1212 00:24:44.033909 6579 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1212 00:24:44.033926 6579 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1212 00:24:44.033935 6579 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1212 00:24:44.034088 6579 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1212 00:24:44.034244 6579 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1212 00:24:44.034528 6579 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1212 00:24:44.034862 6579 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-12T00:24:43Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-hpw5w_openshift-ovn-kubernetes(da25b0ba-5398-4185-a4c1-aeba44ae5633)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e3c08f155fa3cef9692edf10c4df9719f3647b5625a9ac68a7c158afb80dd86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f94a9f641815d76464c17c6145b9ac4803eedab3bea29a52d65a793d448cebe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f94a9f641815d76464c17c6145b9ac4803eedab3bea29a52d65a793d448cebe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hpw5w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:49Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:49 crc kubenswrapper[4606]: I1212 00:24:49.770597 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7rxps" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c831d6d-b07d-46dd-adc0-85239379350f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba170e097af91d1b27e0ee3e414e0831a5a93256c693247e7129a5fda0b9dc85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:24:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbjbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d66f7e90d4969cb2791da434d47d0e4fb269aa8760347c7d59bd8d16f4cf42c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:24:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbjbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:24:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7rxps\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:49Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:49 crc kubenswrapper[4606]: I1212 00:24:49.782604 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mjjwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0853dce1-c009-407e-960d-1113f85e503f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwwtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwwtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:24:03Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mjjwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:49Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:49 crc kubenswrapper[4606]: I1212 00:24:49.799332 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:49Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:49 crc kubenswrapper[4606]: I1212 00:24:49.816929 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92839d27-9755-4aaa-a4c2-8aa83b6473de\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d18b3a560566347e9cfcd748453a89d1588de77f02a5e961a71916c6182eff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9531d10c07644d54aeff48edf4ac068968acbd1b6597baabcb0660e7bcd54c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76f2e4c2d97f36aebba4bcbabe12214a1df27637376d1440ee54d43fca39065\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b764ca1153735e365d95ed2855c7d41cd7457614e1ddce0f008d62720d607e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c75350d79a93c04db2bd6d71995920cde9b897776249b8696f3ead3939f7d5d4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"ed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1765499023\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1765499022\\\\\\\\\\\\\\\" (2025-12-11 23:23:42 +0000 UTC to 2026-12-11 23:23:42 +0000 UTC (now=2025-12-12 00:23:48.120831298 +0000 UTC))\\\\\\\"\\\\nI1212 00:23:48.120889 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1212 00:23:48.120921 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1212 00:23:48.120964 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1212 00:23:48.120986 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1212 00:23:48.121028 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3271453193/tls.crt::/tmp/serving-cert-3271453193/tls.key\\\\\\\"\\\\nI1212 00:23:48.121084 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1212 00:23:48.121211 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1212 00:23:48.121228 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1212 00:23:48.121265 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1212 00:23:48.121275 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1212 00:23:48.121345 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1212 00:23:48.121358 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF1212 00:23:48.123292 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3baf1bf5fbaa04f9f165f43b6a4fba496be0d453641fcd8bbc811687891dd66\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bbabfdcef0c9f4dcaada0bf5c2e98c083fd0f809462f017f8a767ef800e8e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0bbabfdcef0c9f4dcaada0bf5c2e98c083fd0f809462f017f8a767ef800e8e13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:49Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:49 crc kubenswrapper[4606]: I1212 00:24:49.834416 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ba650bd-b1e1-4950-98c6-2bee87181b18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de6cd366dc8d770945235faa6560cea35c0ef7eabd95cb272639e2116e2d623d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a380a96c542428bf9cf66391f4d843144052269b294636e863319b2ba954047f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://902a9d2fb0d7dcdec6339fac981ee0539911f1998a86ecf02a01015d3b1b5907\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4309a0a93977264e5000a930a32a4fb18ebe51fcf2af20bbf153f30f09759ff6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:49Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:49 crc kubenswrapper[4606]: I1212 00:24:49.846391 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:49 crc kubenswrapper[4606]: I1212 00:24:49.846453 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:49 crc kubenswrapper[4606]: I1212 00:24:49.846463 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:49 crc kubenswrapper[4606]: I1212 00:24:49.846485 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:49 crc kubenswrapper[4606]: I1212 00:24:49.846496 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:49Z","lastTransitionTime":"2025-12-12T00:24:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:49 crc kubenswrapper[4606]: I1212 00:24:49.849636 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:49Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:49 crc kubenswrapper[4606]: I1212 00:24:49.865586 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://694cd2f423e8241bd8fd68395824d1cd45a12d80e92ad57e63e59c0642b21384\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:49Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:49 crc kubenswrapper[4606]: I1212 00:24:49.880783 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a543e227-be89-40cb-941d-b4707cc28921\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b09638dd0d881593745c14548156f20a9366f65ecfa5018d510b902240dd4f7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-986w2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5de882904f4a5718bcf44d07da05096363447836f325f35130914c17a30c3a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-986w2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cqmz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:49Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:49 crc kubenswrapper[4606]: I1212 00:24:49.897747 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xzcfk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3601dd0b8c0a574cc4689906f3c4ed491f3432cae8b328c55f07e5d17b6e976\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84110159d07d70b0607ac029fec70fb043bbb0e310c39955e49becfe87e9faa0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-12T00:24:36Z\\\",\\\"message\\\":\\\"2025-12-12T00:23:51+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_d712852f-61b9-48f1-a1d6-1031b964219c\\\\n2025-12-12T00:23:51+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_d712852f-61b9-48f1-a1d6-1031b964219c to /host/opt/cni/bin/\\\\n2025-12-12T00:23:51Z [verbose] multus-daemon started\\\\n2025-12-12T00:23:51Z [verbose] Readiness Indicator file check\\\\n2025-12-12T00:24:36Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-spsv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xzcfk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:49Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:49 crc kubenswrapper[4606]: I1212 00:24:49.917768 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f8df398-6835-482a-a2cd-3395b6e9efab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b993d03e36e69edb510ce98f50dfb98344fb6fe2b5debeff3a2004807dd00ee6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c6fef20d4e159a405f0ab0206c49e5e314e97bf7ddaf0758e090106945c9ffe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://accf40ef1339b65136666c1ea6f21a70d2152a650c87ef3ade19ddb6f9effbc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbd49ad6d6011a235d440da0e8058c0fc4dcc64effec37b50a68e84438d537a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2860bce1479ca1d558c4f5230d76cadcaf3efc0e3842ef0d371ecf1168cd852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7780eb068dc4a157794f9253367d2198a38a2c039cf3906fc2319cf48db3cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa7780eb068dc4a157794f9253367d2198a38a2c039cf3906fc2319cf48db3cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://684c387c89889a0a1e8d317ca70ebfc71ea377eb5de9c08c71a3ff02faf99e4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://684c387c89889a0a1e8d317ca70ebfc71ea377eb5de9c08c71a3ff02faf99e4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://78da314feaec6cab81e27eaea086d2bba3a05c4b73ce2add1a0bb57a226b0f5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78da314feaec6cab81e27eaea086d2bba3a05c4b73ce2add1a0bb57a226b0f5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:49Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:49 crc kubenswrapper[4606]: I1212 00:24:49.931052 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab571c5f-6f1d-4cb9-9a00-a57cc7baad63\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7a27c1b503f68ce99c9004d0190e5c74380bf2ff33b2b0e1e7f424e5cf9d450\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a159e87c39859bf3b5652b40223e4a8fdd9dcae3d23c1fae17d0eb8b5842a71c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e338552ee9504d46234c3caf3d9b7306a033258a94b6cf7c542bb957ea32a94c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c88faa76d30856d9170f04a153de4239eefbef49b2ee25a5bd04502697248b5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c88faa76d30856d9170f04a153de4239eefbef49b2ee25a5bd04502697248b5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:29Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:49Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:49 crc kubenswrapper[4606]: I1212 00:24:49.945227 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4986a294a19f85082b9caf854f13f9c3587ca49314b40917b00768ee0bb51ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4251eab73ccb6141903993f8df922037762f7c11dc1e0ef1a2b5708ed435307\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:49Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:49 crc kubenswrapper[4606]: I1212 00:24:49.948430 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:49 crc kubenswrapper[4606]: I1212 00:24:49.948457 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:49 crc kubenswrapper[4606]: I1212 00:24:49.948466 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:49 crc kubenswrapper[4606]: I1212 00:24:49.948478 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:49 crc kubenswrapper[4606]: I1212 00:24:49.948488 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:49Z","lastTransitionTime":"2025-12-12T00:24:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:49 crc kubenswrapper[4606]: I1212 00:24:49.960389 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:49Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:49 crc kubenswrapper[4606]: I1212 00:24:49.970988 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qtz7b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f437a237-3eb8-4817-b0a9-35efece69933\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f587a144b02f6ba1229380174f2a4f2c50ed33702fe8d64eb5ff7174b32eb2d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qq49f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qtz7b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:49Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:49 crc kubenswrapper[4606]: I1212 00:24:49.980373 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67cd27ea-8882-4aa1-ba7a-eb2058cac536\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c06b3470d2a194c99cd63ef6039a6f2abff89fc93e8f61feca67c4eefa9b9ac8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://892ac723ca54cc612171eec750846b774fcfc54e09531ff344813a1e1afc61fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://892ac723ca54cc612171eec750846b774fcfc54e09531ff344813a1e1afc61fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:49Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:50 crc kubenswrapper[4606]: I1212 00:24:50.050522 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:50 crc kubenswrapper[4606]: I1212 00:24:50.050554 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:50 crc kubenswrapper[4606]: I1212 00:24:50.050572 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:50 crc kubenswrapper[4606]: I1212 00:24:50.050589 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:50 crc kubenswrapper[4606]: I1212 00:24:50.050602 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:50Z","lastTransitionTime":"2025-12-12T00:24:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:50 crc kubenswrapper[4606]: I1212 00:24:50.152634 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:50 crc kubenswrapper[4606]: I1212 00:24:50.152954 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:50 crc kubenswrapper[4606]: I1212 00:24:50.153340 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:50 crc kubenswrapper[4606]: I1212 00:24:50.153534 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:50 crc kubenswrapper[4606]: I1212 00:24:50.153711 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:50Z","lastTransitionTime":"2025-12-12T00:24:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:50 crc kubenswrapper[4606]: I1212 00:24:50.256883 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:50 crc kubenswrapper[4606]: I1212 00:24:50.256918 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:50 crc kubenswrapper[4606]: I1212 00:24:50.256929 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:50 crc kubenswrapper[4606]: I1212 00:24:50.256943 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:50 crc kubenswrapper[4606]: I1212 00:24:50.256954 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:50Z","lastTransitionTime":"2025-12-12T00:24:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:50 crc kubenswrapper[4606]: I1212 00:24:50.360613 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:50 crc kubenswrapper[4606]: I1212 00:24:50.360646 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:50 crc kubenswrapper[4606]: I1212 00:24:50.360657 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:50 crc kubenswrapper[4606]: I1212 00:24:50.360671 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:50 crc kubenswrapper[4606]: I1212 00:24:50.360680 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:50Z","lastTransitionTime":"2025-12-12T00:24:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:50 crc kubenswrapper[4606]: I1212 00:24:50.463234 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:50 crc kubenswrapper[4606]: I1212 00:24:50.463289 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:50 crc kubenswrapper[4606]: I1212 00:24:50.463313 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:50 crc kubenswrapper[4606]: I1212 00:24:50.463344 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:50 crc kubenswrapper[4606]: I1212 00:24:50.463369 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:50Z","lastTransitionTime":"2025-12-12T00:24:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:50 crc kubenswrapper[4606]: I1212 00:24:50.565542 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:50 crc kubenswrapper[4606]: I1212 00:24:50.565581 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:50 crc kubenswrapper[4606]: I1212 00:24:50.565594 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:50 crc kubenswrapper[4606]: I1212 00:24:50.565616 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:50 crc kubenswrapper[4606]: I1212 00:24:50.565626 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:50Z","lastTransitionTime":"2025-12-12T00:24:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:50 crc kubenswrapper[4606]: I1212 00:24:50.667376 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:50 crc kubenswrapper[4606]: I1212 00:24:50.667414 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:50 crc kubenswrapper[4606]: I1212 00:24:50.667424 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:50 crc kubenswrapper[4606]: I1212 00:24:50.667438 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:50 crc kubenswrapper[4606]: I1212 00:24:50.667457 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:50Z","lastTransitionTime":"2025-12-12T00:24:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:50 crc kubenswrapper[4606]: I1212 00:24:50.698876 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 12 00:24:50 crc kubenswrapper[4606]: I1212 00:24:50.698913 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 12 00:24:50 crc kubenswrapper[4606]: I1212 00:24:50.698974 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 12 00:24:50 crc kubenswrapper[4606]: E1212 00:24:50.699018 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 12 00:24:50 crc kubenswrapper[4606]: E1212 00:24:50.699205 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 12 00:24:50 crc kubenswrapper[4606]: E1212 00:24:50.699261 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 12 00:24:50 crc kubenswrapper[4606]: I1212 00:24:50.769822 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:50 crc kubenswrapper[4606]: I1212 00:24:50.770248 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:50 crc kubenswrapper[4606]: I1212 00:24:50.770492 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:50 crc kubenswrapper[4606]: I1212 00:24:50.770717 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:50 crc kubenswrapper[4606]: I1212 00:24:50.770912 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:50Z","lastTransitionTime":"2025-12-12T00:24:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:50 crc kubenswrapper[4606]: I1212 00:24:50.873868 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:50 crc kubenswrapper[4606]: I1212 00:24:50.873914 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:50 crc kubenswrapper[4606]: I1212 00:24:50.873928 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:50 crc kubenswrapper[4606]: I1212 00:24:50.873947 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:50 crc kubenswrapper[4606]: I1212 00:24:50.873960 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:50Z","lastTransitionTime":"2025-12-12T00:24:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:50 crc kubenswrapper[4606]: I1212 00:24:50.976228 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:50 crc kubenswrapper[4606]: I1212 00:24:50.976278 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:50 crc kubenswrapper[4606]: I1212 00:24:50.976290 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:50 crc kubenswrapper[4606]: I1212 00:24:50.976308 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:50 crc kubenswrapper[4606]: I1212 00:24:50.976321 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:50Z","lastTransitionTime":"2025-12-12T00:24:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:51 crc kubenswrapper[4606]: I1212 00:24:51.078506 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:51 crc kubenswrapper[4606]: I1212 00:24:51.078539 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:51 crc kubenswrapper[4606]: I1212 00:24:51.078563 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:51 crc kubenswrapper[4606]: I1212 00:24:51.078583 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:51 crc kubenswrapper[4606]: I1212 00:24:51.078597 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:51Z","lastTransitionTime":"2025-12-12T00:24:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:51 crc kubenswrapper[4606]: I1212 00:24:51.181717 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:51 crc kubenswrapper[4606]: I1212 00:24:51.181760 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:51 crc kubenswrapper[4606]: I1212 00:24:51.181778 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:51 crc kubenswrapper[4606]: I1212 00:24:51.181801 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:51 crc kubenswrapper[4606]: I1212 00:24:51.181819 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:51Z","lastTransitionTime":"2025-12-12T00:24:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:51 crc kubenswrapper[4606]: I1212 00:24:51.285751 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:51 crc kubenswrapper[4606]: I1212 00:24:51.285807 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:51 crc kubenswrapper[4606]: I1212 00:24:51.285825 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:51 crc kubenswrapper[4606]: I1212 00:24:51.285850 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:51 crc kubenswrapper[4606]: I1212 00:24:51.285868 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:51Z","lastTransitionTime":"2025-12-12T00:24:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:51 crc kubenswrapper[4606]: I1212 00:24:51.388851 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:51 crc kubenswrapper[4606]: I1212 00:24:51.388933 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:51 crc kubenswrapper[4606]: I1212 00:24:51.388959 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:51 crc kubenswrapper[4606]: I1212 00:24:51.388986 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:51 crc kubenswrapper[4606]: I1212 00:24:51.389004 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:51Z","lastTransitionTime":"2025-12-12T00:24:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:51 crc kubenswrapper[4606]: I1212 00:24:51.492704 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:51 crc kubenswrapper[4606]: I1212 00:24:51.492767 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:51 crc kubenswrapper[4606]: I1212 00:24:51.492788 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:51 crc kubenswrapper[4606]: I1212 00:24:51.492816 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:51 crc kubenswrapper[4606]: I1212 00:24:51.492838 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:51Z","lastTransitionTime":"2025-12-12T00:24:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:51 crc kubenswrapper[4606]: I1212 00:24:51.596461 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:51 crc kubenswrapper[4606]: I1212 00:24:51.596506 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:51 crc kubenswrapper[4606]: I1212 00:24:51.596517 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:51 crc kubenswrapper[4606]: I1212 00:24:51.596534 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:51 crc kubenswrapper[4606]: I1212 00:24:51.596545 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:51Z","lastTransitionTime":"2025-12-12T00:24:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:51 crc kubenswrapper[4606]: I1212 00:24:51.699123 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mjjwd" Dec 12 00:24:51 crc kubenswrapper[4606]: E1212 00:24:51.699414 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mjjwd" podUID="0853dce1-c009-407e-960d-1113f85e503f" Dec 12 00:24:51 crc kubenswrapper[4606]: I1212 00:24:51.699764 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:51 crc kubenswrapper[4606]: I1212 00:24:51.699861 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:51 crc kubenswrapper[4606]: I1212 00:24:51.699880 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:51 crc kubenswrapper[4606]: I1212 00:24:51.699907 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:51 crc kubenswrapper[4606]: I1212 00:24:51.699927 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:51Z","lastTransitionTime":"2025-12-12T00:24:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:51 crc kubenswrapper[4606]: I1212 00:24:51.803275 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:51 crc kubenswrapper[4606]: I1212 00:24:51.803602 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:51 crc kubenswrapper[4606]: I1212 00:24:51.803627 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:51 crc kubenswrapper[4606]: I1212 00:24:51.803660 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:51 crc kubenswrapper[4606]: I1212 00:24:51.803681 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:51Z","lastTransitionTime":"2025-12-12T00:24:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:51 crc kubenswrapper[4606]: I1212 00:24:51.914950 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:51 crc kubenswrapper[4606]: I1212 00:24:51.915017 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:51 crc kubenswrapper[4606]: I1212 00:24:51.915038 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:51 crc kubenswrapper[4606]: I1212 00:24:51.915067 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:51 crc kubenswrapper[4606]: I1212 00:24:51.915101 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:51Z","lastTransitionTime":"2025-12-12T00:24:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:52 crc kubenswrapper[4606]: I1212 00:24:52.018233 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:52 crc kubenswrapper[4606]: I1212 00:24:52.018268 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:52 crc kubenswrapper[4606]: I1212 00:24:52.018275 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:52 crc kubenswrapper[4606]: I1212 00:24:52.018287 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:52 crc kubenswrapper[4606]: I1212 00:24:52.018296 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:52Z","lastTransitionTime":"2025-12-12T00:24:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:52 crc kubenswrapper[4606]: I1212 00:24:52.121115 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:52 crc kubenswrapper[4606]: I1212 00:24:52.121203 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:52 crc kubenswrapper[4606]: I1212 00:24:52.121218 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:52 crc kubenswrapper[4606]: I1212 00:24:52.121244 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:52 crc kubenswrapper[4606]: I1212 00:24:52.121258 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:52Z","lastTransitionTime":"2025-12-12T00:24:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:52 crc kubenswrapper[4606]: I1212 00:24:52.223572 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:52 crc kubenswrapper[4606]: I1212 00:24:52.223623 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:52 crc kubenswrapper[4606]: I1212 00:24:52.223637 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:52 crc kubenswrapper[4606]: I1212 00:24:52.223653 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:52 crc kubenswrapper[4606]: I1212 00:24:52.223664 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:52Z","lastTransitionTime":"2025-12-12T00:24:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:52 crc kubenswrapper[4606]: I1212 00:24:52.326283 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:52 crc kubenswrapper[4606]: I1212 00:24:52.326323 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:52 crc kubenswrapper[4606]: I1212 00:24:52.326332 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:52 crc kubenswrapper[4606]: I1212 00:24:52.326346 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:52 crc kubenswrapper[4606]: I1212 00:24:52.326355 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:52Z","lastTransitionTime":"2025-12-12T00:24:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:52 crc kubenswrapper[4606]: I1212 00:24:52.428976 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:52 crc kubenswrapper[4606]: I1212 00:24:52.429011 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:52 crc kubenswrapper[4606]: I1212 00:24:52.429041 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:52 crc kubenswrapper[4606]: I1212 00:24:52.429058 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:52 crc kubenswrapper[4606]: I1212 00:24:52.429069 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:52Z","lastTransitionTime":"2025-12-12T00:24:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:52 crc kubenswrapper[4606]: I1212 00:24:52.459936 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:52 crc kubenswrapper[4606]: I1212 00:24:52.459960 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:52 crc kubenswrapper[4606]: I1212 00:24:52.459987 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:52 crc kubenswrapper[4606]: I1212 00:24:52.460019 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:52 crc kubenswrapper[4606]: I1212 00:24:52.460028 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:52Z","lastTransitionTime":"2025-12-12T00:24:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:52 crc kubenswrapper[4606]: E1212 00:24:52.471275 4606 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:24:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:24:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:24:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:24:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"19ab28be-ccfb-4859-88d0-8d375e2d06f2\\\",\\\"systemUUID\\\":\\\"d5982033-b2dc-474c-9cda-275cd567c208\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:52Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:52 crc kubenswrapper[4606]: I1212 00:24:52.475605 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:52 crc kubenswrapper[4606]: I1212 00:24:52.475640 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:52 crc kubenswrapper[4606]: I1212 00:24:52.475649 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:52 crc kubenswrapper[4606]: I1212 00:24:52.475662 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:52 crc kubenswrapper[4606]: I1212 00:24:52.475672 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:52Z","lastTransitionTime":"2025-12-12T00:24:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:52 crc kubenswrapper[4606]: E1212 00:24:52.489277 4606 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:24:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:24:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:24:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:24:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"19ab28be-ccfb-4859-88d0-8d375e2d06f2\\\",\\\"systemUUID\\\":\\\"d5982033-b2dc-474c-9cda-275cd567c208\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:52Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:52 crc kubenswrapper[4606]: I1212 00:24:52.497825 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:52 crc kubenswrapper[4606]: I1212 00:24:52.497860 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:52 crc kubenswrapper[4606]: I1212 00:24:52.497868 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:52 crc kubenswrapper[4606]: I1212 00:24:52.497882 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:52 crc kubenswrapper[4606]: I1212 00:24:52.497891 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:52Z","lastTransitionTime":"2025-12-12T00:24:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:52 crc kubenswrapper[4606]: E1212 00:24:52.512381 4606 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:24:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:24:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:24:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:24:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"19ab28be-ccfb-4859-88d0-8d375e2d06f2\\\",\\\"systemUUID\\\":\\\"d5982033-b2dc-474c-9cda-275cd567c208\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:52Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:52 crc kubenswrapper[4606]: I1212 00:24:52.516107 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:52 crc kubenswrapper[4606]: I1212 00:24:52.516131 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:52 crc kubenswrapper[4606]: I1212 00:24:52.516140 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:52 crc kubenswrapper[4606]: I1212 00:24:52.516153 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:52 crc kubenswrapper[4606]: I1212 00:24:52.516161 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:52Z","lastTransitionTime":"2025-12-12T00:24:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:52 crc kubenswrapper[4606]: E1212 00:24:52.526748 4606 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:24:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:24:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:24:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:24:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"19ab28be-ccfb-4859-88d0-8d375e2d06f2\\\",\\\"systemUUID\\\":\\\"d5982033-b2dc-474c-9cda-275cd567c208\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:52Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:52 crc kubenswrapper[4606]: I1212 00:24:52.529490 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:52 crc kubenswrapper[4606]: I1212 00:24:52.529507 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:52 crc kubenswrapper[4606]: I1212 00:24:52.529515 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:52 crc kubenswrapper[4606]: I1212 00:24:52.529526 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:52 crc kubenswrapper[4606]: I1212 00:24:52.529535 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:52Z","lastTransitionTime":"2025-12-12T00:24:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:52 crc kubenswrapper[4606]: E1212 00:24:52.539483 4606 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:24:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:24:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:24:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:24:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"19ab28be-ccfb-4859-88d0-8d375e2d06f2\\\",\\\"systemUUID\\\":\\\"d5982033-b2dc-474c-9cda-275cd567c208\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:52Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:52 crc kubenswrapper[4606]: E1212 00:24:52.539616 4606 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 12 00:24:52 crc kubenswrapper[4606]: I1212 00:24:52.541268 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:52 crc kubenswrapper[4606]: I1212 00:24:52.541299 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:52 crc kubenswrapper[4606]: I1212 00:24:52.541308 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:52 crc kubenswrapper[4606]: I1212 00:24:52.541320 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:52 crc kubenswrapper[4606]: I1212 00:24:52.541331 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:52Z","lastTransitionTime":"2025-12-12T00:24:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:52 crc kubenswrapper[4606]: I1212 00:24:52.644259 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:52 crc kubenswrapper[4606]: I1212 00:24:52.644294 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:52 crc kubenswrapper[4606]: I1212 00:24:52.644304 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:52 crc kubenswrapper[4606]: I1212 00:24:52.644318 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:52 crc kubenswrapper[4606]: I1212 00:24:52.644329 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:52Z","lastTransitionTime":"2025-12-12T00:24:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:52 crc kubenswrapper[4606]: I1212 00:24:52.699590 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 12 00:24:52 crc kubenswrapper[4606]: E1212 00:24:52.699798 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 12 00:24:52 crc kubenswrapper[4606]: I1212 00:24:52.699627 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 12 00:24:52 crc kubenswrapper[4606]: E1212 00:24:52.699923 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 12 00:24:52 crc kubenswrapper[4606]: I1212 00:24:52.699590 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 12 00:24:52 crc kubenswrapper[4606]: E1212 00:24:52.700020 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 12 00:24:52 crc kubenswrapper[4606]: I1212 00:24:52.737430 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 12 00:24:52 crc kubenswrapper[4606]: E1212 00:24:52.737633 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-12 00:25:56.737597802 +0000 UTC m=+147.282950708 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 12 00:24:52 crc kubenswrapper[4606]: I1212 00:24:52.747398 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:52 crc kubenswrapper[4606]: I1212 00:24:52.747437 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:52 crc kubenswrapper[4606]: I1212 00:24:52.747446 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:52 crc kubenswrapper[4606]: I1212 00:24:52.747462 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:52 crc kubenswrapper[4606]: I1212 00:24:52.747472 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:52Z","lastTransitionTime":"2025-12-12T00:24:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:52 crc kubenswrapper[4606]: I1212 00:24:52.839166 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 12 00:24:52 crc kubenswrapper[4606]: I1212 00:24:52.839306 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 12 00:24:52 crc kubenswrapper[4606]: I1212 00:24:52.839419 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 12 00:24:52 crc kubenswrapper[4606]: E1212 00:24:52.839429 4606 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 12 00:24:52 crc kubenswrapper[4606]: E1212 00:24:52.839559 4606 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 12 00:24:52 crc kubenswrapper[4606]: I1212 00:24:52.839577 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 12 00:24:52 crc kubenswrapper[4606]: E1212 00:24:52.839608 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-12 00:25:56.839592046 +0000 UTC m=+147.384944912 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 12 00:24:52 crc kubenswrapper[4606]: E1212 00:24:52.839516 4606 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 12 00:24:52 crc kubenswrapper[4606]: E1212 00:24:52.839729 4606 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 12 00:24:52 crc kubenswrapper[4606]: E1212 00:24:52.839761 4606 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 12 00:24:52 crc kubenswrapper[4606]: E1212 00:24:52.839824 4606 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 12 00:24:52 crc kubenswrapper[4606]: E1212 00:24:52.839859 4606 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 12 00:24:52 crc kubenswrapper[4606]: E1212 00:24:52.839879 4606 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 12 00:24:52 crc kubenswrapper[4606]: E1212 00:24:52.839890 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-12 00:25:56.839635887 +0000 UTC m=+147.384988813 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 12 00:24:52 crc kubenswrapper[4606]: E1212 00:24:52.839927 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-12 00:25:56.839908295 +0000 UTC m=+147.385261271 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 12 00:24:52 crc kubenswrapper[4606]: E1212 00:24:52.839956 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-12 00:25:56.839940716 +0000 UTC m=+147.385293762 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 12 00:24:52 crc kubenswrapper[4606]: I1212 00:24:52.850557 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:52 crc kubenswrapper[4606]: I1212 00:24:52.850608 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:52 crc kubenswrapper[4606]: I1212 00:24:52.850625 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:52 crc kubenswrapper[4606]: I1212 00:24:52.850649 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:52 crc kubenswrapper[4606]: I1212 00:24:52.850666 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:52Z","lastTransitionTime":"2025-12-12T00:24:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:52 crc kubenswrapper[4606]: I1212 00:24:52.953592 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:52 crc kubenswrapper[4606]: I1212 00:24:52.953662 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:52 crc kubenswrapper[4606]: I1212 00:24:52.953689 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:52 crc kubenswrapper[4606]: I1212 00:24:52.953719 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:52 crc kubenswrapper[4606]: I1212 00:24:52.953823 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:52Z","lastTransitionTime":"2025-12-12T00:24:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:53 crc kubenswrapper[4606]: I1212 00:24:53.056235 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:53 crc kubenswrapper[4606]: I1212 00:24:53.056268 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:53 crc kubenswrapper[4606]: I1212 00:24:53.056280 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:53 crc kubenswrapper[4606]: I1212 00:24:53.056296 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:53 crc kubenswrapper[4606]: I1212 00:24:53.056306 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:53Z","lastTransitionTime":"2025-12-12T00:24:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:53 crc kubenswrapper[4606]: I1212 00:24:53.160496 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:53 crc kubenswrapper[4606]: I1212 00:24:53.160542 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:53 crc kubenswrapper[4606]: I1212 00:24:53.160559 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:53 crc kubenswrapper[4606]: I1212 00:24:53.160578 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:53 crc kubenswrapper[4606]: I1212 00:24:53.160591 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:53Z","lastTransitionTime":"2025-12-12T00:24:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:53 crc kubenswrapper[4606]: I1212 00:24:53.263447 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:53 crc kubenswrapper[4606]: I1212 00:24:53.263500 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:53 crc kubenswrapper[4606]: I1212 00:24:53.263511 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:53 crc kubenswrapper[4606]: I1212 00:24:53.263528 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:53 crc kubenswrapper[4606]: I1212 00:24:53.263562 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:53Z","lastTransitionTime":"2025-12-12T00:24:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:53 crc kubenswrapper[4606]: I1212 00:24:53.366612 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:53 crc kubenswrapper[4606]: I1212 00:24:53.366656 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:53 crc kubenswrapper[4606]: I1212 00:24:53.366667 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:53 crc kubenswrapper[4606]: I1212 00:24:53.366683 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:53 crc kubenswrapper[4606]: I1212 00:24:53.366694 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:53Z","lastTransitionTime":"2025-12-12T00:24:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:53 crc kubenswrapper[4606]: I1212 00:24:53.469878 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:53 crc kubenswrapper[4606]: I1212 00:24:53.469937 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:53 crc kubenswrapper[4606]: I1212 00:24:53.469954 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:53 crc kubenswrapper[4606]: I1212 00:24:53.469977 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:53 crc kubenswrapper[4606]: I1212 00:24:53.469994 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:53Z","lastTransitionTime":"2025-12-12T00:24:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:53 crc kubenswrapper[4606]: I1212 00:24:53.572914 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:53 crc kubenswrapper[4606]: I1212 00:24:53.572975 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:53 crc kubenswrapper[4606]: I1212 00:24:53.572994 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:53 crc kubenswrapper[4606]: I1212 00:24:53.573017 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:53 crc kubenswrapper[4606]: I1212 00:24:53.573034 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:53Z","lastTransitionTime":"2025-12-12T00:24:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:53 crc kubenswrapper[4606]: I1212 00:24:53.676768 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:53 crc kubenswrapper[4606]: I1212 00:24:53.676867 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:53 crc kubenswrapper[4606]: I1212 00:24:53.676893 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:53 crc kubenswrapper[4606]: I1212 00:24:53.676924 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:53 crc kubenswrapper[4606]: I1212 00:24:53.676945 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:53Z","lastTransitionTime":"2025-12-12T00:24:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:53 crc kubenswrapper[4606]: I1212 00:24:53.699627 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mjjwd" Dec 12 00:24:53 crc kubenswrapper[4606]: E1212 00:24:53.699893 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mjjwd" podUID="0853dce1-c009-407e-960d-1113f85e503f" Dec 12 00:24:53 crc kubenswrapper[4606]: I1212 00:24:53.780588 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:53 crc kubenswrapper[4606]: I1212 00:24:53.780665 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:53 crc kubenswrapper[4606]: I1212 00:24:53.780687 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:53 crc kubenswrapper[4606]: I1212 00:24:53.780724 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:53 crc kubenswrapper[4606]: I1212 00:24:53.780748 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:53Z","lastTransitionTime":"2025-12-12T00:24:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:53 crc kubenswrapper[4606]: I1212 00:24:53.884128 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:53 crc kubenswrapper[4606]: I1212 00:24:53.884237 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:53 crc kubenswrapper[4606]: I1212 00:24:53.884260 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:53 crc kubenswrapper[4606]: I1212 00:24:53.884289 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:53 crc kubenswrapper[4606]: I1212 00:24:53.884312 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:53Z","lastTransitionTime":"2025-12-12T00:24:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:53 crc kubenswrapper[4606]: I1212 00:24:53.987348 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:53 crc kubenswrapper[4606]: I1212 00:24:53.987425 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:53 crc kubenswrapper[4606]: I1212 00:24:53.987449 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:53 crc kubenswrapper[4606]: I1212 00:24:53.987476 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:53 crc kubenswrapper[4606]: I1212 00:24:53.987495 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:53Z","lastTransitionTime":"2025-12-12T00:24:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:54 crc kubenswrapper[4606]: I1212 00:24:54.090622 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:54 crc kubenswrapper[4606]: I1212 00:24:54.090662 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:54 crc kubenswrapper[4606]: I1212 00:24:54.090673 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:54 crc kubenswrapper[4606]: I1212 00:24:54.090700 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:54 crc kubenswrapper[4606]: I1212 00:24:54.090722 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:54Z","lastTransitionTime":"2025-12-12T00:24:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:54 crc kubenswrapper[4606]: I1212 00:24:54.193933 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:54 crc kubenswrapper[4606]: I1212 00:24:54.194005 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:54 crc kubenswrapper[4606]: I1212 00:24:54.194029 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:54 crc kubenswrapper[4606]: I1212 00:24:54.194056 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:54 crc kubenswrapper[4606]: I1212 00:24:54.194072 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:54Z","lastTransitionTime":"2025-12-12T00:24:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:54 crc kubenswrapper[4606]: I1212 00:24:54.297091 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:54 crc kubenswrapper[4606]: I1212 00:24:54.297150 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:54 crc kubenswrapper[4606]: I1212 00:24:54.297205 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:54 crc kubenswrapper[4606]: I1212 00:24:54.297233 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:54 crc kubenswrapper[4606]: I1212 00:24:54.297251 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:54Z","lastTransitionTime":"2025-12-12T00:24:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:54 crc kubenswrapper[4606]: I1212 00:24:54.401043 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:54 crc kubenswrapper[4606]: I1212 00:24:54.401110 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:54 crc kubenswrapper[4606]: I1212 00:24:54.401128 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:54 crc kubenswrapper[4606]: I1212 00:24:54.401152 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:54 crc kubenswrapper[4606]: I1212 00:24:54.401207 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:54Z","lastTransitionTime":"2025-12-12T00:24:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:54 crc kubenswrapper[4606]: I1212 00:24:54.503898 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:54 crc kubenswrapper[4606]: I1212 00:24:54.504054 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:54 crc kubenswrapper[4606]: I1212 00:24:54.504074 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:54 crc kubenswrapper[4606]: I1212 00:24:54.504099 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:54 crc kubenswrapper[4606]: I1212 00:24:54.504116 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:54Z","lastTransitionTime":"2025-12-12T00:24:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:54 crc kubenswrapper[4606]: I1212 00:24:54.607011 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:54 crc kubenswrapper[4606]: I1212 00:24:54.607058 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:54 crc kubenswrapper[4606]: I1212 00:24:54.607094 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:54 crc kubenswrapper[4606]: I1212 00:24:54.607111 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:54 crc kubenswrapper[4606]: I1212 00:24:54.607122 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:54Z","lastTransitionTime":"2025-12-12T00:24:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:54 crc kubenswrapper[4606]: I1212 00:24:54.699259 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 12 00:24:54 crc kubenswrapper[4606]: E1212 00:24:54.699401 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 12 00:24:54 crc kubenswrapper[4606]: I1212 00:24:54.699502 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 12 00:24:54 crc kubenswrapper[4606]: I1212 00:24:54.699535 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 12 00:24:54 crc kubenswrapper[4606]: E1212 00:24:54.699698 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 12 00:24:54 crc kubenswrapper[4606]: E1212 00:24:54.699851 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 12 00:24:54 crc kubenswrapper[4606]: I1212 00:24:54.713644 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:54 crc kubenswrapper[4606]: I1212 00:24:54.713808 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:54 crc kubenswrapper[4606]: I1212 00:24:54.713916 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:54 crc kubenswrapper[4606]: I1212 00:24:54.714003 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:54 crc kubenswrapper[4606]: I1212 00:24:54.714086 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:54Z","lastTransitionTime":"2025-12-12T00:24:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:54 crc kubenswrapper[4606]: I1212 00:24:54.817565 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:54 crc kubenswrapper[4606]: I1212 00:24:54.817601 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:54 crc kubenswrapper[4606]: I1212 00:24:54.817612 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:54 crc kubenswrapper[4606]: I1212 00:24:54.817629 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:54 crc kubenswrapper[4606]: I1212 00:24:54.817642 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:54Z","lastTransitionTime":"2025-12-12T00:24:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:54 crc kubenswrapper[4606]: I1212 00:24:54.920583 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:54 crc kubenswrapper[4606]: I1212 00:24:54.920617 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:54 crc kubenswrapper[4606]: I1212 00:24:54.920629 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:54 crc kubenswrapper[4606]: I1212 00:24:54.920645 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:54 crc kubenswrapper[4606]: I1212 00:24:54.920656 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:54Z","lastTransitionTime":"2025-12-12T00:24:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:55 crc kubenswrapper[4606]: I1212 00:24:55.023352 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:55 crc kubenswrapper[4606]: I1212 00:24:55.023390 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:55 crc kubenswrapper[4606]: I1212 00:24:55.023398 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:55 crc kubenswrapper[4606]: I1212 00:24:55.023413 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:55 crc kubenswrapper[4606]: I1212 00:24:55.023422 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:55Z","lastTransitionTime":"2025-12-12T00:24:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:55 crc kubenswrapper[4606]: I1212 00:24:55.126800 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:55 crc kubenswrapper[4606]: I1212 00:24:55.126936 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:55 crc kubenswrapper[4606]: I1212 00:24:55.126973 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:55 crc kubenswrapper[4606]: I1212 00:24:55.127065 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:55 crc kubenswrapper[4606]: I1212 00:24:55.127132 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:55Z","lastTransitionTime":"2025-12-12T00:24:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:55 crc kubenswrapper[4606]: I1212 00:24:55.229572 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:55 crc kubenswrapper[4606]: I1212 00:24:55.229627 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:55 crc kubenswrapper[4606]: I1212 00:24:55.229640 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:55 crc kubenswrapper[4606]: I1212 00:24:55.229657 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:55 crc kubenswrapper[4606]: I1212 00:24:55.229670 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:55Z","lastTransitionTime":"2025-12-12T00:24:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:55 crc kubenswrapper[4606]: I1212 00:24:55.333081 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:55 crc kubenswrapper[4606]: I1212 00:24:55.333140 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:55 crc kubenswrapper[4606]: I1212 00:24:55.333157 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:55 crc kubenswrapper[4606]: I1212 00:24:55.333220 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:55 crc kubenswrapper[4606]: I1212 00:24:55.333238 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:55Z","lastTransitionTime":"2025-12-12T00:24:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:55 crc kubenswrapper[4606]: I1212 00:24:55.435927 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:55 crc kubenswrapper[4606]: I1212 00:24:55.435975 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:55 crc kubenswrapper[4606]: I1212 00:24:55.435984 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:55 crc kubenswrapper[4606]: I1212 00:24:55.435999 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:55 crc kubenswrapper[4606]: I1212 00:24:55.436008 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:55Z","lastTransitionTime":"2025-12-12T00:24:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:55 crc kubenswrapper[4606]: I1212 00:24:55.539733 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:55 crc kubenswrapper[4606]: I1212 00:24:55.539804 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:55 crc kubenswrapper[4606]: I1212 00:24:55.539822 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:55 crc kubenswrapper[4606]: I1212 00:24:55.539846 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:55 crc kubenswrapper[4606]: I1212 00:24:55.539866 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:55Z","lastTransitionTime":"2025-12-12T00:24:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:55 crc kubenswrapper[4606]: I1212 00:24:55.643122 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:55 crc kubenswrapper[4606]: I1212 00:24:55.643163 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:55 crc kubenswrapper[4606]: I1212 00:24:55.643196 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:55 crc kubenswrapper[4606]: I1212 00:24:55.643212 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:55 crc kubenswrapper[4606]: I1212 00:24:55.643223 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:55Z","lastTransitionTime":"2025-12-12T00:24:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:55 crc kubenswrapper[4606]: I1212 00:24:55.699574 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mjjwd" Dec 12 00:24:55 crc kubenswrapper[4606]: E1212 00:24:55.700610 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mjjwd" podUID="0853dce1-c009-407e-960d-1113f85e503f" Dec 12 00:24:55 crc kubenswrapper[4606]: I1212 00:24:55.746391 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:55 crc kubenswrapper[4606]: I1212 00:24:55.746442 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:55 crc kubenswrapper[4606]: I1212 00:24:55.746457 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:55 crc kubenswrapper[4606]: I1212 00:24:55.746475 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:55 crc kubenswrapper[4606]: I1212 00:24:55.746488 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:55Z","lastTransitionTime":"2025-12-12T00:24:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:55 crc kubenswrapper[4606]: I1212 00:24:55.849569 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:55 crc kubenswrapper[4606]: I1212 00:24:55.849629 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:55 crc kubenswrapper[4606]: I1212 00:24:55.849651 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:55 crc kubenswrapper[4606]: I1212 00:24:55.849681 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:55 crc kubenswrapper[4606]: I1212 00:24:55.849700 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:55Z","lastTransitionTime":"2025-12-12T00:24:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:55 crc kubenswrapper[4606]: I1212 00:24:55.952360 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:55 crc kubenswrapper[4606]: I1212 00:24:55.952412 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:55 crc kubenswrapper[4606]: I1212 00:24:55.952429 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:55 crc kubenswrapper[4606]: I1212 00:24:55.952452 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:55 crc kubenswrapper[4606]: I1212 00:24:55.952470 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:55Z","lastTransitionTime":"2025-12-12T00:24:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:56 crc kubenswrapper[4606]: I1212 00:24:56.055475 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:56 crc kubenswrapper[4606]: I1212 00:24:56.055538 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:56 crc kubenswrapper[4606]: I1212 00:24:56.055555 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:56 crc kubenswrapper[4606]: I1212 00:24:56.055582 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:56 crc kubenswrapper[4606]: I1212 00:24:56.055603 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:56Z","lastTransitionTime":"2025-12-12T00:24:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:56 crc kubenswrapper[4606]: I1212 00:24:56.159103 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:56 crc kubenswrapper[4606]: I1212 00:24:56.159158 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:56 crc kubenswrapper[4606]: I1212 00:24:56.159200 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:56 crc kubenswrapper[4606]: I1212 00:24:56.159224 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:56 crc kubenswrapper[4606]: I1212 00:24:56.159241 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:56Z","lastTransitionTime":"2025-12-12T00:24:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:56 crc kubenswrapper[4606]: I1212 00:24:56.261853 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:56 crc kubenswrapper[4606]: I1212 00:24:56.261893 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:56 crc kubenswrapper[4606]: I1212 00:24:56.261902 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:56 crc kubenswrapper[4606]: I1212 00:24:56.261919 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:56 crc kubenswrapper[4606]: I1212 00:24:56.261929 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:56Z","lastTransitionTime":"2025-12-12T00:24:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:56 crc kubenswrapper[4606]: I1212 00:24:56.364159 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:56 crc kubenswrapper[4606]: I1212 00:24:56.364225 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:56 crc kubenswrapper[4606]: I1212 00:24:56.364240 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:56 crc kubenswrapper[4606]: I1212 00:24:56.364257 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:56 crc kubenswrapper[4606]: I1212 00:24:56.364269 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:56Z","lastTransitionTime":"2025-12-12T00:24:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:56 crc kubenswrapper[4606]: I1212 00:24:56.467526 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:56 crc kubenswrapper[4606]: I1212 00:24:56.467605 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:56 crc kubenswrapper[4606]: I1212 00:24:56.467631 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:56 crc kubenswrapper[4606]: I1212 00:24:56.467665 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:56 crc kubenswrapper[4606]: I1212 00:24:56.467689 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:56Z","lastTransitionTime":"2025-12-12T00:24:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:56 crc kubenswrapper[4606]: I1212 00:24:56.570925 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:56 crc kubenswrapper[4606]: I1212 00:24:56.571034 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:56 crc kubenswrapper[4606]: I1212 00:24:56.571055 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:56 crc kubenswrapper[4606]: I1212 00:24:56.571115 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:56 crc kubenswrapper[4606]: I1212 00:24:56.571137 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:56Z","lastTransitionTime":"2025-12-12T00:24:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:56 crc kubenswrapper[4606]: I1212 00:24:56.674268 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:56 crc kubenswrapper[4606]: I1212 00:24:56.674331 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:56 crc kubenswrapper[4606]: I1212 00:24:56.674347 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:56 crc kubenswrapper[4606]: I1212 00:24:56.674372 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:56 crc kubenswrapper[4606]: I1212 00:24:56.674390 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:56Z","lastTransitionTime":"2025-12-12T00:24:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:56 crc kubenswrapper[4606]: I1212 00:24:56.698937 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 12 00:24:56 crc kubenswrapper[4606]: I1212 00:24:56.699040 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 12 00:24:56 crc kubenswrapper[4606]: I1212 00:24:56.699039 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 12 00:24:56 crc kubenswrapper[4606]: E1212 00:24:56.699147 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 12 00:24:56 crc kubenswrapper[4606]: E1212 00:24:56.699336 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 12 00:24:56 crc kubenswrapper[4606]: E1212 00:24:56.699464 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 12 00:24:56 crc kubenswrapper[4606]: I1212 00:24:56.777378 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:56 crc kubenswrapper[4606]: I1212 00:24:56.777474 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:56 crc kubenswrapper[4606]: I1212 00:24:56.777499 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:56 crc kubenswrapper[4606]: I1212 00:24:56.777533 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:56 crc kubenswrapper[4606]: I1212 00:24:56.777558 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:56Z","lastTransitionTime":"2025-12-12T00:24:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:56 crc kubenswrapper[4606]: I1212 00:24:56.881437 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:56 crc kubenswrapper[4606]: I1212 00:24:56.881534 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:56 crc kubenswrapper[4606]: I1212 00:24:56.881557 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:56 crc kubenswrapper[4606]: I1212 00:24:56.881636 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:56 crc kubenswrapper[4606]: I1212 00:24:56.881704 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:56Z","lastTransitionTime":"2025-12-12T00:24:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:56 crc kubenswrapper[4606]: I1212 00:24:56.984445 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:56 crc kubenswrapper[4606]: I1212 00:24:56.984500 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:56 crc kubenswrapper[4606]: I1212 00:24:56.984534 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:56 crc kubenswrapper[4606]: I1212 00:24:56.984561 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:56 crc kubenswrapper[4606]: I1212 00:24:56.984582 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:56Z","lastTransitionTime":"2025-12-12T00:24:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:57 crc kubenswrapper[4606]: I1212 00:24:57.087838 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:57 crc kubenswrapper[4606]: I1212 00:24:57.087883 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:57 crc kubenswrapper[4606]: I1212 00:24:57.087902 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:57 crc kubenswrapper[4606]: I1212 00:24:57.087920 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:57 crc kubenswrapper[4606]: I1212 00:24:57.087932 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:57Z","lastTransitionTime":"2025-12-12T00:24:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:57 crc kubenswrapper[4606]: I1212 00:24:57.191091 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:57 crc kubenswrapper[4606]: I1212 00:24:57.191224 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:57 crc kubenswrapper[4606]: I1212 00:24:57.191249 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:57 crc kubenswrapper[4606]: I1212 00:24:57.191278 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:57 crc kubenswrapper[4606]: I1212 00:24:57.191300 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:57Z","lastTransitionTime":"2025-12-12T00:24:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:57 crc kubenswrapper[4606]: I1212 00:24:57.294131 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:57 crc kubenswrapper[4606]: I1212 00:24:57.294218 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:57 crc kubenswrapper[4606]: I1212 00:24:57.294245 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:57 crc kubenswrapper[4606]: I1212 00:24:57.294273 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:57 crc kubenswrapper[4606]: I1212 00:24:57.294294 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:57Z","lastTransitionTime":"2025-12-12T00:24:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:57 crc kubenswrapper[4606]: I1212 00:24:57.397528 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:57 crc kubenswrapper[4606]: I1212 00:24:57.397644 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:57 crc kubenswrapper[4606]: I1212 00:24:57.397672 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:57 crc kubenswrapper[4606]: I1212 00:24:57.397736 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:57 crc kubenswrapper[4606]: I1212 00:24:57.397755 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:57Z","lastTransitionTime":"2025-12-12T00:24:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:57 crc kubenswrapper[4606]: I1212 00:24:57.501412 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:57 crc kubenswrapper[4606]: I1212 00:24:57.501508 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:57 crc kubenswrapper[4606]: I1212 00:24:57.501528 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:57 crc kubenswrapper[4606]: I1212 00:24:57.501551 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:57 crc kubenswrapper[4606]: I1212 00:24:57.501571 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:57Z","lastTransitionTime":"2025-12-12T00:24:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:57 crc kubenswrapper[4606]: I1212 00:24:57.603752 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:57 crc kubenswrapper[4606]: I1212 00:24:57.603815 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:57 crc kubenswrapper[4606]: I1212 00:24:57.603831 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:57 crc kubenswrapper[4606]: I1212 00:24:57.603851 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:57 crc kubenswrapper[4606]: I1212 00:24:57.603866 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:57Z","lastTransitionTime":"2025-12-12T00:24:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:57 crc kubenswrapper[4606]: I1212 00:24:57.698849 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mjjwd" Dec 12 00:24:57 crc kubenswrapper[4606]: E1212 00:24:57.699100 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mjjwd" podUID="0853dce1-c009-407e-960d-1113f85e503f" Dec 12 00:24:57 crc kubenswrapper[4606]: I1212 00:24:57.706830 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:57 crc kubenswrapper[4606]: I1212 00:24:57.706882 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:57 crc kubenswrapper[4606]: I1212 00:24:57.706900 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:57 crc kubenswrapper[4606]: I1212 00:24:57.706922 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:57 crc kubenswrapper[4606]: I1212 00:24:57.706940 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:57Z","lastTransitionTime":"2025-12-12T00:24:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:57 crc kubenswrapper[4606]: I1212 00:24:57.809842 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:57 crc kubenswrapper[4606]: I1212 00:24:57.809991 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:57 crc kubenswrapper[4606]: I1212 00:24:57.810019 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:57 crc kubenswrapper[4606]: I1212 00:24:57.810044 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:57 crc kubenswrapper[4606]: I1212 00:24:57.810064 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:57Z","lastTransitionTime":"2025-12-12T00:24:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:57 crc kubenswrapper[4606]: I1212 00:24:57.913911 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:57 crc kubenswrapper[4606]: I1212 00:24:57.913987 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:57 crc kubenswrapper[4606]: I1212 00:24:57.914007 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:57 crc kubenswrapper[4606]: I1212 00:24:57.914033 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:57 crc kubenswrapper[4606]: I1212 00:24:57.914051 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:57Z","lastTransitionTime":"2025-12-12T00:24:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:58 crc kubenswrapper[4606]: I1212 00:24:58.019044 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:58 crc kubenswrapper[4606]: I1212 00:24:58.019111 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:58 crc kubenswrapper[4606]: I1212 00:24:58.019125 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:58 crc kubenswrapper[4606]: I1212 00:24:58.019148 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:58 crc kubenswrapper[4606]: I1212 00:24:58.019163 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:58Z","lastTransitionTime":"2025-12-12T00:24:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:58 crc kubenswrapper[4606]: I1212 00:24:58.123223 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:58 crc kubenswrapper[4606]: I1212 00:24:58.123284 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:58 crc kubenswrapper[4606]: I1212 00:24:58.123300 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:58 crc kubenswrapper[4606]: I1212 00:24:58.123325 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:58 crc kubenswrapper[4606]: I1212 00:24:58.123338 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:58Z","lastTransitionTime":"2025-12-12T00:24:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:58 crc kubenswrapper[4606]: I1212 00:24:58.226460 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:58 crc kubenswrapper[4606]: I1212 00:24:58.226506 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:58 crc kubenswrapper[4606]: I1212 00:24:58.226518 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:58 crc kubenswrapper[4606]: I1212 00:24:58.226538 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:58 crc kubenswrapper[4606]: I1212 00:24:58.226553 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:58Z","lastTransitionTime":"2025-12-12T00:24:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:58 crc kubenswrapper[4606]: I1212 00:24:58.329304 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:58 crc kubenswrapper[4606]: I1212 00:24:58.329354 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:58 crc kubenswrapper[4606]: I1212 00:24:58.329364 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:58 crc kubenswrapper[4606]: I1212 00:24:58.329386 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:58 crc kubenswrapper[4606]: I1212 00:24:58.329399 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:58Z","lastTransitionTime":"2025-12-12T00:24:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:58 crc kubenswrapper[4606]: I1212 00:24:58.432865 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:58 crc kubenswrapper[4606]: I1212 00:24:58.432925 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:58 crc kubenswrapper[4606]: I1212 00:24:58.432940 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:58 crc kubenswrapper[4606]: I1212 00:24:58.432964 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:58 crc kubenswrapper[4606]: I1212 00:24:58.432978 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:58Z","lastTransitionTime":"2025-12-12T00:24:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:58 crc kubenswrapper[4606]: I1212 00:24:58.535917 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:58 crc kubenswrapper[4606]: I1212 00:24:58.535959 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:58 crc kubenswrapper[4606]: I1212 00:24:58.535971 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:58 crc kubenswrapper[4606]: I1212 00:24:58.535991 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:58 crc kubenswrapper[4606]: I1212 00:24:58.536003 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:58Z","lastTransitionTime":"2025-12-12T00:24:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:58 crc kubenswrapper[4606]: I1212 00:24:58.638130 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:58 crc kubenswrapper[4606]: I1212 00:24:58.638186 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:58 crc kubenswrapper[4606]: I1212 00:24:58.638195 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:58 crc kubenswrapper[4606]: I1212 00:24:58.638213 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:58 crc kubenswrapper[4606]: I1212 00:24:58.638225 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:58Z","lastTransitionTime":"2025-12-12T00:24:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:58 crc kubenswrapper[4606]: I1212 00:24:58.699544 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 12 00:24:58 crc kubenswrapper[4606]: I1212 00:24:58.699673 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 12 00:24:58 crc kubenswrapper[4606]: I1212 00:24:58.699673 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 12 00:24:58 crc kubenswrapper[4606]: E1212 00:24:58.699851 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 12 00:24:58 crc kubenswrapper[4606]: E1212 00:24:58.700164 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 12 00:24:58 crc kubenswrapper[4606]: E1212 00:24:58.700321 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 12 00:24:58 crc kubenswrapper[4606]: I1212 00:24:58.740626 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:58 crc kubenswrapper[4606]: I1212 00:24:58.740689 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:58 crc kubenswrapper[4606]: I1212 00:24:58.740703 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:58 crc kubenswrapper[4606]: I1212 00:24:58.740722 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:58 crc kubenswrapper[4606]: I1212 00:24:58.740735 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:58Z","lastTransitionTime":"2025-12-12T00:24:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:58 crc kubenswrapper[4606]: I1212 00:24:58.843707 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:58 crc kubenswrapper[4606]: I1212 00:24:58.843801 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:58 crc kubenswrapper[4606]: I1212 00:24:58.843812 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:58 crc kubenswrapper[4606]: I1212 00:24:58.843843 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:58 crc kubenswrapper[4606]: I1212 00:24:58.843862 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:58Z","lastTransitionTime":"2025-12-12T00:24:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:58 crc kubenswrapper[4606]: I1212 00:24:58.947236 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:58 crc kubenswrapper[4606]: I1212 00:24:58.947329 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:58 crc kubenswrapper[4606]: I1212 00:24:58.947359 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:58 crc kubenswrapper[4606]: I1212 00:24:58.947390 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:58 crc kubenswrapper[4606]: I1212 00:24:58.947417 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:58Z","lastTransitionTime":"2025-12-12T00:24:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:59 crc kubenswrapper[4606]: I1212 00:24:59.050680 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:59 crc kubenswrapper[4606]: I1212 00:24:59.050752 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:59 crc kubenswrapper[4606]: I1212 00:24:59.050791 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:59 crc kubenswrapper[4606]: I1212 00:24:59.050823 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:59 crc kubenswrapper[4606]: I1212 00:24:59.050845 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:59Z","lastTransitionTime":"2025-12-12T00:24:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:59 crc kubenswrapper[4606]: I1212 00:24:59.154017 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:59 crc kubenswrapper[4606]: I1212 00:24:59.154098 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:59 crc kubenswrapper[4606]: I1212 00:24:59.154121 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:59 crc kubenswrapper[4606]: I1212 00:24:59.154150 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:59 crc kubenswrapper[4606]: I1212 00:24:59.154169 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:59Z","lastTransitionTime":"2025-12-12T00:24:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:59 crc kubenswrapper[4606]: I1212 00:24:59.257123 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:59 crc kubenswrapper[4606]: I1212 00:24:59.257218 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:59 crc kubenswrapper[4606]: I1212 00:24:59.257236 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:59 crc kubenswrapper[4606]: I1212 00:24:59.257256 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:59 crc kubenswrapper[4606]: I1212 00:24:59.257269 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:59Z","lastTransitionTime":"2025-12-12T00:24:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:59 crc kubenswrapper[4606]: I1212 00:24:59.359305 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:59 crc kubenswrapper[4606]: I1212 00:24:59.359380 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:59 crc kubenswrapper[4606]: I1212 00:24:59.359397 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:59 crc kubenswrapper[4606]: I1212 00:24:59.359882 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:59 crc kubenswrapper[4606]: I1212 00:24:59.359951 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:59Z","lastTransitionTime":"2025-12-12T00:24:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:59 crc kubenswrapper[4606]: I1212 00:24:59.464365 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:59 crc kubenswrapper[4606]: I1212 00:24:59.464428 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:59 crc kubenswrapper[4606]: I1212 00:24:59.464447 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:59 crc kubenswrapper[4606]: I1212 00:24:59.464473 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:59 crc kubenswrapper[4606]: I1212 00:24:59.464492 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:59Z","lastTransitionTime":"2025-12-12T00:24:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:59 crc kubenswrapper[4606]: I1212 00:24:59.567923 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:59 crc kubenswrapper[4606]: I1212 00:24:59.567999 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:59 crc kubenswrapper[4606]: I1212 00:24:59.568024 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:59 crc kubenswrapper[4606]: I1212 00:24:59.568062 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:59 crc kubenswrapper[4606]: I1212 00:24:59.568087 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:59Z","lastTransitionTime":"2025-12-12T00:24:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:59 crc kubenswrapper[4606]: I1212 00:24:59.670042 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:59 crc kubenswrapper[4606]: I1212 00:24:59.670084 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:59 crc kubenswrapper[4606]: I1212 00:24:59.670094 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:59 crc kubenswrapper[4606]: I1212 00:24:59.670111 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:59 crc kubenswrapper[4606]: I1212 00:24:59.670123 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:59Z","lastTransitionTime":"2025-12-12T00:24:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:59 crc kubenswrapper[4606]: I1212 00:24:59.699995 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mjjwd" Dec 12 00:24:59 crc kubenswrapper[4606]: E1212 00:24:59.700371 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mjjwd" podUID="0853dce1-c009-407e-960d-1113f85e503f" Dec 12 00:24:59 crc kubenswrapper[4606]: I1212 00:24:59.700473 4606 scope.go:117] "RemoveContainer" containerID="04af88555385f41f61f31bcc6dcfd9ea677fd98ba9ef0b1a25003fff11e8be1b" Dec 12 00:24:59 crc kubenswrapper[4606]: E1212 00:24:59.700949 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-hpw5w_openshift-ovn-kubernetes(da25b0ba-5398-4185-a4c1-aeba44ae5633)\"" pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" podUID="da25b0ba-5398-4185-a4c1-aeba44ae5633" Dec 12 00:24:59 crc kubenswrapper[4606]: I1212 00:24:59.713109 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-554rp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d34036b-3243-416a-a0c0-1d1ddb3a0ca5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://346417326a8a4ba8e839ce4870dbaacd33a90b4ab096d8ce573ce598d909b51a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9k8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-554rp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:59Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:59 crc kubenswrapper[4606]: I1212 00:24:59.727876 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-w4rbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"470c2076-46bf-4305-9fb1-3e509eb4d4f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78e7b08154aa4792439cd9e13c38f1c3a106a4d138d8e6a47d5406c82f11662c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34a40d0ce85078630553b90b7a35510e0d3523f600db8940d5e0cf063903192d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34a40d0ce85078630553b90b7a35510e0d3523f600db8940d5e0cf063903192d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f0d903a3e09486a00bc5031c1a100d7a520f8e0cbdcccbeadb796e70f1075b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f0d903a3e09486a00bc5031c1a100d7a520f8e0cbdcccbeadb796e70f1075b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06909fa1392ee8f29394ecd7411437e4f331ccd2007d0a534d0292175a7260ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06909fa1392ee8f29394ecd7411437e4f331ccd2007d0a534d0292175a7260ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://328f409c465a8289f2e2e070ac2475187e763de8354cad8af5b5b4e0a3539628\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://328f409c465a8289f2e2e070ac2475187e763de8354cad8af5b5b4e0a3539628\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09980f0c6b16141a2a66d2768c6e01f75c37314fea48825e9b9707a3fc4d3f9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09980f0c6b16141a2a66d2768c6e01f75c37314fea48825e9b9707a3fc4d3f9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d34cd079b74cccb75c66edf8c51b64113483767ced4eb56ad429d6527d0bb0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d34cd079b74cccb75c66edf8c51b64113483767ced4eb56ad429d6527d0bb0c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-w4rbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:59Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:59 crc kubenswrapper[4606]: I1212 00:24:59.752614 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da25b0ba-5398-4185-a4c1-aeba44ae5633\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa827acce161127216bcfcf3553efeb627cbe52e4d23a8cb011e025f67dde670\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://653bdfa6f15b75149c89a4aff0706e132368f8bc75520cde9f1e1c55c8866537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbb742680d55b47d51cd1964e187b6816e50a88befbce58d3a8746af74cf7071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://366ddc63e1156e702b24e0c3d3c02785c10324669fa851e46851f9a2b8a46a5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6183a4685fda61493c792e9ae693cd7f76e4aad9753f622b48347da3fd2b3b00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0867a89de2edf12036267a937b7e3b1a801e9710ff0a1250c20e020c80ac4d5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04af88555385f41f61f31bcc6dcfd9ea677fd98ba9ef0b1a25003fff11e8be1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04af88555385f41f61f31bcc6dcfd9ea677fd98ba9ef0b1a25003fff11e8be1b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-12T00:24:44Z\\\",\\\"message\\\":\\\"atch factory\\\\nI1212 00:24:44.033859 6579 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1212 00:24:44.033876 6579 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1212 00:24:44.033886 6579 handler.go:208] Removed *v1.Node event handler 2\\\\nI1212 00:24:44.033899 6579 handler.go:208] Removed *v1.Node event handler 7\\\\nI1212 00:24:44.033905 6579 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1212 00:24:44.033909 6579 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1212 00:24:44.033926 6579 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1212 00:24:44.033935 6579 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1212 00:24:44.034088 6579 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1212 00:24:44.034244 6579 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1212 00:24:44.034528 6579 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1212 00:24:44.034862 6579 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-12T00:24:43Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-hpw5w_openshift-ovn-kubernetes(da25b0ba-5398-4185-a4c1-aeba44ae5633)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e3c08f155fa3cef9692edf10c4df9719f3647b5625a9ac68a7c158afb80dd86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f94a9f641815d76464c17c6145b9ac4803eedab3bea29a52d65a793d448cebe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f94a9f641815d76464c17c6145b9ac4803eedab3bea29a52d65a793d448cebe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hpw5w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:59Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:59 crc kubenswrapper[4606]: I1212 00:24:59.769308 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7rxps" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c831d6d-b07d-46dd-adc0-85239379350f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba170e097af91d1b27e0ee3e414e0831a5a93256c693247e7129a5fda0b9dc85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:24:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbjbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d66f7e90d4969cb2791da434d47d0e4fb269aa8760347c7d59bd8d16f4cf42c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:24:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbjbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:24:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7rxps\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:59Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:59 crc kubenswrapper[4606]: I1212 00:24:59.773065 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:59 crc kubenswrapper[4606]: I1212 00:24:59.773113 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:59 crc kubenswrapper[4606]: I1212 00:24:59.773127 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:59 crc kubenswrapper[4606]: I1212 00:24:59.773195 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:59 crc kubenswrapper[4606]: I1212 00:24:59.773208 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:59Z","lastTransitionTime":"2025-12-12T00:24:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:59 crc kubenswrapper[4606]: I1212 00:24:59.782907 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mjjwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0853dce1-c009-407e-960d-1113f85e503f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwwtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwwtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:24:03Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mjjwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:59Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:59 crc kubenswrapper[4606]: I1212 00:24:59.828256 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:59Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:59 crc kubenswrapper[4606]: I1212 00:24:59.848824 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92839d27-9755-4aaa-a4c2-8aa83b6473de\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d18b3a560566347e9cfcd748453a89d1588de77f02a5e961a71916c6182eff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9531d10c07644d54aeff48edf4ac068968acbd1b6597baabcb0660e7bcd54c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76f2e4c2d97f36aebba4bcbabe12214a1df27637376d1440ee54d43fca39065\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b764ca1153735e365d95ed2855c7d41cd7457614e1ddce0f008d62720d607e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c75350d79a93c04db2bd6d71995920cde9b897776249b8696f3ead3939f7d5d4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"ed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1765499023\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1765499022\\\\\\\\\\\\\\\" (2025-12-11 23:23:42 +0000 UTC to 2026-12-11 23:23:42 +0000 UTC (now=2025-12-12 00:23:48.120831298 +0000 UTC))\\\\\\\"\\\\nI1212 00:23:48.120889 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1212 00:23:48.120921 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1212 00:23:48.120964 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1212 00:23:48.120986 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1212 00:23:48.121028 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3271453193/tls.crt::/tmp/serving-cert-3271453193/tls.key\\\\\\\"\\\\nI1212 00:23:48.121084 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1212 00:23:48.121211 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1212 00:23:48.121228 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1212 00:23:48.121265 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1212 00:23:48.121275 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1212 00:23:48.121345 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1212 00:23:48.121358 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF1212 00:23:48.123292 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3baf1bf5fbaa04f9f165f43b6a4fba496be0d453641fcd8bbc811687891dd66\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bbabfdcef0c9f4dcaada0bf5c2e98c083fd0f809462f017f8a767ef800e8e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0bbabfdcef0c9f4dcaada0bf5c2e98c083fd0f809462f017f8a767ef800e8e13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:59Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:59 crc kubenswrapper[4606]: I1212 00:24:59.865761 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ba650bd-b1e1-4950-98c6-2bee87181b18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de6cd366dc8d770945235faa6560cea35c0ef7eabd95cb272639e2116e2d623d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a380a96c542428bf9cf66391f4d843144052269b294636e863319b2ba954047f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://902a9d2fb0d7dcdec6339fac981ee0539911f1998a86ecf02a01015d3b1b5907\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4309a0a93977264e5000a930a32a4fb18ebe51fcf2af20bbf153f30f09759ff6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:59Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:59 crc kubenswrapper[4606]: I1212 00:24:59.876194 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:59 crc kubenswrapper[4606]: I1212 00:24:59.876250 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:59 crc kubenswrapper[4606]: I1212 00:24:59.876265 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:59 crc kubenswrapper[4606]: I1212 00:24:59.876289 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:59 crc kubenswrapper[4606]: I1212 00:24:59.876304 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:59Z","lastTransitionTime":"2025-12-12T00:24:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:59 crc kubenswrapper[4606]: I1212 00:24:59.881097 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:59Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:59 crc kubenswrapper[4606]: I1212 00:24:59.895601 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://694cd2f423e8241bd8fd68395824d1cd45a12d80e92ad57e63e59c0642b21384\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:59Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:59 crc kubenswrapper[4606]: I1212 00:24:59.911507 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a543e227-be89-40cb-941d-b4707cc28921\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b09638dd0d881593745c14548156f20a9366f65ecfa5018d510b902240dd4f7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-986w2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5de882904f4a5718bcf44d07da05096363447836f325f35130914c17a30c3a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-986w2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cqmz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:59Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:59 crc kubenswrapper[4606]: I1212 00:24:59.928461 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xzcfk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3601dd0b8c0a574cc4689906f3c4ed491f3432cae8b328c55f07e5d17b6e976\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84110159d07d70b0607ac029fec70fb043bbb0e310c39955e49becfe87e9faa0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-12T00:24:36Z\\\",\\\"message\\\":\\\"2025-12-12T00:23:51+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_d712852f-61b9-48f1-a1d6-1031b964219c\\\\n2025-12-12T00:23:51+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_d712852f-61b9-48f1-a1d6-1031b964219c to /host/opt/cni/bin/\\\\n2025-12-12T00:23:51Z [verbose] multus-daemon started\\\\n2025-12-12T00:23:51Z [verbose] Readiness Indicator file check\\\\n2025-12-12T00:24:36Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-spsv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xzcfk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:59Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:59 crc kubenswrapper[4606]: I1212 00:24:59.954445 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f8df398-6835-482a-a2cd-3395b6e9efab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b993d03e36e69edb510ce98f50dfb98344fb6fe2b5debeff3a2004807dd00ee6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c6fef20d4e159a405f0ab0206c49e5e314e97bf7ddaf0758e090106945c9ffe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://accf40ef1339b65136666c1ea6f21a70d2152a650c87ef3ade19ddb6f9effbc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbd49ad6d6011a235d440da0e8058c0fc4dcc64effec37b50a68e84438d537a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2860bce1479ca1d558c4f5230d76cadcaf3efc0e3842ef0d371ecf1168cd852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7780eb068dc4a157794f9253367d2198a38a2c039cf3906fc2319cf48db3cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa7780eb068dc4a157794f9253367d2198a38a2c039cf3906fc2319cf48db3cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://684c387c89889a0a1e8d317ca70ebfc71ea377eb5de9c08c71a3ff02faf99e4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://684c387c89889a0a1e8d317ca70ebfc71ea377eb5de9c08c71a3ff02faf99e4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://78da314feaec6cab81e27eaea086d2bba3a05c4b73ce2add1a0bb57a226b0f5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78da314feaec6cab81e27eaea086d2bba3a05c4b73ce2add1a0bb57a226b0f5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:59Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:59 crc kubenswrapper[4606]: I1212 00:24:59.966921 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab571c5f-6f1d-4cb9-9a00-a57cc7baad63\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7a27c1b503f68ce99c9004d0190e5c74380bf2ff33b2b0e1e7f424e5cf9d450\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a159e87c39859bf3b5652b40223e4a8fdd9dcae3d23c1fae17d0eb8b5842a71c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e338552ee9504d46234c3caf3d9b7306a033258a94b6cf7c542bb957ea32a94c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c88faa76d30856d9170f04a153de4239eefbef49b2ee25a5bd04502697248b5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c88faa76d30856d9170f04a153de4239eefbef49b2ee25a5bd04502697248b5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:29Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:59Z is after 2025-08-24T17:21:41Z" Dec 12 00:24:59 crc kubenswrapper[4606]: I1212 00:24:59.978449 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:24:59 crc kubenswrapper[4606]: I1212 00:24:59.978501 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:24:59 crc kubenswrapper[4606]: I1212 00:24:59.978513 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:24:59 crc kubenswrapper[4606]: I1212 00:24:59.978528 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:24:59 crc kubenswrapper[4606]: I1212 00:24:59.978539 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:24:59Z","lastTransitionTime":"2025-12-12T00:24:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:24:59 crc kubenswrapper[4606]: I1212 00:24:59.982952 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4986a294a19f85082b9caf854f13f9c3587ca49314b40917b00768ee0bb51ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4251eab73ccb6141903993f8df922037762f7c11dc1e0ef1a2b5708ed435307\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:59Z is after 2025-08-24T17:21:41Z" Dec 12 00:25:00 crc kubenswrapper[4606]: I1212 00:25:00.001002 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:24:59Z is after 2025-08-24T17:21:41Z" Dec 12 00:25:00 crc kubenswrapper[4606]: I1212 00:25:00.014344 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qtz7b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f437a237-3eb8-4817-b0a9-35efece69933\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f587a144b02f6ba1229380174f2a4f2c50ed33702fe8d64eb5ff7174b32eb2d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qq49f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qtz7b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:25:00Z is after 2025-08-24T17:21:41Z" Dec 12 00:25:00 crc kubenswrapper[4606]: I1212 00:25:00.028016 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67cd27ea-8882-4aa1-ba7a-eb2058cac536\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c06b3470d2a194c99cd63ef6039a6f2abff89fc93e8f61feca67c4eefa9b9ac8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://892ac723ca54cc612171eec750846b774fcfc54e09531ff344813a1e1afc61fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://892ac723ca54cc612171eec750846b774fcfc54e09531ff344813a1e1afc61fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:25:00Z is after 2025-08-24T17:21:41Z" Dec 12 00:25:00 crc kubenswrapper[4606]: I1212 00:25:00.041908 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce6c4e852b4c68e910621ad11058beecb40960d501d3e64f5fcad97c442df4ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:25:00Z is after 2025-08-24T17:21:41Z" Dec 12 00:25:00 crc kubenswrapper[4606]: I1212 00:25:00.080933 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:25:00 crc kubenswrapper[4606]: I1212 00:25:00.080970 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:25:00 crc kubenswrapper[4606]: I1212 00:25:00.080985 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:25:00 crc kubenswrapper[4606]: I1212 00:25:00.081002 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:25:00 crc kubenswrapper[4606]: I1212 00:25:00.081013 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:25:00Z","lastTransitionTime":"2025-12-12T00:25:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:25:00 crc kubenswrapper[4606]: I1212 00:25:00.183105 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:25:00 crc kubenswrapper[4606]: I1212 00:25:00.183141 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:25:00 crc kubenswrapper[4606]: I1212 00:25:00.183151 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:25:00 crc kubenswrapper[4606]: I1212 00:25:00.183166 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:25:00 crc kubenswrapper[4606]: I1212 00:25:00.183194 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:25:00Z","lastTransitionTime":"2025-12-12T00:25:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:25:00 crc kubenswrapper[4606]: I1212 00:25:00.285450 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:25:00 crc kubenswrapper[4606]: I1212 00:25:00.285486 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:25:00 crc kubenswrapper[4606]: I1212 00:25:00.285498 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:25:00 crc kubenswrapper[4606]: I1212 00:25:00.285511 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:25:00 crc kubenswrapper[4606]: I1212 00:25:00.285525 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:25:00Z","lastTransitionTime":"2025-12-12T00:25:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:25:00 crc kubenswrapper[4606]: I1212 00:25:00.389095 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:25:00 crc kubenswrapper[4606]: I1212 00:25:00.389205 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:25:00 crc kubenswrapper[4606]: I1212 00:25:00.389238 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:25:00 crc kubenswrapper[4606]: I1212 00:25:00.389270 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:25:00 crc kubenswrapper[4606]: I1212 00:25:00.389299 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:25:00Z","lastTransitionTime":"2025-12-12T00:25:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:25:00 crc kubenswrapper[4606]: I1212 00:25:00.492310 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:25:00 crc kubenswrapper[4606]: I1212 00:25:00.492362 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:25:00 crc kubenswrapper[4606]: I1212 00:25:00.492380 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:25:00 crc kubenswrapper[4606]: I1212 00:25:00.492404 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:25:00 crc kubenswrapper[4606]: I1212 00:25:00.492422 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:25:00Z","lastTransitionTime":"2025-12-12T00:25:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:25:00 crc kubenswrapper[4606]: I1212 00:25:00.595114 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:25:00 crc kubenswrapper[4606]: I1212 00:25:00.595219 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:25:00 crc kubenswrapper[4606]: I1212 00:25:00.595256 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:25:00 crc kubenswrapper[4606]: I1212 00:25:00.595285 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:25:00 crc kubenswrapper[4606]: I1212 00:25:00.595307 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:25:00Z","lastTransitionTime":"2025-12-12T00:25:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:25:00 crc kubenswrapper[4606]: I1212 00:25:00.697788 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:25:00 crc kubenswrapper[4606]: I1212 00:25:00.697827 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:25:00 crc kubenswrapper[4606]: I1212 00:25:00.697842 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:25:00 crc kubenswrapper[4606]: I1212 00:25:00.697862 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:25:00 crc kubenswrapper[4606]: I1212 00:25:00.697876 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:25:00Z","lastTransitionTime":"2025-12-12T00:25:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:25:00 crc kubenswrapper[4606]: I1212 00:25:00.699242 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 12 00:25:00 crc kubenswrapper[4606]: I1212 00:25:00.699283 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 12 00:25:00 crc kubenswrapper[4606]: E1212 00:25:00.699327 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 12 00:25:00 crc kubenswrapper[4606]: I1212 00:25:00.699245 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 12 00:25:00 crc kubenswrapper[4606]: E1212 00:25:00.699383 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 12 00:25:00 crc kubenswrapper[4606]: E1212 00:25:00.699686 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 12 00:25:00 crc kubenswrapper[4606]: I1212 00:25:00.801312 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:25:00 crc kubenswrapper[4606]: I1212 00:25:00.801355 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:25:00 crc kubenswrapper[4606]: I1212 00:25:00.801372 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:25:00 crc kubenswrapper[4606]: I1212 00:25:00.801393 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:25:00 crc kubenswrapper[4606]: I1212 00:25:00.801409 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:25:00Z","lastTransitionTime":"2025-12-12T00:25:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:25:00 crc kubenswrapper[4606]: I1212 00:25:00.904275 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:25:00 crc kubenswrapper[4606]: I1212 00:25:00.904328 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:25:00 crc kubenswrapper[4606]: I1212 00:25:00.904341 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:25:00 crc kubenswrapper[4606]: I1212 00:25:00.904365 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:25:00 crc kubenswrapper[4606]: I1212 00:25:00.904379 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:25:00Z","lastTransitionTime":"2025-12-12T00:25:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:25:01 crc kubenswrapper[4606]: I1212 00:25:01.007964 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:25:01 crc kubenswrapper[4606]: I1212 00:25:01.008012 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:25:01 crc kubenswrapper[4606]: I1212 00:25:01.008022 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:25:01 crc kubenswrapper[4606]: I1212 00:25:01.008041 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:25:01 crc kubenswrapper[4606]: I1212 00:25:01.008056 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:25:01Z","lastTransitionTime":"2025-12-12T00:25:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:25:01 crc kubenswrapper[4606]: I1212 00:25:01.111143 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:25:01 crc kubenswrapper[4606]: I1212 00:25:01.111232 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:25:01 crc kubenswrapper[4606]: I1212 00:25:01.111249 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:25:01 crc kubenswrapper[4606]: I1212 00:25:01.111277 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:25:01 crc kubenswrapper[4606]: I1212 00:25:01.111295 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:25:01Z","lastTransitionTime":"2025-12-12T00:25:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:25:01 crc kubenswrapper[4606]: I1212 00:25:01.214206 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:25:01 crc kubenswrapper[4606]: I1212 00:25:01.214244 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:25:01 crc kubenswrapper[4606]: I1212 00:25:01.214255 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:25:01 crc kubenswrapper[4606]: I1212 00:25:01.214270 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:25:01 crc kubenswrapper[4606]: I1212 00:25:01.214280 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:25:01Z","lastTransitionTime":"2025-12-12T00:25:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:25:01 crc kubenswrapper[4606]: I1212 00:25:01.316610 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:25:01 crc kubenswrapper[4606]: I1212 00:25:01.316880 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:25:01 crc kubenswrapper[4606]: I1212 00:25:01.316963 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:25:01 crc kubenswrapper[4606]: I1212 00:25:01.317037 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:25:01 crc kubenswrapper[4606]: I1212 00:25:01.317099 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:25:01Z","lastTransitionTime":"2025-12-12T00:25:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:25:01 crc kubenswrapper[4606]: I1212 00:25:01.419818 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:25:01 crc kubenswrapper[4606]: I1212 00:25:01.419867 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:25:01 crc kubenswrapper[4606]: I1212 00:25:01.419877 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:25:01 crc kubenswrapper[4606]: I1212 00:25:01.419890 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:25:01 crc kubenswrapper[4606]: I1212 00:25:01.419900 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:25:01Z","lastTransitionTime":"2025-12-12T00:25:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:25:01 crc kubenswrapper[4606]: I1212 00:25:01.522335 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:25:01 crc kubenswrapper[4606]: I1212 00:25:01.522369 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:25:01 crc kubenswrapper[4606]: I1212 00:25:01.522380 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:25:01 crc kubenswrapper[4606]: I1212 00:25:01.522394 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:25:01 crc kubenswrapper[4606]: I1212 00:25:01.522404 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:25:01Z","lastTransitionTime":"2025-12-12T00:25:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:25:01 crc kubenswrapper[4606]: I1212 00:25:01.625967 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:25:01 crc kubenswrapper[4606]: I1212 00:25:01.626010 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:25:01 crc kubenswrapper[4606]: I1212 00:25:01.626023 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:25:01 crc kubenswrapper[4606]: I1212 00:25:01.626039 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:25:01 crc kubenswrapper[4606]: I1212 00:25:01.626051 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:25:01Z","lastTransitionTime":"2025-12-12T00:25:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:25:01 crc kubenswrapper[4606]: I1212 00:25:01.698776 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mjjwd" Dec 12 00:25:01 crc kubenswrapper[4606]: E1212 00:25:01.698944 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mjjwd" podUID="0853dce1-c009-407e-960d-1113f85e503f" Dec 12 00:25:01 crc kubenswrapper[4606]: I1212 00:25:01.728715 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:25:01 crc kubenswrapper[4606]: I1212 00:25:01.728775 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:25:01 crc kubenswrapper[4606]: I1212 00:25:01.728801 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:25:01 crc kubenswrapper[4606]: I1212 00:25:01.728832 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:25:01 crc kubenswrapper[4606]: I1212 00:25:01.728855 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:25:01Z","lastTransitionTime":"2025-12-12T00:25:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:25:01 crc kubenswrapper[4606]: I1212 00:25:01.831807 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:25:01 crc kubenswrapper[4606]: I1212 00:25:01.831889 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:25:01 crc kubenswrapper[4606]: I1212 00:25:01.831981 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:25:01 crc kubenswrapper[4606]: I1212 00:25:01.832013 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:25:01 crc kubenswrapper[4606]: I1212 00:25:01.832037 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:25:01Z","lastTransitionTime":"2025-12-12T00:25:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:25:01 crc kubenswrapper[4606]: I1212 00:25:01.934683 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:25:01 crc kubenswrapper[4606]: I1212 00:25:01.934720 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:25:01 crc kubenswrapper[4606]: I1212 00:25:01.934729 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:25:01 crc kubenswrapper[4606]: I1212 00:25:01.934744 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:25:01 crc kubenswrapper[4606]: I1212 00:25:01.934756 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:25:01Z","lastTransitionTime":"2025-12-12T00:25:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:25:02 crc kubenswrapper[4606]: I1212 00:25:02.037857 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:25:02 crc kubenswrapper[4606]: I1212 00:25:02.037983 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:25:02 crc kubenswrapper[4606]: I1212 00:25:02.038064 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:25:02 crc kubenswrapper[4606]: I1212 00:25:02.038144 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:25:02 crc kubenswrapper[4606]: I1212 00:25:02.038211 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:25:02Z","lastTransitionTime":"2025-12-12T00:25:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:25:02 crc kubenswrapper[4606]: I1212 00:25:02.142081 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:25:02 crc kubenswrapper[4606]: I1212 00:25:02.142700 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:25:02 crc kubenswrapper[4606]: I1212 00:25:02.142814 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:25:02 crc kubenswrapper[4606]: I1212 00:25:02.142926 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:25:02 crc kubenswrapper[4606]: I1212 00:25:02.143029 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:25:02Z","lastTransitionTime":"2025-12-12T00:25:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:25:02 crc kubenswrapper[4606]: I1212 00:25:02.246753 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:25:02 crc kubenswrapper[4606]: I1212 00:25:02.247056 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:25:02 crc kubenswrapper[4606]: I1212 00:25:02.247166 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:25:02 crc kubenswrapper[4606]: I1212 00:25:02.247298 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:25:02 crc kubenswrapper[4606]: I1212 00:25:02.247393 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:25:02Z","lastTransitionTime":"2025-12-12T00:25:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:25:02 crc kubenswrapper[4606]: I1212 00:25:02.349487 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:25:02 crc kubenswrapper[4606]: I1212 00:25:02.349759 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:25:02 crc kubenswrapper[4606]: I1212 00:25:02.349954 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:25:02 crc kubenswrapper[4606]: I1212 00:25:02.350123 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:25:02 crc kubenswrapper[4606]: I1212 00:25:02.350316 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:25:02Z","lastTransitionTime":"2025-12-12T00:25:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:25:02 crc kubenswrapper[4606]: I1212 00:25:02.452747 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:25:02 crc kubenswrapper[4606]: I1212 00:25:02.453340 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:25:02 crc kubenswrapper[4606]: I1212 00:25:02.453379 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:25:02 crc kubenswrapper[4606]: I1212 00:25:02.453406 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:25:02 crc kubenswrapper[4606]: I1212 00:25:02.453425 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:25:02Z","lastTransitionTime":"2025-12-12T00:25:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:25:02 crc kubenswrapper[4606]: I1212 00:25:02.554211 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:25:02 crc kubenswrapper[4606]: I1212 00:25:02.554234 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:25:02 crc kubenswrapper[4606]: I1212 00:25:02.554242 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:25:02 crc kubenswrapper[4606]: I1212 00:25:02.554255 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:25:02 crc kubenswrapper[4606]: I1212 00:25:02.554264 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:25:02Z","lastTransitionTime":"2025-12-12T00:25:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:25:02 crc kubenswrapper[4606]: E1212 00:25:02.575064 4606 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:25:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:25:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:25:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:25:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:25:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:25:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:25:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:25:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"19ab28be-ccfb-4859-88d0-8d375e2d06f2\\\",\\\"systemUUID\\\":\\\"d5982033-b2dc-474c-9cda-275cd567c208\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:25:02Z is after 2025-08-24T17:21:41Z" Dec 12 00:25:02 crc kubenswrapper[4606]: I1212 00:25:02.581479 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:25:02 crc kubenswrapper[4606]: I1212 00:25:02.581525 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:25:02 crc kubenswrapper[4606]: I1212 00:25:02.581540 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:25:02 crc kubenswrapper[4606]: I1212 00:25:02.581559 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:25:02 crc kubenswrapper[4606]: I1212 00:25:02.581576 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:25:02Z","lastTransitionTime":"2025-12-12T00:25:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:25:02 crc kubenswrapper[4606]: E1212 00:25:02.594725 4606 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:25:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:25:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:25:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:25:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:25:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:25:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:25:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:25:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"19ab28be-ccfb-4859-88d0-8d375e2d06f2\\\",\\\"systemUUID\\\":\\\"d5982033-b2dc-474c-9cda-275cd567c208\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:25:02Z is after 2025-08-24T17:21:41Z" Dec 12 00:25:02 crc kubenswrapper[4606]: I1212 00:25:02.600123 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:25:02 crc kubenswrapper[4606]: I1212 00:25:02.600477 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:25:02 crc kubenswrapper[4606]: I1212 00:25:02.600613 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:25:02 crc kubenswrapper[4606]: I1212 00:25:02.600705 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:25:02 crc kubenswrapper[4606]: I1212 00:25:02.600800 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:25:02Z","lastTransitionTime":"2025-12-12T00:25:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:25:02 crc kubenswrapper[4606]: E1212 00:25:02.619056 4606 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:25:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:25:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:25:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:25:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:25:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:25:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:25:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:25:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"19ab28be-ccfb-4859-88d0-8d375e2d06f2\\\",\\\"systemUUID\\\":\\\"d5982033-b2dc-474c-9cda-275cd567c208\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:25:02Z is after 2025-08-24T17:21:41Z" Dec 12 00:25:02 crc kubenswrapper[4606]: I1212 00:25:02.623284 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:25:02 crc kubenswrapper[4606]: I1212 00:25:02.623673 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:25:02 crc kubenswrapper[4606]: I1212 00:25:02.623780 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:25:02 crc kubenswrapper[4606]: I1212 00:25:02.623891 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:25:02 crc kubenswrapper[4606]: I1212 00:25:02.623994 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:25:02Z","lastTransitionTime":"2025-12-12T00:25:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:25:02 crc kubenswrapper[4606]: E1212 00:25:02.639111 4606 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:25:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:25:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:25:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:25:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:25:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:25:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:25:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:25:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"19ab28be-ccfb-4859-88d0-8d375e2d06f2\\\",\\\"systemUUID\\\":\\\"d5982033-b2dc-474c-9cda-275cd567c208\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:25:02Z is after 2025-08-24T17:21:41Z" Dec 12 00:25:02 crc kubenswrapper[4606]: I1212 00:25:02.643362 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:25:02 crc kubenswrapper[4606]: I1212 00:25:02.643600 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:25:02 crc kubenswrapper[4606]: I1212 00:25:02.643752 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:25:02 crc kubenswrapper[4606]: I1212 00:25:02.643906 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:25:02 crc kubenswrapper[4606]: I1212 00:25:02.644047 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:25:02Z","lastTransitionTime":"2025-12-12T00:25:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:25:02 crc kubenswrapper[4606]: E1212 00:25:02.658578 4606 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:25:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:25:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:25:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:25:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:25:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:25:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:25:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-12T00:25:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"19ab28be-ccfb-4859-88d0-8d375e2d06f2\\\",\\\"systemUUID\\\":\\\"d5982033-b2dc-474c-9cda-275cd567c208\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:25:02Z is after 2025-08-24T17:21:41Z" Dec 12 00:25:02 crc kubenswrapper[4606]: E1212 00:25:02.659235 4606 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 12 00:25:02 crc kubenswrapper[4606]: I1212 00:25:02.661257 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:25:02 crc kubenswrapper[4606]: I1212 00:25:02.661482 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:25:02 crc kubenswrapper[4606]: I1212 00:25:02.661686 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:25:02 crc kubenswrapper[4606]: I1212 00:25:02.661900 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:25:02 crc kubenswrapper[4606]: I1212 00:25:02.662131 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:25:02Z","lastTransitionTime":"2025-12-12T00:25:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:25:02 crc kubenswrapper[4606]: I1212 00:25:02.698992 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 12 00:25:02 crc kubenswrapper[4606]: E1212 00:25:02.699155 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 12 00:25:02 crc kubenswrapper[4606]: I1212 00:25:02.699350 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 12 00:25:02 crc kubenswrapper[4606]: I1212 00:25:02.699409 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 12 00:25:02 crc kubenswrapper[4606]: E1212 00:25:02.699697 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 12 00:25:02 crc kubenswrapper[4606]: E1212 00:25:02.699582 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 12 00:25:02 crc kubenswrapper[4606]: I1212 00:25:02.765916 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:25:02 crc kubenswrapper[4606]: I1212 00:25:02.765995 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:25:02 crc kubenswrapper[4606]: I1212 00:25:02.766019 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:25:02 crc kubenswrapper[4606]: I1212 00:25:02.766095 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:25:02 crc kubenswrapper[4606]: I1212 00:25:02.766124 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:25:02Z","lastTransitionTime":"2025-12-12T00:25:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:25:02 crc kubenswrapper[4606]: I1212 00:25:02.869217 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:25:02 crc kubenswrapper[4606]: I1212 00:25:02.869259 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:25:02 crc kubenswrapper[4606]: I1212 00:25:02.869268 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:25:02 crc kubenswrapper[4606]: I1212 00:25:02.869283 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:25:02 crc kubenswrapper[4606]: I1212 00:25:02.869295 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:25:02Z","lastTransitionTime":"2025-12-12T00:25:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:25:02 crc kubenswrapper[4606]: I1212 00:25:02.979715 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:25:02 crc kubenswrapper[4606]: I1212 00:25:02.979782 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:25:02 crc kubenswrapper[4606]: I1212 00:25:02.979800 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:25:02 crc kubenswrapper[4606]: I1212 00:25:02.979823 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:25:02 crc kubenswrapper[4606]: I1212 00:25:02.979840 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:25:02Z","lastTransitionTime":"2025-12-12T00:25:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:25:03 crc kubenswrapper[4606]: I1212 00:25:03.082010 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:25:03 crc kubenswrapper[4606]: I1212 00:25:03.082057 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:25:03 crc kubenswrapper[4606]: I1212 00:25:03.082068 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:25:03 crc kubenswrapper[4606]: I1212 00:25:03.082085 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:25:03 crc kubenswrapper[4606]: I1212 00:25:03.082098 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:25:03Z","lastTransitionTime":"2025-12-12T00:25:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:25:03 crc kubenswrapper[4606]: I1212 00:25:03.184539 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:25:03 crc kubenswrapper[4606]: I1212 00:25:03.184588 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:25:03 crc kubenswrapper[4606]: I1212 00:25:03.184600 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:25:03 crc kubenswrapper[4606]: I1212 00:25:03.184618 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:25:03 crc kubenswrapper[4606]: I1212 00:25:03.184630 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:25:03Z","lastTransitionTime":"2025-12-12T00:25:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:25:03 crc kubenswrapper[4606]: I1212 00:25:03.286728 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:25:03 crc kubenswrapper[4606]: I1212 00:25:03.286757 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:25:03 crc kubenswrapper[4606]: I1212 00:25:03.286767 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:25:03 crc kubenswrapper[4606]: I1212 00:25:03.286782 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:25:03 crc kubenswrapper[4606]: I1212 00:25:03.286792 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:25:03Z","lastTransitionTime":"2025-12-12T00:25:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:25:03 crc kubenswrapper[4606]: I1212 00:25:03.389268 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:25:03 crc kubenswrapper[4606]: I1212 00:25:03.389343 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:25:03 crc kubenswrapper[4606]: I1212 00:25:03.389356 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:25:03 crc kubenswrapper[4606]: I1212 00:25:03.389375 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:25:03 crc kubenswrapper[4606]: I1212 00:25:03.389388 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:25:03Z","lastTransitionTime":"2025-12-12T00:25:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:25:03 crc kubenswrapper[4606]: I1212 00:25:03.491408 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:25:03 crc kubenswrapper[4606]: I1212 00:25:03.491476 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:25:03 crc kubenswrapper[4606]: I1212 00:25:03.491487 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:25:03 crc kubenswrapper[4606]: I1212 00:25:03.491500 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:25:03 crc kubenswrapper[4606]: I1212 00:25:03.491510 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:25:03Z","lastTransitionTime":"2025-12-12T00:25:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:25:03 crc kubenswrapper[4606]: I1212 00:25:03.593534 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:25:03 crc kubenswrapper[4606]: I1212 00:25:03.593594 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:25:03 crc kubenswrapper[4606]: I1212 00:25:03.593605 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:25:03 crc kubenswrapper[4606]: I1212 00:25:03.593622 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:25:03 crc kubenswrapper[4606]: I1212 00:25:03.593634 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:25:03Z","lastTransitionTime":"2025-12-12T00:25:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:25:03 crc kubenswrapper[4606]: I1212 00:25:03.697488 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:25:03 crc kubenswrapper[4606]: I1212 00:25:03.697554 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:25:03 crc kubenswrapper[4606]: I1212 00:25:03.697572 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:25:03 crc kubenswrapper[4606]: I1212 00:25:03.697598 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:25:03 crc kubenswrapper[4606]: I1212 00:25:03.697617 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:25:03Z","lastTransitionTime":"2025-12-12T00:25:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:25:03 crc kubenswrapper[4606]: I1212 00:25:03.698801 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mjjwd" Dec 12 00:25:03 crc kubenswrapper[4606]: E1212 00:25:03.699008 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mjjwd" podUID="0853dce1-c009-407e-960d-1113f85e503f" Dec 12 00:25:03 crc kubenswrapper[4606]: I1212 00:25:03.800657 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:25:03 crc kubenswrapper[4606]: I1212 00:25:03.800715 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:25:03 crc kubenswrapper[4606]: I1212 00:25:03.800728 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:25:03 crc kubenswrapper[4606]: I1212 00:25:03.800937 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:25:03 crc kubenswrapper[4606]: I1212 00:25:03.800950 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:25:03Z","lastTransitionTime":"2025-12-12T00:25:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:25:03 crc kubenswrapper[4606]: I1212 00:25:03.903405 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:25:03 crc kubenswrapper[4606]: I1212 00:25:03.903450 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:25:03 crc kubenswrapper[4606]: I1212 00:25:03.903462 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:25:03 crc kubenswrapper[4606]: I1212 00:25:03.903481 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:25:03 crc kubenswrapper[4606]: I1212 00:25:03.903494 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:25:03Z","lastTransitionTime":"2025-12-12T00:25:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:25:04 crc kubenswrapper[4606]: I1212 00:25:04.005516 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:25:04 crc kubenswrapper[4606]: I1212 00:25:04.005541 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:25:04 crc kubenswrapper[4606]: I1212 00:25:04.005549 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:25:04 crc kubenswrapper[4606]: I1212 00:25:04.005561 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:25:04 crc kubenswrapper[4606]: I1212 00:25:04.005569 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:25:04Z","lastTransitionTime":"2025-12-12T00:25:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:25:04 crc kubenswrapper[4606]: I1212 00:25:04.107921 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:25:04 crc kubenswrapper[4606]: I1212 00:25:04.107963 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:25:04 crc kubenswrapper[4606]: I1212 00:25:04.107974 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:25:04 crc kubenswrapper[4606]: I1212 00:25:04.107991 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:25:04 crc kubenswrapper[4606]: I1212 00:25:04.108001 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:25:04Z","lastTransitionTime":"2025-12-12T00:25:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:25:04 crc kubenswrapper[4606]: I1212 00:25:04.210443 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:25:04 crc kubenswrapper[4606]: I1212 00:25:04.210492 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:25:04 crc kubenswrapper[4606]: I1212 00:25:04.210504 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:25:04 crc kubenswrapper[4606]: I1212 00:25:04.210520 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:25:04 crc kubenswrapper[4606]: I1212 00:25:04.210531 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:25:04Z","lastTransitionTime":"2025-12-12T00:25:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:25:04 crc kubenswrapper[4606]: I1212 00:25:04.312301 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:25:04 crc kubenswrapper[4606]: I1212 00:25:04.312343 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:25:04 crc kubenswrapper[4606]: I1212 00:25:04.312356 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:25:04 crc kubenswrapper[4606]: I1212 00:25:04.312371 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:25:04 crc kubenswrapper[4606]: I1212 00:25:04.312382 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:25:04Z","lastTransitionTime":"2025-12-12T00:25:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:25:04 crc kubenswrapper[4606]: I1212 00:25:04.414506 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:25:04 crc kubenswrapper[4606]: I1212 00:25:04.414550 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:25:04 crc kubenswrapper[4606]: I1212 00:25:04.414565 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:25:04 crc kubenswrapper[4606]: I1212 00:25:04.414586 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:25:04 crc kubenswrapper[4606]: I1212 00:25:04.414603 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:25:04Z","lastTransitionTime":"2025-12-12T00:25:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:25:04 crc kubenswrapper[4606]: I1212 00:25:04.517263 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:25:04 crc kubenswrapper[4606]: I1212 00:25:04.517291 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:25:04 crc kubenswrapper[4606]: I1212 00:25:04.517298 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:25:04 crc kubenswrapper[4606]: I1212 00:25:04.517311 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:25:04 crc kubenswrapper[4606]: I1212 00:25:04.517319 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:25:04Z","lastTransitionTime":"2025-12-12T00:25:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:25:04 crc kubenswrapper[4606]: I1212 00:25:04.620901 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:25:04 crc kubenswrapper[4606]: I1212 00:25:04.621024 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:25:04 crc kubenswrapper[4606]: I1212 00:25:04.621096 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:25:04 crc kubenswrapper[4606]: I1212 00:25:04.621125 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:25:04 crc kubenswrapper[4606]: I1212 00:25:04.621241 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:25:04Z","lastTransitionTime":"2025-12-12T00:25:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:25:04 crc kubenswrapper[4606]: I1212 00:25:04.699573 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 12 00:25:04 crc kubenswrapper[4606]: I1212 00:25:04.699608 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 12 00:25:04 crc kubenswrapper[4606]: E1212 00:25:04.699791 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 12 00:25:04 crc kubenswrapper[4606]: I1212 00:25:04.699631 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 12 00:25:04 crc kubenswrapper[4606]: E1212 00:25:04.699943 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 12 00:25:04 crc kubenswrapper[4606]: E1212 00:25:04.700029 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 12 00:25:04 crc kubenswrapper[4606]: I1212 00:25:04.724482 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:25:04 crc kubenswrapper[4606]: I1212 00:25:04.724528 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:25:04 crc kubenswrapper[4606]: I1212 00:25:04.724544 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:25:04 crc kubenswrapper[4606]: I1212 00:25:04.724567 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:25:04 crc kubenswrapper[4606]: I1212 00:25:04.724584 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:25:04Z","lastTransitionTime":"2025-12-12T00:25:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:25:04 crc kubenswrapper[4606]: I1212 00:25:04.827459 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:25:04 crc kubenswrapper[4606]: I1212 00:25:04.827524 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:25:04 crc kubenswrapper[4606]: I1212 00:25:04.827549 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:25:04 crc kubenswrapper[4606]: I1212 00:25:04.827582 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:25:04 crc kubenswrapper[4606]: I1212 00:25:04.827604 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:25:04Z","lastTransitionTime":"2025-12-12T00:25:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:25:04 crc kubenswrapper[4606]: I1212 00:25:04.931241 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:25:04 crc kubenswrapper[4606]: I1212 00:25:04.931304 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:25:04 crc kubenswrapper[4606]: I1212 00:25:04.931322 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:25:04 crc kubenswrapper[4606]: I1212 00:25:04.931343 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:25:04 crc kubenswrapper[4606]: I1212 00:25:04.931359 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:25:04Z","lastTransitionTime":"2025-12-12T00:25:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:25:05 crc kubenswrapper[4606]: I1212 00:25:05.033852 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:25:05 crc kubenswrapper[4606]: I1212 00:25:05.034262 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:25:05 crc kubenswrapper[4606]: I1212 00:25:05.034282 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:25:05 crc kubenswrapper[4606]: I1212 00:25:05.034304 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:25:05 crc kubenswrapper[4606]: I1212 00:25:05.034321 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:25:05Z","lastTransitionTime":"2025-12-12T00:25:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:25:05 crc kubenswrapper[4606]: I1212 00:25:05.137165 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:25:05 crc kubenswrapper[4606]: I1212 00:25:05.137221 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:25:05 crc kubenswrapper[4606]: I1212 00:25:05.137232 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:25:05 crc kubenswrapper[4606]: I1212 00:25:05.137249 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:25:05 crc kubenswrapper[4606]: I1212 00:25:05.137263 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:25:05Z","lastTransitionTime":"2025-12-12T00:25:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:25:05 crc kubenswrapper[4606]: I1212 00:25:05.240087 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:25:05 crc kubenswrapper[4606]: I1212 00:25:05.240130 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:25:05 crc kubenswrapper[4606]: I1212 00:25:05.240145 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:25:05 crc kubenswrapper[4606]: I1212 00:25:05.240166 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:25:05 crc kubenswrapper[4606]: I1212 00:25:05.240227 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:25:05Z","lastTransitionTime":"2025-12-12T00:25:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:25:05 crc kubenswrapper[4606]: I1212 00:25:05.342385 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:25:05 crc kubenswrapper[4606]: I1212 00:25:05.342431 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:25:05 crc kubenswrapper[4606]: I1212 00:25:05.342447 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:25:05 crc kubenswrapper[4606]: I1212 00:25:05.342469 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:25:05 crc kubenswrapper[4606]: I1212 00:25:05.342484 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:25:05Z","lastTransitionTime":"2025-12-12T00:25:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:25:05 crc kubenswrapper[4606]: I1212 00:25:05.445942 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:25:05 crc kubenswrapper[4606]: I1212 00:25:05.446006 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:25:05 crc kubenswrapper[4606]: I1212 00:25:05.446029 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:25:05 crc kubenswrapper[4606]: I1212 00:25:05.446053 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:25:05 crc kubenswrapper[4606]: I1212 00:25:05.446072 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:25:05Z","lastTransitionTime":"2025-12-12T00:25:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:25:05 crc kubenswrapper[4606]: I1212 00:25:05.549303 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:25:05 crc kubenswrapper[4606]: I1212 00:25:05.549368 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:25:05 crc kubenswrapper[4606]: I1212 00:25:05.549387 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:25:05 crc kubenswrapper[4606]: I1212 00:25:05.549412 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:25:05 crc kubenswrapper[4606]: I1212 00:25:05.549430 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:25:05Z","lastTransitionTime":"2025-12-12T00:25:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:25:05 crc kubenswrapper[4606]: I1212 00:25:05.651360 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:25:05 crc kubenswrapper[4606]: I1212 00:25:05.651389 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:25:05 crc kubenswrapper[4606]: I1212 00:25:05.651398 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:25:05 crc kubenswrapper[4606]: I1212 00:25:05.651411 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:25:05 crc kubenswrapper[4606]: I1212 00:25:05.651420 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:25:05Z","lastTransitionTime":"2025-12-12T00:25:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:25:05 crc kubenswrapper[4606]: I1212 00:25:05.699295 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mjjwd" Dec 12 00:25:05 crc kubenswrapper[4606]: E1212 00:25:05.699471 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mjjwd" podUID="0853dce1-c009-407e-960d-1113f85e503f" Dec 12 00:25:05 crc kubenswrapper[4606]: I1212 00:25:05.754552 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:25:05 crc kubenswrapper[4606]: I1212 00:25:05.754601 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:25:05 crc kubenswrapper[4606]: I1212 00:25:05.754617 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:25:05 crc kubenswrapper[4606]: I1212 00:25:05.754639 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:25:05 crc kubenswrapper[4606]: I1212 00:25:05.754656 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:25:05Z","lastTransitionTime":"2025-12-12T00:25:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:25:05 crc kubenswrapper[4606]: I1212 00:25:05.857958 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:25:05 crc kubenswrapper[4606]: I1212 00:25:05.858033 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:25:05 crc kubenswrapper[4606]: I1212 00:25:05.858051 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:25:05 crc kubenswrapper[4606]: I1212 00:25:05.858076 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:25:05 crc kubenswrapper[4606]: I1212 00:25:05.858092 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:25:05Z","lastTransitionTime":"2025-12-12T00:25:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:25:05 crc kubenswrapper[4606]: I1212 00:25:05.961057 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:25:05 crc kubenswrapper[4606]: I1212 00:25:05.961124 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:25:05 crc kubenswrapper[4606]: I1212 00:25:05.961149 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:25:05 crc kubenswrapper[4606]: I1212 00:25:05.961219 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:25:05 crc kubenswrapper[4606]: I1212 00:25:05.961245 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:25:05Z","lastTransitionTime":"2025-12-12T00:25:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:25:06 crc kubenswrapper[4606]: I1212 00:25:06.064267 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:25:06 crc kubenswrapper[4606]: I1212 00:25:06.064575 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:25:06 crc kubenswrapper[4606]: I1212 00:25:06.064590 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:25:06 crc kubenswrapper[4606]: I1212 00:25:06.064607 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:25:06 crc kubenswrapper[4606]: I1212 00:25:06.064619 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:25:06Z","lastTransitionTime":"2025-12-12T00:25:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:25:06 crc kubenswrapper[4606]: I1212 00:25:06.167728 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:25:06 crc kubenswrapper[4606]: I1212 00:25:06.167762 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:25:06 crc kubenswrapper[4606]: I1212 00:25:06.167774 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:25:06 crc kubenswrapper[4606]: I1212 00:25:06.167792 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:25:06 crc kubenswrapper[4606]: I1212 00:25:06.167804 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:25:06Z","lastTransitionTime":"2025-12-12T00:25:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:25:06 crc kubenswrapper[4606]: I1212 00:25:06.270117 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:25:06 crc kubenswrapper[4606]: I1212 00:25:06.270203 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:25:06 crc kubenswrapper[4606]: I1212 00:25:06.270225 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:25:06 crc kubenswrapper[4606]: I1212 00:25:06.270248 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:25:06 crc kubenswrapper[4606]: I1212 00:25:06.270266 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:25:06Z","lastTransitionTime":"2025-12-12T00:25:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:25:06 crc kubenswrapper[4606]: I1212 00:25:06.373512 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:25:06 crc kubenswrapper[4606]: I1212 00:25:06.373763 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:25:06 crc kubenswrapper[4606]: I1212 00:25:06.373847 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:25:06 crc kubenswrapper[4606]: I1212 00:25:06.373935 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:25:06 crc kubenswrapper[4606]: I1212 00:25:06.374011 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:25:06Z","lastTransitionTime":"2025-12-12T00:25:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:25:06 crc kubenswrapper[4606]: I1212 00:25:06.477760 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:25:06 crc kubenswrapper[4606]: I1212 00:25:06.477845 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:25:06 crc kubenswrapper[4606]: I1212 00:25:06.477867 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:25:06 crc kubenswrapper[4606]: I1212 00:25:06.477898 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:25:06 crc kubenswrapper[4606]: I1212 00:25:06.477919 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:25:06Z","lastTransitionTime":"2025-12-12T00:25:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:25:06 crc kubenswrapper[4606]: I1212 00:25:06.580787 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:25:06 crc kubenswrapper[4606]: I1212 00:25:06.580846 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:25:06 crc kubenswrapper[4606]: I1212 00:25:06.580864 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:25:06 crc kubenswrapper[4606]: I1212 00:25:06.580893 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:25:06 crc kubenswrapper[4606]: I1212 00:25:06.580910 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:25:06Z","lastTransitionTime":"2025-12-12T00:25:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:25:06 crc kubenswrapper[4606]: I1212 00:25:06.682953 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:25:06 crc kubenswrapper[4606]: I1212 00:25:06.682998 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:25:06 crc kubenswrapper[4606]: I1212 00:25:06.683009 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:25:06 crc kubenswrapper[4606]: I1212 00:25:06.683026 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:25:06 crc kubenswrapper[4606]: I1212 00:25:06.683040 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:25:06Z","lastTransitionTime":"2025-12-12T00:25:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:25:06 crc kubenswrapper[4606]: I1212 00:25:06.699242 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 12 00:25:06 crc kubenswrapper[4606]: I1212 00:25:06.699258 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 12 00:25:06 crc kubenswrapper[4606]: E1212 00:25:06.699331 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 12 00:25:06 crc kubenswrapper[4606]: I1212 00:25:06.699469 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 12 00:25:06 crc kubenswrapper[4606]: E1212 00:25:06.699515 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 12 00:25:06 crc kubenswrapper[4606]: E1212 00:25:06.699619 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 12 00:25:06 crc kubenswrapper[4606]: I1212 00:25:06.786097 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:25:06 crc kubenswrapper[4606]: I1212 00:25:06.786156 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:25:06 crc kubenswrapper[4606]: I1212 00:25:06.786201 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:25:06 crc kubenswrapper[4606]: I1212 00:25:06.786226 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:25:06 crc kubenswrapper[4606]: I1212 00:25:06.786245 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:25:06Z","lastTransitionTime":"2025-12-12T00:25:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:25:06 crc kubenswrapper[4606]: I1212 00:25:06.888954 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:25:06 crc kubenswrapper[4606]: I1212 00:25:06.888982 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:25:06 crc kubenswrapper[4606]: I1212 00:25:06.888990 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:25:06 crc kubenswrapper[4606]: I1212 00:25:06.889003 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:25:06 crc kubenswrapper[4606]: I1212 00:25:06.889011 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:25:06Z","lastTransitionTime":"2025-12-12T00:25:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:25:06 crc kubenswrapper[4606]: I1212 00:25:06.991924 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:25:06 crc kubenswrapper[4606]: I1212 00:25:06.991995 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:25:06 crc kubenswrapper[4606]: I1212 00:25:06.992006 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:25:06 crc kubenswrapper[4606]: I1212 00:25:06.992021 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:25:06 crc kubenswrapper[4606]: I1212 00:25:06.992029 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:25:06Z","lastTransitionTime":"2025-12-12T00:25:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:25:07 crc kubenswrapper[4606]: I1212 00:25:07.095224 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:25:07 crc kubenswrapper[4606]: I1212 00:25:07.096004 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:25:07 crc kubenswrapper[4606]: I1212 00:25:07.096156 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:25:07 crc kubenswrapper[4606]: I1212 00:25:07.096418 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:25:07 crc kubenswrapper[4606]: I1212 00:25:07.096565 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:25:07Z","lastTransitionTime":"2025-12-12T00:25:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:25:07 crc kubenswrapper[4606]: I1212 00:25:07.198880 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:25:07 crc kubenswrapper[4606]: I1212 00:25:07.198913 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:25:07 crc kubenswrapper[4606]: I1212 00:25:07.198921 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:25:07 crc kubenswrapper[4606]: I1212 00:25:07.198934 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:25:07 crc kubenswrapper[4606]: I1212 00:25:07.198943 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:25:07Z","lastTransitionTime":"2025-12-12T00:25:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:25:07 crc kubenswrapper[4606]: I1212 00:25:07.302061 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:25:07 crc kubenswrapper[4606]: I1212 00:25:07.302130 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:25:07 crc kubenswrapper[4606]: I1212 00:25:07.302154 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:25:07 crc kubenswrapper[4606]: I1212 00:25:07.302219 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:25:07 crc kubenswrapper[4606]: I1212 00:25:07.302245 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:25:07Z","lastTransitionTime":"2025-12-12T00:25:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:25:07 crc kubenswrapper[4606]: I1212 00:25:07.405108 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:25:07 crc kubenswrapper[4606]: I1212 00:25:07.405267 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:25:07 crc kubenswrapper[4606]: I1212 00:25:07.405294 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:25:07 crc kubenswrapper[4606]: I1212 00:25:07.405321 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:25:07 crc kubenswrapper[4606]: I1212 00:25:07.405341 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:25:07Z","lastTransitionTime":"2025-12-12T00:25:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:25:07 crc kubenswrapper[4606]: I1212 00:25:07.505113 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0853dce1-c009-407e-960d-1113f85e503f-metrics-certs\") pod \"network-metrics-daemon-mjjwd\" (UID: \"0853dce1-c009-407e-960d-1113f85e503f\") " pod="openshift-multus/network-metrics-daemon-mjjwd" Dec 12 00:25:07 crc kubenswrapper[4606]: E1212 00:25:07.505311 4606 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 12 00:25:07 crc kubenswrapper[4606]: E1212 00:25:07.505380 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0853dce1-c009-407e-960d-1113f85e503f-metrics-certs podName:0853dce1-c009-407e-960d-1113f85e503f nodeName:}" failed. No retries permitted until 2025-12-12 00:26:11.505358555 +0000 UTC m=+162.050711451 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0853dce1-c009-407e-960d-1113f85e503f-metrics-certs") pod "network-metrics-daemon-mjjwd" (UID: "0853dce1-c009-407e-960d-1113f85e503f") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 12 00:25:07 crc kubenswrapper[4606]: I1212 00:25:07.508098 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:25:07 crc kubenswrapper[4606]: I1212 00:25:07.508156 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:25:07 crc kubenswrapper[4606]: I1212 00:25:07.508214 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:25:07 crc kubenswrapper[4606]: I1212 00:25:07.508245 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:25:07 crc kubenswrapper[4606]: I1212 00:25:07.508313 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:25:07Z","lastTransitionTime":"2025-12-12T00:25:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:25:07 crc kubenswrapper[4606]: I1212 00:25:07.610996 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:25:07 crc kubenswrapper[4606]: I1212 00:25:07.611038 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:25:07 crc kubenswrapper[4606]: I1212 00:25:07.611048 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:25:07 crc kubenswrapper[4606]: I1212 00:25:07.611063 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:25:07 crc kubenswrapper[4606]: I1212 00:25:07.611073 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:25:07Z","lastTransitionTime":"2025-12-12T00:25:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:25:07 crc kubenswrapper[4606]: I1212 00:25:07.699377 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mjjwd" Dec 12 00:25:07 crc kubenswrapper[4606]: E1212 00:25:07.699564 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mjjwd" podUID="0853dce1-c009-407e-960d-1113f85e503f" Dec 12 00:25:07 crc kubenswrapper[4606]: I1212 00:25:07.713611 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:25:07 crc kubenswrapper[4606]: I1212 00:25:07.713668 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:25:07 crc kubenswrapper[4606]: I1212 00:25:07.713686 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:25:07 crc kubenswrapper[4606]: I1212 00:25:07.713721 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:25:07 crc kubenswrapper[4606]: I1212 00:25:07.713758 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:25:07Z","lastTransitionTime":"2025-12-12T00:25:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:25:07 crc kubenswrapper[4606]: I1212 00:25:07.817454 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:25:07 crc kubenswrapper[4606]: I1212 00:25:07.817486 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:25:07 crc kubenswrapper[4606]: I1212 00:25:07.817494 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:25:07 crc kubenswrapper[4606]: I1212 00:25:07.817507 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:25:07 crc kubenswrapper[4606]: I1212 00:25:07.817516 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:25:07Z","lastTransitionTime":"2025-12-12T00:25:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:25:07 crc kubenswrapper[4606]: I1212 00:25:07.919667 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:25:07 crc kubenswrapper[4606]: I1212 00:25:07.919765 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:25:07 crc kubenswrapper[4606]: I1212 00:25:07.919785 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:25:07 crc kubenswrapper[4606]: I1212 00:25:07.919817 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:25:07 crc kubenswrapper[4606]: I1212 00:25:07.919834 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:25:07Z","lastTransitionTime":"2025-12-12T00:25:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:25:08 crc kubenswrapper[4606]: I1212 00:25:08.023356 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:25:08 crc kubenswrapper[4606]: I1212 00:25:08.023411 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:25:08 crc kubenswrapper[4606]: I1212 00:25:08.023437 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:25:08 crc kubenswrapper[4606]: I1212 00:25:08.023458 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:25:08 crc kubenswrapper[4606]: I1212 00:25:08.023473 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:25:08Z","lastTransitionTime":"2025-12-12T00:25:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:25:08 crc kubenswrapper[4606]: I1212 00:25:08.125321 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:25:08 crc kubenswrapper[4606]: I1212 00:25:08.125365 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:25:08 crc kubenswrapper[4606]: I1212 00:25:08.125377 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:25:08 crc kubenswrapper[4606]: I1212 00:25:08.125394 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:25:08 crc kubenswrapper[4606]: I1212 00:25:08.125756 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:25:08Z","lastTransitionTime":"2025-12-12T00:25:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:25:08 crc kubenswrapper[4606]: I1212 00:25:08.227720 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:25:08 crc kubenswrapper[4606]: I1212 00:25:08.227756 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:25:08 crc kubenswrapper[4606]: I1212 00:25:08.227764 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:25:08 crc kubenswrapper[4606]: I1212 00:25:08.227777 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:25:08 crc kubenswrapper[4606]: I1212 00:25:08.227786 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:25:08Z","lastTransitionTime":"2025-12-12T00:25:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:25:08 crc kubenswrapper[4606]: I1212 00:25:08.331017 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:25:08 crc kubenswrapper[4606]: I1212 00:25:08.331077 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:25:08 crc kubenswrapper[4606]: I1212 00:25:08.331102 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:25:08 crc kubenswrapper[4606]: I1212 00:25:08.331131 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:25:08 crc kubenswrapper[4606]: I1212 00:25:08.331151 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:25:08Z","lastTransitionTime":"2025-12-12T00:25:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:25:08 crc kubenswrapper[4606]: I1212 00:25:08.434582 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:25:08 crc kubenswrapper[4606]: I1212 00:25:08.434649 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:25:08 crc kubenswrapper[4606]: I1212 00:25:08.434674 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:25:08 crc kubenswrapper[4606]: I1212 00:25:08.434702 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:25:08 crc kubenswrapper[4606]: I1212 00:25:08.434725 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:25:08Z","lastTransitionTime":"2025-12-12T00:25:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:25:08 crc kubenswrapper[4606]: I1212 00:25:08.537030 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:25:08 crc kubenswrapper[4606]: I1212 00:25:08.537116 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:25:08 crc kubenswrapper[4606]: I1212 00:25:08.537135 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:25:08 crc kubenswrapper[4606]: I1212 00:25:08.537157 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:25:08 crc kubenswrapper[4606]: I1212 00:25:08.537215 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:25:08Z","lastTransitionTime":"2025-12-12T00:25:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:25:08 crc kubenswrapper[4606]: I1212 00:25:08.640381 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:25:08 crc kubenswrapper[4606]: I1212 00:25:08.640438 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:25:08 crc kubenswrapper[4606]: I1212 00:25:08.640447 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:25:08 crc kubenswrapper[4606]: I1212 00:25:08.640469 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:25:08 crc kubenswrapper[4606]: I1212 00:25:08.640480 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:25:08Z","lastTransitionTime":"2025-12-12T00:25:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:25:08 crc kubenswrapper[4606]: I1212 00:25:08.698673 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 12 00:25:08 crc kubenswrapper[4606]: I1212 00:25:08.698833 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 12 00:25:08 crc kubenswrapper[4606]: I1212 00:25:08.699152 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 12 00:25:08 crc kubenswrapper[4606]: E1212 00:25:08.699119 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 12 00:25:08 crc kubenswrapper[4606]: E1212 00:25:08.699463 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 12 00:25:08 crc kubenswrapper[4606]: E1212 00:25:08.699645 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 12 00:25:08 crc kubenswrapper[4606]: I1212 00:25:08.743923 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:25:08 crc kubenswrapper[4606]: I1212 00:25:08.744005 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:25:08 crc kubenswrapper[4606]: I1212 00:25:08.744026 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:25:08 crc kubenswrapper[4606]: I1212 00:25:08.744053 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:25:08 crc kubenswrapper[4606]: I1212 00:25:08.744074 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:25:08Z","lastTransitionTime":"2025-12-12T00:25:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:25:08 crc kubenswrapper[4606]: I1212 00:25:08.848368 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:25:08 crc kubenswrapper[4606]: I1212 00:25:08.848428 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:25:08 crc kubenswrapper[4606]: I1212 00:25:08.848445 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:25:08 crc kubenswrapper[4606]: I1212 00:25:08.848469 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:25:08 crc kubenswrapper[4606]: I1212 00:25:08.848489 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:25:08Z","lastTransitionTime":"2025-12-12T00:25:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:25:08 crc kubenswrapper[4606]: I1212 00:25:08.951052 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:25:08 crc kubenswrapper[4606]: I1212 00:25:08.951123 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:25:08 crc kubenswrapper[4606]: I1212 00:25:08.951144 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:25:08 crc kubenswrapper[4606]: I1212 00:25:08.951211 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:25:08 crc kubenswrapper[4606]: I1212 00:25:08.951236 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:25:08Z","lastTransitionTime":"2025-12-12T00:25:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:25:09 crc kubenswrapper[4606]: I1212 00:25:09.054486 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:25:09 crc kubenswrapper[4606]: I1212 00:25:09.054565 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:25:09 crc kubenswrapper[4606]: I1212 00:25:09.054589 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:25:09 crc kubenswrapper[4606]: I1212 00:25:09.054625 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:25:09 crc kubenswrapper[4606]: I1212 00:25:09.054650 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:25:09Z","lastTransitionTime":"2025-12-12T00:25:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:25:09 crc kubenswrapper[4606]: I1212 00:25:09.157880 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:25:09 crc kubenswrapper[4606]: I1212 00:25:09.157924 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:25:09 crc kubenswrapper[4606]: I1212 00:25:09.157935 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:25:09 crc kubenswrapper[4606]: I1212 00:25:09.157957 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:25:09 crc kubenswrapper[4606]: I1212 00:25:09.157975 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:25:09Z","lastTransitionTime":"2025-12-12T00:25:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:25:09 crc kubenswrapper[4606]: I1212 00:25:09.260006 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:25:09 crc kubenswrapper[4606]: I1212 00:25:09.260051 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:25:09 crc kubenswrapper[4606]: I1212 00:25:09.260066 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:25:09 crc kubenswrapper[4606]: I1212 00:25:09.260085 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:25:09 crc kubenswrapper[4606]: I1212 00:25:09.260099 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:25:09Z","lastTransitionTime":"2025-12-12T00:25:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:25:09 crc kubenswrapper[4606]: I1212 00:25:09.362773 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:25:09 crc kubenswrapper[4606]: I1212 00:25:09.362806 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:25:09 crc kubenswrapper[4606]: I1212 00:25:09.362823 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:25:09 crc kubenswrapper[4606]: I1212 00:25:09.362844 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:25:09 crc kubenswrapper[4606]: I1212 00:25:09.362859 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:25:09Z","lastTransitionTime":"2025-12-12T00:25:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:25:09 crc kubenswrapper[4606]: I1212 00:25:09.465311 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:25:09 crc kubenswrapper[4606]: I1212 00:25:09.465370 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:25:09 crc kubenswrapper[4606]: I1212 00:25:09.465388 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:25:09 crc kubenswrapper[4606]: I1212 00:25:09.465410 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:25:09 crc kubenswrapper[4606]: I1212 00:25:09.465428 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:25:09Z","lastTransitionTime":"2025-12-12T00:25:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:25:09 crc kubenswrapper[4606]: I1212 00:25:09.568459 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:25:09 crc kubenswrapper[4606]: I1212 00:25:09.568506 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:25:09 crc kubenswrapper[4606]: I1212 00:25:09.568518 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:25:09 crc kubenswrapper[4606]: I1212 00:25:09.568536 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:25:09 crc kubenswrapper[4606]: I1212 00:25:09.568548 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:25:09Z","lastTransitionTime":"2025-12-12T00:25:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:25:09 crc kubenswrapper[4606]: I1212 00:25:09.672080 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:25:09 crc kubenswrapper[4606]: I1212 00:25:09.672258 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:25:09 crc kubenswrapper[4606]: I1212 00:25:09.672277 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:25:09 crc kubenswrapper[4606]: I1212 00:25:09.672303 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:25:09 crc kubenswrapper[4606]: I1212 00:25:09.672321 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:25:09Z","lastTransitionTime":"2025-12-12T00:25:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:25:09 crc kubenswrapper[4606]: I1212 00:25:09.699667 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mjjwd" Dec 12 00:25:09 crc kubenswrapper[4606]: E1212 00:25:09.699817 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mjjwd" podUID="0853dce1-c009-407e-960d-1113f85e503f" Dec 12 00:25:09 crc kubenswrapper[4606]: I1212 00:25:09.714776 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:25:09Z is after 2025-08-24T17:21:41Z" Dec 12 00:25:09 crc kubenswrapper[4606]: I1212 00:25:09.727413 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://694cd2f423e8241bd8fd68395824d1cd45a12d80e92ad57e63e59c0642b21384\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:25:09Z is after 2025-08-24T17:21:41Z" Dec 12 00:25:09 crc kubenswrapper[4606]: I1212 00:25:09.737114 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a543e227-be89-40cb-941d-b4707cc28921\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b09638dd0d881593745c14548156f20a9366f65ecfa5018d510b902240dd4f7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-986w2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5de882904f4a5718bcf44d07da05096363447836f325f35130914c17a30c3a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-986w2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cqmz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:25:09Z is after 2025-08-24T17:21:41Z" Dec 12 00:25:09 crc kubenswrapper[4606]: I1212 00:25:09.749841 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xzcfk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3601dd0b8c0a574cc4689906f3c4ed491f3432cae8b328c55f07e5d17b6e976\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84110159d07d70b0607ac029fec70fb043bbb0e310c39955e49becfe87e9faa0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-12T00:24:36Z\\\",\\\"message\\\":\\\"2025-12-12T00:23:51+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_d712852f-61b9-48f1-a1d6-1031b964219c\\\\n2025-12-12T00:23:51+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_d712852f-61b9-48f1-a1d6-1031b964219c to /host/opt/cni/bin/\\\\n2025-12-12T00:23:51Z [verbose] multus-daemon started\\\\n2025-12-12T00:23:51Z [verbose] Readiness Indicator file check\\\\n2025-12-12T00:24:36Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-spsv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xzcfk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:25:09Z is after 2025-08-24T17:21:41Z" Dec 12 00:25:09 crc kubenswrapper[4606]: I1212 00:25:09.771712 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f8df398-6835-482a-a2cd-3395b6e9efab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b993d03e36e69edb510ce98f50dfb98344fb6fe2b5debeff3a2004807dd00ee6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c6fef20d4e159a405f0ab0206c49e5e314e97bf7ddaf0758e090106945c9ffe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://accf40ef1339b65136666c1ea6f21a70d2152a650c87ef3ade19ddb6f9effbc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbd49ad6d6011a235d440da0e8058c0fc4dcc64effec37b50a68e84438d537a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2860bce1479ca1d558c4f5230d76cadcaf3efc0e3842ef0d371ecf1168cd852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7780eb068dc4a157794f9253367d2198a38a2c039cf3906fc2319cf48db3cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa7780eb068dc4a157794f9253367d2198a38a2c039cf3906fc2319cf48db3cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://684c387c89889a0a1e8d317ca70ebfc71ea377eb5de9c08c71a3ff02faf99e4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://684c387c89889a0a1e8d317ca70ebfc71ea377eb5de9c08c71a3ff02faf99e4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://78da314feaec6cab81e27eaea086d2bba3a05c4b73ce2add1a0bb57a226b0f5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78da314feaec6cab81e27eaea086d2bba3a05c4b73ce2add1a0bb57a226b0f5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:25:09Z is after 2025-08-24T17:21:41Z" Dec 12 00:25:09 crc kubenswrapper[4606]: I1212 00:25:09.774653 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:25:09 crc kubenswrapper[4606]: I1212 00:25:09.774689 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:25:09 crc kubenswrapper[4606]: I1212 00:25:09.774698 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:25:09 crc kubenswrapper[4606]: I1212 00:25:09.774711 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:25:09 crc kubenswrapper[4606]: I1212 00:25:09.774720 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:25:09Z","lastTransitionTime":"2025-12-12T00:25:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:25:09 crc kubenswrapper[4606]: I1212 00:25:09.784138 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92839d27-9755-4aaa-a4c2-8aa83b6473de\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d18b3a560566347e9cfcd748453a89d1588de77f02a5e961a71916c6182eff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9531d10c07644d54aeff48edf4ac068968acbd1b6597baabcb0660e7bcd54c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e76f2e4c2d97f36aebba4bcbabe12214a1df27637376d1440ee54d43fca39065\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b764ca1153735e365d95ed2855c7d41cd7457614e1ddce0f008d62720d607e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c75350d79a93c04db2bd6d71995920cde9b897776249b8696f3ead3939f7d5d4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"ed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1765499023\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1765499022\\\\\\\\\\\\\\\" (2025-12-11 23:23:42 +0000 UTC to 2026-12-11 23:23:42 +0000 UTC (now=2025-12-12 00:23:48.120831298 +0000 UTC))\\\\\\\"\\\\nI1212 00:23:48.120889 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1212 00:23:48.120921 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1212 00:23:48.120964 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1212 00:23:48.120986 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1212 00:23:48.121028 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3271453193/tls.crt::/tmp/serving-cert-3271453193/tls.key\\\\\\\"\\\\nI1212 00:23:48.121084 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1212 00:23:48.121211 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1212 00:23:48.121228 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1212 00:23:48.121265 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1212 00:23:48.121275 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1212 00:23:48.121345 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1212 00:23:48.121358 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF1212 00:23:48.123292 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3baf1bf5fbaa04f9f165f43b6a4fba496be0d453641fcd8bbc811687891dd66\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bbabfdcef0c9f4dcaada0bf5c2e98c083fd0f809462f017f8a767ef800e8e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0bbabfdcef0c9f4dcaada0bf5c2e98c083fd0f809462f017f8a767ef800e8e13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:25:09Z is after 2025-08-24T17:21:41Z" Dec 12 00:25:09 crc kubenswrapper[4606]: I1212 00:25:09.795538 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ba650bd-b1e1-4950-98c6-2bee87181b18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de6cd366dc8d770945235faa6560cea35c0ef7eabd95cb272639e2116e2d623d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a380a96c542428bf9cf66391f4d843144052269b294636e863319b2ba954047f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://902a9d2fb0d7dcdec6339fac981ee0539911f1998a86ecf02a01015d3b1b5907\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4309a0a93977264e5000a930a32a4fb18ebe51fcf2af20bbf153f30f09759ff6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:25:09Z is after 2025-08-24T17:21:41Z" Dec 12 00:25:09 crc kubenswrapper[4606]: I1212 00:25:09.809575 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:25:09Z is after 2025-08-24T17:21:41Z" Dec 12 00:25:09 crc kubenswrapper[4606]: I1212 00:25:09.818083 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qtz7b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f437a237-3eb8-4817-b0a9-35efece69933\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f587a144b02f6ba1229380174f2a4f2c50ed33702fe8d64eb5ff7174b32eb2d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qq49f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qtz7b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:25:09Z is after 2025-08-24T17:21:41Z" Dec 12 00:25:09 crc kubenswrapper[4606]: I1212 00:25:09.828360 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67cd27ea-8882-4aa1-ba7a-eb2058cac536\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c06b3470d2a194c99cd63ef6039a6f2abff89fc93e8f61feca67c4eefa9b9ac8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://892ac723ca54cc612171eec750846b774fcfc54e09531ff344813a1e1afc61fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://892ac723ca54cc612171eec750846b774fcfc54e09531ff344813a1e1afc61fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:25:09Z is after 2025-08-24T17:21:41Z" Dec 12 00:25:09 crc kubenswrapper[4606]: I1212 00:25:09.840358 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab571c5f-6f1d-4cb9-9a00-a57cc7baad63\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7a27c1b503f68ce99c9004d0190e5c74380bf2ff33b2b0e1e7f424e5cf9d450\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a159e87c39859bf3b5652b40223e4a8fdd9dcae3d23c1fae17d0eb8b5842a71c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e338552ee9504d46234c3caf3d9b7306a033258a94b6cf7c542bb957ea32a94c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c88faa76d30856d9170f04a153de4239eefbef49b2ee25a5bd04502697248b5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c88faa76d30856d9170f04a153de4239eefbef49b2ee25a5bd04502697248b5c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:30Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:29Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:25:09Z is after 2025-08-24T17:21:41Z" Dec 12 00:25:09 crc kubenswrapper[4606]: I1212 00:25:09.854077 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4986a294a19f85082b9caf854f13f9c3587ca49314b40917b00768ee0bb51ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4251eab73ccb6141903993f8df922037762f7c11dc1e0ef1a2b5708ed435307\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:25:09Z is after 2025-08-24T17:21:41Z" Dec 12 00:25:09 crc kubenswrapper[4606]: I1212 00:25:09.867237 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce6c4e852b4c68e910621ad11058beecb40960d501d3e64f5fcad97c442df4ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:25:09Z is after 2025-08-24T17:21:41Z" Dec 12 00:25:09 crc kubenswrapper[4606]: I1212 00:25:09.876744 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:25:09 crc kubenswrapper[4606]: I1212 00:25:09.876788 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:25:09 crc kubenswrapper[4606]: I1212 00:25:09.876799 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:25:09 crc kubenswrapper[4606]: I1212 00:25:09.876815 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:25:09 crc kubenswrapper[4606]: I1212 00:25:09.876828 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:25:09Z","lastTransitionTime":"2025-12-12T00:25:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:25:09 crc kubenswrapper[4606]: I1212 00:25:09.889749 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da25b0ba-5398-4185-a4c1-aeba44ae5633\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa827acce161127216bcfcf3553efeb627cbe52e4d23a8cb011e025f67dde670\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://653bdfa6f15b75149c89a4aff0706e132368f8bc75520cde9f1e1c55c8866537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbb742680d55b47d51cd1964e187b6816e50a88befbce58d3a8746af74cf7071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://366ddc63e1156e702b24e0c3d3c02785c10324669fa851e46851f9a2b8a46a5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6183a4685fda61493c792e9ae693cd7f76e4aad9753f622b48347da3fd2b3b00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0867a89de2edf12036267a937b7e3b1a801e9710ff0a1250c20e020c80ac4d5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04af88555385f41f61f31bcc6dcfd9ea677fd98ba9ef0b1a25003fff11e8be1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04af88555385f41f61f31bcc6dcfd9ea677fd98ba9ef0b1a25003fff11e8be1b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-12T00:24:44Z\\\",\\\"message\\\":\\\"atch factory\\\\nI1212 00:24:44.033859 6579 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1212 00:24:44.033876 6579 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1212 00:24:44.033886 6579 handler.go:208] Removed *v1.Node event handler 2\\\\nI1212 00:24:44.033899 6579 handler.go:208] Removed *v1.Node event handler 7\\\\nI1212 00:24:44.033905 6579 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1212 00:24:44.033909 6579 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1212 00:24:44.033926 6579 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1212 00:24:44.033935 6579 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1212 00:24:44.034088 6579 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1212 00:24:44.034244 6579 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1212 00:24:44.034528 6579 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1212 00:24:44.034862 6579 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-12T00:24:43Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-hpw5w_openshift-ovn-kubernetes(da25b0ba-5398-4185-a4c1-aeba44ae5633)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e3c08f155fa3cef9692edf10c4df9719f3647b5625a9ac68a7c158afb80dd86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f94a9f641815d76464c17c6145b9ac4803eedab3bea29a52d65a793d448cebe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f94a9f641815d76464c17c6145b9ac4803eedab3bea29a52d65a793d448cebe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbcl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hpw5w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:25:09Z is after 2025-08-24T17:21:41Z" Dec 12 00:25:09 crc kubenswrapper[4606]: I1212 00:25:09.899519 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7rxps" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c831d6d-b07d-46dd-adc0-85239379350f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba170e097af91d1b27e0ee3e414e0831a5a93256c693247e7129a5fda0b9dc85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:24:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbjbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d66f7e90d4969cb2791da434d47d0e4fb269aa8760347c7d59bd8d16f4cf42c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:24:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hbjbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:24:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7rxps\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:25:09Z is after 2025-08-24T17:21:41Z" Dec 12 00:25:09 crc kubenswrapper[4606]: I1212 00:25:09.907484 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mjjwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0853dce1-c009-407e-960d-1113f85e503f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:24:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwwtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwwtt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:24:03Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mjjwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:25:09Z is after 2025-08-24T17:21:41Z" Dec 12 00:25:09 crc kubenswrapper[4606]: I1212 00:25:09.917122 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:25:09Z is after 2025-08-24T17:21:41Z" Dec 12 00:25:09 crc kubenswrapper[4606]: I1212 00:25:09.925987 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-554rp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d34036b-3243-416a-a0c0-1d1ddb3a0ca5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://346417326a8a4ba8e839ce4870dbaacd33a90b4ab096d8ce573ce598d909b51a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9k8v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-554rp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:25:09Z is after 2025-08-24T17:21:41Z" Dec 12 00:25:09 crc kubenswrapper[4606]: I1212 00:25:09.938086 4606 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-w4rbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"470c2076-46bf-4305-9fb1-3e509eb4d4f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-12T00:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78e7b08154aa4792439cd9e13c38f1c3a106a4d138d8e6a47d5406c82f11662c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-12T00:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34a40d0ce85078630553b90b7a35510e0d3523f600db8940d5e0cf063903192d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34a40d0ce85078630553b90b7a35510e0d3523f600db8940d5e0cf063903192d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f0d903a3e09486a00bc5031c1a100d7a520f8e0cbdcccbeadb796e70f1075b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f0d903a3e09486a00bc5031c1a100d7a520f8e0cbdcccbeadb796e70f1075b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06909fa1392ee8f29394ecd7411437e4f331ccd2007d0a534d0292175a7260ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06909fa1392ee8f29394ecd7411437e4f331ccd2007d0a534d0292175a7260ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://328f409c465a8289f2e2e070ac2475187e763de8354cad8af5b5b4e0a3539628\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://328f409c465a8289f2e2e070ac2475187e763de8354cad8af5b5b4e0a3539628\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09980f0c6b16141a2a66d2768c6e01f75c37314fea48825e9b9707a3fc4d3f9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09980f0c6b16141a2a66d2768c6e01f75c37314fea48825e9b9707a3fc4d3f9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d34cd079b74cccb75c66edf8c51b64113483767ced4eb56ad429d6527d0bb0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d34cd079b74cccb75c66edf8c51b64113483767ced4eb56ad429d6527d0bb0c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-12T00:23:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-12T00:23:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k7tv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-12T00:23:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-w4rbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-12T00:25:09Z is after 2025-08-24T17:21:41Z" Dec 12 00:25:09 crc kubenswrapper[4606]: I1212 00:25:09.978814 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:25:09 crc kubenswrapper[4606]: I1212 00:25:09.978862 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:25:09 crc kubenswrapper[4606]: I1212 00:25:09.978874 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:25:09 crc kubenswrapper[4606]: I1212 00:25:09.978891 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:25:09 crc kubenswrapper[4606]: I1212 00:25:09.978907 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:25:09Z","lastTransitionTime":"2025-12-12T00:25:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:25:10 crc kubenswrapper[4606]: I1212 00:25:10.081112 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:25:10 crc kubenswrapper[4606]: I1212 00:25:10.081153 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:25:10 crc kubenswrapper[4606]: I1212 00:25:10.081163 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:25:10 crc kubenswrapper[4606]: I1212 00:25:10.081198 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:25:10 crc kubenswrapper[4606]: I1212 00:25:10.081210 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:25:10Z","lastTransitionTime":"2025-12-12T00:25:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:25:10 crc kubenswrapper[4606]: I1212 00:25:10.183459 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:25:10 crc kubenswrapper[4606]: I1212 00:25:10.183518 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:25:10 crc kubenswrapper[4606]: I1212 00:25:10.183528 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:25:10 crc kubenswrapper[4606]: I1212 00:25:10.183540 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:25:10 crc kubenswrapper[4606]: I1212 00:25:10.183548 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:25:10Z","lastTransitionTime":"2025-12-12T00:25:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:25:10 crc kubenswrapper[4606]: I1212 00:25:10.285593 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:25:10 crc kubenswrapper[4606]: I1212 00:25:10.285627 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:25:10 crc kubenswrapper[4606]: I1212 00:25:10.285641 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:25:10 crc kubenswrapper[4606]: I1212 00:25:10.285660 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:25:10 crc kubenswrapper[4606]: I1212 00:25:10.285674 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:25:10Z","lastTransitionTime":"2025-12-12T00:25:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:25:10 crc kubenswrapper[4606]: I1212 00:25:10.388218 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:25:10 crc kubenswrapper[4606]: I1212 00:25:10.388248 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:25:10 crc kubenswrapper[4606]: I1212 00:25:10.388256 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:25:10 crc kubenswrapper[4606]: I1212 00:25:10.388269 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:25:10 crc kubenswrapper[4606]: I1212 00:25:10.388279 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:25:10Z","lastTransitionTime":"2025-12-12T00:25:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:25:10 crc kubenswrapper[4606]: I1212 00:25:10.491138 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:25:10 crc kubenswrapper[4606]: I1212 00:25:10.491220 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:25:10 crc kubenswrapper[4606]: I1212 00:25:10.491242 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:25:10 crc kubenswrapper[4606]: I1212 00:25:10.491264 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:25:10 crc kubenswrapper[4606]: I1212 00:25:10.491277 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:25:10Z","lastTransitionTime":"2025-12-12T00:25:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:25:10 crc kubenswrapper[4606]: I1212 00:25:10.594620 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:25:10 crc kubenswrapper[4606]: I1212 00:25:10.594661 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:25:10 crc kubenswrapper[4606]: I1212 00:25:10.594673 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:25:10 crc kubenswrapper[4606]: I1212 00:25:10.594688 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:25:10 crc kubenswrapper[4606]: I1212 00:25:10.594700 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:25:10Z","lastTransitionTime":"2025-12-12T00:25:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:25:10 crc kubenswrapper[4606]: I1212 00:25:10.696308 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:25:10 crc kubenswrapper[4606]: I1212 00:25:10.696345 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:25:10 crc kubenswrapper[4606]: I1212 00:25:10.696355 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:25:10 crc kubenswrapper[4606]: I1212 00:25:10.696372 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:25:10 crc kubenswrapper[4606]: I1212 00:25:10.696429 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:25:10Z","lastTransitionTime":"2025-12-12T00:25:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:25:10 crc kubenswrapper[4606]: I1212 00:25:10.698810 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 12 00:25:10 crc kubenswrapper[4606]: E1212 00:25:10.698937 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 12 00:25:10 crc kubenswrapper[4606]: I1212 00:25:10.698950 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 12 00:25:10 crc kubenswrapper[4606]: I1212 00:25:10.698992 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 12 00:25:10 crc kubenswrapper[4606]: E1212 00:25:10.699168 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 12 00:25:10 crc kubenswrapper[4606]: E1212 00:25:10.699294 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 12 00:25:10 crc kubenswrapper[4606]: I1212 00:25:10.798388 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:25:10 crc kubenswrapper[4606]: I1212 00:25:10.798422 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:25:10 crc kubenswrapper[4606]: I1212 00:25:10.798430 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:25:10 crc kubenswrapper[4606]: I1212 00:25:10.798443 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:25:10 crc kubenswrapper[4606]: I1212 00:25:10.798452 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:25:10Z","lastTransitionTime":"2025-12-12T00:25:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:25:10 crc kubenswrapper[4606]: I1212 00:25:10.900671 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:25:10 crc kubenswrapper[4606]: I1212 00:25:10.900746 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:25:10 crc kubenswrapper[4606]: I1212 00:25:10.900761 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:25:10 crc kubenswrapper[4606]: I1212 00:25:10.900785 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:25:10 crc kubenswrapper[4606]: I1212 00:25:10.900802 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:25:10Z","lastTransitionTime":"2025-12-12T00:25:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:25:11 crc kubenswrapper[4606]: I1212 00:25:11.004480 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:25:11 crc kubenswrapper[4606]: I1212 00:25:11.004529 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:25:11 crc kubenswrapper[4606]: I1212 00:25:11.004541 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:25:11 crc kubenswrapper[4606]: I1212 00:25:11.004560 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:25:11 crc kubenswrapper[4606]: I1212 00:25:11.004573 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:25:11Z","lastTransitionTime":"2025-12-12T00:25:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:25:11 crc kubenswrapper[4606]: I1212 00:25:11.107159 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:25:11 crc kubenswrapper[4606]: I1212 00:25:11.107216 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:25:11 crc kubenswrapper[4606]: I1212 00:25:11.107227 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:25:11 crc kubenswrapper[4606]: I1212 00:25:11.107245 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:25:11 crc kubenswrapper[4606]: I1212 00:25:11.107258 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:25:11Z","lastTransitionTime":"2025-12-12T00:25:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:25:11 crc kubenswrapper[4606]: I1212 00:25:11.210555 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:25:11 crc kubenswrapper[4606]: I1212 00:25:11.211237 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:25:11 crc kubenswrapper[4606]: I1212 00:25:11.211383 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:25:11 crc kubenswrapper[4606]: I1212 00:25:11.211506 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:25:11 crc kubenswrapper[4606]: I1212 00:25:11.211637 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:25:11Z","lastTransitionTime":"2025-12-12T00:25:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:25:11 crc kubenswrapper[4606]: I1212 00:25:11.314518 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:25:11 crc kubenswrapper[4606]: I1212 00:25:11.314599 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:25:11 crc kubenswrapper[4606]: I1212 00:25:11.314618 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:25:11 crc kubenswrapper[4606]: I1212 00:25:11.314641 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:25:11 crc kubenswrapper[4606]: I1212 00:25:11.314658 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:25:11Z","lastTransitionTime":"2025-12-12T00:25:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:25:11 crc kubenswrapper[4606]: I1212 00:25:11.416915 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:25:11 crc kubenswrapper[4606]: I1212 00:25:11.416954 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:25:11 crc kubenswrapper[4606]: I1212 00:25:11.416962 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:25:11 crc kubenswrapper[4606]: I1212 00:25:11.416976 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:25:11 crc kubenswrapper[4606]: I1212 00:25:11.416986 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:25:11Z","lastTransitionTime":"2025-12-12T00:25:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:25:11 crc kubenswrapper[4606]: I1212 00:25:11.520102 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:25:11 crc kubenswrapper[4606]: I1212 00:25:11.520688 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:25:11 crc kubenswrapper[4606]: I1212 00:25:11.520919 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:25:11 crc kubenswrapper[4606]: I1212 00:25:11.521122 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:25:11 crc kubenswrapper[4606]: I1212 00:25:11.521331 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:25:11Z","lastTransitionTime":"2025-12-12T00:25:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:25:11 crc kubenswrapper[4606]: I1212 00:25:11.623833 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:25:11 crc kubenswrapper[4606]: I1212 00:25:11.624435 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:25:11 crc kubenswrapper[4606]: I1212 00:25:11.624599 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:25:11 crc kubenswrapper[4606]: I1212 00:25:11.624735 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:25:11 crc kubenswrapper[4606]: I1212 00:25:11.624881 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:25:11Z","lastTransitionTime":"2025-12-12T00:25:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:25:11 crc kubenswrapper[4606]: I1212 00:25:11.699652 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mjjwd" Dec 12 00:25:11 crc kubenswrapper[4606]: E1212 00:25:11.699840 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mjjwd" podUID="0853dce1-c009-407e-960d-1113f85e503f" Dec 12 00:25:11 crc kubenswrapper[4606]: I1212 00:25:11.701470 4606 scope.go:117] "RemoveContainer" containerID="04af88555385f41f61f31bcc6dcfd9ea677fd98ba9ef0b1a25003fff11e8be1b" Dec 12 00:25:11 crc kubenswrapper[4606]: E1212 00:25:11.701814 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-hpw5w_openshift-ovn-kubernetes(da25b0ba-5398-4185-a4c1-aeba44ae5633)\"" pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" podUID="da25b0ba-5398-4185-a4c1-aeba44ae5633" Dec 12 00:25:11 crc kubenswrapper[4606]: I1212 00:25:11.727425 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:25:11 crc kubenswrapper[4606]: I1212 00:25:11.727487 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:25:11 crc kubenswrapper[4606]: I1212 00:25:11.727509 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:25:11 crc kubenswrapper[4606]: I1212 00:25:11.727538 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:25:11 crc kubenswrapper[4606]: I1212 00:25:11.727562 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:25:11Z","lastTransitionTime":"2025-12-12T00:25:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:25:11 crc kubenswrapper[4606]: I1212 00:25:11.830761 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:25:11 crc kubenswrapper[4606]: I1212 00:25:11.830810 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:25:11 crc kubenswrapper[4606]: I1212 00:25:11.830858 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:25:11 crc kubenswrapper[4606]: I1212 00:25:11.830883 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:25:11 crc kubenswrapper[4606]: I1212 00:25:11.830900 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:25:11Z","lastTransitionTime":"2025-12-12T00:25:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:25:11 crc kubenswrapper[4606]: I1212 00:25:11.933427 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:25:11 crc kubenswrapper[4606]: I1212 00:25:11.933494 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:25:11 crc kubenswrapper[4606]: I1212 00:25:11.933513 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:25:11 crc kubenswrapper[4606]: I1212 00:25:11.933534 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:25:11 crc kubenswrapper[4606]: I1212 00:25:11.933549 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:25:11Z","lastTransitionTime":"2025-12-12T00:25:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:25:12 crc kubenswrapper[4606]: I1212 00:25:12.037334 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:25:12 crc kubenswrapper[4606]: I1212 00:25:12.037432 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:25:12 crc kubenswrapper[4606]: I1212 00:25:12.037463 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:25:12 crc kubenswrapper[4606]: I1212 00:25:12.037492 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:25:12 crc kubenswrapper[4606]: I1212 00:25:12.037516 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:25:12Z","lastTransitionTime":"2025-12-12T00:25:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:25:12 crc kubenswrapper[4606]: I1212 00:25:12.141133 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:25:12 crc kubenswrapper[4606]: I1212 00:25:12.141222 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:25:12 crc kubenswrapper[4606]: I1212 00:25:12.141247 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:25:12 crc kubenswrapper[4606]: I1212 00:25:12.141277 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:25:12 crc kubenswrapper[4606]: I1212 00:25:12.141296 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:25:12Z","lastTransitionTime":"2025-12-12T00:25:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:25:12 crc kubenswrapper[4606]: I1212 00:25:12.244452 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:25:12 crc kubenswrapper[4606]: I1212 00:25:12.244518 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:25:12 crc kubenswrapper[4606]: I1212 00:25:12.244539 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:25:12 crc kubenswrapper[4606]: I1212 00:25:12.244563 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:25:12 crc kubenswrapper[4606]: I1212 00:25:12.244581 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:25:12Z","lastTransitionTime":"2025-12-12T00:25:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:25:12 crc kubenswrapper[4606]: I1212 00:25:12.346864 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:25:12 crc kubenswrapper[4606]: I1212 00:25:12.346911 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:25:12 crc kubenswrapper[4606]: I1212 00:25:12.346926 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:25:12 crc kubenswrapper[4606]: I1212 00:25:12.346945 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:25:12 crc kubenswrapper[4606]: I1212 00:25:12.346958 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:25:12Z","lastTransitionTime":"2025-12-12T00:25:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:25:12 crc kubenswrapper[4606]: I1212 00:25:12.449957 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:25:12 crc kubenswrapper[4606]: I1212 00:25:12.450057 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:25:12 crc kubenswrapper[4606]: I1212 00:25:12.450081 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:25:12 crc kubenswrapper[4606]: I1212 00:25:12.450145 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:25:12 crc kubenswrapper[4606]: I1212 00:25:12.450168 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:25:12Z","lastTransitionTime":"2025-12-12T00:25:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:25:12 crc kubenswrapper[4606]: I1212 00:25:12.553373 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:25:12 crc kubenswrapper[4606]: I1212 00:25:12.553445 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:25:12 crc kubenswrapper[4606]: I1212 00:25:12.553469 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:25:12 crc kubenswrapper[4606]: I1212 00:25:12.553499 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:25:12 crc kubenswrapper[4606]: I1212 00:25:12.553524 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:25:12Z","lastTransitionTime":"2025-12-12T00:25:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:25:12 crc kubenswrapper[4606]: I1212 00:25:12.656637 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:25:12 crc kubenswrapper[4606]: I1212 00:25:12.656710 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:25:12 crc kubenswrapper[4606]: I1212 00:25:12.656734 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:25:12 crc kubenswrapper[4606]: I1212 00:25:12.656773 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:25:12 crc kubenswrapper[4606]: I1212 00:25:12.656808 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:25:12Z","lastTransitionTime":"2025-12-12T00:25:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:25:12 crc kubenswrapper[4606]: I1212 00:25:12.699705 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 12 00:25:12 crc kubenswrapper[4606]: I1212 00:25:12.699998 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 12 00:25:12 crc kubenswrapper[4606]: E1212 00:25:12.700046 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 12 00:25:12 crc kubenswrapper[4606]: I1212 00:25:12.700138 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 12 00:25:12 crc kubenswrapper[4606]: E1212 00:25:12.700675 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 12 00:25:12 crc kubenswrapper[4606]: E1212 00:25:12.701022 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 12 00:25:12 crc kubenswrapper[4606]: I1212 00:25:12.759654 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:25:12 crc kubenswrapper[4606]: I1212 00:25:12.759710 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:25:12 crc kubenswrapper[4606]: I1212 00:25:12.759727 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:25:12 crc kubenswrapper[4606]: I1212 00:25:12.759750 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:25:12 crc kubenswrapper[4606]: I1212 00:25:12.759767 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:25:12Z","lastTransitionTime":"2025-12-12T00:25:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:25:12 crc kubenswrapper[4606]: I1212 00:25:12.862066 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:25:12 crc kubenswrapper[4606]: I1212 00:25:12.862124 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:25:12 crc kubenswrapper[4606]: I1212 00:25:12.862141 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:25:12 crc kubenswrapper[4606]: I1212 00:25:12.862162 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:25:12 crc kubenswrapper[4606]: I1212 00:25:12.862723 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:25:12Z","lastTransitionTime":"2025-12-12T00:25:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:25:12 crc kubenswrapper[4606]: I1212 00:25:12.965487 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:25:12 crc kubenswrapper[4606]: I1212 00:25:12.965525 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:25:12 crc kubenswrapper[4606]: I1212 00:25:12.965534 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:25:12 crc kubenswrapper[4606]: I1212 00:25:12.965548 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:25:12 crc kubenswrapper[4606]: I1212 00:25:12.965557 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:25:12Z","lastTransitionTime":"2025-12-12T00:25:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:25:13 crc kubenswrapper[4606]: I1212 00:25:13.054083 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:25:13 crc kubenswrapper[4606]: I1212 00:25:13.054138 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:25:13 crc kubenswrapper[4606]: I1212 00:25:13.054151 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:25:13 crc kubenswrapper[4606]: I1212 00:25:13.054183 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:25:13 crc kubenswrapper[4606]: I1212 00:25:13.054200 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:25:13Z","lastTransitionTime":"2025-12-12T00:25:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:25:13 crc kubenswrapper[4606]: I1212 00:25:13.077960 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 12 00:25:13 crc kubenswrapper[4606]: I1212 00:25:13.077993 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 12 00:25:13 crc kubenswrapper[4606]: I1212 00:25:13.078003 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 12 00:25:13 crc kubenswrapper[4606]: I1212 00:25:13.078020 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 12 00:25:13 crc kubenswrapper[4606]: I1212 00:25:13.078031 4606 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-12T00:25:13Z","lastTransitionTime":"2025-12-12T00:25:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 12 00:25:13 crc kubenswrapper[4606]: I1212 00:25:13.105076 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-fvfg2"] Dec 12 00:25:13 crc kubenswrapper[4606]: I1212 00:25:13.105656 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fvfg2" Dec 12 00:25:13 crc kubenswrapper[4606]: I1212 00:25:13.109294 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 12 00:25:13 crc kubenswrapper[4606]: I1212 00:25:13.113822 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 12 00:25:13 crc kubenswrapper[4606]: I1212 00:25:13.115101 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 12 00:25:13 crc kubenswrapper[4606]: I1212 00:25:13.115464 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 12 00:25:13 crc kubenswrapper[4606]: I1212 00:25:13.136351 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-xzcfk" podStartSLOduration=84.136324387 podStartE2EDuration="1m24.136324387s" podCreationTimestamp="2025-12-12 00:23:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:25:13.136208654 +0000 UTC m=+103.681561610" watchObservedRunningTime="2025-12-12 00:25:13.136324387 +0000 UTC m=+103.681677293" Dec 12 00:25:13 crc kubenswrapper[4606]: I1212 00:25:13.170280 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a9ff5265-7350-4878-bf09-d815871c131b-service-ca\") pod \"cluster-version-operator-5c965bbfc6-fvfg2\" (UID: \"a9ff5265-7350-4878-bf09-d815871c131b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fvfg2" Dec 12 00:25:13 crc kubenswrapper[4606]: I1212 00:25:13.170346 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/a9ff5265-7350-4878-bf09-d815871c131b-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-fvfg2\" (UID: \"a9ff5265-7350-4878-bf09-d815871c131b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fvfg2" Dec 12 00:25:13 crc kubenswrapper[4606]: I1212 00:25:13.170369 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a9ff5265-7350-4878-bf09-d815871c131b-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-fvfg2\" (UID: \"a9ff5265-7350-4878-bf09-d815871c131b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fvfg2" Dec 12 00:25:13 crc kubenswrapper[4606]: I1212 00:25:13.170403 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/a9ff5265-7350-4878-bf09-d815871c131b-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-fvfg2\" (UID: \"a9ff5265-7350-4878-bf09-d815871c131b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fvfg2" Dec 12 00:25:13 crc kubenswrapper[4606]: I1212 00:25:13.170548 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a9ff5265-7350-4878-bf09-d815871c131b-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-fvfg2\" (UID: \"a9ff5265-7350-4878-bf09-d815871c131b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fvfg2" Dec 12 00:25:13 crc kubenswrapper[4606]: I1212 00:25:13.187249 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=81.187228221 podStartE2EDuration="1m21.187228221s" podCreationTimestamp="2025-12-12 00:23:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:25:13.167060577 +0000 UTC m=+103.712413443" watchObservedRunningTime="2025-12-12 00:25:13.187228221 +0000 UTC m=+103.732581087" Dec 12 00:25:13 crc kubenswrapper[4606]: I1212 00:25:13.207638 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=85.207611722 podStartE2EDuration="1m25.207611722s" podCreationTimestamp="2025-12-12 00:23:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:25:13.189817074 +0000 UTC m=+103.735169980" watchObservedRunningTime="2025-12-12 00:25:13.207611722 +0000 UTC m=+103.752964598" Dec 12 00:25:13 crc kubenswrapper[4606]: I1212 00:25:13.208073 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=82.208063104 podStartE2EDuration="1m22.208063104s" podCreationTimestamp="2025-12-12 00:23:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:25:13.207098577 +0000 UTC m=+103.752451473" watchObservedRunningTime="2025-12-12 00:25:13.208063104 +0000 UTC m=+103.753415990" Dec 12 00:25:13 crc kubenswrapper[4606]: I1212 00:25:13.249634 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podStartSLOduration=85.249615997 podStartE2EDuration="1m25.249615997s" podCreationTimestamp="2025-12-12 00:23:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:25:13.248724942 +0000 UTC m=+103.794077828" watchObservedRunningTime="2025-12-12 00:25:13.249615997 +0000 UTC m=+103.794968863" Dec 12 00:25:13 crc kubenswrapper[4606]: I1212 00:25:13.261909 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=32.26189142 podStartE2EDuration="32.26189142s" podCreationTimestamp="2025-12-12 00:24:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:25:13.261586462 +0000 UTC m=+103.806939328" watchObservedRunningTime="2025-12-12 00:25:13.26189142 +0000 UTC m=+103.807244296" Dec 12 00:25:13 crc kubenswrapper[4606]: I1212 00:25:13.271523 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a9ff5265-7350-4878-bf09-d815871c131b-service-ca\") pod \"cluster-version-operator-5c965bbfc6-fvfg2\" (UID: \"a9ff5265-7350-4878-bf09-d815871c131b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fvfg2" Dec 12 00:25:13 crc kubenswrapper[4606]: I1212 00:25:13.271578 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/a9ff5265-7350-4878-bf09-d815871c131b-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-fvfg2\" (UID: \"a9ff5265-7350-4878-bf09-d815871c131b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fvfg2" Dec 12 00:25:13 crc kubenswrapper[4606]: I1212 00:25:13.271649 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a9ff5265-7350-4878-bf09-d815871c131b-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-fvfg2\" (UID: \"a9ff5265-7350-4878-bf09-d815871c131b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fvfg2" Dec 12 00:25:13 crc kubenswrapper[4606]: I1212 00:25:13.271683 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/a9ff5265-7350-4878-bf09-d815871c131b-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-fvfg2\" (UID: \"a9ff5265-7350-4878-bf09-d815871c131b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fvfg2" Dec 12 00:25:13 crc kubenswrapper[4606]: I1212 00:25:13.271712 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/a9ff5265-7350-4878-bf09-d815871c131b-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-fvfg2\" (UID: \"a9ff5265-7350-4878-bf09-d815871c131b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fvfg2" Dec 12 00:25:13 crc kubenswrapper[4606]: I1212 00:25:13.271733 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a9ff5265-7350-4878-bf09-d815871c131b-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-fvfg2\" (UID: \"a9ff5265-7350-4878-bf09-d815871c131b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fvfg2" Dec 12 00:25:13 crc kubenswrapper[4606]: I1212 00:25:13.271984 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/a9ff5265-7350-4878-bf09-d815871c131b-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-fvfg2\" (UID: \"a9ff5265-7350-4878-bf09-d815871c131b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fvfg2" Dec 12 00:25:13 crc kubenswrapper[4606]: I1212 00:25:13.272363 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a9ff5265-7350-4878-bf09-d815871c131b-service-ca\") pod \"cluster-version-operator-5c965bbfc6-fvfg2\" (UID: \"a9ff5265-7350-4878-bf09-d815871c131b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fvfg2" Dec 12 00:25:13 crc kubenswrapper[4606]: I1212 00:25:13.275271 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=56.275253944 podStartE2EDuration="56.275253944s" podCreationTimestamp="2025-12-12 00:24:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:25:13.274681038 +0000 UTC m=+103.820033904" watchObservedRunningTime="2025-12-12 00:25:13.275253944 +0000 UTC m=+103.820606820" Dec 12 00:25:13 crc kubenswrapper[4606]: I1212 00:25:13.281974 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a9ff5265-7350-4878-bf09-d815871c131b-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-fvfg2\" (UID: \"a9ff5265-7350-4878-bf09-d815871c131b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fvfg2" Dec 12 00:25:13 crc kubenswrapper[4606]: I1212 00:25:13.288797 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a9ff5265-7350-4878-bf09-d815871c131b-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-fvfg2\" (UID: \"a9ff5265-7350-4878-bf09-d815871c131b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fvfg2" Dec 12 00:25:13 crc kubenswrapper[4606]: I1212 00:25:13.356090 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-qtz7b" podStartSLOduration=85.356072276 podStartE2EDuration="1m25.356072276s" podCreationTimestamp="2025-12-12 00:23:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:25:13.331749485 +0000 UTC m=+103.877102361" watchObservedRunningTime="2025-12-12 00:25:13.356072276 +0000 UTC m=+103.901425152" Dec 12 00:25:13 crc kubenswrapper[4606]: I1212 00:25:13.405479 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-554rp" podStartSLOduration=86.405458057 podStartE2EDuration="1m26.405458057s" podCreationTimestamp="2025-12-12 00:23:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:25:13.40518939 +0000 UTC m=+103.950542256" watchObservedRunningTime="2025-12-12 00:25:13.405458057 +0000 UTC m=+103.950810923" Dec 12 00:25:13 crc kubenswrapper[4606]: I1212 00:25:13.423507 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-w4rbn" podStartSLOduration=84.423489482 podStartE2EDuration="1m24.423489482s" podCreationTimestamp="2025-12-12 00:23:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:25:13.422817203 +0000 UTC m=+103.968170069" watchObservedRunningTime="2025-12-12 00:25:13.423489482 +0000 UTC m=+103.968842348" Dec 12 00:25:13 crc kubenswrapper[4606]: I1212 00:25:13.426520 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fvfg2" Dec 12 00:25:13 crc kubenswrapper[4606]: I1212 00:25:13.477860 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7rxps" podStartSLOduration=84.477842193 podStartE2EDuration="1m24.477842193s" podCreationTimestamp="2025-12-12 00:23:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:25:13.467666578 +0000 UTC m=+104.013019464" watchObservedRunningTime="2025-12-12 00:25:13.477842193 +0000 UTC m=+104.023195059" Dec 12 00:25:13 crc kubenswrapper[4606]: I1212 00:25:13.699048 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mjjwd" Dec 12 00:25:13 crc kubenswrapper[4606]: E1212 00:25:13.699148 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mjjwd" podUID="0853dce1-c009-407e-960d-1113f85e503f" Dec 12 00:25:14 crc kubenswrapper[4606]: I1212 00:25:14.228775 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fvfg2" event={"ID":"a9ff5265-7350-4878-bf09-d815871c131b","Type":"ContainerStarted","Data":"61d87214ee738ffc3292109815e79ad25587cf8957e04bcc4ed7c8b95643c434"} Dec 12 00:25:14 crc kubenswrapper[4606]: I1212 00:25:14.228854 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fvfg2" event={"ID":"a9ff5265-7350-4878-bf09-d815871c131b","Type":"ContainerStarted","Data":"5e6b02fca22774fe52a05925988617ffdbd0cfee234be4d38c5674070aefd30f"} Dec 12 00:25:14 crc kubenswrapper[4606]: I1212 00:25:14.244824 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fvfg2" podStartSLOduration=85.244800292 podStartE2EDuration="1m25.244800292s" podCreationTimestamp="2025-12-12 00:23:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:25:14.244540625 +0000 UTC m=+104.789893491" watchObservedRunningTime="2025-12-12 00:25:14.244800292 +0000 UTC m=+104.790153188" Dec 12 00:25:14 crc kubenswrapper[4606]: I1212 00:25:14.698957 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 12 00:25:14 crc kubenswrapper[4606]: I1212 00:25:14.698980 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 12 00:25:14 crc kubenswrapper[4606]: I1212 00:25:14.698971 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 12 00:25:14 crc kubenswrapper[4606]: E1212 00:25:14.699094 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 12 00:25:14 crc kubenswrapper[4606]: E1212 00:25:14.699161 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 12 00:25:14 crc kubenswrapper[4606]: E1212 00:25:14.699252 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 12 00:25:15 crc kubenswrapper[4606]: I1212 00:25:15.699286 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mjjwd" Dec 12 00:25:15 crc kubenswrapper[4606]: E1212 00:25:15.699521 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mjjwd" podUID="0853dce1-c009-407e-960d-1113f85e503f" Dec 12 00:25:16 crc kubenswrapper[4606]: I1212 00:25:16.699345 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 12 00:25:16 crc kubenswrapper[4606]: I1212 00:25:16.699392 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 12 00:25:16 crc kubenswrapper[4606]: E1212 00:25:16.699502 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 12 00:25:16 crc kubenswrapper[4606]: I1212 00:25:16.699535 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 12 00:25:16 crc kubenswrapper[4606]: E1212 00:25:16.699634 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 12 00:25:16 crc kubenswrapper[4606]: E1212 00:25:16.699715 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 12 00:25:17 crc kubenswrapper[4606]: I1212 00:25:17.699444 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mjjwd" Dec 12 00:25:17 crc kubenswrapper[4606]: E1212 00:25:17.699691 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mjjwd" podUID="0853dce1-c009-407e-960d-1113f85e503f" Dec 12 00:25:18 crc kubenswrapper[4606]: I1212 00:25:18.699000 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 12 00:25:18 crc kubenswrapper[4606]: E1212 00:25:18.699206 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 12 00:25:18 crc kubenswrapper[4606]: I1212 00:25:18.699398 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 12 00:25:18 crc kubenswrapper[4606]: E1212 00:25:18.699562 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 12 00:25:18 crc kubenswrapper[4606]: I1212 00:25:18.699417 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 12 00:25:18 crc kubenswrapper[4606]: E1212 00:25:18.699782 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 12 00:25:19 crc kubenswrapper[4606]: I1212 00:25:19.698739 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mjjwd" Dec 12 00:25:19 crc kubenswrapper[4606]: E1212 00:25:19.701394 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mjjwd" podUID="0853dce1-c009-407e-960d-1113f85e503f" Dec 12 00:25:20 crc kubenswrapper[4606]: I1212 00:25:20.699586 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 12 00:25:20 crc kubenswrapper[4606]: E1212 00:25:20.699979 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 12 00:25:20 crc kubenswrapper[4606]: I1212 00:25:20.699682 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 12 00:25:20 crc kubenswrapper[4606]: E1212 00:25:20.700056 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 12 00:25:20 crc kubenswrapper[4606]: I1212 00:25:20.699644 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 12 00:25:20 crc kubenswrapper[4606]: E1212 00:25:20.700112 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 12 00:25:21 crc kubenswrapper[4606]: I1212 00:25:21.699334 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mjjwd" Dec 12 00:25:21 crc kubenswrapper[4606]: E1212 00:25:21.699518 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mjjwd" podUID="0853dce1-c009-407e-960d-1113f85e503f" Dec 12 00:25:22 crc kubenswrapper[4606]: I1212 00:25:22.699355 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 12 00:25:22 crc kubenswrapper[4606]: I1212 00:25:22.699379 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 12 00:25:22 crc kubenswrapper[4606]: E1212 00:25:22.699539 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 12 00:25:22 crc kubenswrapper[4606]: E1212 00:25:22.699637 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 12 00:25:22 crc kubenswrapper[4606]: I1212 00:25:22.699393 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 12 00:25:22 crc kubenswrapper[4606]: E1212 00:25:22.699765 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 12 00:25:23 crc kubenswrapper[4606]: I1212 00:25:23.255845 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xzcfk_b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0/kube-multus/1.log" Dec 12 00:25:23 crc kubenswrapper[4606]: I1212 00:25:23.256851 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xzcfk_b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0/kube-multus/0.log" Dec 12 00:25:23 crc kubenswrapper[4606]: I1212 00:25:23.256955 4606 generic.go:334] "Generic (PLEG): container finished" podID="b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0" containerID="d3601dd0b8c0a574cc4689906f3c4ed491f3432cae8b328c55f07e5d17b6e976" exitCode=1 Dec 12 00:25:23 crc kubenswrapper[4606]: I1212 00:25:23.257012 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xzcfk" event={"ID":"b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0","Type":"ContainerDied","Data":"d3601dd0b8c0a574cc4689906f3c4ed491f3432cae8b328c55f07e5d17b6e976"} Dec 12 00:25:23 crc kubenswrapper[4606]: I1212 00:25:23.257070 4606 scope.go:117] "RemoveContainer" containerID="84110159d07d70b0607ac029fec70fb043bbb0e310c39955e49becfe87e9faa0" Dec 12 00:25:23 crc kubenswrapper[4606]: I1212 00:25:23.257722 4606 scope.go:117] "RemoveContainer" containerID="d3601dd0b8c0a574cc4689906f3c4ed491f3432cae8b328c55f07e5d17b6e976" Dec 12 00:25:23 crc kubenswrapper[4606]: E1212 00:25:23.257976 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-xzcfk_openshift-multus(b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0)\"" pod="openshift-multus/multus-xzcfk" podUID="b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0" Dec 12 00:25:23 crc kubenswrapper[4606]: I1212 00:25:23.698828 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mjjwd" Dec 12 00:25:23 crc kubenswrapper[4606]: E1212 00:25:23.699235 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mjjwd" podUID="0853dce1-c009-407e-960d-1113f85e503f" Dec 12 00:25:24 crc kubenswrapper[4606]: I1212 00:25:24.262780 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xzcfk_b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0/kube-multus/1.log" Dec 12 00:25:24 crc kubenswrapper[4606]: I1212 00:25:24.698808 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 12 00:25:24 crc kubenswrapper[4606]: I1212 00:25:24.698824 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 12 00:25:24 crc kubenswrapper[4606]: I1212 00:25:24.698835 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 12 00:25:24 crc kubenswrapper[4606]: E1212 00:25:24.699689 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 12 00:25:24 crc kubenswrapper[4606]: E1212 00:25:24.699447 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 12 00:25:24 crc kubenswrapper[4606]: E1212 00:25:24.699894 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 12 00:25:25 crc kubenswrapper[4606]: I1212 00:25:25.699661 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mjjwd" Dec 12 00:25:25 crc kubenswrapper[4606]: E1212 00:25:25.699836 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mjjwd" podUID="0853dce1-c009-407e-960d-1113f85e503f" Dec 12 00:25:26 crc kubenswrapper[4606]: I1212 00:25:26.698640 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 12 00:25:26 crc kubenswrapper[4606]: I1212 00:25:26.698736 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 12 00:25:26 crc kubenswrapper[4606]: I1212 00:25:26.698776 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 12 00:25:26 crc kubenswrapper[4606]: E1212 00:25:26.698889 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 12 00:25:26 crc kubenswrapper[4606]: E1212 00:25:26.699601 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 12 00:25:26 crc kubenswrapper[4606]: E1212 00:25:26.699700 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 12 00:25:26 crc kubenswrapper[4606]: I1212 00:25:26.700276 4606 scope.go:117] "RemoveContainer" containerID="04af88555385f41f61f31bcc6dcfd9ea677fd98ba9ef0b1a25003fff11e8be1b" Dec 12 00:25:27 crc kubenswrapper[4606]: I1212 00:25:27.272268 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hpw5w_da25b0ba-5398-4185-a4c1-aeba44ae5633/ovnkube-controller/3.log" Dec 12 00:25:27 crc kubenswrapper[4606]: I1212 00:25:27.274196 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" event={"ID":"da25b0ba-5398-4185-a4c1-aeba44ae5633","Type":"ContainerStarted","Data":"6bfebed57887da9d277d1b3dc30ba0ef65a624b36ede841840c041551ed0b201"} Dec 12 00:25:27 crc kubenswrapper[4606]: I1212 00:25:27.274558 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" Dec 12 00:25:27 crc kubenswrapper[4606]: I1212 00:25:27.297880 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" podStartSLOduration=98.297864551 podStartE2EDuration="1m38.297864551s" podCreationTimestamp="2025-12-12 00:23:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:25:27.297322065 +0000 UTC m=+117.842674951" watchObservedRunningTime="2025-12-12 00:25:27.297864551 +0000 UTC m=+117.843217417" Dec 12 00:25:27 crc kubenswrapper[4606]: I1212 00:25:27.478608 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-mjjwd"] Dec 12 00:25:27 crc kubenswrapper[4606]: I1212 00:25:27.478718 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mjjwd" Dec 12 00:25:27 crc kubenswrapper[4606]: E1212 00:25:27.478801 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mjjwd" podUID="0853dce1-c009-407e-960d-1113f85e503f" Dec 12 00:25:28 crc kubenswrapper[4606]: I1212 00:25:28.698625 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 12 00:25:28 crc kubenswrapper[4606]: E1212 00:25:28.699530 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 12 00:25:28 crc kubenswrapper[4606]: I1212 00:25:28.698625 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 12 00:25:28 crc kubenswrapper[4606]: E1212 00:25:28.700802 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 12 00:25:28 crc kubenswrapper[4606]: I1212 00:25:28.698717 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 12 00:25:28 crc kubenswrapper[4606]: E1212 00:25:28.701115 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 12 00:25:28 crc kubenswrapper[4606]: I1212 00:25:28.698671 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mjjwd" Dec 12 00:25:28 crc kubenswrapper[4606]: E1212 00:25:28.701425 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mjjwd" podUID="0853dce1-c009-407e-960d-1113f85e503f" Dec 12 00:25:29 crc kubenswrapper[4606]: E1212 00:25:29.662489 4606 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Dec 12 00:25:29 crc kubenswrapper[4606]: E1212 00:25:29.796020 4606 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 12 00:25:30 crc kubenswrapper[4606]: I1212 00:25:30.699238 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 12 00:25:30 crc kubenswrapper[4606]: E1212 00:25:30.699403 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 12 00:25:30 crc kubenswrapper[4606]: I1212 00:25:30.699270 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 12 00:25:30 crc kubenswrapper[4606]: I1212 00:25:30.699270 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mjjwd" Dec 12 00:25:30 crc kubenswrapper[4606]: I1212 00:25:30.699270 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 12 00:25:30 crc kubenswrapper[4606]: E1212 00:25:30.699494 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 12 00:25:30 crc kubenswrapper[4606]: E1212 00:25:30.699717 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mjjwd" podUID="0853dce1-c009-407e-960d-1113f85e503f" Dec 12 00:25:30 crc kubenswrapper[4606]: E1212 00:25:30.699786 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 12 00:25:32 crc kubenswrapper[4606]: I1212 00:25:32.699782 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 12 00:25:32 crc kubenswrapper[4606]: E1212 00:25:32.699930 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 12 00:25:32 crc kubenswrapper[4606]: I1212 00:25:32.699987 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 12 00:25:32 crc kubenswrapper[4606]: I1212 00:25:32.699669 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 12 00:25:32 crc kubenswrapper[4606]: E1212 00:25:32.700153 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 12 00:25:32 crc kubenswrapper[4606]: E1212 00:25:32.700268 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 12 00:25:32 crc kubenswrapper[4606]: I1212 00:25:32.700804 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mjjwd" Dec 12 00:25:32 crc kubenswrapper[4606]: E1212 00:25:32.700999 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mjjwd" podUID="0853dce1-c009-407e-960d-1113f85e503f" Dec 12 00:25:34 crc kubenswrapper[4606]: I1212 00:25:34.699217 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 12 00:25:34 crc kubenswrapper[4606]: I1212 00:25:34.699263 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 12 00:25:34 crc kubenswrapper[4606]: I1212 00:25:34.699227 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 12 00:25:34 crc kubenswrapper[4606]: E1212 00:25:34.699397 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 12 00:25:34 crc kubenswrapper[4606]: I1212 00:25:34.699466 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mjjwd" Dec 12 00:25:34 crc kubenswrapper[4606]: E1212 00:25:34.699557 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 12 00:25:34 crc kubenswrapper[4606]: E1212 00:25:34.699791 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 12 00:25:34 crc kubenswrapper[4606]: I1212 00:25:34.699933 4606 scope.go:117] "RemoveContainer" containerID="d3601dd0b8c0a574cc4689906f3c4ed491f3432cae8b328c55f07e5d17b6e976" Dec 12 00:25:34 crc kubenswrapper[4606]: E1212 00:25:34.700373 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mjjwd" podUID="0853dce1-c009-407e-960d-1113f85e503f" Dec 12 00:25:34 crc kubenswrapper[4606]: E1212 00:25:34.797133 4606 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 12 00:25:35 crc kubenswrapper[4606]: I1212 00:25:35.305255 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xzcfk_b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0/kube-multus/1.log" Dec 12 00:25:35 crc kubenswrapper[4606]: I1212 00:25:35.305345 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xzcfk" event={"ID":"b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0","Type":"ContainerStarted","Data":"834943ef77c01eccd37c1ec6b4bf249f411ca5b5a275a69b5e5939d8f08242e8"} Dec 12 00:25:36 crc kubenswrapper[4606]: I1212 00:25:36.699342 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 12 00:25:36 crc kubenswrapper[4606]: I1212 00:25:36.699373 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 12 00:25:36 crc kubenswrapper[4606]: I1212 00:25:36.699407 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 12 00:25:36 crc kubenswrapper[4606]: E1212 00:25:36.699484 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 12 00:25:36 crc kubenswrapper[4606]: I1212 00:25:36.699516 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mjjwd" Dec 12 00:25:36 crc kubenswrapper[4606]: E1212 00:25:36.699614 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 12 00:25:36 crc kubenswrapper[4606]: E1212 00:25:36.699705 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mjjwd" podUID="0853dce1-c009-407e-960d-1113f85e503f" Dec 12 00:25:36 crc kubenswrapper[4606]: E1212 00:25:36.699763 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 12 00:25:38 crc kubenswrapper[4606]: I1212 00:25:38.699233 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 12 00:25:38 crc kubenswrapper[4606]: I1212 00:25:38.699281 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 12 00:25:38 crc kubenswrapper[4606]: I1212 00:25:38.699323 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mjjwd" Dec 12 00:25:38 crc kubenswrapper[4606]: I1212 00:25:38.699260 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 12 00:25:38 crc kubenswrapper[4606]: E1212 00:25:38.699463 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 12 00:25:38 crc kubenswrapper[4606]: E1212 00:25:38.699588 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 12 00:25:38 crc kubenswrapper[4606]: E1212 00:25:38.699751 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 12 00:25:38 crc kubenswrapper[4606]: E1212 00:25:38.699871 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mjjwd" podUID="0853dce1-c009-407e-960d-1113f85e503f" Dec 12 00:25:40 crc kubenswrapper[4606]: I1212 00:25:40.699613 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 12 00:25:40 crc kubenswrapper[4606]: I1212 00:25:40.699628 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mjjwd" Dec 12 00:25:40 crc kubenswrapper[4606]: I1212 00:25:40.701159 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 12 00:25:40 crc kubenswrapper[4606]: I1212 00:25:40.701264 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 12 00:25:40 crc kubenswrapper[4606]: I1212 00:25:40.703294 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 12 00:25:40 crc kubenswrapper[4606]: I1212 00:25:40.705479 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 12 00:25:40 crc kubenswrapper[4606]: I1212 00:25:40.706258 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 12 00:25:40 crc kubenswrapper[4606]: I1212 00:25:40.707107 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 12 00:25:40 crc kubenswrapper[4606]: I1212 00:25:40.709398 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 12 00:25:40 crc kubenswrapper[4606]: I1212 00:25:40.712826 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.735132 4606 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.778936 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-f6l62"] Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.779643 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-f6l62" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.779927 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-dd5sp"] Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.780646 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-dd5sp" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.780916 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-pkx4d"] Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.781525 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-pkx4d" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.782411 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-7kxjs"] Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.783794 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-7kxjs" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.787243 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-596sh"] Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.787967 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-t6r7p"] Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.788921 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-596sh" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.789085 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-t6r7p" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.811859 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.812165 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.812430 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.812685 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.812800 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.812917 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.813054 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.813655 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-f7w26"] Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.813995 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-q728t"] Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.814241 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-f7w26" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.816396 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-q728t" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.817508 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.817635 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.818148 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.818285 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.818384 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.818473 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.819102 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.819320 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.825630 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-vlq68"] Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.826001 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-vlq68" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.831447 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.831611 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.832391 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.832508 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.837084 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.837360 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.837590 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.837722 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.837723 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.837923 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.837976 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.838116 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.838342 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.838513 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.838615 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.838638 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.838694 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.838706 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.838777 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.838837 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.838864 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.838921 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.838937 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.839019 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.839104 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.839115 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.839166 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.839284 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.841496 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-k5ghr"] Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.841862 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xmr7n"] Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.842051 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-n9mcs"] Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.843611 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-k5ghr" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.845996 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.847761 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xmr7n" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.848773 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n9mcs" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.852770 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.853325 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.854752 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vz9ft"] Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.855468 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vz9ft" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.858571 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.863740 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-dlrwh"] Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.864232 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-dlrwh" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.868553 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.868888 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.876860 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.877047 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.877145 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.877280 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.877379 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.877740 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.877822 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-pruner-29424960-bh2l7"] Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.877881 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.877970 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.878369 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.878420 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29424960-bh2l7" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.878468 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.878593 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.879279 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.879912 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.879950 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.895923 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.881341 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-dgfmw"] Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.897010 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.897258 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-frwjc"] Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.898575 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-9gzg8"] Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.880047 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.900902 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.880086 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.880166 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.880212 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.880268 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.880288 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.880343 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.880613 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.880731 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.881221 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.881291 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.902496 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"pruner-dockercfg-p7bcw" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.908546 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.908789 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-dgfmw" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.909796 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-frwjc" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.911653 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"serviceca" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.911905 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9gzg8" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.913841 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.916700 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.912261 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/dcc51b5e-839a-4b7b-ac5c-e0e25c3aac96-machine-approver-tls\") pod \"machine-approver-56656f9798-q728t\" (UID: \"dcc51b5e-839a-4b7b-ac5c-e0e25c3aac96\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-q728t" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.916804 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/39613736-ea61-4bfb-8e8a-640d4e749bd5-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-7kxjs\" (UID: \"39613736-ea61-4bfb-8e8a-640d4e749bd5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7kxjs" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.916826 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/16a5a061-f2aa-430e-9898-b7adff8ccb50-client-ca\") pod \"controller-manager-879f6c89f-dd5sp\" (UID: \"16a5a061-f2aa-430e-9898-b7adff8ccb50\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dd5sp" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.916858 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/6fb93471-e75b-43b2-a4e2-d36bfc617930-etcd-serving-ca\") pod \"apiserver-76f77b778f-f6l62\" (UID: \"6fb93471-e75b-43b2-a4e2-d36bfc617930\") " pod="openshift-apiserver/apiserver-76f77b778f-f6l62" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.916873 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rw2l\" (UniqueName: \"kubernetes.io/projected/16a5a061-f2aa-430e-9898-b7adff8ccb50-kube-api-access-2rw2l\") pod \"controller-manager-879f6c89f-dd5sp\" (UID: \"16a5a061-f2aa-430e-9898-b7adff8ccb50\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dd5sp" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.916891 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dcc51b5e-839a-4b7b-ac5c-e0e25c3aac96-config\") pod \"machine-approver-56656f9798-q728t\" (UID: \"dcc51b5e-839a-4b7b-ac5c-e0e25c3aac96\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-q728t" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.916915 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/6fb93471-e75b-43b2-a4e2-d36bfc617930-node-pullsecrets\") pod \"apiserver-76f77b778f-f6l62\" (UID: \"6fb93471-e75b-43b2-a4e2-d36bfc617930\") " pod="openshift-apiserver/apiserver-76f77b778f-f6l62" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.916932 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/05c0cc56-218e-423d-b6cc-72bf5db5fdfd-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-t6r7p\" (UID: \"05c0cc56-218e-423d-b6cc-72bf5db5fdfd\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-t6r7p" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.916949 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a4e84bf9-56eb-4ce2-9719-5836eb7177a1-trusted-ca\") pod \"console-operator-58897d9998-596sh\" (UID: \"a4e84bf9-56eb-4ce2-9719-5836eb7177a1\") " pod="openshift-console-operator/console-operator-58897d9998-596sh" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.916963 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6fb93471-e75b-43b2-a4e2-d36bfc617930-audit-dir\") pod \"apiserver-76f77b778f-f6l62\" (UID: \"6fb93471-e75b-43b2-a4e2-d36bfc617930\") " pod="openshift-apiserver/apiserver-76f77b778f-f6l62" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.916977 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8xt5\" (UniqueName: \"kubernetes.io/projected/05c0cc56-218e-423d-b6cc-72bf5db5fdfd-kube-api-access-n8xt5\") pod \"openshift-controller-manager-operator-756b6f6bc6-t6r7p\" (UID: \"05c0cc56-218e-423d-b6cc-72bf5db5fdfd\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-t6r7p" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.917004 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7st4c\" (UniqueName: \"kubernetes.io/projected/39613736-ea61-4bfb-8e8a-640d4e749bd5-kube-api-access-7st4c\") pod \"authentication-operator-69f744f599-7kxjs\" (UID: \"39613736-ea61-4bfb-8e8a-640d4e749bd5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7kxjs" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.917022 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhbjx\" (UniqueName: \"kubernetes.io/projected/dcc51b5e-839a-4b7b-ac5c-e0e25c3aac96-kube-api-access-hhbjx\") pod \"machine-approver-56656f9798-q728t\" (UID: \"dcc51b5e-839a-4b7b-ac5c-e0e25c3aac96\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-q728t" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.917035 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xlkk\" (UniqueName: \"kubernetes.io/projected/a4e84bf9-56eb-4ce2-9719-5836eb7177a1-kube-api-access-6xlkk\") pod \"console-operator-58897d9998-596sh\" (UID: \"a4e84bf9-56eb-4ce2-9719-5836eb7177a1\") " pod="openshift-console-operator/console-operator-58897d9998-596sh" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.917070 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6fb93471-e75b-43b2-a4e2-d36bfc617930-etcd-client\") pod \"apiserver-76f77b778f-f6l62\" (UID: \"6fb93471-e75b-43b2-a4e2-d36bfc617930\") " pod="openshift-apiserver/apiserver-76f77b778f-f6l62" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.917086 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/9d84c7aa-3fad-4d0c-ba9a-5577ba892a5b-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-pkx4d\" (UID: \"9d84c7aa-3fad-4d0c-ba9a-5577ba892a5b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-pkx4d" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.917098 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/39613736-ea61-4bfb-8e8a-640d4e749bd5-serving-cert\") pod \"authentication-operator-69f744f599-7kxjs\" (UID: \"39613736-ea61-4bfb-8e8a-640d4e749bd5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7kxjs" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.917121 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d84c7aa-3fad-4d0c-ba9a-5577ba892a5b-config\") pod \"machine-api-operator-5694c8668f-pkx4d\" (UID: \"9d84c7aa-3fad-4d0c-ba9a-5577ba892a5b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-pkx4d" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.917135 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7f35cb7e-9b27-44e8-bd6f-05757a107776-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-f7w26\" (UID: \"7f35cb7e-9b27-44e8-bd6f-05757a107776\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-f7w26" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.917148 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/39613736-ea61-4bfb-8e8a-640d4e749bd5-service-ca-bundle\") pod \"authentication-operator-69f744f599-7kxjs\" (UID: \"39613736-ea61-4bfb-8e8a-640d4e749bd5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7kxjs" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.917162 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a4e84bf9-56eb-4ce2-9719-5836eb7177a1-serving-cert\") pod \"console-operator-58897d9998-596sh\" (UID: \"a4e84bf9-56eb-4ce2-9719-5836eb7177a1\") " pod="openshift-console-operator/console-operator-58897d9998-596sh" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.917200 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39613736-ea61-4bfb-8e8a-640d4e749bd5-config\") pod \"authentication-operator-69f744f599-7kxjs\" (UID: \"39613736-ea61-4bfb-8e8a-640d4e749bd5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7kxjs" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.917221 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/16a5a061-f2aa-430e-9898-b7adff8ccb50-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-dd5sp\" (UID: \"16a5a061-f2aa-430e-9898-b7adff8ccb50\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dd5sp" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.917245 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/6fb93471-e75b-43b2-a4e2-d36bfc617930-encryption-config\") pod \"apiserver-76f77b778f-f6l62\" (UID: \"6fb93471-e75b-43b2-a4e2-d36bfc617930\") " pod="openshift-apiserver/apiserver-76f77b778f-f6l62" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.917259 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16a5a061-f2aa-430e-9898-b7adff8ccb50-config\") pod \"controller-manager-879f6c89f-dd5sp\" (UID: \"16a5a061-f2aa-430e-9898-b7adff8ccb50\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dd5sp" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.917275 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6fb93471-e75b-43b2-a4e2-d36bfc617930-trusted-ca-bundle\") pod \"apiserver-76f77b778f-f6l62\" (UID: \"6fb93471-e75b-43b2-a4e2-d36bfc617930\") " pod="openshift-apiserver/apiserver-76f77b778f-f6l62" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.917291 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/7f35cb7e-9b27-44e8-bd6f-05757a107776-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-f7w26\" (UID: \"7f35cb7e-9b27-44e8-bd6f-05757a107776\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-f7w26" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.917305 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4e84bf9-56eb-4ce2-9719-5836eb7177a1-config\") pod \"console-operator-58897d9998-596sh\" (UID: \"a4e84bf9-56eb-4ce2-9719-5836eb7177a1\") " pod="openshift-console-operator/console-operator-58897d9998-596sh" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.917320 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7f35cb7e-9b27-44e8-bd6f-05757a107776-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-f7w26\" (UID: \"7f35cb7e-9b27-44e8-bd6f-05757a107776\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-f7w26" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.917338 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6fb93471-e75b-43b2-a4e2-d36bfc617930-serving-cert\") pod \"apiserver-76f77b778f-f6l62\" (UID: \"6fb93471-e75b-43b2-a4e2-d36bfc617930\") " pod="openshift-apiserver/apiserver-76f77b778f-f6l62" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.917360 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05c0cc56-218e-423d-b6cc-72bf5db5fdfd-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-t6r7p\" (UID: \"05c0cc56-218e-423d-b6cc-72bf5db5fdfd\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-t6r7p" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.917375 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58qc2\" (UniqueName: \"kubernetes.io/projected/6fb93471-e75b-43b2-a4e2-d36bfc617930-kube-api-access-58qc2\") pod \"apiserver-76f77b778f-f6l62\" (UID: \"6fb93471-e75b-43b2-a4e2-d36bfc617930\") " pod="openshift-apiserver/apiserver-76f77b778f-f6l62" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.917391 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/16a5a061-f2aa-430e-9898-b7adff8ccb50-serving-cert\") pod \"controller-manager-879f6c89f-dd5sp\" (UID: \"16a5a061-f2aa-430e-9898-b7adff8ccb50\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dd5sp" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.917407 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6fb93471-e75b-43b2-a4e2-d36bfc617930-config\") pod \"apiserver-76f77b778f-f6l62\" (UID: \"6fb93471-e75b-43b2-a4e2-d36bfc617930\") " pod="openshift-apiserver/apiserver-76f77b778f-f6l62" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.917422 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/6fb93471-e75b-43b2-a4e2-d36bfc617930-image-import-ca\") pod \"apiserver-76f77b778f-f6l62\" (UID: \"6fb93471-e75b-43b2-a4e2-d36bfc617930\") " pod="openshift-apiserver/apiserver-76f77b778f-f6l62" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.917435 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/9d84c7aa-3fad-4d0c-ba9a-5577ba892a5b-images\") pod \"machine-api-operator-5694c8668f-pkx4d\" (UID: \"9d84c7aa-3fad-4d0c-ba9a-5577ba892a5b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-pkx4d" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.917449 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbffd\" (UniqueName: \"kubernetes.io/projected/9d84c7aa-3fad-4d0c-ba9a-5577ba892a5b-kube-api-access-xbffd\") pod \"machine-api-operator-5694c8668f-pkx4d\" (UID: \"9d84c7aa-3fad-4d0c-ba9a-5577ba892a5b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-pkx4d" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.917466 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/6fb93471-e75b-43b2-a4e2-d36bfc617930-audit\") pod \"apiserver-76f77b778f-f6l62\" (UID: \"6fb93471-e75b-43b2-a4e2-d36bfc617930\") " pod="openshift-apiserver/apiserver-76f77b778f-f6l62" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.917480 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/dcc51b5e-839a-4b7b-ac5c-e0e25c3aac96-auth-proxy-config\") pod \"machine-approver-56656f9798-q728t\" (UID: \"dcc51b5e-839a-4b7b-ac5c-e0e25c3aac96\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-q728t" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.917496 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fv569\" (UniqueName: \"kubernetes.io/projected/7f35cb7e-9b27-44e8-bd6f-05757a107776-kube-api-access-fv569\") pod \"cluster-image-registry-operator-dc59b4c8b-f7w26\" (UID: \"7f35cb7e-9b27-44e8-bd6f-05757a107776\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-f7w26" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.914188 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.917813 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.914298 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.914332 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.915774 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.915890 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.918095 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.918160 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.918239 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.918279 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.918346 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.918424 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.918442 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.919350 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.919677 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.920101 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.922278 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-64s9l"] Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.922747 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-l5pmq"] Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.923056 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-l5pmq" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.923297 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-64s9l" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.923539 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-ffwdl"] Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.923999 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-ffwdl" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.925249 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.926815 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-pkx4d"] Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.927606 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.927849 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.931961 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-t8nn7"] Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.932502 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-t8nn7" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.933267 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4fdkr"] Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.933842 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4fdkr" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.934253 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-596sh"] Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.935234 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4hwhh"] Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.936233 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4hwhh" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.943315 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-9vfqp"] Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.944294 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-9vfqp" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.944302 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-dd5sp"] Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.946336 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-f6l62"] Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.949047 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.959011 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.959332 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.966144 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.966894 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.972362 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-n9mcs"] Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.998886 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.999161 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-f7w26"] Dec 12 00:25:43 crc kubenswrapper[4606]: I1212 00:25:43.999216 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-k5ghr"] Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.001237 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-6h2gf"] Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.002474 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6h2gf" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.006011 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.010246 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.010992 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-snpww"] Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.012012 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-snpww" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.013218 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-544fr"] Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.030610 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-544fr" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.031625 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6fb93471-e75b-43b2-a4e2-d36bfc617930-serving-cert\") pod \"apiserver-76f77b778f-f6l62\" (UID: \"6fb93471-e75b-43b2-a4e2-d36bfc617930\") " pod="openshift-apiserver/apiserver-76f77b778f-f6l62" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.031672 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05c0cc56-218e-423d-b6cc-72bf5db5fdfd-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-t6r7p\" (UID: \"05c0cc56-218e-423d-b6cc-72bf5db5fdfd\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-t6r7p" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.031699 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58qc2\" (UniqueName: \"kubernetes.io/projected/6fb93471-e75b-43b2-a4e2-d36bfc617930-kube-api-access-58qc2\") pod \"apiserver-76f77b778f-f6l62\" (UID: \"6fb93471-e75b-43b2-a4e2-d36bfc617930\") " pod="openshift-apiserver/apiserver-76f77b778f-f6l62" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.031717 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/16a5a061-f2aa-430e-9898-b7adff8ccb50-serving-cert\") pod \"controller-manager-879f6c89f-dd5sp\" (UID: \"16a5a061-f2aa-430e-9898-b7adff8ccb50\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dd5sp" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.031737 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/079b1c50-eaa5-4be5-a0d2-0015a67a1875-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-dgfmw\" (UID: \"079b1c50-eaa5-4be5-a0d2-0015a67a1875\") " pod="openshift-authentication/oauth-openshift-558db77b4-dgfmw" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.031758 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b993a003-2c7c-484b-a44b-17f07bdf6784-etcd-client\") pod \"apiserver-7bbb656c7d-n9mcs\" (UID: \"b993a003-2c7c-484b-a44b-17f07bdf6784\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n9mcs" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.031775 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6fb93471-e75b-43b2-a4e2-d36bfc617930-config\") pod \"apiserver-76f77b778f-f6l62\" (UID: \"6fb93471-e75b-43b2-a4e2-d36bfc617930\") " pod="openshift-apiserver/apiserver-76f77b778f-f6l62" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.031791 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/6fb93471-e75b-43b2-a4e2-d36bfc617930-image-import-ca\") pod \"apiserver-76f77b778f-f6l62\" (UID: \"6fb93471-e75b-43b2-a4e2-d36bfc617930\") " pod="openshift-apiserver/apiserver-76f77b778f-f6l62" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.031807 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/9d84c7aa-3fad-4d0c-ba9a-5577ba892a5b-images\") pod \"machine-api-operator-5694c8668f-pkx4d\" (UID: \"9d84c7aa-3fad-4d0c-ba9a-5577ba892a5b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-pkx4d" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.031824 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbffd\" (UniqueName: \"kubernetes.io/projected/9d84c7aa-3fad-4d0c-ba9a-5577ba892a5b-kube-api-access-xbffd\") pod \"machine-api-operator-5694c8668f-pkx4d\" (UID: \"9d84c7aa-3fad-4d0c-ba9a-5577ba892a5b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-pkx4d" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.031840 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmcv9\" (UniqueName: \"kubernetes.io/projected/43bb746f-62c0-45c5-b1db-490810a0ba0e-kube-api-access-vmcv9\") pod \"route-controller-manager-6576b87f9c-k5ghr\" (UID: \"43bb746f-62c0-45c5-b1db-490810a0ba0e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-k5ghr" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.031857 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/dcc51b5e-839a-4b7b-ac5c-e0e25c3aac96-auth-proxy-config\") pod \"machine-approver-56656f9798-q728t\" (UID: \"dcc51b5e-839a-4b7b-ac5c-e0e25c3aac96\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-q728t" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.031877 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/6fb93471-e75b-43b2-a4e2-d36bfc617930-audit\") pod \"apiserver-76f77b778f-f6l62\" (UID: \"6fb93471-e75b-43b2-a4e2-d36bfc617930\") " pod="openshift-apiserver/apiserver-76f77b778f-f6l62" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.031995 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fv569\" (UniqueName: \"kubernetes.io/projected/7f35cb7e-9b27-44e8-bd6f-05757a107776-kube-api-access-fv569\") pod \"cluster-image-registry-operator-dc59b4c8b-f7w26\" (UID: \"7f35cb7e-9b27-44e8-bd6f-05757a107776\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-f7w26" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.032262 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c454b7c4-18db-442a-ae25-d66e7e6061f3-console-config\") pod \"console-f9d7485db-dlrwh\" (UID: \"c454b7c4-18db-442a-ae25-d66e7e6061f3\") " pod="openshift-console/console-f9d7485db-dlrwh" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.032297 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c454b7c4-18db-442a-ae25-d66e7e6061f3-console-serving-cert\") pod \"console-f9d7485db-dlrwh\" (UID: \"c454b7c4-18db-442a-ae25-d66e7e6061f3\") " pod="openshift-console/console-f9d7485db-dlrwh" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.032316 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/16a5a061-f2aa-430e-9898-b7adff8ccb50-client-ca\") pod \"controller-manager-879f6c89f-dd5sp\" (UID: \"16a5a061-f2aa-430e-9898-b7adff8ccb50\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dd5sp" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.032331 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/dcc51b5e-839a-4b7b-ac5c-e0e25c3aac96-machine-approver-tls\") pod \"machine-approver-56656f9798-q728t\" (UID: \"dcc51b5e-839a-4b7b-ac5c-e0e25c3aac96\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-q728t" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.032526 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/6fb93471-e75b-43b2-a4e2-d36bfc617930-audit\") pod \"apiserver-76f77b778f-f6l62\" (UID: \"6fb93471-e75b-43b2-a4e2-d36bfc617930\") " pod="openshift-apiserver/apiserver-76f77b778f-f6l62" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.032645 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-c86kb"] Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.033088 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05c0cc56-218e-423d-b6cc-72bf5db5fdfd-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-t6r7p\" (UID: \"05c0cc56-218e-423d-b6cc-72bf5db5fdfd\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-t6r7p" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.033342 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dmjzw"] Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.033680 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dmjzw" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.033759 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8ltmf"] Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.033959 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-c86kb" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.033683 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6fb93471-e75b-43b2-a4e2-d36bfc617930-config\") pod \"apiserver-76f77b778f-f6l62\" (UID: \"6fb93471-e75b-43b2-a4e2-d36bfc617930\") " pod="openshift-apiserver/apiserver-76f77b778f-f6l62" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.034155 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8ltmf" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.034543 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.034626 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/39613736-ea61-4bfb-8e8a-640d4e749bd5-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-7kxjs\" (UID: \"39613736-ea61-4bfb-8e8a-640d4e749bd5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7kxjs" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.034655 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/43bb746f-62c0-45c5-b1db-490810a0ba0e-config\") pod \"route-controller-manager-6576b87f9c-k5ghr\" (UID: \"43bb746f-62c0-45c5-b1db-490810a0ba0e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-k5ghr" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.034806 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/6fb93471-e75b-43b2-a4e2-d36bfc617930-image-import-ca\") pod \"apiserver-76f77b778f-f6l62\" (UID: \"6fb93471-e75b-43b2-a4e2-d36bfc617930\") " pod="openshift-apiserver/apiserver-76f77b778f-f6l62" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.035342 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/16a5a061-f2aa-430e-9898-b7adff8ccb50-client-ca\") pod \"controller-manager-879f6c89f-dd5sp\" (UID: \"16a5a061-f2aa-430e-9898-b7adff8ccb50\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dd5sp" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.035456 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/9d84c7aa-3fad-4d0c-ba9a-5577ba892a5b-images\") pod \"machine-api-operator-5694c8668f-pkx4d\" (UID: \"9d84c7aa-3fad-4d0c-ba9a-5577ba892a5b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-pkx4d" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.035553 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/84d78ffd-976a-4d55-9b6a-d10369b35718-available-featuregates\") pod \"openshift-config-operator-7777fb866f-frwjc\" (UID: \"84d78ffd-976a-4d55-9b6a-d10369b35718\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-frwjc" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.035590 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/43bb746f-62c0-45c5-b1db-490810a0ba0e-client-ca\") pod \"route-controller-manager-6576b87f9c-k5ghr\" (UID: \"43bb746f-62c0-45c5-b1db-490810a0ba0e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-k5ghr" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.035641 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kb6vx\" (UniqueName: \"kubernetes.io/projected/c454b7c4-18db-442a-ae25-d66e7e6061f3-kube-api-access-kb6vx\") pod \"console-f9d7485db-dlrwh\" (UID: \"c454b7c4-18db-442a-ae25-d66e7e6061f3\") " pod="openshift-console/console-f9d7485db-dlrwh" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.035658 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/079b1c50-eaa5-4be5-a0d2-0015a67a1875-audit-policies\") pod \"oauth-openshift-558db77b4-dgfmw\" (UID: \"079b1c50-eaa5-4be5-a0d2-0015a67a1875\") " pod="openshift-authentication/oauth-openshift-558db77b4-dgfmw" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.035696 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/6fb93471-e75b-43b2-a4e2-d36bfc617930-node-pullsecrets\") pod \"apiserver-76f77b778f-f6l62\" (UID: \"6fb93471-e75b-43b2-a4e2-d36bfc617930\") " pod="openshift-apiserver/apiserver-76f77b778f-f6l62" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.035714 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/6fb93471-e75b-43b2-a4e2-d36bfc617930-etcd-serving-ca\") pod \"apiserver-76f77b778f-f6l62\" (UID: \"6fb93471-e75b-43b2-a4e2-d36bfc617930\") " pod="openshift-apiserver/apiserver-76f77b778f-f6l62" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.035729 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2rw2l\" (UniqueName: \"kubernetes.io/projected/16a5a061-f2aa-430e-9898-b7adff8ccb50-kube-api-access-2rw2l\") pod \"controller-manager-879f6c89f-dd5sp\" (UID: \"16a5a061-f2aa-430e-9898-b7adff8ccb50\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dd5sp" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.035782 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dcc51b5e-839a-4b7b-ac5c-e0e25c3aac96-config\") pod \"machine-approver-56656f9798-q728t\" (UID: \"dcc51b5e-839a-4b7b-ac5c-e0e25c3aac96\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-q728t" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.035800 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvstb\" (UniqueName: \"kubernetes.io/projected/92300adf-2095-4cf3-901b-d17a9ab4deb5-kube-api-access-mvstb\") pod \"cluster-samples-operator-665b6dd947-vz9ft\" (UID: \"92300adf-2095-4cf3-901b-d17a9ab4deb5\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vz9ft" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.035816 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d8869d3c-60d0-4a85-9b0f-84147ce018b5-metrics-tls\") pod \"ingress-operator-5b745b69d9-9gzg8\" (UID: \"d8869d3c-60d0-4a85-9b0f-84147ce018b5\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9gzg8" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.035832 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6fb93471-e75b-43b2-a4e2-d36bfc617930-audit-dir\") pod \"apiserver-76f77b778f-f6l62\" (UID: \"6fb93471-e75b-43b2-a4e2-d36bfc617930\") " pod="openshift-apiserver/apiserver-76f77b778f-f6l62" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.035848 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/05c0cc56-218e-423d-b6cc-72bf5db5fdfd-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-t6r7p\" (UID: \"05c0cc56-218e-423d-b6cc-72bf5db5fdfd\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-t6r7p" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.035863 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a4e84bf9-56eb-4ce2-9719-5836eb7177a1-trusted-ca\") pod \"console-operator-58897d9998-596sh\" (UID: \"a4e84bf9-56eb-4ce2-9719-5836eb7177a1\") " pod="openshift-console-operator/console-operator-58897d9998-596sh" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.035920 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8xt5\" (UniqueName: \"kubernetes.io/projected/05c0cc56-218e-423d-b6cc-72bf5db5fdfd-kube-api-access-n8xt5\") pod \"openshift-controller-manager-operator-756b6f6bc6-t6r7p\" (UID: \"05c0cc56-218e-423d-b6cc-72bf5db5fdfd\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-t6r7p" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.035940 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/079b1c50-eaa5-4be5-a0d2-0015a67a1875-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-dgfmw\" (UID: \"079b1c50-eaa5-4be5-a0d2-0015a67a1875\") " pod="openshift-authentication/oauth-openshift-558db77b4-dgfmw" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.035956 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/079b1c50-eaa5-4be5-a0d2-0015a67a1875-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-dgfmw\" (UID: \"079b1c50-eaa5-4be5-a0d2-0015a67a1875\") " pod="openshift-authentication/oauth-openshift-558db77b4-dgfmw" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.035974 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/92300adf-2095-4cf3-901b-d17a9ab4deb5-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-vz9ft\" (UID: \"92300adf-2095-4cf3-901b-d17a9ab4deb5\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vz9ft" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.035990 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b993a003-2c7c-484b-a44b-17f07bdf6784-audit-policies\") pod \"apiserver-7bbb656c7d-n9mcs\" (UID: \"b993a003-2c7c-484b-a44b-17f07bdf6784\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n9mcs" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.036006 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjxtp\" (UniqueName: \"kubernetes.io/projected/b993a003-2c7c-484b-a44b-17f07bdf6784-kube-api-access-pjxtp\") pod \"apiserver-7bbb656c7d-n9mcs\" (UID: \"b993a003-2c7c-484b-a44b-17f07bdf6784\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n9mcs" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.036024 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7st4c\" (UniqueName: \"kubernetes.io/projected/39613736-ea61-4bfb-8e8a-640d4e749bd5-kube-api-access-7st4c\") pod \"authentication-operator-69f744f599-7kxjs\" (UID: \"39613736-ea61-4bfb-8e8a-640d4e749bd5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7kxjs" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.036039 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/079b1c50-eaa5-4be5-a0d2-0015a67a1875-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-dgfmw\" (UID: \"079b1c50-eaa5-4be5-a0d2-0015a67a1875\") " pod="openshift-authentication/oauth-openshift-558db77b4-dgfmw" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.035935 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/dcc51b5e-839a-4b7b-ac5c-e0e25c3aac96-auth-proxy-config\") pod \"machine-approver-56656f9798-q728t\" (UID: \"dcc51b5e-839a-4b7b-ac5c-e0e25c3aac96\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-q728t" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.036101 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/079b1c50-eaa5-4be5-a0d2-0015a67a1875-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-dgfmw\" (UID: \"079b1c50-eaa5-4be5-a0d2-0015a67a1875\") " pod="openshift-authentication/oauth-openshift-558db77b4-dgfmw" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.036117 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgz49\" (UniqueName: \"kubernetes.io/projected/5635c63d-bd71-4b80-b111-0fd9ff2cd053-kube-api-access-lgz49\") pod \"downloads-7954f5f757-vlq68\" (UID: \"5635c63d-bd71-4b80-b111-0fd9ff2cd053\") " pod="openshift-console/downloads-7954f5f757-vlq68" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.036133 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d8869d3c-60d0-4a85-9b0f-84147ce018b5-trusted-ca\") pod \"ingress-operator-5b745b69d9-9gzg8\" (UID: \"d8869d3c-60d0-4a85-9b0f-84147ce018b5\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9gzg8" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.036151 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hhbjx\" (UniqueName: \"kubernetes.io/projected/dcc51b5e-839a-4b7b-ac5c-e0e25c3aac96-kube-api-access-hhbjx\") pod \"machine-approver-56656f9798-q728t\" (UID: \"dcc51b5e-839a-4b7b-ac5c-e0e25c3aac96\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-q728t" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.036167 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xlkk\" (UniqueName: \"kubernetes.io/projected/a4e84bf9-56eb-4ce2-9719-5836eb7177a1-kube-api-access-6xlkk\") pod \"console-operator-58897d9998-596sh\" (UID: \"a4e84bf9-56eb-4ce2-9719-5836eb7177a1\") " pod="openshift-console-operator/console-operator-58897d9998-596sh" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.036197 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64f7cb4f-b96f-4538-8eb9-3a8826dada32-config\") pod \"openshift-apiserver-operator-796bbdcf4f-xmr7n\" (UID: \"64f7cb4f-b96f-4538-8eb9-3a8826dada32\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xmr7n" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.036373 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/079b1c50-eaa5-4be5-a0d2-0015a67a1875-audit-dir\") pod \"oauth-openshift-558db77b4-dgfmw\" (UID: \"079b1c50-eaa5-4be5-a0d2-0015a67a1875\") " pod="openshift-authentication/oauth-openshift-558db77b4-dgfmw" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.036392 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/079b1c50-eaa5-4be5-a0d2-0015a67a1875-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-dgfmw\" (UID: \"079b1c50-eaa5-4be5-a0d2-0015a67a1875\") " pod="openshift-authentication/oauth-openshift-558db77b4-dgfmw" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.036409 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c454b7c4-18db-442a-ae25-d66e7e6061f3-console-oauth-config\") pod \"console-f9d7485db-dlrwh\" (UID: \"c454b7c4-18db-442a-ae25-d66e7e6061f3\") " pod="openshift-console/console-f9d7485db-dlrwh" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.036423 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c454b7c4-18db-442a-ae25-d66e7e6061f3-oauth-serving-cert\") pod \"console-f9d7485db-dlrwh\" (UID: \"c454b7c4-18db-442a-ae25-d66e7e6061f3\") " pod="openshift-console/console-f9d7485db-dlrwh" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.036439 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/079b1c50-eaa5-4be5-a0d2-0015a67a1875-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-dgfmw\" (UID: \"079b1c50-eaa5-4be5-a0d2-0015a67a1875\") " pod="openshift-authentication/oauth-openshift-558db77b4-dgfmw" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.036453 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b993a003-2c7c-484b-a44b-17f07bdf6784-serving-cert\") pod \"apiserver-7bbb656c7d-n9mcs\" (UID: \"b993a003-2c7c-484b-a44b-17f07bdf6784\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n9mcs" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.036470 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6fb93471-e75b-43b2-a4e2-d36bfc617930-etcd-client\") pod \"apiserver-76f77b778f-f6l62\" (UID: \"6fb93471-e75b-43b2-a4e2-d36bfc617930\") " pod="openshift-apiserver/apiserver-76f77b778f-f6l62" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.036487 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fcl2w\" (UniqueName: \"kubernetes.io/projected/148f1f7a-b994-4984-a900-18e9d5868002-kube-api-access-fcl2w\") pod \"image-pruner-29424960-bh2l7\" (UID: \"148f1f7a-b994-4984-a900-18e9d5868002\") " pod="openshift-image-registry/image-pruner-29424960-bh2l7" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.036502 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b993a003-2c7c-484b-a44b-17f07bdf6784-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-n9mcs\" (UID: \"b993a003-2c7c-484b-a44b-17f07bdf6784\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n9mcs" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.036519 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d84c7aa-3fad-4d0c-ba9a-5577ba892a5b-config\") pod \"machine-api-operator-5694c8668f-pkx4d\" (UID: \"9d84c7aa-3fad-4d0c-ba9a-5577ba892a5b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-pkx4d" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.036536 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/9d84c7aa-3fad-4d0c-ba9a-5577ba892a5b-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-pkx4d\" (UID: \"9d84c7aa-3fad-4d0c-ba9a-5577ba892a5b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-pkx4d" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.036554 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/39613736-ea61-4bfb-8e8a-640d4e749bd5-serving-cert\") pod \"authentication-operator-69f744f599-7kxjs\" (UID: \"39613736-ea61-4bfb-8e8a-640d4e749bd5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7kxjs" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.036570 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/079b1c50-eaa5-4be5-a0d2-0015a67a1875-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-dgfmw\" (UID: \"079b1c50-eaa5-4be5-a0d2-0015a67a1875\") " pod="openshift-authentication/oauth-openshift-558db77b4-dgfmw" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.037591 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a4e84bf9-56eb-4ce2-9719-5836eb7177a1-trusted-ca\") pod \"console-operator-58897d9998-596sh\" (UID: \"a4e84bf9-56eb-4ce2-9719-5836eb7177a1\") " pod="openshift-console-operator/console-operator-58897d9998-596sh" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.037939 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dcc51b5e-839a-4b7b-ac5c-e0e25c3aac96-config\") pod \"machine-approver-56656f9798-q728t\" (UID: \"dcc51b5e-839a-4b7b-ac5c-e0e25c3aac96\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-q728t" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.037971 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/6fb93471-e75b-43b2-a4e2-d36bfc617930-node-pullsecrets\") pod \"apiserver-76f77b778f-f6l62\" (UID: \"6fb93471-e75b-43b2-a4e2-d36bfc617930\") " pod="openshift-apiserver/apiserver-76f77b778f-f6l62" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.038359 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/6fb93471-e75b-43b2-a4e2-d36bfc617930-etcd-serving-ca\") pod \"apiserver-76f77b778f-f6l62\" (UID: \"6fb93471-e75b-43b2-a4e2-d36bfc617930\") " pod="openshift-apiserver/apiserver-76f77b778f-f6l62" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.038385 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6fb93471-e75b-43b2-a4e2-d36bfc617930-audit-dir\") pod \"apiserver-76f77b778f-f6l62\" (UID: \"6fb93471-e75b-43b2-a4e2-d36bfc617930\") " pod="openshift-apiserver/apiserver-76f77b778f-f6l62" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.044854 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/05c0cc56-218e-423d-b6cc-72bf5db5fdfd-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-t6r7p\" (UID: \"05c0cc56-218e-423d-b6cc-72bf5db5fdfd\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-t6r7p" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.045168 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xmr7n"] Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.045246 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6fb93471-e75b-43b2-a4e2-d36bfc617930-etcd-client\") pod \"apiserver-76f77b778f-f6l62\" (UID: \"6fb93471-e75b-43b2-a4e2-d36bfc617930\") " pod="openshift-apiserver/apiserver-76f77b778f-f6l62" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.045327 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7f35cb7e-9b27-44e8-bd6f-05757a107776-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-f7w26\" (UID: \"7f35cb7e-9b27-44e8-bd6f-05757a107776\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-f7w26" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.045365 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/39613736-ea61-4bfb-8e8a-640d4e749bd5-service-ca-bundle\") pod \"authentication-operator-69f744f599-7kxjs\" (UID: \"39613736-ea61-4bfb-8e8a-640d4e749bd5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7kxjs" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.045388 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b993a003-2c7c-484b-a44b-17f07bdf6784-encryption-config\") pod \"apiserver-7bbb656c7d-n9mcs\" (UID: \"b993a003-2c7c-484b-a44b-17f07bdf6784\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n9mcs" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.054292 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a4e84bf9-56eb-4ce2-9719-5836eb7177a1-serving-cert\") pod \"console-operator-58897d9998-596sh\" (UID: \"a4e84bf9-56eb-4ce2-9719-5836eb7177a1\") " pod="openshift-console-operator/console-operator-58897d9998-596sh" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.054369 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/079b1c50-eaa5-4be5-a0d2-0015a67a1875-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-dgfmw\" (UID: \"079b1c50-eaa5-4be5-a0d2-0015a67a1875\") " pod="openshift-authentication/oauth-openshift-558db77b4-dgfmw" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.054391 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/079b1c50-eaa5-4be5-a0d2-0015a67a1875-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-dgfmw\" (UID: \"079b1c50-eaa5-4be5-a0d2-0015a67a1875\") " pod="openshift-authentication/oauth-openshift-558db77b4-dgfmw" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.054410 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vksbn\" (UniqueName: \"kubernetes.io/projected/079b1c50-eaa5-4be5-a0d2-0015a67a1875-kube-api-access-vksbn\") pod \"oauth-openshift-558db77b4-dgfmw\" (UID: \"079b1c50-eaa5-4be5-a0d2-0015a67a1875\") " pod="openshift-authentication/oauth-openshift-558db77b4-dgfmw" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.055379 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d84c7aa-3fad-4d0c-ba9a-5577ba892a5b-config\") pod \"machine-api-operator-5694c8668f-pkx4d\" (UID: \"9d84c7aa-3fad-4d0c-ba9a-5577ba892a5b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-pkx4d" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.055635 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.056780 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7f35cb7e-9b27-44e8-bd6f-05757a107776-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-f7w26\" (UID: \"7f35cb7e-9b27-44e8-bd6f-05757a107776\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-f7w26" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.057368 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtttx\" (UniqueName: \"kubernetes.io/projected/64f7cb4f-b96f-4538-8eb9-3a8826dada32-kube-api-access-jtttx\") pod \"openshift-apiserver-operator-796bbdcf4f-xmr7n\" (UID: \"64f7cb4f-b96f-4538-8eb9-3a8826dada32\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xmr7n" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.057403 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b993a003-2c7c-484b-a44b-17f07bdf6784-audit-dir\") pod \"apiserver-7bbb656c7d-n9mcs\" (UID: \"b993a003-2c7c-484b-a44b-17f07bdf6784\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n9mcs" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.057420 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/64f7cb4f-b96f-4538-8eb9-3a8826dada32-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-xmr7n\" (UID: \"64f7cb4f-b96f-4538-8eb9-3a8826dada32\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xmr7n" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.057481 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39613736-ea61-4bfb-8e8a-640d4e749bd5-config\") pod \"authentication-operator-69f744f599-7kxjs\" (UID: \"39613736-ea61-4bfb-8e8a-640d4e749bd5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7kxjs" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.057517 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/148f1f7a-b994-4984-a900-18e9d5868002-serviceca\") pod \"image-pruner-29424960-bh2l7\" (UID: \"148f1f7a-b994-4984-a900-18e9d5868002\") " pod="openshift-image-registry/image-pruner-29424960-bh2l7" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.057537 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2tnt2\" (UniqueName: \"kubernetes.io/projected/84d78ffd-976a-4d55-9b6a-d10369b35718-kube-api-access-2tnt2\") pod \"openshift-config-operator-7777fb866f-frwjc\" (UID: \"84d78ffd-976a-4d55-9b6a-d10369b35718\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-frwjc" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.057569 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/43bb746f-62c0-45c5-b1db-490810a0ba0e-serving-cert\") pod \"route-controller-manager-6576b87f9c-k5ghr\" (UID: \"43bb746f-62c0-45c5-b1db-490810a0ba0e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-k5ghr" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.057591 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c454b7c4-18db-442a-ae25-d66e7e6061f3-service-ca\") pod \"console-f9d7485db-dlrwh\" (UID: \"c454b7c4-18db-442a-ae25-d66e7e6061f3\") " pod="openshift-console/console-f9d7485db-dlrwh" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.057636 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/079b1c50-eaa5-4be5-a0d2-0015a67a1875-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-dgfmw\" (UID: \"079b1c50-eaa5-4be5-a0d2-0015a67a1875\") " pod="openshift-authentication/oauth-openshift-558db77b4-dgfmw" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.057651 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b993a003-2c7c-484b-a44b-17f07bdf6784-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-n9mcs\" (UID: \"b993a003-2c7c-484b-a44b-17f07bdf6784\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n9mcs" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.058706 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6fb93471-e75b-43b2-a4e2-d36bfc617930-serving-cert\") pod \"apiserver-76f77b778f-f6l62\" (UID: \"6fb93471-e75b-43b2-a4e2-d36bfc617930\") " pod="openshift-apiserver/apiserver-76f77b778f-f6l62" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.058881 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/16a5a061-f2aa-430e-9898-b7adff8ccb50-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-dd5sp\" (UID: \"16a5a061-f2aa-430e-9898-b7adff8ccb50\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dd5sp" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.059800 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/16a5a061-f2aa-430e-9898-b7adff8ccb50-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-dd5sp\" (UID: \"16a5a061-f2aa-430e-9898-b7adff8ccb50\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dd5sp" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.059844 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/6fb93471-e75b-43b2-a4e2-d36bfc617930-encryption-config\") pod \"apiserver-76f77b778f-f6l62\" (UID: \"6fb93471-e75b-43b2-a4e2-d36bfc617930\") " pod="openshift-apiserver/apiserver-76f77b778f-f6l62" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.059875 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6fb93471-e75b-43b2-a4e2-d36bfc617930-trusted-ca-bundle\") pod \"apiserver-76f77b778f-f6l62\" (UID: \"6fb93471-e75b-43b2-a4e2-d36bfc617930\") " pod="openshift-apiserver/apiserver-76f77b778f-f6l62" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.059904 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16a5a061-f2aa-430e-9898-b7adff8ccb50-config\") pod \"controller-manager-879f6c89f-dd5sp\" (UID: \"16a5a061-f2aa-430e-9898-b7adff8ccb50\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dd5sp" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.059924 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/84d78ffd-976a-4d55-9b6a-d10369b35718-serving-cert\") pod \"openshift-config-operator-7777fb866f-frwjc\" (UID: \"84d78ffd-976a-4d55-9b6a-d10369b35718\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-frwjc" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.059943 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d8869d3c-60d0-4a85-9b0f-84147ce018b5-bound-sa-token\") pod \"ingress-operator-5b745b69d9-9gzg8\" (UID: \"d8869d3c-60d0-4a85-9b0f-84147ce018b5\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9gzg8" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.059977 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/7f35cb7e-9b27-44e8-bd6f-05757a107776-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-f7w26\" (UID: \"7f35cb7e-9b27-44e8-bd6f-05757a107776\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-f7w26" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.060007 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4e84bf9-56eb-4ce2-9719-5836eb7177a1-config\") pod \"console-operator-58897d9998-596sh\" (UID: \"a4e84bf9-56eb-4ce2-9719-5836eb7177a1\") " pod="openshift-console-operator/console-operator-58897d9998-596sh" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.060025 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c454b7c4-18db-442a-ae25-d66e7e6061f3-trusted-ca-bundle\") pod \"console-f9d7485db-dlrwh\" (UID: \"c454b7c4-18db-442a-ae25-d66e7e6061f3\") " pod="openshift-console/console-f9d7485db-dlrwh" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.060041 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nw9g\" (UniqueName: \"kubernetes.io/projected/d8869d3c-60d0-4a85-9b0f-84147ce018b5-kube-api-access-6nw9g\") pod \"ingress-operator-5b745b69d9-9gzg8\" (UID: \"d8869d3c-60d0-4a85-9b0f-84147ce018b5\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9gzg8" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.060059 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7f35cb7e-9b27-44e8-bd6f-05757a107776-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-f7w26\" (UID: \"7f35cb7e-9b27-44e8-bd6f-05757a107776\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-f7w26" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.064161 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6fb93471-e75b-43b2-a4e2-d36bfc617930-trusted-ca-bundle\") pod \"apiserver-76f77b778f-f6l62\" (UID: \"6fb93471-e75b-43b2-a4e2-d36bfc617930\") " pod="openshift-apiserver/apiserver-76f77b778f-f6l62" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.064250 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4e84bf9-56eb-4ce2-9719-5836eb7177a1-config\") pod \"console-operator-58897d9998-596sh\" (UID: \"a4e84bf9-56eb-4ce2-9719-5836eb7177a1\") " pod="openshift-console-operator/console-operator-58897d9998-596sh" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.066435 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/7f35cb7e-9b27-44e8-bd6f-05757a107776-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-f7w26\" (UID: \"7f35cb7e-9b27-44e8-bd6f-05757a107776\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-f7w26" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.067488 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-mbfr9"] Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.067767 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/dcc51b5e-839a-4b7b-ac5c-e0e25c3aac96-machine-approver-tls\") pod \"machine-approver-56656f9798-q728t\" (UID: \"dcc51b5e-839a-4b7b-ac5c-e0e25c3aac96\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-q728t" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.071628 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-pruner-29424960-bh2l7"] Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.071725 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-mbfr9" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.073492 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/6fb93471-e75b-43b2-a4e2-d36bfc617930-encryption-config\") pod \"apiserver-76f77b778f-f6l62\" (UID: \"6fb93471-e75b-43b2-a4e2-d36bfc617930\") " pod="openshift-apiserver/apiserver-76f77b778f-f6l62" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.074151 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/39613736-ea61-4bfb-8e8a-640d4e749bd5-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-7kxjs\" (UID: \"39613736-ea61-4bfb-8e8a-640d4e749bd5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7kxjs" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.074758 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/39613736-ea61-4bfb-8e8a-640d4e749bd5-service-ca-bundle\") pod \"authentication-operator-69f744f599-7kxjs\" (UID: \"39613736-ea61-4bfb-8e8a-640d4e749bd5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7kxjs" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.074772 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39613736-ea61-4bfb-8e8a-640d4e749bd5-config\") pod \"authentication-operator-69f744f599-7kxjs\" (UID: \"39613736-ea61-4bfb-8e8a-640d4e749bd5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7kxjs" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.075318 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a4e84bf9-56eb-4ce2-9719-5836eb7177a1-serving-cert\") pod \"console-operator-58897d9998-596sh\" (UID: \"a4e84bf9-56eb-4ce2-9719-5836eb7177a1\") " pod="openshift-console-operator/console-operator-58897d9998-596sh" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.075857 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/9d84c7aa-3fad-4d0c-ba9a-5577ba892a5b-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-pkx4d\" (UID: \"9d84c7aa-3fad-4d0c-ba9a-5577ba892a5b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-pkx4d" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.076795 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/16a5a061-f2aa-430e-9898-b7adff8ccb50-serving-cert\") pod \"controller-manager-879f6c89f-dd5sp\" (UID: \"16a5a061-f2aa-430e-9898-b7adff8ccb50\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dd5sp" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.077910 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/39613736-ea61-4bfb-8e8a-640d4e749bd5-serving-cert\") pod \"authentication-operator-69f744f599-7kxjs\" (UID: \"39613736-ea61-4bfb-8e8a-640d4e749bd5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7kxjs" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.079109 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16a5a061-f2aa-430e-9898-b7adff8ccb50-config\") pod \"controller-manager-879f6c89f-dd5sp\" (UID: \"16a5a061-f2aa-430e-9898-b7adff8ccb50\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dd5sp" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.080610 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-t6r7p"] Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.081255 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-4p7d9"] Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.081794 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-4p7d9" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.082309 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-dlrwh"] Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.083921 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-2drsc"] Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.084458 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-2drsc" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.084869 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-52vsx"] Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.085439 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-52vsx" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.087244 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-46kcq"] Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.087596 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29424975-jpm67"] Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.087935 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29424975-jpm67" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.088091 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-46kcq" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.094768 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-q79c6"] Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.095461 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-q79c6" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.096486 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-hhzb9"] Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.097920 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-hhzb9" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.099315 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.099491 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-dgfmw"] Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.100462 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-vlq68"] Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.101410 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-h8tjq"] Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.102361 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-ffwdl"] Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.102457 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-h8tjq" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.105592 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4hwhh"] Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.105633 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-2drsc"] Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.105644 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-t8nn7"] Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.106551 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.107042 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-4p7d9"] Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.110208 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-9vfqp"] Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.110829 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-q79c6"] Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.112349 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-7kxjs"] Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.114671 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-h8tjq"] Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.116855 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-l5pmq"] Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.119329 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-6h2gf"] Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.120824 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vz9ft"] Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.122276 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.123564 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-snpww"] Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.125110 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-9gzg8"] Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.126854 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-frwjc"] Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.128678 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-544fr"] Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.130289 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-hhzb9"] Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.131557 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29424975-jpm67"] Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.133220 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4fdkr"] Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.134339 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8ltmf"] Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.135578 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-c86kb"] Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.136551 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dmjzw"] Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.137810 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-46kcq"] Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.144487 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.147163 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-52vsx"] Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.149945 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-mbfr9"] Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.151410 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-898cq"] Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.152041 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-898cq" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.152688 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-pvw8f"] Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.153076 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-pvw8f" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.155977 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-898cq"] Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.160562 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d5b97b3b-8994-4d7f-a165-c04d13546e89-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-544fr\" (UID: \"d5b97b3b-8994-4d7f-a165-c04d13546e89\") " pod="openshift-marketplace/marketplace-operator-79b997595-544fr" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.160595 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/54e9b065-dbcb-4238-8da6-f36ae3e18dde-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-4fdkr\" (UID: \"54e9b065-dbcb-4238-8da6-f36ae3e18dde\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4fdkr" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.160611 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ae2ce179-dd9a-4a2d-8f4c-e35424c12f94-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-l5pmq\" (UID: \"ae2ce179-dd9a-4a2d-8f4c-e35424c12f94\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-l5pmq" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.160631 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c454b7c4-18db-442a-ae25-d66e7e6061f3-console-config\") pod \"console-f9d7485db-dlrwh\" (UID: \"c454b7c4-18db-442a-ae25-d66e7e6061f3\") " pod="openshift-console/console-f9d7485db-dlrwh" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.160647 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/f6960858-8b4e-4855-b52a-caa021444b7d-etcd-service-ca\") pod \"etcd-operator-b45778765-ffwdl\" (UID: \"f6960858-8b4e-4855-b52a-caa021444b7d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ffwdl" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.160669 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c454b7c4-18db-442a-ae25-d66e7e6061f3-console-serving-cert\") pod \"console-f9d7485db-dlrwh\" (UID: \"c454b7c4-18db-442a-ae25-d66e7e6061f3\") " pod="openshift-console/console-f9d7485db-dlrwh" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.160685 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/24b34ccb-e494-493e-98a9-31cf59981c38-proxy-tls\") pod \"machine-config-operator-74547568cd-snpww\" (UID: \"24b34ccb-e494-493e-98a9-31cf59981c38\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-snpww" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.160710 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/84d78ffd-976a-4d55-9b6a-d10369b35718-available-featuregates\") pod \"openshift-config-operator-7777fb866f-frwjc\" (UID: \"84d78ffd-976a-4d55-9b6a-d10369b35718\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-frwjc" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.160727 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/43bb746f-62c0-45c5-b1db-490810a0ba0e-client-ca\") pod \"route-controller-manager-6576b87f9c-k5ghr\" (UID: \"43bb746f-62c0-45c5-b1db-490810a0ba0e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-k5ghr" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.160745 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kb6vx\" (UniqueName: \"kubernetes.io/projected/c454b7c4-18db-442a-ae25-d66e7e6061f3-kube-api-access-kb6vx\") pod \"console-f9d7485db-dlrwh\" (UID: \"c454b7c4-18db-442a-ae25-d66e7e6061f3\") " pod="openshift-console/console-f9d7485db-dlrwh" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.160760 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/079b1c50-eaa5-4be5-a0d2-0015a67a1875-audit-policies\") pod \"oauth-openshift-558db77b4-dgfmw\" (UID: \"079b1c50-eaa5-4be5-a0d2-0015a67a1875\") " pod="openshift-authentication/oauth-openshift-558db77b4-dgfmw" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.160774 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f6960858-8b4e-4855-b52a-caa021444b7d-etcd-client\") pod \"etcd-operator-b45778765-ffwdl\" (UID: \"f6960858-8b4e-4855-b52a-caa021444b7d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ffwdl" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.160788 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d8869d3c-60d0-4a85-9b0f-84147ce018b5-metrics-tls\") pod \"ingress-operator-5b745b69d9-9gzg8\" (UID: \"d8869d3c-60d0-4a85-9b0f-84147ce018b5\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9gzg8" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.160803 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae2ce179-dd9a-4a2d-8f4c-e35424c12f94-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-l5pmq\" (UID: \"ae2ce179-dd9a-4a2d-8f4c-e35424c12f94\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-l5pmq" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.160860 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvstb\" (UniqueName: \"kubernetes.io/projected/92300adf-2095-4cf3-901b-d17a9ab4deb5-kube-api-access-mvstb\") pod \"cluster-samples-operator-665b6dd947-vz9ft\" (UID: \"92300adf-2095-4cf3-901b-d17a9ab4deb5\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vz9ft" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.160876 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/079b1c50-eaa5-4be5-a0d2-0015a67a1875-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-dgfmw\" (UID: \"079b1c50-eaa5-4be5-a0d2-0015a67a1875\") " pod="openshift-authentication/oauth-openshift-558db77b4-dgfmw" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.160891 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lgz49\" (UniqueName: \"kubernetes.io/projected/5635c63d-bd71-4b80-b111-0fd9ff2cd053-kube-api-access-lgz49\") pod \"downloads-7954f5f757-vlq68\" (UID: \"5635c63d-bd71-4b80-b111-0fd9ff2cd053\") " pod="openshift-console/downloads-7954f5f757-vlq68" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.160908 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjxtp\" (UniqueName: \"kubernetes.io/projected/b993a003-2c7c-484b-a44b-17f07bdf6784-kube-api-access-pjxtp\") pod \"apiserver-7bbb656c7d-n9mcs\" (UID: \"b993a003-2c7c-484b-a44b-17f07bdf6784\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n9mcs" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.160942 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/079b1c50-eaa5-4be5-a0d2-0015a67a1875-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-dgfmw\" (UID: \"079b1c50-eaa5-4be5-a0d2-0015a67a1875\") " pod="openshift-authentication/oauth-openshift-558db77b4-dgfmw" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.160960 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f6960858-8b4e-4855-b52a-caa021444b7d-serving-cert\") pod \"etcd-operator-b45778765-ffwdl\" (UID: \"f6960858-8b4e-4855-b52a-caa021444b7d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ffwdl" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.160982 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/079b1c50-eaa5-4be5-a0d2-0015a67a1875-audit-dir\") pod \"oauth-openshift-558db77b4-dgfmw\" (UID: \"079b1c50-eaa5-4be5-a0d2-0015a67a1875\") " pod="openshift-authentication/oauth-openshift-558db77b4-dgfmw" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.160997 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/079b1c50-eaa5-4be5-a0d2-0015a67a1875-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-dgfmw\" (UID: \"079b1c50-eaa5-4be5-a0d2-0015a67a1875\") " pod="openshift-authentication/oauth-openshift-558db77b4-dgfmw" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.161013 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54e9b065-dbcb-4238-8da6-f36ae3e18dde-config\") pod \"kube-apiserver-operator-766d6c64bb-4fdkr\" (UID: \"54e9b065-dbcb-4238-8da6-f36ae3e18dde\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4fdkr" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.161029 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vksbn\" (UniqueName: \"kubernetes.io/projected/079b1c50-eaa5-4be5-a0d2-0015a67a1875-kube-api-access-vksbn\") pod \"oauth-openshift-558db77b4-dgfmw\" (UID: \"079b1c50-eaa5-4be5-a0d2-0015a67a1875\") " pod="openshift-authentication/oauth-openshift-558db77b4-dgfmw" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.161040 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-z6php"] Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.161718 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-z6php" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.161044 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b993a003-2c7c-484b-a44b-17f07bdf6784-encryption-config\") pod \"apiserver-7bbb656c7d-n9mcs\" (UID: \"b993a003-2c7c-484b-a44b-17f07bdf6784\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n9mcs" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.161888 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/079b1c50-eaa5-4be5-a0d2-0015a67a1875-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-dgfmw\" (UID: \"079b1c50-eaa5-4be5-a0d2-0015a67a1875\") " pod="openshift-authentication/oauth-openshift-558db77b4-dgfmw" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.161906 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/079b1c50-eaa5-4be5-a0d2-0015a67a1875-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-dgfmw\" (UID: \"079b1c50-eaa5-4be5-a0d2-0015a67a1875\") " pod="openshift-authentication/oauth-openshift-558db77b4-dgfmw" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.161927 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rblg6\" (UniqueName: \"kubernetes.io/projected/24b34ccb-e494-493e-98a9-31cf59981c38-kube-api-access-rblg6\") pod \"machine-config-operator-74547568cd-snpww\" (UID: \"24b34ccb-e494-493e-98a9-31cf59981c38\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-snpww" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.161945 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b993a003-2c7c-484b-a44b-17f07bdf6784-audit-dir\") pod \"apiserver-7bbb656c7d-n9mcs\" (UID: \"b993a003-2c7c-484b-a44b-17f07bdf6784\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n9mcs" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.161962 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6960858-8b4e-4855-b52a-caa021444b7d-config\") pod \"etcd-operator-b45778765-ffwdl\" (UID: \"f6960858-8b4e-4855-b52a-caa021444b7d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ffwdl" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.161973 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/079b1c50-eaa5-4be5-a0d2-0015a67a1875-audit-policies\") pod \"oauth-openshift-558db77b4-dgfmw\" (UID: \"079b1c50-eaa5-4be5-a0d2-0015a67a1875\") " pod="openshift-authentication/oauth-openshift-558db77b4-dgfmw" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.161977 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjvhw\" (UniqueName: \"kubernetes.io/projected/e829b744-8b79-4f65-8783-ea555f280ce8-kube-api-access-tjvhw\") pod \"machine-config-controller-84d6567774-6h2gf\" (UID: \"e829b744-8b79-4f65-8783-ea555f280ce8\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6h2gf" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.162014 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/43bb746f-62c0-45c5-b1db-490810a0ba0e-serving-cert\") pod \"route-controller-manager-6576b87f9c-k5ghr\" (UID: \"43bb746f-62c0-45c5-b1db-490810a0ba0e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-k5ghr" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.162032 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/84d78ffd-976a-4d55-9b6a-d10369b35718-serving-cert\") pod \"openshift-config-operator-7777fb866f-frwjc\" (UID: \"84d78ffd-976a-4d55-9b6a-d10369b35718\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-frwjc" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.162047 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d8869d3c-60d0-4a85-9b0f-84147ce018b5-bound-sa-token\") pod \"ingress-operator-5b745b69d9-9gzg8\" (UID: \"d8869d3c-60d0-4a85-9b0f-84147ce018b5\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9gzg8" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.162064 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzgrh\" (UniqueName: \"kubernetes.io/projected/0ef842ad-7b0b-4e92-bfb2-23306b0f85f9-kube-api-access-lzgrh\") pod \"dns-operator-744455d44c-9vfqp\" (UID: \"0ef842ad-7b0b-4e92-bfb2-23306b0f85f9\") " pod="openshift-dns-operator/dns-operator-744455d44c-9vfqp" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.162080 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/21420052-2e90-4be9-923e-2b8d0d5ad189-default-certificate\") pod \"router-default-5444994796-64s9l\" (UID: \"21420052-2e90-4be9-923e-2b8d0d5ad189\") " pod="openshift-ingress/router-default-5444994796-64s9l" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.162096 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c454b7c4-18db-442a-ae25-d66e7e6061f3-trusted-ca-bundle\") pod \"console-f9d7485db-dlrwh\" (UID: \"c454b7c4-18db-442a-ae25-d66e7e6061f3\") " pod="openshift-console/console-f9d7485db-dlrwh" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.162116 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8j5bk\" (UniqueName: \"kubernetes.io/projected/d5b97b3b-8994-4d7f-a165-c04d13546e89-kube-api-access-8j5bk\") pod \"marketplace-operator-79b997595-544fr\" (UID: \"d5b97b3b-8994-4d7f-a165-c04d13546e89\") " pod="openshift-marketplace/marketplace-operator-79b997595-544fr" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.162137 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/079b1c50-eaa5-4be5-a0d2-0015a67a1875-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-dgfmw\" (UID: \"079b1c50-eaa5-4be5-a0d2-0015a67a1875\") " pod="openshift-authentication/oauth-openshift-558db77b4-dgfmw" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.162154 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmcv9\" (UniqueName: \"kubernetes.io/projected/43bb746f-62c0-45c5-b1db-490810a0ba0e-kube-api-access-vmcv9\") pod \"route-controller-manager-6576b87f9c-k5ghr\" (UID: \"43bb746f-62c0-45c5-b1db-490810a0ba0e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-k5ghr" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.162217 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/d5b97b3b-8994-4d7f-a165-c04d13546e89-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-544fr\" (UID: \"d5b97b3b-8994-4d7f-a165-c04d13546e89\") " pod="openshift-marketplace/marketplace-operator-79b997595-544fr" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.162236 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/43bb746f-62c0-45c5-b1db-490810a0ba0e-config\") pod \"route-controller-manager-6576b87f9c-k5ghr\" (UID: \"43bb746f-62c0-45c5-b1db-490810a0ba0e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-k5ghr" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.162250 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/f6960858-8b4e-4855-b52a-caa021444b7d-etcd-ca\") pod \"etcd-operator-b45778765-ffwdl\" (UID: \"f6960858-8b4e-4855-b52a-caa021444b7d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ffwdl" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.162267 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b993a003-2c7c-484b-a44b-17f07bdf6784-audit-policies\") pod \"apiserver-7bbb656c7d-n9mcs\" (UID: \"b993a003-2c7c-484b-a44b-17f07bdf6784\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n9mcs" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.162282 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e829b744-8b79-4f65-8783-ea555f280ce8-proxy-tls\") pod \"machine-config-controller-84d6567774-6h2gf\" (UID: \"e829b744-8b79-4f65-8783-ea555f280ce8\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6h2gf" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.162304 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/079b1c50-eaa5-4be5-a0d2-0015a67a1875-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-dgfmw\" (UID: \"079b1c50-eaa5-4be5-a0d2-0015a67a1875\") " pod="openshift-authentication/oauth-openshift-558db77b4-dgfmw" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.162318 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/92300adf-2095-4cf3-901b-d17a9ab4deb5-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-vz9ft\" (UID: \"92300adf-2095-4cf3-901b-d17a9ab4deb5\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vz9ft" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.162338 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/079b1c50-eaa5-4be5-a0d2-0015a67a1875-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-dgfmw\" (UID: \"079b1c50-eaa5-4be5-a0d2-0015a67a1875\") " pod="openshift-authentication/oauth-openshift-558db77b4-dgfmw" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.162356 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d8869d3c-60d0-4a85-9b0f-84147ce018b5-trusted-ca\") pod \"ingress-operator-5b745b69d9-9gzg8\" (UID: \"d8869d3c-60d0-4a85-9b0f-84147ce018b5\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9gzg8" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.162378 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64f7cb4f-b96f-4538-8eb9-3a8826dada32-config\") pod \"openshift-apiserver-operator-796bbdcf4f-xmr7n\" (UID: \"64f7cb4f-b96f-4538-8eb9-3a8826dada32\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xmr7n" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.163047 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e829b744-8b79-4f65-8783-ea555f280ce8-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-6h2gf\" (UID: \"e829b744-8b79-4f65-8783-ea555f280ce8\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6h2gf" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.163094 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/079b1c50-eaa5-4be5-a0d2-0015a67a1875-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-dgfmw\" (UID: \"079b1c50-eaa5-4be5-a0d2-0015a67a1875\") " pod="openshift-authentication/oauth-openshift-558db77b4-dgfmw" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.163113 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k54mz\" (UniqueName: \"kubernetes.io/projected/21420052-2e90-4be9-923e-2b8d0d5ad189-kube-api-access-k54mz\") pod \"router-default-5444994796-64s9l\" (UID: \"21420052-2e90-4be9-923e-2b8d0d5ad189\") " pod="openshift-ingress/router-default-5444994796-64s9l" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.163129 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/24b34ccb-e494-493e-98a9-31cf59981c38-images\") pod \"machine-config-operator-74547568cd-snpww\" (UID: \"24b34ccb-e494-493e-98a9-31cf59981c38\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-snpww" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.163145 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c454b7c4-18db-442a-ae25-d66e7e6061f3-console-oauth-config\") pod \"console-f9d7485db-dlrwh\" (UID: \"c454b7c4-18db-442a-ae25-d66e7e6061f3\") " pod="openshift-console/console-f9d7485db-dlrwh" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.163161 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c454b7c4-18db-442a-ae25-d66e7e6061f3-oauth-serving-cert\") pod \"console-f9d7485db-dlrwh\" (UID: \"c454b7c4-18db-442a-ae25-d66e7e6061f3\") " pod="openshift-console/console-f9d7485db-dlrwh" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.163208 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/079b1c50-eaa5-4be5-a0d2-0015a67a1875-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-dgfmw\" (UID: \"079b1c50-eaa5-4be5-a0d2-0015a67a1875\") " pod="openshift-authentication/oauth-openshift-558db77b4-dgfmw" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.163224 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b993a003-2c7c-484b-a44b-17f07bdf6784-serving-cert\") pod \"apiserver-7bbb656c7d-n9mcs\" (UID: \"b993a003-2c7c-484b-a44b-17f07bdf6784\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n9mcs" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.163240 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fcl2w\" (UniqueName: \"kubernetes.io/projected/148f1f7a-b994-4984-a900-18e9d5868002-kube-api-access-fcl2w\") pod \"image-pruner-29424960-bh2l7\" (UID: \"148f1f7a-b994-4984-a900-18e9d5868002\") " pod="openshift-image-registry/image-pruner-29424960-bh2l7" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.163273 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b993a003-2c7c-484b-a44b-17f07bdf6784-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-n9mcs\" (UID: \"b993a003-2c7c-484b-a44b-17f07bdf6784\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n9mcs" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.163312 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jtttx\" (UniqueName: \"kubernetes.io/projected/64f7cb4f-b96f-4538-8eb9-3a8826dada32-kube-api-access-jtttx\") pod \"openshift-apiserver-operator-796bbdcf4f-xmr7n\" (UID: \"64f7cb4f-b96f-4538-8eb9-3a8826dada32\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xmr7n" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.163327 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2tnt2\" (UniqueName: \"kubernetes.io/projected/84d78ffd-976a-4d55-9b6a-d10369b35718-kube-api-access-2tnt2\") pod \"openshift-config-operator-7777fb866f-frwjc\" (UID: \"84d78ffd-976a-4d55-9b6a-d10369b35718\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-frwjc" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.163362 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/64f7cb4f-b96f-4538-8eb9-3a8826dada32-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-xmr7n\" (UID: \"64f7cb4f-b96f-4538-8eb9-3a8826dada32\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xmr7n" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.163378 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/148f1f7a-b994-4984-a900-18e9d5868002-serviceca\") pod \"image-pruner-29424960-bh2l7\" (UID: \"148f1f7a-b994-4984-a900-18e9d5868002\") " pod="openshift-image-registry/image-pruner-29424960-bh2l7" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.163395 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/21420052-2e90-4be9-923e-2b8d0d5ad189-service-ca-bundle\") pod \"router-default-5444994796-64s9l\" (UID: \"21420052-2e90-4be9-923e-2b8d0d5ad189\") " pod="openshift-ingress/router-default-5444994796-64s9l" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.163410 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/21420052-2e90-4be9-923e-2b8d0d5ad189-metrics-certs\") pod \"router-default-5444994796-64s9l\" (UID: \"21420052-2e90-4be9-923e-2b8d0d5ad189\") " pod="openshift-ingress/router-default-5444994796-64s9l" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.163426 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c454b7c4-18db-442a-ae25-d66e7e6061f3-service-ca\") pod \"console-f9d7485db-dlrwh\" (UID: \"c454b7c4-18db-442a-ae25-d66e7e6061f3\") " pod="openshift-console/console-f9d7485db-dlrwh" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.163444 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/079b1c50-eaa5-4be5-a0d2-0015a67a1875-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-dgfmw\" (UID: \"079b1c50-eaa5-4be5-a0d2-0015a67a1875\") " pod="openshift-authentication/oauth-openshift-558db77b4-dgfmw" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.163459 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b993a003-2c7c-484b-a44b-17f07bdf6784-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-n9mcs\" (UID: \"b993a003-2c7c-484b-a44b-17f07bdf6784\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n9mcs" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.163484 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/54e9b065-dbcb-4238-8da6-f36ae3e18dde-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-4fdkr\" (UID: \"54e9b065-dbcb-4238-8da6-f36ae3e18dde\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4fdkr" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.163500 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0ef842ad-7b0b-4e92-bfb2-23306b0f85f9-metrics-tls\") pod \"dns-operator-744455d44c-9vfqp\" (UID: \"0ef842ad-7b0b-4e92-bfb2-23306b0f85f9\") " pod="openshift-dns-operator/dns-operator-744455d44c-9vfqp" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.163515 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wv74f\" (UniqueName: \"kubernetes.io/projected/f6960858-8b4e-4855-b52a-caa021444b7d-kube-api-access-wv74f\") pod \"etcd-operator-b45778765-ffwdl\" (UID: \"f6960858-8b4e-4855-b52a-caa021444b7d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ffwdl" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.163533 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6nw9g\" (UniqueName: \"kubernetes.io/projected/d8869d3c-60d0-4a85-9b0f-84147ce018b5-kube-api-access-6nw9g\") pod \"ingress-operator-5b745b69d9-9gzg8\" (UID: \"d8869d3c-60d0-4a85-9b0f-84147ce018b5\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9gzg8" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.163630 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/21420052-2e90-4be9-923e-2b8d0d5ad189-stats-auth\") pod \"router-default-5444994796-64s9l\" (UID: \"21420052-2e90-4be9-923e-2b8d0d5ad189\") " pod="openshift-ingress/router-default-5444994796-64s9l" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.163649 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/24b34ccb-e494-493e-98a9-31cf59981c38-auth-proxy-config\") pod \"machine-config-operator-74547568cd-snpww\" (UID: \"24b34ccb-e494-493e-98a9-31cf59981c38\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-snpww" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.163667 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b993a003-2c7c-484b-a44b-17f07bdf6784-etcd-client\") pod \"apiserver-7bbb656c7d-n9mcs\" (UID: \"b993a003-2c7c-484b-a44b-17f07bdf6784\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n9mcs" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.163682 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ae2ce179-dd9a-4a2d-8f4c-e35424c12f94-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-l5pmq\" (UID: \"ae2ce179-dd9a-4a2d-8f4c-e35424c12f94\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-l5pmq" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.165710 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c454b7c4-18db-442a-ae25-d66e7e6061f3-console-config\") pod \"console-f9d7485db-dlrwh\" (UID: \"c454b7c4-18db-442a-ae25-d66e7e6061f3\") " pod="openshift-console/console-f9d7485db-dlrwh" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.167939 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b993a003-2c7c-484b-a44b-17f07bdf6784-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-n9mcs\" (UID: \"b993a003-2c7c-484b-a44b-17f07bdf6784\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n9mcs" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.169294 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-z6php"] Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.169455 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.170242 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b993a003-2c7c-484b-a44b-17f07bdf6784-serving-cert\") pod \"apiserver-7bbb656c7d-n9mcs\" (UID: \"b993a003-2c7c-484b-a44b-17f07bdf6784\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n9mcs" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.171070 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/079b1c50-eaa5-4be5-a0d2-0015a67a1875-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-dgfmw\" (UID: \"079b1c50-eaa5-4be5-a0d2-0015a67a1875\") " pod="openshift-authentication/oauth-openshift-558db77b4-dgfmw" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.171637 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c454b7c4-18db-442a-ae25-d66e7e6061f3-service-ca\") pod \"console-f9d7485db-dlrwh\" (UID: \"c454b7c4-18db-442a-ae25-d66e7e6061f3\") " pod="openshift-console/console-f9d7485db-dlrwh" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.172482 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b993a003-2c7c-484b-a44b-17f07bdf6784-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-n9mcs\" (UID: \"b993a003-2c7c-484b-a44b-17f07bdf6784\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n9mcs" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.172662 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b993a003-2c7c-484b-a44b-17f07bdf6784-audit-dir\") pod \"apiserver-7bbb656c7d-n9mcs\" (UID: \"b993a003-2c7c-484b-a44b-17f07bdf6784\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n9mcs" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.172850 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c454b7c4-18db-442a-ae25-d66e7e6061f3-trusted-ca-bundle\") pod \"console-f9d7485db-dlrwh\" (UID: \"c454b7c4-18db-442a-ae25-d66e7e6061f3\") " pod="openshift-console/console-f9d7485db-dlrwh" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.172895 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/84d78ffd-976a-4d55-9b6a-d10369b35718-available-featuregates\") pod \"openshift-config-operator-7777fb866f-frwjc\" (UID: \"84d78ffd-976a-4d55-9b6a-d10369b35718\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-frwjc" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.173074 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/64f7cb4f-b96f-4538-8eb9-3a8826dada32-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-xmr7n\" (UID: \"64f7cb4f-b96f-4538-8eb9-3a8826dada32\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xmr7n" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.173379 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b993a003-2c7c-484b-a44b-17f07bdf6784-audit-policies\") pod \"apiserver-7bbb656c7d-n9mcs\" (UID: \"b993a003-2c7c-484b-a44b-17f07bdf6784\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n9mcs" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.174114 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/43bb746f-62c0-45c5-b1db-490810a0ba0e-config\") pod \"route-controller-manager-6576b87f9c-k5ghr\" (UID: \"43bb746f-62c0-45c5-b1db-490810a0ba0e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-k5ghr" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.174162 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/148f1f7a-b994-4984-a900-18e9d5868002-serviceca\") pod \"image-pruner-29424960-bh2l7\" (UID: \"148f1f7a-b994-4984-a900-18e9d5868002\") " pod="openshift-image-registry/image-pruner-29424960-bh2l7" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.174507 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/079b1c50-eaa5-4be5-a0d2-0015a67a1875-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-dgfmw\" (UID: \"079b1c50-eaa5-4be5-a0d2-0015a67a1875\") " pod="openshift-authentication/oauth-openshift-558db77b4-dgfmw" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.174704 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/079b1c50-eaa5-4be5-a0d2-0015a67a1875-audit-dir\") pod \"oauth-openshift-558db77b4-dgfmw\" (UID: \"079b1c50-eaa5-4be5-a0d2-0015a67a1875\") " pod="openshift-authentication/oauth-openshift-558db77b4-dgfmw" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.175055 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/84d78ffd-976a-4d55-9b6a-d10369b35718-serving-cert\") pod \"openshift-config-operator-7777fb866f-frwjc\" (UID: \"84d78ffd-976a-4d55-9b6a-d10369b35718\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-frwjc" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.176204 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c454b7c4-18db-442a-ae25-d66e7e6061f3-oauth-serving-cert\") pod \"console-f9d7485db-dlrwh\" (UID: \"c454b7c4-18db-442a-ae25-d66e7e6061f3\") " pod="openshift-console/console-f9d7485db-dlrwh" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.176624 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d8869d3c-60d0-4a85-9b0f-84147ce018b5-trusted-ca\") pod \"ingress-operator-5b745b69d9-9gzg8\" (UID: \"d8869d3c-60d0-4a85-9b0f-84147ce018b5\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9gzg8" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.176645 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/079b1c50-eaa5-4be5-a0d2-0015a67a1875-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-dgfmw\" (UID: \"079b1c50-eaa5-4be5-a0d2-0015a67a1875\") " pod="openshift-authentication/oauth-openshift-558db77b4-dgfmw" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.176857 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/92300adf-2095-4cf3-901b-d17a9ab4deb5-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-vz9ft\" (UID: \"92300adf-2095-4cf3-901b-d17a9ab4deb5\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vz9ft" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.177051 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/079b1c50-eaa5-4be5-a0d2-0015a67a1875-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-dgfmw\" (UID: \"079b1c50-eaa5-4be5-a0d2-0015a67a1875\") " pod="openshift-authentication/oauth-openshift-558db77b4-dgfmw" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.177055 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b993a003-2c7c-484b-a44b-17f07bdf6784-encryption-config\") pod \"apiserver-7bbb656c7d-n9mcs\" (UID: \"b993a003-2c7c-484b-a44b-17f07bdf6784\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n9mcs" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.177447 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64f7cb4f-b96f-4538-8eb9-3a8826dada32-config\") pod \"openshift-apiserver-operator-796bbdcf4f-xmr7n\" (UID: \"64f7cb4f-b96f-4538-8eb9-3a8826dada32\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xmr7n" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.177637 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/43bb746f-62c0-45c5-b1db-490810a0ba0e-client-ca\") pod \"route-controller-manager-6576b87f9c-k5ghr\" (UID: \"43bb746f-62c0-45c5-b1db-490810a0ba0e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-k5ghr" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.177857 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b993a003-2c7c-484b-a44b-17f07bdf6784-etcd-client\") pod \"apiserver-7bbb656c7d-n9mcs\" (UID: \"b993a003-2c7c-484b-a44b-17f07bdf6784\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n9mcs" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.179448 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/43bb746f-62c0-45c5-b1db-490810a0ba0e-serving-cert\") pod \"route-controller-manager-6576b87f9c-k5ghr\" (UID: \"43bb746f-62c0-45c5-b1db-490810a0ba0e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-k5ghr" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.179601 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c454b7c4-18db-442a-ae25-d66e7e6061f3-console-oauth-config\") pod \"console-f9d7485db-dlrwh\" (UID: \"c454b7c4-18db-442a-ae25-d66e7e6061f3\") " pod="openshift-console/console-f9d7485db-dlrwh" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.179850 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d8869d3c-60d0-4a85-9b0f-84147ce018b5-metrics-tls\") pod \"ingress-operator-5b745b69d9-9gzg8\" (UID: \"d8869d3c-60d0-4a85-9b0f-84147ce018b5\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9gzg8" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.180564 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/079b1c50-eaa5-4be5-a0d2-0015a67a1875-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-dgfmw\" (UID: \"079b1c50-eaa5-4be5-a0d2-0015a67a1875\") " pod="openshift-authentication/oauth-openshift-558db77b4-dgfmw" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.181428 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/079b1c50-eaa5-4be5-a0d2-0015a67a1875-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-dgfmw\" (UID: \"079b1c50-eaa5-4be5-a0d2-0015a67a1875\") " pod="openshift-authentication/oauth-openshift-558db77b4-dgfmw" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.181477 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.182627 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/079b1c50-eaa5-4be5-a0d2-0015a67a1875-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-dgfmw\" (UID: \"079b1c50-eaa5-4be5-a0d2-0015a67a1875\") " pod="openshift-authentication/oauth-openshift-558db77b4-dgfmw" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.183697 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/079b1c50-eaa5-4be5-a0d2-0015a67a1875-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-dgfmw\" (UID: \"079b1c50-eaa5-4be5-a0d2-0015a67a1875\") " pod="openshift-authentication/oauth-openshift-558db77b4-dgfmw" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.183894 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/079b1c50-eaa5-4be5-a0d2-0015a67a1875-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-dgfmw\" (UID: \"079b1c50-eaa5-4be5-a0d2-0015a67a1875\") " pod="openshift-authentication/oauth-openshift-558db77b4-dgfmw" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.184360 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/079b1c50-eaa5-4be5-a0d2-0015a67a1875-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-dgfmw\" (UID: \"079b1c50-eaa5-4be5-a0d2-0015a67a1875\") " pod="openshift-authentication/oauth-openshift-558db77b4-dgfmw" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.184428 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c454b7c4-18db-442a-ae25-d66e7e6061f3-console-serving-cert\") pod \"console-f9d7485db-dlrwh\" (UID: \"c454b7c4-18db-442a-ae25-d66e7e6061f3\") " pod="openshift-console/console-f9d7485db-dlrwh" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.185378 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/079b1c50-eaa5-4be5-a0d2-0015a67a1875-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-dgfmw\" (UID: \"079b1c50-eaa5-4be5-a0d2-0015a67a1875\") " pod="openshift-authentication/oauth-openshift-558db77b4-dgfmw" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.202154 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.221439 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.241553 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.261777 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.264216 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/21420052-2e90-4be9-923e-2b8d0d5ad189-service-ca-bundle\") pod \"router-default-5444994796-64s9l\" (UID: \"21420052-2e90-4be9-923e-2b8d0d5ad189\") " pod="openshift-ingress/router-default-5444994796-64s9l" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.264254 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/21420052-2e90-4be9-923e-2b8d0d5ad189-metrics-certs\") pod \"router-default-5444994796-64s9l\" (UID: \"21420052-2e90-4be9-923e-2b8d0d5ad189\") " pod="openshift-ingress/router-default-5444994796-64s9l" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.264291 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/54e9b065-dbcb-4238-8da6-f36ae3e18dde-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-4fdkr\" (UID: \"54e9b065-dbcb-4238-8da6-f36ae3e18dde\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4fdkr" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.264315 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0ef842ad-7b0b-4e92-bfb2-23306b0f85f9-metrics-tls\") pod \"dns-operator-744455d44c-9vfqp\" (UID: \"0ef842ad-7b0b-4e92-bfb2-23306b0f85f9\") " pod="openshift-dns-operator/dns-operator-744455d44c-9vfqp" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.264336 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wv74f\" (UniqueName: \"kubernetes.io/projected/f6960858-8b4e-4855-b52a-caa021444b7d-kube-api-access-wv74f\") pod \"etcd-operator-b45778765-ffwdl\" (UID: \"f6960858-8b4e-4855-b52a-caa021444b7d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ffwdl" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.264367 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/21420052-2e90-4be9-923e-2b8d0d5ad189-stats-auth\") pod \"router-default-5444994796-64s9l\" (UID: \"21420052-2e90-4be9-923e-2b8d0d5ad189\") " pod="openshift-ingress/router-default-5444994796-64s9l" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.264388 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/24b34ccb-e494-493e-98a9-31cf59981c38-auth-proxy-config\") pod \"machine-config-operator-74547568cd-snpww\" (UID: \"24b34ccb-e494-493e-98a9-31cf59981c38\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-snpww" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.264412 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ae2ce179-dd9a-4a2d-8f4c-e35424c12f94-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-l5pmq\" (UID: \"ae2ce179-dd9a-4a2d-8f4c-e35424c12f94\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-l5pmq" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.264453 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d5b97b3b-8994-4d7f-a165-c04d13546e89-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-544fr\" (UID: \"d5b97b3b-8994-4d7f-a165-c04d13546e89\") " pod="openshift-marketplace/marketplace-operator-79b997595-544fr" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.264475 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/54e9b065-dbcb-4238-8da6-f36ae3e18dde-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-4fdkr\" (UID: \"54e9b065-dbcb-4238-8da6-f36ae3e18dde\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4fdkr" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.264496 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ae2ce179-dd9a-4a2d-8f4c-e35424c12f94-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-l5pmq\" (UID: \"ae2ce179-dd9a-4a2d-8f4c-e35424c12f94\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-l5pmq" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.264520 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/f6960858-8b4e-4855-b52a-caa021444b7d-etcd-service-ca\") pod \"etcd-operator-b45778765-ffwdl\" (UID: \"f6960858-8b4e-4855-b52a-caa021444b7d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ffwdl" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.264551 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/24b34ccb-e494-493e-98a9-31cf59981c38-proxy-tls\") pod \"machine-config-operator-74547568cd-snpww\" (UID: \"24b34ccb-e494-493e-98a9-31cf59981c38\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-snpww" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.264594 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f6960858-8b4e-4855-b52a-caa021444b7d-etcd-client\") pod \"etcd-operator-b45778765-ffwdl\" (UID: \"f6960858-8b4e-4855-b52a-caa021444b7d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ffwdl" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.264626 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae2ce179-dd9a-4a2d-8f4c-e35424c12f94-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-l5pmq\" (UID: \"ae2ce179-dd9a-4a2d-8f4c-e35424c12f94\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-l5pmq" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.264689 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f6960858-8b4e-4855-b52a-caa021444b7d-serving-cert\") pod \"etcd-operator-b45778765-ffwdl\" (UID: \"f6960858-8b4e-4855-b52a-caa021444b7d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ffwdl" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.264738 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54e9b065-dbcb-4238-8da6-f36ae3e18dde-config\") pod \"kube-apiserver-operator-766d6c64bb-4fdkr\" (UID: \"54e9b065-dbcb-4238-8da6-f36ae3e18dde\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4fdkr" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.264769 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rblg6\" (UniqueName: \"kubernetes.io/projected/24b34ccb-e494-493e-98a9-31cf59981c38-kube-api-access-rblg6\") pod \"machine-config-operator-74547568cd-snpww\" (UID: \"24b34ccb-e494-493e-98a9-31cf59981c38\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-snpww" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.264793 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6960858-8b4e-4855-b52a-caa021444b7d-config\") pod \"etcd-operator-b45778765-ffwdl\" (UID: \"f6960858-8b4e-4855-b52a-caa021444b7d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ffwdl" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.264815 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tjvhw\" (UniqueName: \"kubernetes.io/projected/e829b744-8b79-4f65-8783-ea555f280ce8-kube-api-access-tjvhw\") pod \"machine-config-controller-84d6567774-6h2gf\" (UID: \"e829b744-8b79-4f65-8783-ea555f280ce8\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6h2gf" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.264876 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lzgrh\" (UniqueName: \"kubernetes.io/projected/0ef842ad-7b0b-4e92-bfb2-23306b0f85f9-kube-api-access-lzgrh\") pod \"dns-operator-744455d44c-9vfqp\" (UID: \"0ef842ad-7b0b-4e92-bfb2-23306b0f85f9\") " pod="openshift-dns-operator/dns-operator-744455d44c-9vfqp" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.264900 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/21420052-2e90-4be9-923e-2b8d0d5ad189-default-certificate\") pod \"router-default-5444994796-64s9l\" (UID: \"21420052-2e90-4be9-923e-2b8d0d5ad189\") " pod="openshift-ingress/router-default-5444994796-64s9l" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.264933 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8j5bk\" (UniqueName: \"kubernetes.io/projected/d5b97b3b-8994-4d7f-a165-c04d13546e89-kube-api-access-8j5bk\") pod \"marketplace-operator-79b997595-544fr\" (UID: \"d5b97b3b-8994-4d7f-a165-c04d13546e89\") " pod="openshift-marketplace/marketplace-operator-79b997595-544fr" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.264972 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/d5b97b3b-8994-4d7f-a165-c04d13546e89-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-544fr\" (UID: \"d5b97b3b-8994-4d7f-a165-c04d13546e89\") " pod="openshift-marketplace/marketplace-operator-79b997595-544fr" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.264997 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/f6960858-8b4e-4855-b52a-caa021444b7d-etcd-ca\") pod \"etcd-operator-b45778765-ffwdl\" (UID: \"f6960858-8b4e-4855-b52a-caa021444b7d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ffwdl" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.265020 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e829b744-8b79-4f65-8783-ea555f280ce8-proxy-tls\") pod \"machine-config-controller-84d6567774-6h2gf\" (UID: \"e829b744-8b79-4f65-8783-ea555f280ce8\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6h2gf" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.265069 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e829b744-8b79-4f65-8783-ea555f280ce8-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-6h2gf\" (UID: \"e829b744-8b79-4f65-8783-ea555f280ce8\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6h2gf" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.265093 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k54mz\" (UniqueName: \"kubernetes.io/projected/21420052-2e90-4be9-923e-2b8d0d5ad189-kube-api-access-k54mz\") pod \"router-default-5444994796-64s9l\" (UID: \"21420052-2e90-4be9-923e-2b8d0d5ad189\") " pod="openshift-ingress/router-default-5444994796-64s9l" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.265114 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/24b34ccb-e494-493e-98a9-31cf59981c38-images\") pod \"machine-config-operator-74547568cd-snpww\" (UID: \"24b34ccb-e494-493e-98a9-31cf59981c38\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-snpww" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.265928 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/21420052-2e90-4be9-923e-2b8d0d5ad189-service-ca-bundle\") pod \"router-default-5444994796-64s9l\" (UID: \"21420052-2e90-4be9-923e-2b8d0d5ad189\") " pod="openshift-ingress/router-default-5444994796-64s9l" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.266690 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae2ce179-dd9a-4a2d-8f4c-e35424c12f94-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-l5pmq\" (UID: \"ae2ce179-dd9a-4a2d-8f4c-e35424c12f94\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-l5pmq" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.266873 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/24b34ccb-e494-493e-98a9-31cf59981c38-auth-proxy-config\") pod \"machine-config-operator-74547568cd-snpww\" (UID: \"24b34ccb-e494-493e-98a9-31cf59981c38\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-snpww" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.267345 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e829b744-8b79-4f65-8783-ea555f280ce8-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-6h2gf\" (UID: \"e829b744-8b79-4f65-8783-ea555f280ce8\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6h2gf" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.268017 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/21420052-2e90-4be9-923e-2b8d0d5ad189-stats-auth\") pod \"router-default-5444994796-64s9l\" (UID: \"21420052-2e90-4be9-923e-2b8d0d5ad189\") " pod="openshift-ingress/router-default-5444994796-64s9l" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.269834 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/21420052-2e90-4be9-923e-2b8d0d5ad189-metrics-certs\") pod \"router-default-5444994796-64s9l\" (UID: \"21420052-2e90-4be9-923e-2b8d0d5ad189\") " pod="openshift-ingress/router-default-5444994796-64s9l" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.270271 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/21420052-2e90-4be9-923e-2b8d0d5ad189-default-certificate\") pod \"router-default-5444994796-64s9l\" (UID: \"21420052-2e90-4be9-923e-2b8d0d5ad189\") " pod="openshift-ingress/router-default-5444994796-64s9l" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.270830 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ae2ce179-dd9a-4a2d-8f4c-e35424c12f94-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-l5pmq\" (UID: \"ae2ce179-dd9a-4a2d-8f4c-e35424c12f94\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-l5pmq" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.282317 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.301336 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.322348 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.341422 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.349061 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f6960858-8b4e-4855-b52a-caa021444b7d-serving-cert\") pod \"etcd-operator-b45778765-ffwdl\" (UID: \"f6960858-8b4e-4855-b52a-caa021444b7d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ffwdl" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.371493 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.378485 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f6960858-8b4e-4855-b52a-caa021444b7d-etcd-client\") pod \"etcd-operator-b45778765-ffwdl\" (UID: \"f6960858-8b4e-4855-b52a-caa021444b7d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ffwdl" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.381146 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.387009 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6960858-8b4e-4855-b52a-caa021444b7d-config\") pod \"etcd-operator-b45778765-ffwdl\" (UID: \"f6960858-8b4e-4855-b52a-caa021444b7d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ffwdl" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.402000 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.406829 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/f6960858-8b4e-4855-b52a-caa021444b7d-etcd-ca\") pod \"etcd-operator-b45778765-ffwdl\" (UID: \"f6960858-8b4e-4855-b52a-caa021444b7d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ffwdl" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.422224 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.427025 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/f6960858-8b4e-4855-b52a-caa021444b7d-etcd-service-ca\") pod \"etcd-operator-b45778765-ffwdl\" (UID: \"f6960858-8b4e-4855-b52a-caa021444b7d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ffwdl" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.442619 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.462384 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.481966 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.501799 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.521450 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.541662 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.561940 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.572298 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/54e9b065-dbcb-4238-8da6-f36ae3e18dde-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-4fdkr\" (UID: \"54e9b065-dbcb-4238-8da6-f36ae3e18dde\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4fdkr" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.582261 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.587032 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54e9b065-dbcb-4238-8da6-f36ae3e18dde-config\") pod \"kube-apiserver-operator-766d6c64bb-4fdkr\" (UID: \"54e9b065-dbcb-4238-8da6-f36ae3e18dde\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4fdkr" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.602645 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.622025 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.643254 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.661715 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.682776 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.701983 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.722585 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.742336 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.748195 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0ef842ad-7b0b-4e92-bfb2-23306b0f85f9-metrics-tls\") pod \"dns-operator-744455d44c-9vfqp\" (UID: \"0ef842ad-7b0b-4e92-bfb2-23306b0f85f9\") " pod="openshift-dns-operator/dns-operator-744455d44c-9vfqp" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.762153 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.770389 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e829b744-8b79-4f65-8783-ea555f280ce8-proxy-tls\") pod \"machine-config-controller-84d6567774-6h2gf\" (UID: \"e829b744-8b79-4f65-8783-ea555f280ce8\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6h2gf" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.781680 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.801519 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.806543 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/24b34ccb-e494-493e-98a9-31cf59981c38-images\") pod \"machine-config-operator-74547568cd-snpww\" (UID: \"24b34ccb-e494-493e-98a9-31cf59981c38\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-snpww" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.822731 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.842452 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.849035 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/24b34ccb-e494-493e-98a9-31cf59981c38-proxy-tls\") pod \"machine-config-operator-74547568cd-snpww\" (UID: \"24b34ccb-e494-493e-98a9-31cf59981c38\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-snpww" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.862009 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.882100 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.902338 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.909704 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/d5b97b3b-8994-4d7f-a165-c04d13546e89-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-544fr\" (UID: \"d5b97b3b-8994-4d7f-a165-c04d13546e89\") " pod="openshift-marketplace/marketplace-operator-79b997595-544fr" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.930511 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.938057 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d5b97b3b-8994-4d7f-a165-c04d13546e89-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-544fr\" (UID: \"d5b97b3b-8994-4d7f-a165-c04d13546e89\") " pod="openshift-marketplace/marketplace-operator-79b997595-544fr" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.942413 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.976937 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fv569\" (UniqueName: \"kubernetes.io/projected/7f35cb7e-9b27-44e8-bd6f-05757a107776-kube-api-access-fv569\") pod \"cluster-image-registry-operator-dc59b4c8b-f7w26\" (UID: \"7f35cb7e-9b27-44e8-bd6f-05757a107776\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-f7w26" Dec 12 00:25:44 crc kubenswrapper[4606]: I1212 00:25:44.982304 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 12 00:25:45 crc kubenswrapper[4606]: I1212 00:25:45.002870 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 12 00:25:45 crc kubenswrapper[4606]: I1212 00:25:45.021717 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 12 00:25:45 crc kubenswrapper[4606]: I1212 00:25:45.040807 4606 request.go:700] Waited for 1.006631819s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-storage-version-migrator-operator/secrets?fieldSelector=metadata.name%3Dserving-cert&limit=500&resourceVersion=0 Dec 12 00:25:45 crc kubenswrapper[4606]: I1212 00:25:45.042713 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 12 00:25:45 crc kubenswrapper[4606]: I1212 00:25:45.061059 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 12 00:25:45 crc kubenswrapper[4606]: I1212 00:25:45.081780 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 12 00:25:45 crc kubenswrapper[4606]: I1212 00:25:45.101848 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 12 00:25:45 crc kubenswrapper[4606]: I1212 00:25:45.121657 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 12 00:25:45 crc kubenswrapper[4606]: I1212 00:25:45.141404 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 12 00:25:45 crc kubenswrapper[4606]: I1212 00:25:45.161434 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 12 00:25:45 crc kubenswrapper[4606]: I1212 00:25:45.199839 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbffd\" (UniqueName: \"kubernetes.io/projected/9d84c7aa-3fad-4d0c-ba9a-5577ba892a5b-kube-api-access-xbffd\") pod \"machine-api-operator-5694c8668f-pkx4d\" (UID: \"9d84c7aa-3fad-4d0c-ba9a-5577ba892a5b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-pkx4d" Dec 12 00:25:45 crc kubenswrapper[4606]: I1212 00:25:45.216034 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58qc2\" (UniqueName: \"kubernetes.io/projected/6fb93471-e75b-43b2-a4e2-d36bfc617930-kube-api-access-58qc2\") pod \"apiserver-76f77b778f-f6l62\" (UID: \"6fb93471-e75b-43b2-a4e2-d36bfc617930\") " pod="openshift-apiserver/apiserver-76f77b778f-f6l62" Dec 12 00:25:45 crc kubenswrapper[4606]: I1212 00:25:45.236151 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8xt5\" (UniqueName: \"kubernetes.io/projected/05c0cc56-218e-423d-b6cc-72bf5db5fdfd-kube-api-access-n8xt5\") pod \"openshift-controller-manager-operator-756b6f6bc6-t6r7p\" (UID: \"05c0cc56-218e-423d-b6cc-72bf5db5fdfd\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-t6r7p" Dec 12 00:25:45 crc kubenswrapper[4606]: I1212 00:25:45.254878 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7st4c\" (UniqueName: \"kubernetes.io/projected/39613736-ea61-4bfb-8e8a-640d4e749bd5-kube-api-access-7st4c\") pod \"authentication-operator-69f744f599-7kxjs\" (UID: \"39613736-ea61-4bfb-8e8a-640d4e749bd5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7kxjs" Dec 12 00:25:45 crc kubenswrapper[4606]: I1212 00:25:45.274820 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhbjx\" (UniqueName: \"kubernetes.io/projected/dcc51b5e-839a-4b7b-ac5c-e0e25c3aac96-kube-api-access-hhbjx\") pod \"machine-approver-56656f9798-q728t\" (UID: \"dcc51b5e-839a-4b7b-ac5c-e0e25c3aac96\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-q728t" Dec 12 00:25:45 crc kubenswrapper[4606]: I1212 00:25:45.294657 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xlkk\" (UniqueName: \"kubernetes.io/projected/a4e84bf9-56eb-4ce2-9719-5836eb7177a1-kube-api-access-6xlkk\") pod \"console-operator-58897d9998-596sh\" (UID: \"a4e84bf9-56eb-4ce2-9719-5836eb7177a1\") " pod="openshift-console-operator/console-operator-58897d9998-596sh" Dec 12 00:25:45 crc kubenswrapper[4606]: I1212 00:25:45.307445 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-f6l62" Dec 12 00:25:45 crc kubenswrapper[4606]: I1212 00:25:45.312938 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rw2l\" (UniqueName: \"kubernetes.io/projected/16a5a061-f2aa-430e-9898-b7adff8ccb50-kube-api-access-2rw2l\") pod \"controller-manager-879f6c89f-dd5sp\" (UID: \"16a5a061-f2aa-430e-9898-b7adff8ccb50\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dd5sp" Dec 12 00:25:45 crc kubenswrapper[4606]: I1212 00:25:45.321758 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-dd5sp" Dec 12 00:25:45 crc kubenswrapper[4606]: I1212 00:25:45.329268 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-pkx4d" Dec 12 00:25:45 crc kubenswrapper[4606]: I1212 00:25:45.345678 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7f35cb7e-9b27-44e8-bd6f-05757a107776-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-f7w26\" (UID: \"7f35cb7e-9b27-44e8-bd6f-05757a107776\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-f7w26" Dec 12 00:25:45 crc kubenswrapper[4606]: I1212 00:25:45.352493 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-7kxjs" Dec 12 00:25:45 crc kubenswrapper[4606]: I1212 00:25:45.370838 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-596sh" Dec 12 00:25:45 crc kubenswrapper[4606]: I1212 00:25:45.381954 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 12 00:25:45 crc kubenswrapper[4606]: I1212 00:25:45.395683 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-t6r7p" Dec 12 00:25:45 crc kubenswrapper[4606]: I1212 00:25:45.405076 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 12 00:25:45 crc kubenswrapper[4606]: I1212 00:25:45.423239 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 12 00:25:45 crc kubenswrapper[4606]: I1212 00:25:45.435230 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-f7w26" Dec 12 00:25:45 crc kubenswrapper[4606]: I1212 00:25:45.442672 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 12 00:25:45 crc kubenswrapper[4606]: I1212 00:25:45.454899 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-q728t" Dec 12 00:25:45 crc kubenswrapper[4606]: I1212 00:25:45.464305 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 12 00:25:45 crc kubenswrapper[4606]: I1212 00:25:45.482012 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 12 00:25:45 crc kubenswrapper[4606]: I1212 00:25:45.501817 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 12 00:25:45 crc kubenswrapper[4606]: I1212 00:25:45.522346 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 12 00:25:45 crc kubenswrapper[4606]: I1212 00:25:45.543820 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 12 00:25:45 crc kubenswrapper[4606]: I1212 00:25:45.562462 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 12 00:25:45 crc kubenswrapper[4606]: I1212 00:25:45.584666 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 12 00:25:45 crc kubenswrapper[4606]: I1212 00:25:45.604374 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 12 00:25:45 crc kubenswrapper[4606]: I1212 00:25:45.626407 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 12 00:25:45 crc kubenswrapper[4606]: I1212 00:25:45.643139 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 12 00:25:45 crc kubenswrapper[4606]: I1212 00:25:45.662381 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 12 00:25:45 crc kubenswrapper[4606]: I1212 00:25:45.680412 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-596sh"] Dec 12 00:25:45 crc kubenswrapper[4606]: I1212 00:25:45.681283 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 12 00:25:45 crc kubenswrapper[4606]: I1212 00:25:45.683930 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-7kxjs"] Dec 12 00:25:45 crc kubenswrapper[4606]: I1212 00:25:45.704595 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 12 00:25:45 crc kubenswrapper[4606]: I1212 00:25:45.722264 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 12 00:25:45 crc kubenswrapper[4606]: I1212 00:25:45.732863 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-t6r7p"] Dec 12 00:25:45 crc kubenswrapper[4606]: I1212 00:25:45.742526 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 12 00:25:45 crc kubenswrapper[4606]: I1212 00:25:45.759445 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-f6l62"] Dec 12 00:25:45 crc kubenswrapper[4606]: I1212 00:25:45.762720 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 12 00:25:45 crc kubenswrapper[4606]: I1212 00:25:45.782018 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 12 00:25:45 crc kubenswrapper[4606]: I1212 00:25:45.782893 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-f7w26"] Dec 12 00:25:45 crc kubenswrapper[4606]: W1212 00:25:45.788456 4606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7f35cb7e_9b27_44e8_bd6f_05757a107776.slice/crio-e3033a67f148099663ca576386290ef940f94fd2a35ec9a6446970a75bd2097f WatchSource:0}: Error finding container e3033a67f148099663ca576386290ef940f94fd2a35ec9a6446970a75bd2097f: Status 404 returned error can't find the container with id e3033a67f148099663ca576386290ef940f94fd2a35ec9a6446970a75bd2097f Dec 12 00:25:45 crc kubenswrapper[4606]: I1212 00:25:45.801754 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 12 00:25:45 crc kubenswrapper[4606]: I1212 00:25:45.819406 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-pkx4d"] Dec 12 00:25:45 crc kubenswrapper[4606]: I1212 00:25:45.821843 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-dd5sp"] Dec 12 00:25:45 crc kubenswrapper[4606]: I1212 00:25:45.823155 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 12 00:25:45 crc kubenswrapper[4606]: I1212 00:25:45.841146 4606 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 12 00:25:45 crc kubenswrapper[4606]: I1212 00:25:45.861050 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 12 00:25:45 crc kubenswrapper[4606]: I1212 00:25:45.880779 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 12 00:25:45 crc kubenswrapper[4606]: I1212 00:25:45.902551 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 12 00:25:45 crc kubenswrapper[4606]: I1212 00:25:45.922641 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 12 00:25:45 crc kubenswrapper[4606]: I1212 00:25:45.941905 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 12 00:25:45 crc kubenswrapper[4606]: I1212 00:25:45.962952 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 12 00:25:45 crc kubenswrapper[4606]: I1212 00:25:45.982345 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.001795 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.023306 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.040921 4606 request.go:700] Waited for 1.87885995s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-dns/secrets?fieldSelector=metadata.name%3Ddns-dockercfg-jwfmh&limit=500&resourceVersion=0 Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.047139 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.062879 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.102777 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fcl2w\" (UniqueName: \"kubernetes.io/projected/148f1f7a-b994-4984-a900-18e9d5868002-kube-api-access-fcl2w\") pod \"image-pruner-29424960-bh2l7\" (UID: \"148f1f7a-b994-4984-a900-18e9d5868002\") " pod="openshift-image-registry/image-pruner-29424960-bh2l7" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.135974 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtttx\" (UniqueName: \"kubernetes.io/projected/64f7cb4f-b96f-4538-8eb9-3a8826dada32-kube-api-access-jtttx\") pod \"openshift-apiserver-operator-796bbdcf4f-xmr7n\" (UID: \"64f7cb4f-b96f-4538-8eb9-3a8826dada32\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xmr7n" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.152780 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2tnt2\" (UniqueName: \"kubernetes.io/projected/84d78ffd-976a-4d55-9b6a-d10369b35718-kube-api-access-2tnt2\") pod \"openshift-config-operator-7777fb866f-frwjc\" (UID: \"84d78ffd-976a-4d55-9b6a-d10369b35718\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-frwjc" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.157664 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d8869d3c-60d0-4a85-9b0f-84147ce018b5-bound-sa-token\") pod \"ingress-operator-5b745b69d9-9gzg8\" (UID: \"d8869d3c-60d0-4a85-9b0f-84147ce018b5\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9gzg8" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.175896 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvstb\" (UniqueName: \"kubernetes.io/projected/92300adf-2095-4cf3-901b-d17a9ab4deb5-kube-api-access-mvstb\") pod \"cluster-samples-operator-665b6dd947-vz9ft\" (UID: \"92300adf-2095-4cf3-901b-d17a9ab4deb5\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vz9ft" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.179868 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vz9ft" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.200773 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nw9g\" (UniqueName: \"kubernetes.io/projected/d8869d3c-60d0-4a85-9b0f-84147ce018b5-kube-api-access-6nw9g\") pod \"ingress-operator-5b745b69d9-9gzg8\" (UID: \"d8869d3c-60d0-4a85-9b0f-84147ce018b5\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9gzg8" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.221320 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29424960-bh2l7" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.222589 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lgz49\" (UniqueName: \"kubernetes.io/projected/5635c63d-bd71-4b80-b111-0fd9ff2cd053-kube-api-access-lgz49\") pod \"downloads-7954f5f757-vlq68\" (UID: \"5635c63d-bd71-4b80-b111-0fd9ff2cd053\") " pod="openshift-console/downloads-7954f5f757-vlq68" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.236867 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-frwjc" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.244025 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9gzg8" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.260119 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmcv9\" (UniqueName: \"kubernetes.io/projected/43bb746f-62c0-45c5-b1db-490810a0ba0e-kube-api-access-vmcv9\") pod \"route-controller-manager-6576b87f9c-k5ghr\" (UID: \"43bb746f-62c0-45c5-b1db-490810a0ba0e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-k5ghr" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.261324 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vksbn\" (UniqueName: \"kubernetes.io/projected/079b1c50-eaa5-4be5-a0d2-0015a67a1875-kube-api-access-vksbn\") pod \"oauth-openshift-558db77b4-dgfmw\" (UID: \"079b1c50-eaa5-4be5-a0d2-0015a67a1875\") " pod="openshift-authentication/oauth-openshift-558db77b4-dgfmw" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.297695 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kb6vx\" (UniqueName: \"kubernetes.io/projected/c454b7c4-18db-442a-ae25-d66e7e6061f3-kube-api-access-kb6vx\") pod \"console-f9d7485db-dlrwh\" (UID: \"c454b7c4-18db-442a-ae25-d66e7e6061f3\") " pod="openshift-console/console-f9d7485db-dlrwh" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.300513 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjxtp\" (UniqueName: \"kubernetes.io/projected/b993a003-2c7c-484b-a44b-17f07bdf6784-kube-api-access-pjxtp\") pod \"apiserver-7bbb656c7d-n9mcs\" (UID: \"b993a003-2c7c-484b-a44b-17f07bdf6784\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n9mcs" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.316660 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/54e9b065-dbcb-4238-8da6-f36ae3e18dde-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-4fdkr\" (UID: \"54e9b065-dbcb-4238-8da6-f36ae3e18dde\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4fdkr" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.342744 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wv74f\" (UniqueName: \"kubernetes.io/projected/f6960858-8b4e-4855-b52a-caa021444b7d-kube-api-access-wv74f\") pod \"etcd-operator-b45778765-ffwdl\" (UID: \"f6960858-8b4e-4855-b52a-caa021444b7d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ffwdl" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.361227 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-vlq68" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.364236 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rblg6\" (UniqueName: \"kubernetes.io/projected/24b34ccb-e494-493e-98a9-31cf59981c38-kube-api-access-rblg6\") pod \"machine-config-operator-74547568cd-snpww\" (UID: \"24b34ccb-e494-493e-98a9-31cf59981c38\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-snpww" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.375561 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-k5ghr" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.382783 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8j5bk\" (UniqueName: \"kubernetes.io/projected/d5b97b3b-8994-4d7f-a165-c04d13546e89-kube-api-access-8j5bk\") pod \"marketplace-operator-79b997595-544fr\" (UID: \"d5b97b3b-8994-4d7f-a165-c04d13546e89\") " pod="openshift-marketplace/marketplace-operator-79b997595-544fr" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.400471 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzgrh\" (UniqueName: \"kubernetes.io/projected/0ef842ad-7b0b-4e92-bfb2-23306b0f85f9-kube-api-access-lzgrh\") pod \"dns-operator-744455d44c-9vfqp\" (UID: \"0ef842ad-7b0b-4e92-bfb2-23306b0f85f9\") " pod="openshift-dns-operator/dns-operator-744455d44c-9vfqp" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.402877 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-q728t" event={"ID":"dcc51b5e-839a-4b7b-ac5c-e0e25c3aac96","Type":"ContainerStarted","Data":"b02e246e8df1bacea75a4777179c5a89e85e23156c9c5cf2c107a22820c09c32"} Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.402912 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-q728t" event={"ID":"dcc51b5e-839a-4b7b-ac5c-e0e25c3aac96","Type":"ContainerStarted","Data":"74aa1937765f510486457b17640863a40c07f00e7fdef0b6bd773542d3cb6a97"} Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.402939 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xmr7n" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.413872 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n9mcs" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.421959 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjvhw\" (UniqueName: \"kubernetes.io/projected/e829b744-8b79-4f65-8783-ea555f280ce8-kube-api-access-tjvhw\") pod \"machine-config-controller-84d6567774-6h2gf\" (UID: \"e829b744-8b79-4f65-8783-ea555f280ce8\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6h2gf" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.428367 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-f7w26" event={"ID":"7f35cb7e-9b27-44e8-bd6f-05757a107776","Type":"ContainerStarted","Data":"e3033a67f148099663ca576386290ef940f94fd2a35ec9a6446970a75bd2097f"} Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.429595 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-f6l62" event={"ID":"6fb93471-e75b-43b2-a4e2-d36bfc617930","Type":"ContainerStarted","Data":"0c3fe0c6666a25e19cde4c60602c8ad9bf3ce88393ab36fcc87cf614d2dc80c6"} Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.431444 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-596sh" event={"ID":"a4e84bf9-56eb-4ce2-9719-5836eb7177a1","Type":"ContainerStarted","Data":"c535c2c4da50f77f76b69c4687b82694cd95a068cd5cc50a4a4c73109402698a"} Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.431463 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-596sh" event={"ID":"a4e84bf9-56eb-4ce2-9719-5836eb7177a1","Type":"ContainerStarted","Data":"645b8be93f11a740c89707c24929d4a074057433a4a6acde1234ae37be2b29bb"} Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.433443 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-596sh" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.441707 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k54mz\" (UniqueName: \"kubernetes.io/projected/21420052-2e90-4be9-923e-2b8d0d5ad189-kube-api-access-k54mz\") pod \"router-default-5444994796-64s9l\" (UID: \"21420052-2e90-4be9-923e-2b8d0d5ad189\") " pod="openshift-ingress/router-default-5444994796-64s9l" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.441802 4606 patch_prober.go:28] interesting pod/console-operator-58897d9998-596sh container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/readyz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.441834 4606 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-596sh" podUID="a4e84bf9-56eb-4ce2-9719-5836eb7177a1" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.7:8443/readyz\": dial tcp 10.217.0.7:8443: connect: connection refused" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.442279 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-7kxjs" event={"ID":"39613736-ea61-4bfb-8e8a-640d4e749bd5","Type":"ContainerStarted","Data":"365722c47a18bc0a1bfed9b4d06b10295632f734041600924789e20d4bf177ff"} Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.442317 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-7kxjs" event={"ID":"39613736-ea61-4bfb-8e8a-640d4e749bd5","Type":"ContainerStarted","Data":"b01095fc74379b12b70c47cd2d88dd54a1404d34121ff2f147d0a9899ead1c1a"} Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.459646 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ae2ce179-dd9a-4a2d-8f4c-e35424c12f94-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-l5pmq\" (UID: \"ae2ce179-dd9a-4a2d-8f4c-e35424c12f94\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-l5pmq" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.484562 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-pkx4d" event={"ID":"9d84c7aa-3fad-4d0c-ba9a-5577ba892a5b","Type":"ContainerStarted","Data":"f2b8baea41112eb30a10927a8fc74168715620a87e58d3748eeafd8ab8727a76"} Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.487302 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-t6r7p" event={"ID":"05c0cc56-218e-423d-b6cc-72bf5db5fdfd","Type":"ContainerStarted","Data":"84431c3ff7cfae3100ec7b0925f7630f617061515a3e612188d337a69358f823"} Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.488039 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-dd5sp" event={"ID":"16a5a061-f2aa-430e-9898-b7adff8ccb50","Type":"ContainerStarted","Data":"eba57dcb592fb4f77aa6f032523d3a53c6709d99d908bb4d18a80f1a57ba16ff"} Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.497229 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/40292f84-e865-4368-9e37-e385dfcb5880-installation-pull-secrets\") pod \"image-registry-697d97f7c8-t8nn7\" (UID: \"40292f84-e865-4368-9e37-e385dfcb5880\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8nn7" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.497262 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e2af8a85-751c-409a-908e-1334eb3e8e42-config\") pod \"kube-controller-manager-operator-78b949d7b-4hwhh\" (UID: \"e2af8a85-751c-409a-908e-1334eb3e8e42\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4hwhh" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.497299 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/40292f84-e865-4368-9e37-e385dfcb5880-registry-tls\") pod \"image-registry-697d97f7c8-t8nn7\" (UID: \"40292f84-e865-4368-9e37-e385dfcb5880\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8nn7" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.497324 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/40292f84-e865-4368-9e37-e385dfcb5880-trusted-ca\") pod \"image-registry-697d97f7c8-t8nn7\" (UID: \"40292f84-e865-4368-9e37-e385dfcb5880\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8nn7" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.497353 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e2af8a85-751c-409a-908e-1334eb3e8e42-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-4hwhh\" (UID: \"e2af8a85-751c-409a-908e-1334eb3e8e42\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4hwhh" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.497397 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/40292f84-e865-4368-9e37-e385dfcb5880-ca-trust-extracted\") pod \"image-registry-697d97f7c8-t8nn7\" (UID: \"40292f84-e865-4368-9e37-e385dfcb5880\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8nn7" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.497418 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/40292f84-e865-4368-9e37-e385dfcb5880-bound-sa-token\") pod \"image-registry-697d97f7c8-t8nn7\" (UID: \"40292f84-e865-4368-9e37-e385dfcb5880\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8nn7" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.497445 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e2af8a85-751c-409a-908e-1334eb3e8e42-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-4hwhh\" (UID: \"e2af8a85-751c-409a-908e-1334eb3e8e42\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4hwhh" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.497466 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/40292f84-e865-4368-9e37-e385dfcb5880-registry-certificates\") pod \"image-registry-697d97f7c8-t8nn7\" (UID: \"40292f84-e865-4368-9e37-e385dfcb5880\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8nn7" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.497494 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9p24\" (UniqueName: \"kubernetes.io/projected/40292f84-e865-4368-9e37-e385dfcb5880-kube-api-access-f9p24\") pod \"image-registry-697d97f7c8-t8nn7\" (UID: \"40292f84-e865-4368-9e37-e385dfcb5880\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8nn7" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.497518 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8nn7\" (UID: \"40292f84-e865-4368-9e37-e385dfcb5880\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8nn7" Dec 12 00:25:46 crc kubenswrapper[4606]: E1212 00:25:46.497804 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-12 00:25:46.997790933 +0000 UTC m=+137.543143799 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t8nn7" (UID: "40292f84-e865-4368-9e37-e385dfcb5880") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.518298 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-dlrwh" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.529074 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-dgfmw" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.552297 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-l5pmq" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.564363 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-64s9l" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.573329 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-ffwdl" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.595097 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4fdkr" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.598596 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.598812 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/ef54a55c-b8cc-405b-a5a6-ff9118635c9d-profile-collector-cert\") pod \"olm-operator-6b444d44fb-52vsx\" (UID: \"ef54a55c-b8cc-405b-a5a6-ff9118635c9d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-52vsx" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.598838 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/4dc94fa6-9193-4aff-94d5-3ab846094763-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-c86kb\" (UID: \"4dc94fa6-9193-4aff-94d5-3ab846094763\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-c86kb" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.598883 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e107c29e-3757-44e7-ad54-223801e19085-apiservice-cert\") pod \"packageserver-d55dfcdfc-8ltmf\" (UID: \"e107c29e-3757-44e7-ad54-223801e19085\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8ltmf" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.598904 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9b7c71bd-1fac-494e-8407-ecedfa667fc7-config-volume\") pod \"collect-profiles-29424975-jpm67\" (UID: \"9b7c71bd-1fac-494e-8407-ecedfa667fc7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29424975-jpm67" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.598949 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zp2km\" (UniqueName: \"kubernetes.io/projected/3487af3b-5a9b-4e8f-8647-e800938cb74d-kube-api-access-zp2km\") pod \"catalog-operator-68c6474976-46kcq\" (UID: \"3487af3b-5a9b-4e8f-8647-e800938cb74d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-46kcq" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.598977 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdtfn\" (UniqueName: \"kubernetes.io/projected/ef54a55c-b8cc-405b-a5a6-ff9118635c9d-kube-api-access-sdtfn\") pod \"olm-operator-6b444d44fb-52vsx\" (UID: \"ef54a55c-b8cc-405b-a5a6-ff9118635c9d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-52vsx" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.599024 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-526dr\" (UniqueName: \"kubernetes.io/projected/e836a0be-41d7-4d9d-9b59-d1db42826d1b-kube-api-access-526dr\") pod \"csi-hostpathplugin-h8tjq\" (UID: \"e836a0be-41d7-4d9d-9b59-d1db42826d1b\") " pod="hostpath-provisioner/csi-hostpathplugin-h8tjq" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.599042 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/b7cea121-b721-4fa7-9b48-057abf4048af-certs\") pod \"machine-config-server-pvw8f\" (UID: \"b7cea121-b721-4fa7-9b48-057abf4048af\") " pod="openshift-machine-config-operator/machine-config-server-pvw8f" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.599073 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4s6z\" (UniqueName: \"kubernetes.io/projected/4dc94fa6-9193-4aff-94d5-3ab846094763-kube-api-access-h4s6z\") pod \"package-server-manager-789f6589d5-c86kb\" (UID: \"4dc94fa6-9193-4aff-94d5-3ab846094763\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-c86kb" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.599091 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tcct9\" (UniqueName: \"kubernetes.io/projected/cfc4a02d-08c4-4861-b768-6e85a15b3897-kube-api-access-tcct9\") pod \"ingress-canary-898cq\" (UID: \"cfc4a02d-08c4-4861-b768-6e85a15b3897\") " pod="openshift-ingress-canary/ingress-canary-898cq" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.599144 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/848f49c7-d7b8-4490-9956-4014339c4a31-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-4p7d9\" (UID: \"848f49c7-d7b8-4490-9956-4014339c4a31\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-4p7d9" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.599227 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/40292f84-e865-4368-9e37-e385dfcb5880-installation-pull-secrets\") pod \"image-registry-697d97f7c8-t8nn7\" (UID: \"40292f84-e865-4368-9e37-e385dfcb5880\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8nn7" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.599253 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/e836a0be-41d7-4d9d-9b59-d1db42826d1b-csi-data-dir\") pod \"csi-hostpathplugin-h8tjq\" (UID: \"e836a0be-41d7-4d9d-9b59-d1db42826d1b\") " pod="hostpath-provisioner/csi-hostpathplugin-h8tjq" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.599282 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e2af8a85-751c-409a-908e-1334eb3e8e42-config\") pod \"kube-controller-manager-operator-78b949d7b-4hwhh\" (UID: \"e2af8a85-751c-409a-908e-1334eb3e8e42\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4hwhh" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.599349 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/e107c29e-3757-44e7-ad54-223801e19085-tmpfs\") pod \"packageserver-d55dfcdfc-8ltmf\" (UID: \"e107c29e-3757-44e7-ad54-223801e19085\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8ltmf" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.599375 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/40292f84-e865-4368-9e37-e385dfcb5880-registry-tls\") pod \"image-registry-697d97f7c8-t8nn7\" (UID: \"40292f84-e865-4368-9e37-e385dfcb5880\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8nn7" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.599394 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/77014cd9-7c35-4271-b875-a1f2b2a712d1-signing-cabundle\") pod \"service-ca-9c57cc56f-q79c6\" (UID: \"77014cd9-7c35-4271-b875-a1f2b2a712d1\") " pod="openshift-service-ca/service-ca-9c57cc56f-q79c6" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.599413 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbw7n\" (UniqueName: \"kubernetes.io/projected/b7cea121-b721-4fa7-9b48-057abf4048af-kube-api-access-fbw7n\") pod \"machine-config-server-pvw8f\" (UID: \"b7cea121-b721-4fa7-9b48-057abf4048af\") " pod="openshift-machine-config-operator/machine-config-server-pvw8f" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.599469 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jgsw\" (UniqueName: \"kubernetes.io/projected/5c84db9e-ba60-4723-95f8-887b16eaa84a-kube-api-access-6jgsw\") pod \"multus-admission-controller-857f4d67dd-2drsc\" (UID: \"5c84db9e-ba60-4723-95f8-887b16eaa84a\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-2drsc" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.599488 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/40292f84-e865-4368-9e37-e385dfcb5880-trusted-ca\") pod \"image-registry-697d97f7c8-t8nn7\" (UID: \"40292f84-e865-4368-9e37-e385dfcb5880\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8nn7" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.599518 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a76fe324-756f-48ce-8e78-9ee4b4933b96-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-dmjzw\" (UID: \"a76fe324-756f-48ce-8e78-9ee4b4933b96\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dmjzw" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.599536 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/3487af3b-5a9b-4e8f-8647-e800938cb74d-profile-collector-cert\") pod \"catalog-operator-68c6474976-46kcq\" (UID: \"3487af3b-5a9b-4e8f-8647-e800938cb74d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-46kcq" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.599565 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/e836a0be-41d7-4d9d-9b59-d1db42826d1b-plugins-dir\") pod \"csi-hostpathplugin-h8tjq\" (UID: \"e836a0be-41d7-4d9d-9b59-d1db42826d1b\") " pod="hostpath-provisioner/csi-hostpathplugin-h8tjq" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.599584 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9b7c71bd-1fac-494e-8407-ecedfa667fc7-secret-volume\") pod \"collect-profiles-29424975-jpm67\" (UID: \"9b7c71bd-1fac-494e-8407-ecedfa667fc7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29424975-jpm67" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.599623 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkb6s\" (UniqueName: \"kubernetes.io/projected/073ca95c-7738-4217-99af-72c9335f0c30-kube-api-access-qkb6s\") pod \"dns-default-z6php\" (UID: \"073ca95c-7738-4217-99af-72c9335f0c30\") " pod="openshift-dns/dns-default-z6php" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.599686 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/e836a0be-41d7-4d9d-9b59-d1db42826d1b-mountpoint-dir\") pod \"csi-hostpathplugin-h8tjq\" (UID: \"e836a0be-41d7-4d9d-9b59-d1db42826d1b\") " pod="hostpath-provisioner/csi-hostpathplugin-h8tjq" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.599704 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6d6pm\" (UniqueName: \"kubernetes.io/projected/9b7c71bd-1fac-494e-8407-ecedfa667fc7-kube-api-access-6d6pm\") pod \"collect-profiles-29424975-jpm67\" (UID: \"9b7c71bd-1fac-494e-8407-ecedfa667fc7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29424975-jpm67" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.599739 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/b7cea121-b721-4fa7-9b48-057abf4048af-node-bootstrap-token\") pod \"machine-config-server-pvw8f\" (UID: \"b7cea121-b721-4fa7-9b48-057abf4048af\") " pod="openshift-machine-config-operator/machine-config-server-pvw8f" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.599764 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/073ca95c-7738-4217-99af-72c9335f0c30-config-volume\") pod \"dns-default-z6php\" (UID: \"073ca95c-7738-4217-99af-72c9335f0c30\") " pod="openshift-dns/dns-default-z6php" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.599800 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e2af8a85-751c-409a-908e-1334eb3e8e42-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-4hwhh\" (UID: \"e2af8a85-751c-409a-908e-1334eb3e8e42\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4hwhh" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.599843 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6sw6\" (UniqueName: \"kubernetes.io/projected/a76fe324-756f-48ce-8e78-9ee4b4933b96-kube-api-access-x6sw6\") pod \"kube-storage-version-migrator-operator-b67b599dd-dmjzw\" (UID: \"a76fe324-756f-48ce-8e78-9ee4b4933b96\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dmjzw" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.599939 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vp9fz\" (UniqueName: \"kubernetes.io/projected/848f49c7-d7b8-4490-9956-4014339c4a31-kube-api-access-vp9fz\") pod \"control-plane-machine-set-operator-78cbb6b69f-4p7d9\" (UID: \"848f49c7-d7b8-4490-9956-4014339c4a31\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-4p7d9" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.599989 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/ef54a55c-b8cc-405b-a5a6-ff9118635c9d-srv-cert\") pod \"olm-operator-6b444d44fb-52vsx\" (UID: \"ef54a55c-b8cc-405b-a5a6-ff9118635c9d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-52vsx" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.600006 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/77014cd9-7c35-4271-b875-a1f2b2a712d1-signing-key\") pod \"service-ca-9c57cc56f-q79c6\" (UID: \"77014cd9-7c35-4271-b875-a1f2b2a712d1\") " pod="openshift-service-ca/service-ca-9c57cc56f-q79c6" Dec 12 00:25:46 crc kubenswrapper[4606]: E1212 00:25:46.600086 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-12 00:25:47.100065677 +0000 UTC m=+137.645418543 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.600119 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e836a0be-41d7-4d9d-9b59-d1db42826d1b-socket-dir\") pod \"csi-hostpathplugin-h8tjq\" (UID: \"e836a0be-41d7-4d9d-9b59-d1db42826d1b\") " pod="hostpath-provisioner/csi-hostpathplugin-h8tjq" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.600150 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e9e0df34-ce0f-47c9-b74a-ef5e3a3f0d77-config\") pod \"service-ca-operator-777779d784-hhzb9\" (UID: \"e9e0df34-ce0f-47c9-b74a-ef5e3a3f0d77\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-hhzb9" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.600186 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/073ca95c-7738-4217-99af-72c9335f0c30-metrics-tls\") pod \"dns-default-z6php\" (UID: \"073ca95c-7738-4217-99af-72c9335f0c30\") " pod="openshift-dns/dns-default-z6php" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.600220 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mkcv7\" (UniqueName: \"kubernetes.io/projected/e9e0df34-ce0f-47c9-b74a-ef5e3a3f0d77-kube-api-access-mkcv7\") pod \"service-ca-operator-777779d784-hhzb9\" (UID: \"e9e0df34-ce0f-47c9-b74a-ef5e3a3f0d77\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-hhzb9" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.600244 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/40292f84-e865-4368-9e37-e385dfcb5880-ca-trust-extracted\") pod \"image-registry-697d97f7c8-t8nn7\" (UID: \"40292f84-e865-4368-9e37-e385dfcb5880\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8nn7" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.600289 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a76fe324-756f-48ce-8e78-9ee4b4933b96-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-dmjzw\" (UID: \"a76fe324-756f-48ce-8e78-9ee4b4933b96\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dmjzw" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.600373 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e9e0df34-ce0f-47c9-b74a-ef5e3a3f0d77-serving-cert\") pod \"service-ca-operator-777779d784-hhzb9\" (UID: \"e9e0df34-ce0f-47c9-b74a-ef5e3a3f0d77\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-hhzb9" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.600407 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/3487af3b-5a9b-4e8f-8647-e800938cb74d-srv-cert\") pod \"catalog-operator-68c6474976-46kcq\" (UID: \"3487af3b-5a9b-4e8f-8647-e800938cb74d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-46kcq" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.600444 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e836a0be-41d7-4d9d-9b59-d1db42826d1b-registration-dir\") pod \"csi-hostpathplugin-h8tjq\" (UID: \"e836a0be-41d7-4d9d-9b59-d1db42826d1b\") " pod="hostpath-provisioner/csi-hostpathplugin-h8tjq" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.600504 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fswl\" (UniqueName: \"kubernetes.io/projected/77014cd9-7c35-4271-b875-a1f2b2a712d1-kube-api-access-4fswl\") pod \"service-ca-9c57cc56f-q79c6\" (UID: \"77014cd9-7c35-4271-b875-a1f2b2a712d1\") " pod="openshift-service-ca/service-ca-9c57cc56f-q79c6" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.600552 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/40292f84-e865-4368-9e37-e385dfcb5880-bound-sa-token\") pod \"image-registry-697d97f7c8-t8nn7\" (UID: \"40292f84-e865-4368-9e37-e385dfcb5880\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8nn7" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.600570 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e2af8a85-751c-409a-908e-1334eb3e8e42-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-4hwhh\" (UID: \"e2af8a85-751c-409a-908e-1334eb3e8e42\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4hwhh" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.600591 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e107c29e-3757-44e7-ad54-223801e19085-webhook-cert\") pod \"packageserver-d55dfcdfc-8ltmf\" (UID: \"e107c29e-3757-44e7-ad54-223801e19085\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8ltmf" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.600612 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5c84db9e-ba60-4723-95f8-887b16eaa84a-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-2drsc\" (UID: \"5c84db9e-ba60-4723-95f8-887b16eaa84a\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-2drsc" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.600629 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hjh7\" (UniqueName: \"kubernetes.io/projected/e3e167ac-8428-49a3-899c-a2290fb8f78f-kube-api-access-5hjh7\") pod \"migrator-59844c95c7-mbfr9\" (UID: \"e3e167ac-8428-49a3-899c-a2290fb8f78f\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-mbfr9" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.600649 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/40292f84-e865-4368-9e37-e385dfcb5880-registry-certificates\") pod \"image-registry-697d97f7c8-t8nn7\" (UID: \"40292f84-e865-4368-9e37-e385dfcb5880\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8nn7" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.600666 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pcq6q\" (UniqueName: \"kubernetes.io/projected/e107c29e-3757-44e7-ad54-223801e19085-kube-api-access-pcq6q\") pod \"packageserver-d55dfcdfc-8ltmf\" (UID: \"e107c29e-3757-44e7-ad54-223801e19085\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8ltmf" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.600695 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9p24\" (UniqueName: \"kubernetes.io/projected/40292f84-e865-4368-9e37-e385dfcb5880-kube-api-access-f9p24\") pod \"image-registry-697d97f7c8-t8nn7\" (UID: \"40292f84-e865-4368-9e37-e385dfcb5880\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8nn7" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.600710 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cfc4a02d-08c4-4861-b768-6e85a15b3897-cert\") pod \"ingress-canary-898cq\" (UID: \"cfc4a02d-08c4-4861-b768-6e85a15b3897\") " pod="openshift-ingress-canary/ingress-canary-898cq" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.602605 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/40292f84-e865-4368-9e37-e385dfcb5880-ca-trust-extracted\") pod \"image-registry-697d97f7c8-t8nn7\" (UID: \"40292f84-e865-4368-9e37-e385dfcb5880\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8nn7" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.602594 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e2af8a85-751c-409a-908e-1334eb3e8e42-config\") pod \"kube-controller-manager-operator-78b949d7b-4hwhh\" (UID: \"e2af8a85-751c-409a-908e-1334eb3e8e42\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4hwhh" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.606399 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-9vfqp" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.609549 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/40292f84-e865-4368-9e37-e385dfcb5880-installation-pull-secrets\") pod \"image-registry-697d97f7c8-t8nn7\" (UID: \"40292f84-e865-4368-9e37-e385dfcb5880\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8nn7" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.612334 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/40292f84-e865-4368-9e37-e385dfcb5880-registry-certificates\") pod \"image-registry-697d97f7c8-t8nn7\" (UID: \"40292f84-e865-4368-9e37-e385dfcb5880\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8nn7" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.613599 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/40292f84-e865-4368-9e37-e385dfcb5880-registry-tls\") pod \"image-registry-697d97f7c8-t8nn7\" (UID: \"40292f84-e865-4368-9e37-e385dfcb5880\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8nn7" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.614924 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6h2gf" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.615937 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/40292f84-e865-4368-9e37-e385dfcb5880-trusted-ca\") pod \"image-registry-697d97f7c8-t8nn7\" (UID: \"40292f84-e865-4368-9e37-e385dfcb5880\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8nn7" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.616546 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e2af8a85-751c-409a-908e-1334eb3e8e42-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-4hwhh\" (UID: \"e2af8a85-751c-409a-908e-1334eb3e8e42\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4hwhh" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.622815 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-snpww" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.642096 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-544fr" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.642790 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/40292f84-e865-4368-9e37-e385dfcb5880-bound-sa-token\") pod \"image-registry-697d97f7c8-t8nn7\" (UID: \"40292f84-e865-4368-9e37-e385dfcb5880\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8nn7" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.669739 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e2af8a85-751c-409a-908e-1334eb3e8e42-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-4hwhh\" (UID: \"e2af8a85-751c-409a-908e-1334eb3e8e42\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4hwhh" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.681122 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9p24\" (UniqueName: \"kubernetes.io/projected/40292f84-e865-4368-9e37-e385dfcb5880-kube-api-access-f9p24\") pod \"image-registry-697d97f7c8-t8nn7\" (UID: \"40292f84-e865-4368-9e37-e385dfcb5880\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8nn7" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.701690 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/e107c29e-3757-44e7-ad54-223801e19085-tmpfs\") pod \"packageserver-d55dfcdfc-8ltmf\" (UID: \"e107c29e-3757-44e7-ad54-223801e19085\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8ltmf" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.701723 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/77014cd9-7c35-4271-b875-a1f2b2a712d1-signing-cabundle\") pod \"service-ca-9c57cc56f-q79c6\" (UID: \"77014cd9-7c35-4271-b875-a1f2b2a712d1\") " pod="openshift-service-ca/service-ca-9c57cc56f-q79c6" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.701745 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fbw7n\" (UniqueName: \"kubernetes.io/projected/b7cea121-b721-4fa7-9b48-057abf4048af-kube-api-access-fbw7n\") pod \"machine-config-server-pvw8f\" (UID: \"b7cea121-b721-4fa7-9b48-057abf4048af\") " pod="openshift-machine-config-operator/machine-config-server-pvw8f" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.701771 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jgsw\" (UniqueName: \"kubernetes.io/projected/5c84db9e-ba60-4723-95f8-887b16eaa84a-kube-api-access-6jgsw\") pod \"multus-admission-controller-857f4d67dd-2drsc\" (UID: \"5c84db9e-ba60-4723-95f8-887b16eaa84a\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-2drsc" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.701790 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a76fe324-756f-48ce-8e78-9ee4b4933b96-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-dmjzw\" (UID: \"a76fe324-756f-48ce-8e78-9ee4b4933b96\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dmjzw" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.701807 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/3487af3b-5a9b-4e8f-8647-e800938cb74d-profile-collector-cert\") pod \"catalog-operator-68c6474976-46kcq\" (UID: \"3487af3b-5a9b-4e8f-8647-e800938cb74d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-46kcq" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.701826 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/e836a0be-41d7-4d9d-9b59-d1db42826d1b-plugins-dir\") pod \"csi-hostpathplugin-h8tjq\" (UID: \"e836a0be-41d7-4d9d-9b59-d1db42826d1b\") " pod="hostpath-provisioner/csi-hostpathplugin-h8tjq" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.701843 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9b7c71bd-1fac-494e-8407-ecedfa667fc7-secret-volume\") pod \"collect-profiles-29424975-jpm67\" (UID: \"9b7c71bd-1fac-494e-8407-ecedfa667fc7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29424975-jpm67" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.701860 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qkb6s\" (UniqueName: \"kubernetes.io/projected/073ca95c-7738-4217-99af-72c9335f0c30-kube-api-access-qkb6s\") pod \"dns-default-z6php\" (UID: \"073ca95c-7738-4217-99af-72c9335f0c30\") " pod="openshift-dns/dns-default-z6php" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.701878 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6d6pm\" (UniqueName: \"kubernetes.io/projected/9b7c71bd-1fac-494e-8407-ecedfa667fc7-kube-api-access-6d6pm\") pod \"collect-profiles-29424975-jpm67\" (UID: \"9b7c71bd-1fac-494e-8407-ecedfa667fc7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29424975-jpm67" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.701893 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/e836a0be-41d7-4d9d-9b59-d1db42826d1b-mountpoint-dir\") pod \"csi-hostpathplugin-h8tjq\" (UID: \"e836a0be-41d7-4d9d-9b59-d1db42826d1b\") " pod="hostpath-provisioner/csi-hostpathplugin-h8tjq" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.701911 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/b7cea121-b721-4fa7-9b48-057abf4048af-node-bootstrap-token\") pod \"machine-config-server-pvw8f\" (UID: \"b7cea121-b721-4fa7-9b48-057abf4048af\") " pod="openshift-machine-config-operator/machine-config-server-pvw8f" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.701926 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/073ca95c-7738-4217-99af-72c9335f0c30-config-volume\") pod \"dns-default-z6php\" (UID: \"073ca95c-7738-4217-99af-72c9335f0c30\") " pod="openshift-dns/dns-default-z6php" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.701954 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x6sw6\" (UniqueName: \"kubernetes.io/projected/a76fe324-756f-48ce-8e78-9ee4b4933b96-kube-api-access-x6sw6\") pod \"kube-storage-version-migrator-operator-b67b599dd-dmjzw\" (UID: \"a76fe324-756f-48ce-8e78-9ee4b4933b96\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dmjzw" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.701979 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vp9fz\" (UniqueName: \"kubernetes.io/projected/848f49c7-d7b8-4490-9956-4014339c4a31-kube-api-access-vp9fz\") pod \"control-plane-machine-set-operator-78cbb6b69f-4p7d9\" (UID: \"848f49c7-d7b8-4490-9956-4014339c4a31\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-4p7d9" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.701997 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/ef54a55c-b8cc-405b-a5a6-ff9118635c9d-srv-cert\") pod \"olm-operator-6b444d44fb-52vsx\" (UID: \"ef54a55c-b8cc-405b-a5a6-ff9118635c9d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-52vsx" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.702014 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/77014cd9-7c35-4271-b875-a1f2b2a712d1-signing-key\") pod \"service-ca-9c57cc56f-q79c6\" (UID: \"77014cd9-7c35-4271-b875-a1f2b2a712d1\") " pod="openshift-service-ca/service-ca-9c57cc56f-q79c6" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.702031 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e836a0be-41d7-4d9d-9b59-d1db42826d1b-socket-dir\") pod \"csi-hostpathplugin-h8tjq\" (UID: \"e836a0be-41d7-4d9d-9b59-d1db42826d1b\") " pod="hostpath-provisioner/csi-hostpathplugin-h8tjq" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.702048 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e9e0df34-ce0f-47c9-b74a-ef5e3a3f0d77-config\") pod \"service-ca-operator-777779d784-hhzb9\" (UID: \"e9e0df34-ce0f-47c9-b74a-ef5e3a3f0d77\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-hhzb9" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.702065 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/073ca95c-7738-4217-99af-72c9335f0c30-metrics-tls\") pod \"dns-default-z6php\" (UID: \"073ca95c-7738-4217-99af-72c9335f0c30\") " pod="openshift-dns/dns-default-z6php" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.702086 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mkcv7\" (UniqueName: \"kubernetes.io/projected/e9e0df34-ce0f-47c9-b74a-ef5e3a3f0d77-kube-api-access-mkcv7\") pod \"service-ca-operator-777779d784-hhzb9\" (UID: \"e9e0df34-ce0f-47c9-b74a-ef5e3a3f0d77\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-hhzb9" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.702103 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a76fe324-756f-48ce-8e78-9ee4b4933b96-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-dmjzw\" (UID: \"a76fe324-756f-48ce-8e78-9ee4b4933b96\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dmjzw" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.702119 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e9e0df34-ce0f-47c9-b74a-ef5e3a3f0d77-serving-cert\") pod \"service-ca-operator-777779d784-hhzb9\" (UID: \"e9e0df34-ce0f-47c9-b74a-ef5e3a3f0d77\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-hhzb9" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.702134 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/3487af3b-5a9b-4e8f-8647-e800938cb74d-srv-cert\") pod \"catalog-operator-68c6474976-46kcq\" (UID: \"3487af3b-5a9b-4e8f-8647-e800938cb74d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-46kcq" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.702150 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e836a0be-41d7-4d9d-9b59-d1db42826d1b-registration-dir\") pod \"csi-hostpathplugin-h8tjq\" (UID: \"e836a0be-41d7-4d9d-9b59-d1db42826d1b\") " pod="hostpath-provisioner/csi-hostpathplugin-h8tjq" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.702185 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4fswl\" (UniqueName: \"kubernetes.io/projected/77014cd9-7c35-4271-b875-a1f2b2a712d1-kube-api-access-4fswl\") pod \"service-ca-9c57cc56f-q79c6\" (UID: \"77014cd9-7c35-4271-b875-a1f2b2a712d1\") " pod="openshift-service-ca/service-ca-9c57cc56f-q79c6" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.702203 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e107c29e-3757-44e7-ad54-223801e19085-webhook-cert\") pod \"packageserver-d55dfcdfc-8ltmf\" (UID: \"e107c29e-3757-44e7-ad54-223801e19085\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8ltmf" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.702217 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5c84db9e-ba60-4723-95f8-887b16eaa84a-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-2drsc\" (UID: \"5c84db9e-ba60-4723-95f8-887b16eaa84a\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-2drsc" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.702226 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/e836a0be-41d7-4d9d-9b59-d1db42826d1b-plugins-dir\") pod \"csi-hostpathplugin-h8tjq\" (UID: \"e836a0be-41d7-4d9d-9b59-d1db42826d1b\") " pod="hostpath-provisioner/csi-hostpathplugin-h8tjq" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.702233 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5hjh7\" (UniqueName: \"kubernetes.io/projected/e3e167ac-8428-49a3-899c-a2290fb8f78f-kube-api-access-5hjh7\") pod \"migrator-59844c95c7-mbfr9\" (UID: \"e3e167ac-8428-49a3-899c-a2290fb8f78f\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-mbfr9" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.702290 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pcq6q\" (UniqueName: \"kubernetes.io/projected/e107c29e-3757-44e7-ad54-223801e19085-kube-api-access-pcq6q\") pod \"packageserver-d55dfcdfc-8ltmf\" (UID: \"e107c29e-3757-44e7-ad54-223801e19085\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8ltmf" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.702314 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cfc4a02d-08c4-4861-b768-6e85a15b3897-cert\") pod \"ingress-canary-898cq\" (UID: \"cfc4a02d-08c4-4861-b768-6e85a15b3897\") " pod="openshift-ingress-canary/ingress-canary-898cq" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.702354 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8nn7\" (UID: \"40292f84-e865-4368-9e37-e385dfcb5880\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8nn7" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.702372 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/ef54a55c-b8cc-405b-a5a6-ff9118635c9d-profile-collector-cert\") pod \"olm-operator-6b444d44fb-52vsx\" (UID: \"ef54a55c-b8cc-405b-a5a6-ff9118635c9d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-52vsx" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.702392 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/4dc94fa6-9193-4aff-94d5-3ab846094763-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-c86kb\" (UID: \"4dc94fa6-9193-4aff-94d5-3ab846094763\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-c86kb" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.702420 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e107c29e-3757-44e7-ad54-223801e19085-apiservice-cert\") pod \"packageserver-d55dfcdfc-8ltmf\" (UID: \"e107c29e-3757-44e7-ad54-223801e19085\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8ltmf" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.702438 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9b7c71bd-1fac-494e-8407-ecedfa667fc7-config-volume\") pod \"collect-profiles-29424975-jpm67\" (UID: \"9b7c71bd-1fac-494e-8407-ecedfa667fc7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29424975-jpm67" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.702458 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zp2km\" (UniqueName: \"kubernetes.io/projected/3487af3b-5a9b-4e8f-8647-e800938cb74d-kube-api-access-zp2km\") pod \"catalog-operator-68c6474976-46kcq\" (UID: \"3487af3b-5a9b-4e8f-8647-e800938cb74d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-46kcq" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.702476 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdtfn\" (UniqueName: \"kubernetes.io/projected/ef54a55c-b8cc-405b-a5a6-ff9118635c9d-kube-api-access-sdtfn\") pod \"olm-operator-6b444d44fb-52vsx\" (UID: \"ef54a55c-b8cc-405b-a5a6-ff9118635c9d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-52vsx" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.702497 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-526dr\" (UniqueName: \"kubernetes.io/projected/e836a0be-41d7-4d9d-9b59-d1db42826d1b-kube-api-access-526dr\") pod \"csi-hostpathplugin-h8tjq\" (UID: \"e836a0be-41d7-4d9d-9b59-d1db42826d1b\") " pod="hostpath-provisioner/csi-hostpathplugin-h8tjq" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.702511 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/b7cea121-b721-4fa7-9b48-057abf4048af-certs\") pod \"machine-config-server-pvw8f\" (UID: \"b7cea121-b721-4fa7-9b48-057abf4048af\") " pod="openshift-machine-config-operator/machine-config-server-pvw8f" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.702530 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h4s6z\" (UniqueName: \"kubernetes.io/projected/4dc94fa6-9193-4aff-94d5-3ab846094763-kube-api-access-h4s6z\") pod \"package-server-manager-789f6589d5-c86kb\" (UID: \"4dc94fa6-9193-4aff-94d5-3ab846094763\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-c86kb" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.702546 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tcct9\" (UniqueName: \"kubernetes.io/projected/cfc4a02d-08c4-4861-b768-6e85a15b3897-kube-api-access-tcct9\") pod \"ingress-canary-898cq\" (UID: \"cfc4a02d-08c4-4861-b768-6e85a15b3897\") " pod="openshift-ingress-canary/ingress-canary-898cq" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.702571 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/848f49c7-d7b8-4490-9956-4014339c4a31-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-4p7d9\" (UID: \"848f49c7-d7b8-4490-9956-4014339c4a31\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-4p7d9" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.702602 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/e836a0be-41d7-4d9d-9b59-d1db42826d1b-csi-data-dir\") pod \"csi-hostpathplugin-h8tjq\" (UID: \"e836a0be-41d7-4d9d-9b59-d1db42826d1b\") " pod="hostpath-provisioner/csi-hostpathplugin-h8tjq" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.702732 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/e836a0be-41d7-4d9d-9b59-d1db42826d1b-csi-data-dir\") pod \"csi-hostpathplugin-h8tjq\" (UID: \"e836a0be-41d7-4d9d-9b59-d1db42826d1b\") " pod="hostpath-provisioner/csi-hostpathplugin-h8tjq" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.702749 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e836a0be-41d7-4d9d-9b59-d1db42826d1b-socket-dir\") pod \"csi-hostpathplugin-h8tjq\" (UID: \"e836a0be-41d7-4d9d-9b59-d1db42826d1b\") " pod="hostpath-provisioner/csi-hostpathplugin-h8tjq" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.702886 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/e836a0be-41d7-4d9d-9b59-d1db42826d1b-mountpoint-dir\") pod \"csi-hostpathplugin-h8tjq\" (UID: \"e836a0be-41d7-4d9d-9b59-d1db42826d1b\") " pod="hostpath-provisioner/csi-hostpathplugin-h8tjq" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.703100 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/e107c29e-3757-44e7-ad54-223801e19085-tmpfs\") pod \"packageserver-d55dfcdfc-8ltmf\" (UID: \"e107c29e-3757-44e7-ad54-223801e19085\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8ltmf" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.703762 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/77014cd9-7c35-4271-b875-a1f2b2a712d1-signing-cabundle\") pod \"service-ca-9c57cc56f-q79c6\" (UID: \"77014cd9-7c35-4271-b875-a1f2b2a712d1\") " pod="openshift-service-ca/service-ca-9c57cc56f-q79c6" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.704448 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a76fe324-756f-48ce-8e78-9ee4b4933b96-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-dmjzw\" (UID: \"a76fe324-756f-48ce-8e78-9ee4b4933b96\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dmjzw" Dec 12 00:25:46 crc kubenswrapper[4606]: E1212 00:25:46.710237 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-12 00:25:47.21016632 +0000 UTC m=+137.755519186 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t8nn7" (UID: "40292f84-e865-4368-9e37-e385dfcb5880") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.711226 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9b7c71bd-1fac-494e-8407-ecedfa667fc7-config-volume\") pod \"collect-profiles-29424975-jpm67\" (UID: \"9b7c71bd-1fac-494e-8407-ecedfa667fc7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29424975-jpm67" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.711537 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/3487af3b-5a9b-4e8f-8647-e800938cb74d-profile-collector-cert\") pod \"catalog-operator-68c6474976-46kcq\" (UID: \"3487af3b-5a9b-4e8f-8647-e800938cb74d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-46kcq" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.711651 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cfc4a02d-08c4-4861-b768-6e85a15b3897-cert\") pod \"ingress-canary-898cq\" (UID: \"cfc4a02d-08c4-4861-b768-6e85a15b3897\") " pod="openshift-ingress-canary/ingress-canary-898cq" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.711762 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e836a0be-41d7-4d9d-9b59-d1db42826d1b-registration-dir\") pod \"csi-hostpathplugin-h8tjq\" (UID: \"e836a0be-41d7-4d9d-9b59-d1db42826d1b\") " pod="hostpath-provisioner/csi-hostpathplugin-h8tjq" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.718918 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/4dc94fa6-9193-4aff-94d5-3ab846094763-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-c86kb\" (UID: \"4dc94fa6-9193-4aff-94d5-3ab846094763\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-c86kb" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.719836 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e9e0df34-ce0f-47c9-b74a-ef5e3a3f0d77-config\") pod \"service-ca-operator-777779d784-hhzb9\" (UID: \"e9e0df34-ce0f-47c9-b74a-ef5e3a3f0d77\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-hhzb9" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.724881 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/848f49c7-d7b8-4490-9956-4014339c4a31-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-4p7d9\" (UID: \"848f49c7-d7b8-4490-9956-4014339c4a31\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-4p7d9" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.727727 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/073ca95c-7738-4217-99af-72c9335f0c30-config-volume\") pod \"dns-default-z6php\" (UID: \"073ca95c-7738-4217-99af-72c9335f0c30\") " pod="openshift-dns/dns-default-z6php" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.744436 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/b7cea121-b721-4fa7-9b48-057abf4048af-certs\") pod \"machine-config-server-pvw8f\" (UID: \"b7cea121-b721-4fa7-9b48-057abf4048af\") " pod="openshift-machine-config-operator/machine-config-server-pvw8f" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.749830 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vz9ft"] Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.750476 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/ef54a55c-b8cc-405b-a5a6-ff9118635c9d-srv-cert\") pod \"olm-operator-6b444d44fb-52vsx\" (UID: \"ef54a55c-b8cc-405b-a5a6-ff9118635c9d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-52vsx" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.756078 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9b7c71bd-1fac-494e-8407-ecedfa667fc7-secret-volume\") pod \"collect-profiles-29424975-jpm67\" (UID: \"9b7c71bd-1fac-494e-8407-ecedfa667fc7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29424975-jpm67" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.756351 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/b7cea121-b721-4fa7-9b48-057abf4048af-node-bootstrap-token\") pod \"machine-config-server-pvw8f\" (UID: \"b7cea121-b721-4fa7-9b48-057abf4048af\") " pod="openshift-machine-config-operator/machine-config-server-pvw8f" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.756422 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e107c29e-3757-44e7-ad54-223801e19085-apiservice-cert\") pod \"packageserver-d55dfcdfc-8ltmf\" (UID: \"e107c29e-3757-44e7-ad54-223801e19085\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8ltmf" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.756523 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/3487af3b-5a9b-4e8f-8647-e800938cb74d-srv-cert\") pod \"catalog-operator-68c6474976-46kcq\" (UID: \"3487af3b-5a9b-4e8f-8647-e800938cb74d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-46kcq" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.757380 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e9e0df34-ce0f-47c9-b74a-ef5e3a3f0d77-serving-cert\") pod \"service-ca-operator-777779d784-hhzb9\" (UID: \"e9e0df34-ce0f-47c9-b74a-ef5e3a3f0d77\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-hhzb9" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.757423 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/77014cd9-7c35-4271-b875-a1f2b2a712d1-signing-key\") pod \"service-ca-9c57cc56f-q79c6\" (UID: \"77014cd9-7c35-4271-b875-a1f2b2a712d1\") " pod="openshift-service-ca/service-ca-9c57cc56f-q79c6" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.757697 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/073ca95c-7738-4217-99af-72c9335f0c30-metrics-tls\") pod \"dns-default-z6php\" (UID: \"073ca95c-7738-4217-99af-72c9335f0c30\") " pod="openshift-dns/dns-default-z6php" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.757776 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e107c29e-3757-44e7-ad54-223801e19085-webhook-cert\") pod \"packageserver-d55dfcdfc-8ltmf\" (UID: \"e107c29e-3757-44e7-ad54-223801e19085\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8ltmf" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.758054 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/ef54a55c-b8cc-405b-a5a6-ff9118635c9d-profile-collector-cert\") pod \"olm-operator-6b444d44fb-52vsx\" (UID: \"ef54a55c-b8cc-405b-a5a6-ff9118635c9d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-52vsx" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.758301 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5c84db9e-ba60-4723-95f8-887b16eaa84a-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-2drsc\" (UID: \"5c84db9e-ba60-4723-95f8-887b16eaa84a\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-2drsc" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.761510 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a76fe324-756f-48ce-8e78-9ee4b4933b96-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-dmjzw\" (UID: \"a76fe324-756f-48ce-8e78-9ee4b4933b96\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dmjzw" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.764779 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hjh7\" (UniqueName: \"kubernetes.io/projected/e3e167ac-8428-49a3-899c-a2290fb8f78f-kube-api-access-5hjh7\") pod \"migrator-59844c95c7-mbfr9\" (UID: \"e3e167ac-8428-49a3-899c-a2290fb8f78f\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-mbfr9" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.767430 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkb6s\" (UniqueName: \"kubernetes.io/projected/073ca95c-7738-4217-99af-72c9335f0c30-kube-api-access-qkb6s\") pod \"dns-default-z6php\" (UID: \"073ca95c-7738-4217-99af-72c9335f0c30\") " pod="openshift-dns/dns-default-z6php" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.781941 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6d6pm\" (UniqueName: \"kubernetes.io/projected/9b7c71bd-1fac-494e-8407-ecedfa667fc7-kube-api-access-6d6pm\") pod \"collect-profiles-29424975-jpm67\" (UID: \"9b7c71bd-1fac-494e-8407-ecedfa667fc7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29424975-jpm67" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.800645 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xmr7n"] Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.805613 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 12 00:25:46 crc kubenswrapper[4606]: E1212 00:25:46.805957 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-12 00:25:47.305916813 +0000 UTC m=+137.851269679 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.806159 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8nn7\" (UID: \"40292f84-e865-4368-9e37-e385dfcb5880\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8nn7" Dec 12 00:25:46 crc kubenswrapper[4606]: E1212 00:25:46.808423 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-12 00:25:47.308406592 +0000 UTC m=+137.853759458 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t8nn7" (UID: "40292f84-e865-4368-9e37-e385dfcb5880") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.829970 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jgsw\" (UniqueName: \"kubernetes.io/projected/5c84db9e-ba60-4723-95f8-887b16eaa84a-kube-api-access-6jgsw\") pod \"multus-admission-controller-857f4d67dd-2drsc\" (UID: \"5c84db9e-ba60-4723-95f8-887b16eaa84a\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-2drsc" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.833227 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbw7n\" (UniqueName: \"kubernetes.io/projected/b7cea121-b721-4fa7-9b48-057abf4048af-kube-api-access-fbw7n\") pod \"machine-config-server-pvw8f\" (UID: \"b7cea121-b721-4fa7-9b48-057abf4048af\") " pod="openshift-machine-config-operator/machine-config-server-pvw8f" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.838455 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-9gzg8"] Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.845627 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-pruner-29424960-bh2l7"] Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.849750 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pcq6q\" (UniqueName: \"kubernetes.io/projected/e107c29e-3757-44e7-ad54-223801e19085-kube-api-access-pcq6q\") pod \"packageserver-d55dfcdfc-8ltmf\" (UID: \"e107c29e-3757-44e7-ad54-223801e19085\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8ltmf" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.857895 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zp2km\" (UniqueName: \"kubernetes.io/projected/3487af3b-5a9b-4e8f-8647-e800938cb74d-kube-api-access-zp2km\") pod \"catalog-operator-68c6474976-46kcq\" (UID: \"3487af3b-5a9b-4e8f-8647-e800938cb74d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-46kcq" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.878587 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdtfn\" (UniqueName: \"kubernetes.io/projected/ef54a55c-b8cc-405b-a5a6-ff9118635c9d-kube-api-access-sdtfn\") pod \"olm-operator-6b444d44fb-52vsx\" (UID: \"ef54a55c-b8cc-405b-a5a6-ff9118635c9d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-52vsx" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.878952 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-frwjc"] Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.898499 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4hwhh" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.907225 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.907436 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-526dr\" (UniqueName: \"kubernetes.io/projected/e836a0be-41d7-4d9d-9b59-d1db42826d1b-kube-api-access-526dr\") pod \"csi-hostpathplugin-h8tjq\" (UID: \"e836a0be-41d7-4d9d-9b59-d1db42826d1b\") " pod="hostpath-provisioner/csi-hostpathplugin-h8tjq" Dec 12 00:25:46 crc kubenswrapper[4606]: E1212 00:25:46.907341 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-12 00:25:47.407320904 +0000 UTC m=+137.952673770 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.907578 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8nn7\" (UID: \"40292f84-e865-4368-9e37-e385dfcb5880\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8nn7" Dec 12 00:25:46 crc kubenswrapper[4606]: E1212 00:25:46.908137 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-12 00:25:47.408122486 +0000 UTC m=+137.953475352 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t8nn7" (UID: "40292f84-e865-4368-9e37-e385dfcb5880") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.951158 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8ltmf" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.960407 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-mbfr9" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.971641 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-2drsc" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.973514 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mkcv7\" (UniqueName: \"kubernetes.io/projected/e9e0df34-ce0f-47c9-b74a-ef5e3a3f0d77-kube-api-access-mkcv7\") pod \"service-ca-operator-777779d784-hhzb9\" (UID: \"e9e0df34-ce0f-47c9-b74a-ef5e3a3f0d77\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-hhzb9" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.978070 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-52vsx" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.984804 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tcct9\" (UniqueName: \"kubernetes.io/projected/cfc4a02d-08c4-4861-b768-6e85a15b3897-kube-api-access-tcct9\") pod \"ingress-canary-898cq\" (UID: \"cfc4a02d-08c4-4861-b768-6e85a15b3897\") " pod="openshift-ingress-canary/ingress-canary-898cq" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.985056 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29424975-jpm67" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.991035 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-46kcq" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.997129 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fswl\" (UniqueName: \"kubernetes.io/projected/77014cd9-7c35-4271-b875-a1f2b2a712d1-kube-api-access-4fswl\") pod \"service-ca-9c57cc56f-q79c6\" (UID: \"77014cd9-7c35-4271-b875-a1f2b2a712d1\") " pod="openshift-service-ca/service-ca-9c57cc56f-q79c6" Dec 12 00:25:46 crc kubenswrapper[4606]: I1212 00:25:46.998669 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-q79c6" Dec 12 00:25:47 crc kubenswrapper[4606]: I1212 00:25:47.008362 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 12 00:25:47 crc kubenswrapper[4606]: I1212 00:25:47.012944 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-hhzb9" Dec 12 00:25:47 crc kubenswrapper[4606]: E1212 00:25:47.013156 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-12 00:25:47.513135277 +0000 UTC m=+138.058488143 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 12 00:25:47 crc kubenswrapper[4606]: I1212 00:25:47.028558 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-h8tjq" Dec 12 00:25:47 crc kubenswrapper[4606]: I1212 00:25:47.036151 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-898cq" Dec 12 00:25:47 crc kubenswrapper[4606]: I1212 00:25:47.045833 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-pvw8f" Dec 12 00:25:47 crc kubenswrapper[4606]: I1212 00:25:47.052415 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-z6php" Dec 12 00:25:47 crc kubenswrapper[4606]: I1212 00:25:47.052924 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6sw6\" (UniqueName: \"kubernetes.io/projected/a76fe324-756f-48ce-8e78-9ee4b4933b96-kube-api-access-x6sw6\") pod \"kube-storage-version-migrator-operator-b67b599dd-dmjzw\" (UID: \"a76fe324-756f-48ce-8e78-9ee4b4933b96\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dmjzw" Dec 12 00:25:47 crc kubenswrapper[4606]: I1212 00:25:47.089389 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-k5ghr"] Dec 12 00:25:47 crc kubenswrapper[4606]: I1212 00:25:47.109634 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8nn7\" (UID: \"40292f84-e865-4368-9e37-e385dfcb5880\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8nn7" Dec 12 00:25:47 crc kubenswrapper[4606]: E1212 00:25:47.110022 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-12 00:25:47.610008281 +0000 UTC m=+138.155361147 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t8nn7" (UID: "40292f84-e865-4368-9e37-e385dfcb5880") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 12 00:25:47 crc kubenswrapper[4606]: I1212 00:25:47.122292 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-vlq68"] Dec 12 00:25:47 crc kubenswrapper[4606]: I1212 00:25:47.130793 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4s6z\" (UniqueName: \"kubernetes.io/projected/4dc94fa6-9193-4aff-94d5-3ab846094763-kube-api-access-h4s6z\") pod \"package-server-manager-789f6589d5-c86kb\" (UID: \"4dc94fa6-9193-4aff-94d5-3ab846094763\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-c86kb" Dec 12 00:25:47 crc kubenswrapper[4606]: I1212 00:25:47.141946 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vp9fz\" (UniqueName: \"kubernetes.io/projected/848f49c7-d7b8-4490-9956-4014339c4a31-kube-api-access-vp9fz\") pod \"control-plane-machine-set-operator-78cbb6b69f-4p7d9\" (UID: \"848f49c7-d7b8-4490-9956-4014339c4a31\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-4p7d9" Dec 12 00:25:47 crc kubenswrapper[4606]: I1212 00:25:47.210507 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 12 00:25:47 crc kubenswrapper[4606]: E1212 00:25:47.210822 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-12 00:25:47.710807675 +0000 UTC m=+138.256160541 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 12 00:25:47 crc kubenswrapper[4606]: I1212 00:25:47.237642 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dmjzw" Dec 12 00:25:47 crc kubenswrapper[4606]: I1212 00:25:47.252084 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-c86kb" Dec 12 00:25:47 crc kubenswrapper[4606]: I1212 00:25:47.264518 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-4p7d9" Dec 12 00:25:47 crc kubenswrapper[4606]: I1212 00:25:47.299479 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-dlrwh"] Dec 12 00:25:47 crc kubenswrapper[4606]: I1212 00:25:47.301503 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-n9mcs"] Dec 12 00:25:47 crc kubenswrapper[4606]: E1212 00:25:47.312349 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-12 00:25:47.812334809 +0000 UTC m=+138.357687675 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t8nn7" (UID: "40292f84-e865-4368-9e37-e385dfcb5880") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 12 00:25:47 crc kubenswrapper[4606]: I1212 00:25:47.312045 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8nn7\" (UID: \"40292f84-e865-4368-9e37-e385dfcb5880\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8nn7" Dec 12 00:25:47 crc kubenswrapper[4606]: I1212 00:25:47.366983 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-l5pmq"] Dec 12 00:25:47 crc kubenswrapper[4606]: I1212 00:25:47.413659 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 12 00:25:47 crc kubenswrapper[4606]: E1212 00:25:47.414007 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-12 00:25:47.913988727 +0000 UTC m=+138.459341593 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 12 00:25:47 crc kubenswrapper[4606]: W1212 00:25:47.439362 4606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae2ce179_dd9a_4a2d_8f4c_e35424c12f94.slice/crio-406362299c597683595ff21621bd707046427e08e6b741bed68585085796aaaa WatchSource:0}: Error finding container 406362299c597683595ff21621bd707046427e08e6b741bed68585085796aaaa: Status 404 returned error can't find the container with id 406362299c597683595ff21621bd707046427e08e6b741bed68585085796aaaa Dec 12 00:25:47 crc kubenswrapper[4606]: I1212 00:25:47.475838 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-9vfqp"] Dec 12 00:25:47 crc kubenswrapper[4606]: I1212 00:25:47.491603 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-dgfmw"] Dec 12 00:25:47 crc kubenswrapper[4606]: I1212 00:25:47.506597 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4fdkr"] Dec 12 00:25:47 crc kubenswrapper[4606]: I1212 00:25:47.515040 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8nn7\" (UID: \"40292f84-e865-4368-9e37-e385dfcb5880\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8nn7" Dec 12 00:25:47 crc kubenswrapper[4606]: E1212 00:25:47.515414 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-12 00:25:48.015403067 +0000 UTC m=+138.560755933 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t8nn7" (UID: "40292f84-e865-4368-9e37-e385dfcb5880") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 12 00:25:47 crc kubenswrapper[4606]: I1212 00:25:47.535147 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-snpww"] Dec 12 00:25:47 crc kubenswrapper[4606]: I1212 00:25:47.555391 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-ffwdl"] Dec 12 00:25:47 crc kubenswrapper[4606]: I1212 00:25:47.583575 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-t6r7p" event={"ID":"05c0cc56-218e-423d-b6cc-72bf5db5fdfd","Type":"ContainerStarted","Data":"436a4dbedb1475e3d49d3f518e59508bf9318a36fa6861a4ce04f418d6af8cde"} Dec 12 00:25:47 crc kubenswrapper[4606]: I1212 00:25:47.608275 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4hwhh"] Dec 12 00:25:47 crc kubenswrapper[4606]: I1212 00:25:47.613498 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-596sh" podStartSLOduration=118.613471995 podStartE2EDuration="1m58.613471995s" podCreationTimestamp="2025-12-12 00:23:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:25:47.566093367 +0000 UTC m=+138.111446233" watchObservedRunningTime="2025-12-12 00:25:47.613471995 +0000 UTC m=+138.158824861" Dec 12 00:25:47 crc kubenswrapper[4606]: I1212 00:25:47.620256 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 12 00:25:47 crc kubenswrapper[4606]: E1212 00:25:47.622748 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-12 00:25:48.122729893 +0000 UTC m=+138.668082759 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 12 00:25:47 crc kubenswrapper[4606]: I1212 00:25:47.654662 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-544fr"] Dec 12 00:25:47 crc kubenswrapper[4606]: I1212 00:25:47.663093 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-6h2gf"] Dec 12 00:25:47 crc kubenswrapper[4606]: I1212 00:25:47.668278 4606 generic.go:334] "Generic (PLEG): container finished" podID="6fb93471-e75b-43b2-a4e2-d36bfc617930" containerID="437de1da480d55cdcb6d7bd04dc44a36b095f7805680bec423a549668ff66301" exitCode=0 Dec 12 00:25:47 crc kubenswrapper[4606]: I1212 00:25:47.668350 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-f6l62" event={"ID":"6fb93471-e75b-43b2-a4e2-d36bfc617930","Type":"ContainerDied","Data":"437de1da480d55cdcb6d7bd04dc44a36b095f7805680bec423a549668ff66301"} Dec 12 00:25:47 crc kubenswrapper[4606]: I1212 00:25:47.682443 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-l5pmq" event={"ID":"ae2ce179-dd9a-4a2d-8f4c-e35424c12f94","Type":"ContainerStarted","Data":"406362299c597683595ff21621bd707046427e08e6b741bed68585085796aaaa"} Dec 12 00:25:47 crc kubenswrapper[4606]: I1212 00:25:47.696880 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n9mcs" event={"ID":"b993a003-2c7c-484b-a44b-17f07bdf6784","Type":"ContainerStarted","Data":"1f3d70378fbb3fad1d5dc9ecf7143da7882b15b7111288a2a105547eeeac1030"} Dec 12 00:25:47 crc kubenswrapper[4606]: I1212 00:25:47.726895 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8ltmf"] Dec 12 00:25:47 crc kubenswrapper[4606]: I1212 00:25:47.737982 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-f7w26" event={"ID":"7f35cb7e-9b27-44e8-bd6f-05757a107776","Type":"ContainerStarted","Data":"b4014222eda2a3af61261db80b47ee521d38454d627d2037d802b075e2d4c8dd"} Dec 12 00:25:47 crc kubenswrapper[4606]: I1212 00:25:47.750729 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8nn7\" (UID: \"40292f84-e865-4368-9e37-e385dfcb5880\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8nn7" Dec 12 00:25:47 crc kubenswrapper[4606]: E1212 00:25:47.753018 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-12 00:25:48.253005406 +0000 UTC m=+138.798358272 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t8nn7" (UID: "40292f84-e865-4368-9e37-e385dfcb5880") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 12 00:25:47 crc kubenswrapper[4606]: I1212 00:25:47.756159 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-64s9l" event={"ID":"21420052-2e90-4be9-923e-2b8d0d5ad189","Type":"ContainerStarted","Data":"11dced9d08e634e9843e78016c2ec4e9ec1dffa4fd12a34f92be7c4276386e28"} Dec 12 00:25:47 crc kubenswrapper[4606]: I1212 00:25:47.756210 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-64s9l" event={"ID":"21420052-2e90-4be9-923e-2b8d0d5ad189","Type":"ContainerStarted","Data":"3f7f5c3e8436feea709d7d523125c3217159bc90a39e90d62ea536524b0d4885"} Dec 12 00:25:47 crc kubenswrapper[4606]: I1212 00:25:47.775278 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-dlrwh" event={"ID":"c454b7c4-18db-442a-ae25-d66e7e6061f3","Type":"ContainerStarted","Data":"0bdb716413dfc350432245354147f9ed6a46e60ec109954eeeac29fd89d8dcc9"} Dec 12 00:25:47 crc kubenswrapper[4606]: I1212 00:25:47.788217 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-dd5sp" event={"ID":"16a5a061-f2aa-430e-9898-b7adff8ccb50","Type":"ContainerStarted","Data":"b288c67cc787774b203dd95b89935cb0b8499a6c49600efbb6c5faba1ffeefb9"} Dec 12 00:25:47 crc kubenswrapper[4606]: I1212 00:25:47.789380 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-dd5sp" Dec 12 00:25:47 crc kubenswrapper[4606]: I1212 00:25:47.791522 4606 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-dd5sp container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Dec 12 00:25:47 crc kubenswrapper[4606]: I1212 00:25:47.791570 4606 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-dd5sp" podUID="16a5a061-f2aa-430e-9898-b7adff8ccb50" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" Dec 12 00:25:47 crc kubenswrapper[4606]: I1212 00:25:47.816993 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-pkx4d" event={"ID":"9d84c7aa-3fad-4d0c-ba9a-5577ba892a5b","Type":"ContainerStarted","Data":"b420f98b4cc471d30149ea3bd749f6d42a30c96feac6681d2db20337fc228c3a"} Dec 12 00:25:47 crc kubenswrapper[4606]: I1212 00:25:47.818251 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29424960-bh2l7" event={"ID":"148f1f7a-b994-4984-a900-18e9d5868002","Type":"ContainerStarted","Data":"20c4027857969c7e2a2673cdc195dec9ed8c96c9024aa23f76e3cba36775fcbd"} Dec 12 00:25:47 crc kubenswrapper[4606]: I1212 00:25:47.850205 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-q728t" event={"ID":"dcc51b5e-839a-4b7b-ac5c-e0e25c3aac96","Type":"ContainerStarted","Data":"a33e04e3d213708bdef0f508c956639e8d65f0f6b254ca1b95e39381fa4c290e"} Dec 12 00:25:47 crc kubenswrapper[4606]: I1212 00:25:47.852240 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 12 00:25:47 crc kubenswrapper[4606]: E1212 00:25:47.855064 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-12 00:25:48.355042665 +0000 UTC m=+138.900395551 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 12 00:25:47 crc kubenswrapper[4606]: I1212 00:25:47.903463 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xmr7n" event={"ID":"64f7cb4f-b96f-4538-8eb9-3a8826dada32","Type":"ContainerStarted","Data":"a91a2d002bbf50dbe6c503887f39087e2195e0dc9e98d38739acca9c53e6573a"} Dec 12 00:25:47 crc kubenswrapper[4606]: I1212 00:25:47.914875 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-k5ghr" event={"ID":"43bb746f-62c0-45c5-b1db-490810a0ba0e","Type":"ContainerStarted","Data":"ef8780fd900c9a3ff15e14ca55a321065bc15088435b697139975b77857ab7e6"} Dec 12 00:25:47 crc kubenswrapper[4606]: I1212 00:25:47.922390 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-frwjc" event={"ID":"84d78ffd-976a-4d55-9b6a-d10369b35718","Type":"ContainerStarted","Data":"fc72c1ff71925c4191ce5fa6123aca15cbbcd1275da865914e7dc7eb7c3878b0"} Dec 12 00:25:47 crc kubenswrapper[4606]: I1212 00:25:47.940348 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-vlq68" event={"ID":"5635c63d-bd71-4b80-b111-0fd9ff2cd053","Type":"ContainerStarted","Data":"446f295733c7b09e7a4f64bfd3cbbb56f890f6348a71c75f818ca13a0c777b80"} Dec 12 00:25:47 crc kubenswrapper[4606]: I1212 00:25:47.956675 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9gzg8" event={"ID":"d8869d3c-60d0-4a85-9b0f-84147ce018b5","Type":"ContainerStarted","Data":"23612b6f1aa9d59f63145561bcb0b93706c3d5b4838666f0e002d2041f1b7c4d"} Dec 12 00:25:47 crc kubenswrapper[4606]: I1212 00:25:47.960548 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vz9ft" event={"ID":"92300adf-2095-4cf3-901b-d17a9ab4deb5","Type":"ContainerStarted","Data":"497430da5293d6b0fd81a53420171cce1f8e940fa35ba71fcae28bc7a606d9f3"} Dec 12 00:25:47 crc kubenswrapper[4606]: I1212 00:25:47.961630 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8nn7\" (UID: \"40292f84-e865-4368-9e37-e385dfcb5880\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8nn7" Dec 12 00:25:47 crc kubenswrapper[4606]: E1212 00:25:47.962741 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-12 00:25:48.46273016 +0000 UTC m=+139.008083026 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t8nn7" (UID: "40292f84-e865-4368-9e37-e385dfcb5880") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 12 00:25:48 crc kubenswrapper[4606]: I1212 00:25:48.062728 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 12 00:25:48 crc kubenswrapper[4606]: E1212 00:25:48.063832 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-12 00:25:48.563813461 +0000 UTC m=+139.109166327 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 12 00:25:48 crc kubenswrapper[4606]: I1212 00:25:48.164814 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8nn7\" (UID: \"40292f84-e865-4368-9e37-e385dfcb5880\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8nn7" Dec 12 00:25:48 crc kubenswrapper[4606]: E1212 00:25:48.165144 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-12 00:25:48.66513038 +0000 UTC m=+139.210483266 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t8nn7" (UID: "40292f84-e865-4368-9e37-e385dfcb5880") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 12 00:25:48 crc kubenswrapper[4606]: I1212 00:25:48.265602 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 12 00:25:48 crc kubenswrapper[4606]: E1212 00:25:48.266199 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-12 00:25:48.766148359 +0000 UTC m=+139.311501265 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 12 00:25:48 crc kubenswrapper[4606]: I1212 00:25:48.326378 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-46kcq"] Dec 12 00:25:48 crc kubenswrapper[4606]: I1212 00:25:48.353209 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-2drsc"] Dec 12 00:25:48 crc kubenswrapper[4606]: I1212 00:25:48.354824 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-mbfr9"] Dec 12 00:25:48 crc kubenswrapper[4606]: I1212 00:25:48.374102 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8nn7\" (UID: \"40292f84-e865-4368-9e37-e385dfcb5880\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8nn7" Dec 12 00:25:48 crc kubenswrapper[4606]: E1212 00:25:48.374406 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-12 00:25:48.87439597 +0000 UTC m=+139.419748836 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t8nn7" (UID: "40292f84-e865-4368-9e37-e385dfcb5880") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 12 00:25:48 crc kubenswrapper[4606]: W1212 00:25:48.414453 4606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode3e167ac_8428_49a3_899c_a2290fb8f78f.slice/crio-35caa0f63ad1873b48d34bdb78fb9beffcfa052a67660c74b9523d90e841d488 WatchSource:0}: Error finding container 35caa0f63ad1873b48d34bdb78fb9beffcfa052a67660c74b9523d90e841d488: Status 404 returned error can't find the container with id 35caa0f63ad1873b48d34bdb78fb9beffcfa052a67660c74b9523d90e841d488 Dec 12 00:25:48 crc kubenswrapper[4606]: I1212 00:25:48.447150 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-596sh" Dec 12 00:25:48 crc kubenswrapper[4606]: I1212 00:25:48.479062 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 12 00:25:48 crc kubenswrapper[4606]: E1212 00:25:48.479435 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-12 00:25:48.979421022 +0000 UTC m=+139.524773888 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 12 00:25:48 crc kubenswrapper[4606]: I1212 00:25:48.566981 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-64s9l" Dec 12 00:25:48 crc kubenswrapper[4606]: I1212 00:25:48.597513 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8nn7\" (UID: \"40292f84-e865-4368-9e37-e385dfcb5880\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8nn7" Dec 12 00:25:48 crc kubenswrapper[4606]: E1212 00:25:48.597775 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-12 00:25:49.097764262 +0000 UTC m=+139.643117128 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t8nn7" (UID: "40292f84-e865-4368-9e37-e385dfcb5880") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 12 00:25:48 crc kubenswrapper[4606]: I1212 00:25:48.686448 4606 patch_prober.go:28] interesting pod/router-default-5444994796-64s9l container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 12 00:25:48 crc kubenswrapper[4606]: [-]has-synced failed: reason withheld Dec 12 00:25:48 crc kubenswrapper[4606]: [+]process-running ok Dec 12 00:25:48 crc kubenswrapper[4606]: healthz check failed Dec 12 00:25:48 crc kubenswrapper[4606]: I1212 00:25:48.686496 4606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-64s9l" podUID="21420052-2e90-4be9-923e-2b8d0d5ad189" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 12 00:25:48 crc kubenswrapper[4606]: I1212 00:25:48.697864 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 12 00:25:48 crc kubenswrapper[4606]: E1212 00:25:48.700281 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-12 00:25:49.200245283 +0000 UTC m=+139.745598149 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 12 00:25:48 crc kubenswrapper[4606]: I1212 00:25:48.724970 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8nn7\" (UID: \"40292f84-e865-4368-9e37-e385dfcb5880\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8nn7" Dec 12 00:25:48 crc kubenswrapper[4606]: E1212 00:25:48.733793 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-12 00:25:49.233778535 +0000 UTC m=+139.779131401 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t8nn7" (UID: "40292f84-e865-4368-9e37-e385dfcb5880") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 12 00:25:48 crc kubenswrapper[4606]: I1212 00:25:48.836494 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 12 00:25:48 crc kubenswrapper[4606]: E1212 00:25:48.836845 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-12 00:25:49.336829822 +0000 UTC m=+139.882182688 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 12 00:25:48 crc kubenswrapper[4606]: I1212 00:25:48.851594 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-898cq"] Dec 12 00:25:48 crc kubenswrapper[4606]: I1212 00:25:48.878293 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-q79c6"] Dec 12 00:25:48 crc kubenswrapper[4606]: I1212 00:25:48.937234 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-7kxjs" podStartSLOduration=120.937217884 podStartE2EDuration="2m0.937217884s" podCreationTimestamp="2025-12-12 00:23:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:25:48.914484742 +0000 UTC m=+139.459837608" watchObservedRunningTime="2025-12-12 00:25:48.937217884 +0000 UTC m=+139.482570750" Dec 12 00:25:48 crc kubenswrapper[4606]: I1212 00:25:48.944769 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8nn7\" (UID: \"40292f84-e865-4368-9e37-e385dfcb5880\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8nn7" Dec 12 00:25:48 crc kubenswrapper[4606]: E1212 00:25:48.945107 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-12 00:25:49.445094123 +0000 UTC m=+139.990446999 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t8nn7" (UID: "40292f84-e865-4368-9e37-e385dfcb5880") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 12 00:25:49 crc kubenswrapper[4606]: I1212 00:25:49.029675 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-h8tjq"] Dec 12 00:25:49 crc kubenswrapper[4606]: I1212 00:25:49.038458 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-9vfqp" event={"ID":"0ef842ad-7b0b-4e92-bfb2-23306b0f85f9","Type":"ContainerStarted","Data":"c4040d918024430e443a3b4d5575439e565cc6af3c14847243ac3bb03ca8546e"} Dec 12 00:25:49 crc kubenswrapper[4606]: I1212 00:25:49.046185 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 12 00:25:49 crc kubenswrapper[4606]: E1212 00:25:49.046754 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-12 00:25:49.54673706 +0000 UTC m=+140.092089926 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 12 00:25:49 crc kubenswrapper[4606]: W1212 00:25:49.073446 4606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod77014cd9_7c35_4271_b875_a1f2b2a712d1.slice/crio-2166c67c6cd78e9697be7f94eb3a4ddf4fd1bb4996ec94eeecd780430e17d46a WatchSource:0}: Error finding container 2166c67c6cd78e9697be7f94eb3a4ddf4fd1bb4996ec94eeecd780430e17d46a: Status 404 returned error can't find the container with id 2166c67c6cd78e9697be7f94eb3a4ddf4fd1bb4996ec94eeecd780430e17d46a Dec 12 00:25:49 crc kubenswrapper[4606]: I1212 00:25:49.073649 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-544fr" event={"ID":"d5b97b3b-8994-4d7f-a165-c04d13546e89","Type":"ContainerStarted","Data":"40487f2c64688dbbe919542e52b95392d2fdf203e6bf59fca03ab76a8fdf3b9b"} Dec 12 00:25:49 crc kubenswrapper[4606]: I1212 00:25:49.077593 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-q728t" podStartSLOduration=122.077579628 podStartE2EDuration="2m2.077579628s" podCreationTimestamp="2025-12-12 00:23:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:25:49.076347434 +0000 UTC m=+139.621700300" watchObservedRunningTime="2025-12-12 00:25:49.077579628 +0000 UTC m=+139.622932494" Dec 12 00:25:49 crc kubenswrapper[4606]: I1212 00:25:49.078010 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-t6r7p" podStartSLOduration=120.07800487 podStartE2EDuration="2m0.07800487s" podCreationTimestamp="2025-12-12 00:23:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:25:49.04778354 +0000 UTC m=+139.593136416" watchObservedRunningTime="2025-12-12 00:25:49.07800487 +0000 UTC m=+139.623357736" Dec 12 00:25:49 crc kubenswrapper[4606]: I1212 00:25:49.084871 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-52vsx"] Dec 12 00:25:49 crc kubenswrapper[4606]: I1212 00:25:49.085063 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8ltmf" event={"ID":"e107c29e-3757-44e7-ad54-223801e19085","Type":"ContainerStarted","Data":"21e970d4fcb38153656e4f572ea3c512ef714ee4ab6d0a05d548b666f7facc57"} Dec 12 00:25:49 crc kubenswrapper[4606]: I1212 00:25:49.098674 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6h2gf" event={"ID":"e829b744-8b79-4f65-8783-ea555f280ce8","Type":"ContainerStarted","Data":"bd1394ffd822f89f9081958445c182b56d2115d339bce12a09214f53e368ae98"} Dec 12 00:25:49 crc kubenswrapper[4606]: I1212 00:25:49.102551 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vz9ft" event={"ID":"92300adf-2095-4cf3-901b-d17a9ab4deb5","Type":"ContainerStarted","Data":"71f602e6583b1d799f8e9fbdfa1d63ef2aebc2173029c2efa27675cbbf5e2e9f"} Dec 12 00:25:49 crc kubenswrapper[4606]: I1212 00:25:49.115445 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-mbfr9" event={"ID":"e3e167ac-8428-49a3-899c-a2290fb8f78f","Type":"ContainerStarted","Data":"35caa0f63ad1873b48d34bdb78fb9beffcfa052a67660c74b9523d90e841d488"} Dec 12 00:25:49 crc kubenswrapper[4606]: I1212 00:25:49.123063 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4fdkr" event={"ID":"54e9b065-dbcb-4238-8da6-f36ae3e18dde","Type":"ContainerStarted","Data":"3e273525cf73c05068159b1269bf64b9c0d3496f0bd105c0c2c050622df998f4"} Dec 12 00:25:49 crc kubenswrapper[4606]: I1212 00:25:49.130803 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4hwhh" event={"ID":"e2af8a85-751c-409a-908e-1334eb3e8e42","Type":"ContainerStarted","Data":"f0bf23bd5dc7f25831661d6edf9a433eab8bf0377f447791af9e23963f170458"} Dec 12 00:25:49 crc kubenswrapper[4606]: I1212 00:25:49.148949 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8nn7\" (UID: \"40292f84-e865-4368-9e37-e385dfcb5880\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8nn7" Dec 12 00:25:49 crc kubenswrapper[4606]: E1212 00:25:49.149313 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-12 00:25:49.649301873 +0000 UTC m=+140.194654739 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t8nn7" (UID: "40292f84-e865-4368-9e37-e385dfcb5880") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 12 00:25:49 crc kubenswrapper[4606]: I1212 00:25:49.159916 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-dlrwh" event={"ID":"c454b7c4-18db-442a-ae25-d66e7e6061f3","Type":"ContainerStarted","Data":"6a212b878af3aafa15083dc36af083bfdfc315253bb055049c8a514b2bbe5426"} Dec 12 00:25:49 crc kubenswrapper[4606]: I1212 00:25:49.198410 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xmr7n" podStartSLOduration=121.198390089 podStartE2EDuration="2m1.198390089s" podCreationTimestamp="2025-12-12 00:23:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:25:49.110920166 +0000 UTC m=+139.656273032" watchObservedRunningTime="2025-12-12 00:25:49.198390089 +0000 UTC m=+139.743742945" Dec 12 00:25:49 crc kubenswrapper[4606]: I1212 00:25:49.214439 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-pkx4d" event={"ID":"9d84c7aa-3fad-4d0c-ba9a-5577ba892a5b","Type":"ContainerStarted","Data":"83f3c01d51f8549b6173a30f8d513f65ce6792df6bb3f00864dee5139de98948"} Dec 12 00:25:49 crc kubenswrapper[4606]: I1212 00:25:49.256265 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 12 00:25:49 crc kubenswrapper[4606]: E1212 00:25:49.256646 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-12 00:25:49.756626888 +0000 UTC m=+140.301979754 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 12 00:25:49 crc kubenswrapper[4606]: I1212 00:25:49.266484 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-4p7d9"] Dec 12 00:25:49 crc kubenswrapper[4606]: I1212 00:25:49.364685 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8nn7\" (UID: \"40292f84-e865-4368-9e37-e385dfcb5880\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8nn7" Dec 12 00:25:49 crc kubenswrapper[4606]: E1212 00:25:49.368184 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-12 00:25:49.868154341 +0000 UTC m=+140.413507207 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t8nn7" (UID: "40292f84-e865-4368-9e37-e385dfcb5880") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 12 00:25:49 crc kubenswrapper[4606]: I1212 00:25:49.395490 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xmr7n" event={"ID":"64f7cb4f-b96f-4538-8eb9-3a8826dada32","Type":"ContainerStarted","Data":"1c8ec9f247b4eb5b6922e8cdc3d52d34a44ec13889036f60c7d32ce01a96f4ea"} Dec 12 00:25:49 crc kubenswrapper[4606]: I1212 00:25:49.399322 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-f7w26" podStartSLOduration=120.399309507 podStartE2EDuration="2m0.399309507s" podCreationTimestamp="2025-12-12 00:23:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:25:49.277218411 +0000 UTC m=+139.822571277" watchObservedRunningTime="2025-12-12 00:25:49.399309507 +0000 UTC m=+139.944662383" Dec 12 00:25:49 crc kubenswrapper[4606]: I1212 00:25:49.404975 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29424975-jpm67"] Dec 12 00:25:49 crc kubenswrapper[4606]: I1212 00:25:49.425549 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-c86kb"] Dec 12 00:25:49 crc kubenswrapper[4606]: I1212 00:25:49.436091 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-vlq68" Dec 12 00:25:49 crc kubenswrapper[4606]: I1212 00:25:49.437046 4606 patch_prober.go:28] interesting pod/downloads-7954f5f757-vlq68 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" start-of-body= Dec 12 00:25:49 crc kubenswrapper[4606]: I1212 00:25:49.437164 4606 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-vlq68" podUID="5635c63d-bd71-4b80-b111-0fd9ff2cd053" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" Dec 12 00:25:49 crc kubenswrapper[4606]: I1212 00:25:49.452019 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-64s9l" podStartSLOduration=120.452000373 podStartE2EDuration="2m0.452000373s" podCreationTimestamp="2025-12-12 00:23:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:25:49.433073306 +0000 UTC m=+139.978426172" watchObservedRunningTime="2025-12-12 00:25:49.452000373 +0000 UTC m=+139.997353239" Dec 12 00:25:49 crc kubenswrapper[4606]: I1212 00:25:49.456078 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dmjzw"] Dec 12 00:25:49 crc kubenswrapper[4606]: I1212 00:25:49.482082 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-pvw8f" event={"ID":"b7cea121-b721-4fa7-9b48-057abf4048af","Type":"ContainerStarted","Data":"8e5b15548cb1a1f8e5cff313aa3f46f65676bb958cd0744e20c1ac4ed0994367"} Dec 12 00:25:49 crc kubenswrapper[4606]: I1212 00:25:49.493538 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 12 00:25:49 crc kubenswrapper[4606]: E1212 00:25:49.493909 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-12 00:25:49.993885498 +0000 UTC m=+140.539238404 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 12 00:25:49 crc kubenswrapper[4606]: I1212 00:25:49.495785 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-dd5sp" podStartSLOduration=120.49576974 podStartE2EDuration="2m0.49576974s" podCreationTimestamp="2025-12-12 00:23:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:25:49.492492039 +0000 UTC m=+140.037844915" watchObservedRunningTime="2025-12-12 00:25:49.49576974 +0000 UTC m=+140.041122606" Dec 12 00:25:49 crc kubenswrapper[4606]: I1212 00:25:49.513414 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-dgfmw" event={"ID":"079b1c50-eaa5-4be5-a0d2-0015a67a1875","Type":"ContainerStarted","Data":"8fea28cea6c35e7ac4af8b0e8dd81ced09cae2d35eb72f8ba1c189771a683ded"} Dec 12 00:25:49 crc kubenswrapper[4606]: I1212 00:25:49.519955 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-snpww" event={"ID":"24b34ccb-e494-493e-98a9-31cf59981c38","Type":"ContainerStarted","Data":"cc33db71702dc9a9b8c7879939be084ba5fc03620e5540e9c7e068768d508cca"} Dec 12 00:25:49 crc kubenswrapper[4606]: I1212 00:25:49.547396 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-46kcq" event={"ID":"3487af3b-5a9b-4e8f-8647-e800938cb74d","Type":"ContainerStarted","Data":"4458d9316d6be1a64bd81ecc715dc5a85701f3f93ce15fb924def62756b20f58"} Dec 12 00:25:49 crc kubenswrapper[4606]: I1212 00:25:49.553384 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-pkx4d" podStartSLOduration=120.553368452 podStartE2EDuration="2m0.553368452s" podCreationTimestamp="2025-12-12 00:23:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:25:49.550000589 +0000 UTC m=+140.095353455" watchObservedRunningTime="2025-12-12 00:25:49.553368452 +0000 UTC m=+140.098721328" Dec 12 00:25:49 crc kubenswrapper[4606]: I1212 00:25:49.568551 4606 patch_prober.go:28] interesting pod/router-default-5444994796-64s9l container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 12 00:25:49 crc kubenswrapper[4606]: [-]has-synced failed: reason withheld Dec 12 00:25:49 crc kubenswrapper[4606]: [+]process-running ok Dec 12 00:25:49 crc kubenswrapper[4606]: healthz check failed Dec 12 00:25:49 crc kubenswrapper[4606]: I1212 00:25:49.568609 4606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-64s9l" podUID="21420052-2e90-4be9-923e-2b8d0d5ad189" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 12 00:25:49 crc kubenswrapper[4606]: I1212 00:25:49.571604 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-z6php"] Dec 12 00:25:49 crc kubenswrapper[4606]: I1212 00:25:49.595330 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8nn7\" (UID: \"40292f84-e865-4368-9e37-e385dfcb5880\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8nn7" Dec 12 00:25:49 crc kubenswrapper[4606]: E1212 00:25:49.595890 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-12 00:25:50.095872855 +0000 UTC m=+140.641225721 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t8nn7" (UID: "40292f84-e865-4368-9e37-e385dfcb5880") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 12 00:25:49 crc kubenswrapper[4606]: I1212 00:25:49.622967 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-hhzb9"] Dec 12 00:25:49 crc kubenswrapper[4606]: I1212 00:25:49.666575 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-dlrwh" podStartSLOduration=120.666560291 podStartE2EDuration="2m0.666560291s" podCreationTimestamp="2025-12-12 00:23:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:25:49.631563087 +0000 UTC m=+140.176915963" watchObservedRunningTime="2025-12-12 00:25:49.666560291 +0000 UTC m=+140.211913147" Dec 12 00:25:49 crc kubenswrapper[4606]: I1212 00:25:49.670652 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-vlq68" podStartSLOduration=120.670642084 podStartE2EDuration="2m0.670642084s" podCreationTimestamp="2025-12-12 00:23:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:25:49.665524262 +0000 UTC m=+140.210877128" watchObservedRunningTime="2025-12-12 00:25:49.670642084 +0000 UTC m=+140.215994950" Dec 12 00:25:49 crc kubenswrapper[4606]: I1212 00:25:49.698728 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 12 00:25:49 crc kubenswrapper[4606]: E1212 00:25:49.699087 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-12 00:25:50.199072515 +0000 UTC m=+140.744425381 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 12 00:25:49 crc kubenswrapper[4606]: I1212 00:25:49.706470 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-pruner-29424960-bh2l7" podStartSLOduration=121.706457681 podStartE2EDuration="2m1.706457681s" podCreationTimestamp="2025-12-12 00:23:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:25:49.705495084 +0000 UTC m=+140.250847950" watchObservedRunningTime="2025-12-12 00:25:49.706457681 +0000 UTC m=+140.251810547" Dec 12 00:25:49 crc kubenswrapper[4606]: I1212 00:25:49.753242 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9gzg8" podStartSLOduration=120.753223901 podStartE2EDuration="2m0.753223901s" podCreationTimestamp="2025-12-12 00:23:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:25:49.737282598 +0000 UTC m=+140.282635474" watchObservedRunningTime="2025-12-12 00:25:49.753223901 +0000 UTC m=+140.298576767" Dec 12 00:25:49 crc kubenswrapper[4606]: I1212 00:25:49.802061 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8nn7\" (UID: \"40292f84-e865-4368-9e37-e385dfcb5880\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8nn7" Dec 12 00:25:49 crc kubenswrapper[4606]: E1212 00:25:49.803331 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-12 00:25:50.303319765 +0000 UTC m=+140.848672631 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t8nn7" (UID: "40292f84-e865-4368-9e37-e385dfcb5880") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 12 00:25:49 crc kubenswrapper[4606]: I1212 00:25:49.909403 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 12 00:25:49 crc kubenswrapper[4606]: E1212 00:25:49.909719 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-12 00:25:50.409703894 +0000 UTC m=+140.955056760 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 12 00:25:49 crc kubenswrapper[4606]: I1212 00:25:49.923374 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9gzg8" event={"ID":"d8869d3c-60d0-4a85-9b0f-84147ce018b5","Type":"ContainerStarted","Data":"27ace87a5225e0b692c6fc8f5b206bdb73cd913cea0e2f8202e73632381dcf19"} Dec 12 00:25:49 crc kubenswrapper[4606]: I1212 00:25:49.942687 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-ffwdl" event={"ID":"f6960858-8b4e-4855-b52a-caa021444b7d","Type":"ContainerStarted","Data":"2db9ebe10d655f3d6ab553d613b9e061409da508a95d946483c95bba8fd82008"} Dec 12 00:25:50 crc kubenswrapper[4606]: I1212 00:25:50.013949 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-2drsc" event={"ID":"5c84db9e-ba60-4723-95f8-887b16eaa84a","Type":"ContainerStarted","Data":"96e4b745821490405696c7131e583085639d252d5bdf3a5d95d5d2c70ee5179e"} Dec 12 00:25:50 crc kubenswrapper[4606]: I1212 00:25:50.014876 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8nn7\" (UID: \"40292f84-e865-4368-9e37-e385dfcb5880\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8nn7" Dec 12 00:25:50 crc kubenswrapper[4606]: E1212 00:25:50.015285 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-12 00:25:50.51527486 +0000 UTC m=+141.060627716 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t8nn7" (UID: "40292f84-e865-4368-9e37-e385dfcb5880") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 12 00:25:50 crc kubenswrapper[4606]: I1212 00:25:50.041864 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-k5ghr" event={"ID":"43bb746f-62c0-45c5-b1db-490810a0ba0e","Type":"ContainerStarted","Data":"478c918609a55c577a257697e7c2007813fc8796a1cf99f6bd7acbbb7ca53e35"} Dec 12 00:25:50 crc kubenswrapper[4606]: I1212 00:25:50.041901 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-k5ghr" Dec 12 00:25:50 crc kubenswrapper[4606]: I1212 00:25:50.088108 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-dd5sp" Dec 12 00:25:50 crc kubenswrapper[4606]: I1212 00:25:50.115525 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 12 00:25:50 crc kubenswrapper[4606]: E1212 00:25:50.117557 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-12 00:25:50.617540775 +0000 UTC m=+141.162893641 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 12 00:25:50 crc kubenswrapper[4606]: I1212 00:25:50.221524 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8nn7\" (UID: \"40292f84-e865-4368-9e37-e385dfcb5880\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8nn7" Dec 12 00:25:50 crc kubenswrapper[4606]: E1212 00:25:50.221613 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-12 00:25:50.721577979 +0000 UTC m=+141.266930845 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t8nn7" (UID: "40292f84-e865-4368-9e37-e385dfcb5880") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 12 00:25:50 crc kubenswrapper[4606]: I1212 00:25:50.322718 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 12 00:25:50 crc kubenswrapper[4606]: E1212 00:25:50.323058 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-12 00:25:50.823035881 +0000 UTC m=+141.368388747 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 12 00:25:50 crc kubenswrapper[4606]: I1212 00:25:50.323301 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8nn7\" (UID: \"40292f84-e865-4368-9e37-e385dfcb5880\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8nn7" Dec 12 00:25:50 crc kubenswrapper[4606]: E1212 00:25:50.323658 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-12 00:25:50.823642828 +0000 UTC m=+141.368995694 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t8nn7" (UID: "40292f84-e865-4368-9e37-e385dfcb5880") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 12 00:25:50 crc kubenswrapper[4606]: I1212 00:25:50.426232 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 12 00:25:50 crc kubenswrapper[4606]: E1212 00:25:50.426619 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-12 00:25:50.926600291 +0000 UTC m=+141.471953157 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 12 00:25:50 crc kubenswrapper[4606]: I1212 00:25:50.438954 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-k5ghr" podStartSLOduration=121.438933664 podStartE2EDuration="2m1.438933664s" podCreationTimestamp="2025-12-12 00:23:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:25:50.436473046 +0000 UTC m=+140.981825922" watchObservedRunningTime="2025-12-12 00:25:50.438933664 +0000 UTC m=+140.984286520" Dec 12 00:25:50 crc kubenswrapper[4606]: I1212 00:25:50.476082 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-k5ghr" Dec 12 00:25:50 crc kubenswrapper[4606]: I1212 00:25:50.528402 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8nn7\" (UID: \"40292f84-e865-4368-9e37-e385dfcb5880\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8nn7" Dec 12 00:25:50 crc kubenswrapper[4606]: E1212 00:25:50.528792 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-12 00:25:51.028780063 +0000 UTC m=+141.574132929 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t8nn7" (UID: "40292f84-e865-4368-9e37-e385dfcb5880") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 12 00:25:50 crc kubenswrapper[4606]: I1212 00:25:50.582292 4606 patch_prober.go:28] interesting pod/router-default-5444994796-64s9l container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 12 00:25:50 crc kubenswrapper[4606]: [-]has-synced failed: reason withheld Dec 12 00:25:50 crc kubenswrapper[4606]: [+]process-running ok Dec 12 00:25:50 crc kubenswrapper[4606]: healthz check failed Dec 12 00:25:50 crc kubenswrapper[4606]: I1212 00:25:50.582359 4606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-64s9l" podUID="21420052-2e90-4be9-923e-2b8d0d5ad189" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 12 00:25:50 crc kubenswrapper[4606]: I1212 00:25:50.630622 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 12 00:25:50 crc kubenswrapper[4606]: E1212 00:25:50.631266 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-12 00:25:51.131236033 +0000 UTC m=+141.676588899 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 12 00:25:50 crc kubenswrapper[4606]: I1212 00:25:50.734812 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8nn7\" (UID: \"40292f84-e865-4368-9e37-e385dfcb5880\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8nn7" Dec 12 00:25:50 crc kubenswrapper[4606]: E1212 00:25:50.735131 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-12 00:25:51.235118323 +0000 UTC m=+141.780471189 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t8nn7" (UID: "40292f84-e865-4368-9e37-e385dfcb5880") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 12 00:25:50 crc kubenswrapper[4606]: I1212 00:25:50.838262 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 12 00:25:50 crc kubenswrapper[4606]: E1212 00:25:50.838799 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-12 00:25:51.338784766 +0000 UTC m=+141.884137622 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 12 00:25:50 crc kubenswrapper[4606]: I1212 00:25:50.941979 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8nn7\" (UID: \"40292f84-e865-4368-9e37-e385dfcb5880\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8nn7" Dec 12 00:25:50 crc kubenswrapper[4606]: E1212 00:25:50.942279 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-12 00:25:51.442268384 +0000 UTC m=+141.987621250 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t8nn7" (UID: "40292f84-e865-4368-9e37-e385dfcb5880") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 12 00:25:51 crc kubenswrapper[4606]: I1212 00:25:51.044458 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 12 00:25:51 crc kubenswrapper[4606]: E1212 00:25:51.044735 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-12 00:25:51.544722084 +0000 UTC m=+142.090074950 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 12 00:25:51 crc kubenswrapper[4606]: I1212 00:25:51.146324 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8nn7\" (UID: \"40292f84-e865-4368-9e37-e385dfcb5880\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8nn7" Dec 12 00:25:51 crc kubenswrapper[4606]: E1212 00:25:51.147023 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-12 00:25:51.647011309 +0000 UTC m=+142.192364175 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t8nn7" (UID: "40292f84-e865-4368-9e37-e385dfcb5880") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 12 00:25:51 crc kubenswrapper[4606]: I1212 00:25:51.191869 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-hhzb9" event={"ID":"e9e0df34-ce0f-47c9-b74a-ef5e3a3f0d77","Type":"ContainerStarted","Data":"a15d237c30071237440a6134089090accb957ec9500882bab07910f71528a480"} Dec 12 00:25:51 crc kubenswrapper[4606]: I1212 00:25:51.209643 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vz9ft" event={"ID":"92300adf-2095-4cf3-901b-d17a9ab4deb5","Type":"ContainerStarted","Data":"8fa40cf16b443888f1e6cf8cddb45373b8952c530ff7115a0aded46bd06c524f"} Dec 12 00:25:51 crc kubenswrapper[4606]: I1212 00:25:51.214863 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-snpww" event={"ID":"24b34ccb-e494-493e-98a9-31cf59981c38","Type":"ContainerStarted","Data":"517a7c1a2fb46f1a1582d84b3d7635756b5e2c4131301e40c7518f4089cac088"} Dec 12 00:25:51 crc kubenswrapper[4606]: I1212 00:25:51.231712 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-898cq" event={"ID":"cfc4a02d-08c4-4861-b768-6e85a15b3897","Type":"ContainerStarted","Data":"4a5ef9152545866b1b0155c0614a00783e9c77874e5f1b32f7dadd9dc506412a"} Dec 12 00:25:51 crc kubenswrapper[4606]: I1212 00:25:51.231755 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-898cq" event={"ID":"cfc4a02d-08c4-4861-b768-6e85a15b3897","Type":"ContainerStarted","Data":"99f6d0101f7ec4d36a44244cd87a42a32de416b6efcd785e32b79d7541104be0"} Dec 12 00:25:51 crc kubenswrapper[4606]: I1212 00:25:51.257070 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 12 00:25:51 crc kubenswrapper[4606]: E1212 00:25:51.257426 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-12 00:25:51.75740189 +0000 UTC m=+142.302754756 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 12 00:25:51 crc kubenswrapper[4606]: I1212 00:25:51.281415 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8ltmf" event={"ID":"e107c29e-3757-44e7-ad54-223801e19085","Type":"ContainerStarted","Data":"4626810819f6c67f919d23329127f715587e28af8ab70480e60d696c59e2b526"} Dec 12 00:25:51 crc kubenswrapper[4606]: I1212 00:25:51.282214 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8ltmf" Dec 12 00:25:51 crc kubenswrapper[4606]: I1212 00:25:51.283717 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-c86kb" event={"ID":"4dc94fa6-9193-4aff-94d5-3ab846094763","Type":"ContainerStarted","Data":"ffc43994f5b91e07413af03a55d8221be285d6f875026aee3fbdeae6058ce6dd"} Dec 12 00:25:51 crc kubenswrapper[4606]: I1212 00:25:51.285061 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-q79c6" event={"ID":"77014cd9-7c35-4271-b875-a1f2b2a712d1","Type":"ContainerStarted","Data":"2166c67c6cd78e9697be7f94eb3a4ddf4fd1bb4996ec94eeecd780430e17d46a"} Dec 12 00:25:51 crc kubenswrapper[4606]: I1212 00:25:51.286860 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29424960-bh2l7" event={"ID":"148f1f7a-b994-4984-a900-18e9d5868002","Type":"ContainerStarted","Data":"f0f374be38c7614b20aa284d3ad950f1cb7ccd72c89a2c8181f7c2ab0c3b634d"} Dec 12 00:25:51 crc kubenswrapper[4606]: I1212 00:25:51.288350 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dmjzw" event={"ID":"a76fe324-756f-48ce-8e78-9ee4b4933b96","Type":"ContainerStarted","Data":"77d0bea3aafb0b6424b4ccbb915c972ce98b08d2915af22a73d8f474d586e4cd"} Dec 12 00:25:51 crc kubenswrapper[4606]: I1212 00:25:51.289576 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-dgfmw" event={"ID":"079b1c50-eaa5-4be5-a0d2-0015a67a1875","Type":"ContainerStarted","Data":"61a62b7d118fee6eb82dc1f3fb5eafc71f04f337afad6c27d4c7647b0068dfce"} Dec 12 00:25:51 crc kubenswrapper[4606]: I1212 00:25:51.290266 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-dgfmw" Dec 12 00:25:51 crc kubenswrapper[4606]: I1212 00:25:51.298648 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-46kcq" event={"ID":"3487af3b-5a9b-4e8f-8647-e800938cb74d","Type":"ContainerStarted","Data":"cc7ee92e5780d6472068b5e0e5bc4a2a51b8044370847fed68818206a4bb1269"} Dec 12 00:25:51 crc kubenswrapper[4606]: I1212 00:25:51.299573 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-46kcq" Dec 12 00:25:51 crc kubenswrapper[4606]: I1212 00:25:51.301066 4606 generic.go:334] "Generic (PLEG): container finished" podID="84d78ffd-976a-4d55-9b6a-d10369b35718" containerID="9c704a4344eb0abc136a628aa1cf406b965877a290fdbb4e7bf0ff51fd7bf70e" exitCode=0 Dec 12 00:25:51 crc kubenswrapper[4606]: I1212 00:25:51.301147 4606 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-dgfmw container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.17:6443/healthz\": dial tcp 10.217.0.17:6443: connect: connection refused" start-of-body= Dec 12 00:25:51 crc kubenswrapper[4606]: I1212 00:25:51.301242 4606 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-dgfmw" podUID="079b1c50-eaa5-4be5-a0d2-0015a67a1875" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.17:6443/healthz\": dial tcp 10.217.0.17:6443: connect: connection refused" Dec 12 00:25:51 crc kubenswrapper[4606]: I1212 00:25:51.301454 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-frwjc" event={"ID":"84d78ffd-976a-4d55-9b6a-d10369b35718","Type":"ContainerDied","Data":"9c704a4344eb0abc136a628aa1cf406b965877a290fdbb4e7bf0ff51fd7bf70e"} Dec 12 00:25:51 crc kubenswrapper[4606]: I1212 00:25:51.311558 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4hwhh" event={"ID":"e2af8a85-751c-409a-908e-1334eb3e8e42","Type":"ContainerStarted","Data":"57e0e39658a395230a79f71cdb8f15a3a7fdc01cbd9c959e2a22a280dd0c464e"} Dec 12 00:25:51 crc kubenswrapper[4606]: I1212 00:25:51.313499 4606 generic.go:334] "Generic (PLEG): container finished" podID="b993a003-2c7c-484b-a44b-17f07bdf6784" containerID="f9c787bb720381d74959b70991a5ca03edd1ab970b3b197f1f49169019e3aabe" exitCode=0 Dec 12 00:25:51 crc kubenswrapper[4606]: I1212 00:25:51.313598 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n9mcs" event={"ID":"b993a003-2c7c-484b-a44b-17f07bdf6784","Type":"ContainerDied","Data":"f9c787bb720381d74959b70991a5ca03edd1ab970b3b197f1f49169019e3aabe"} Dec 12 00:25:51 crc kubenswrapper[4606]: I1212 00:25:51.316544 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-46kcq" Dec 12 00:25:51 crc kubenswrapper[4606]: I1212 00:25:51.320324 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-z6php" event={"ID":"073ca95c-7738-4217-99af-72c9335f0c30","Type":"ContainerStarted","Data":"e3259173530663877b813f43b3d5e7dfb82a4566bfd12d89e6b74c943705218c"} Dec 12 00:25:51 crc kubenswrapper[4606]: I1212 00:25:51.322677 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9gzg8" event={"ID":"d8869d3c-60d0-4a85-9b0f-84147ce018b5","Type":"ContainerStarted","Data":"c21eed60f286cfd5a323907b2b1a115472ab6a015543c6a9de62ce23d8371959"} Dec 12 00:25:51 crc kubenswrapper[4606]: I1212 00:25:51.343624 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29424975-jpm67" event={"ID":"9b7c71bd-1fac-494e-8407-ecedfa667fc7","Type":"ContainerStarted","Data":"9f0ab9482848c5388313654750598d5293cec0ba6fb453b8f8291468b8c5f1bd"} Dec 12 00:25:51 crc kubenswrapper[4606]: I1212 00:25:51.360489 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8nn7\" (UID: \"40292f84-e865-4368-9e37-e385dfcb5880\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8nn7" Dec 12 00:25:51 crc kubenswrapper[4606]: E1212 00:25:51.361743 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-12 00:25:51.861732032 +0000 UTC m=+142.407084898 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t8nn7" (UID: "40292f84-e865-4368-9e37-e385dfcb5880") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 12 00:25:51 crc kubenswrapper[4606]: I1212 00:25:51.384734 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vz9ft" podStartSLOduration=123.384714161 podStartE2EDuration="2m3.384714161s" podCreationTimestamp="2025-12-12 00:23:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:25:51.384185736 +0000 UTC m=+141.929538602" watchObservedRunningTime="2025-12-12 00:25:51.384714161 +0000 UTC m=+141.930067037" Dec 12 00:25:51 crc kubenswrapper[4606]: I1212 00:25:51.395815 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-f6l62" event={"ID":"6fb93471-e75b-43b2-a4e2-d36bfc617930","Type":"ContainerStarted","Data":"17e7b0aacff5bc9f2bdd3a6585748a13341297a2ae062bdaf80d01b1c90f033b"} Dec 12 00:25:51 crc kubenswrapper[4606]: I1212 00:25:51.446035 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-4p7d9" event={"ID":"848f49c7-d7b8-4490-9956-4014339c4a31","Type":"ContainerStarted","Data":"becfccd74e80dc6fc679b092a394c7657821b34436df35088f289d31b2aa605b"} Dec 12 00:25:51 crc kubenswrapper[4606]: I1212 00:25:51.452326 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-pvw8f" event={"ID":"b7cea121-b721-4fa7-9b48-057abf4048af","Type":"ContainerStarted","Data":"dd7fdd7e1e95cb9b284ce1c887fc133b439b6ecaa01a5b8011857831a597b88b"} Dec 12 00:25:51 crc kubenswrapper[4606]: I1212 00:25:51.454706 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-9vfqp" event={"ID":"0ef842ad-7b0b-4e92-bfb2-23306b0f85f9","Type":"ContainerStarted","Data":"4d3e10598e1cd34bcd29f7451d778d1386d52e1c525ebb0ba32aad70b87080d3"} Dec 12 00:25:51 crc kubenswrapper[4606]: I1212 00:25:51.455961 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4fdkr" event={"ID":"54e9b065-dbcb-4238-8da6-f36ae3e18dde","Type":"ContainerStarted","Data":"9010e71dfd0436a83db0c26f71915338f88354dd24a2f74bc5c4943149098179"} Dec 12 00:25:51 crc kubenswrapper[4606]: I1212 00:25:51.464728 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 12 00:25:51 crc kubenswrapper[4606]: E1212 00:25:51.466535 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-12 00:25:51.966514816 +0000 UTC m=+142.511867682 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 12 00:25:51 crc kubenswrapper[4606]: I1212 00:25:51.478627 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6h2gf" event={"ID":"e829b744-8b79-4f65-8783-ea555f280ce8","Type":"ContainerStarted","Data":"09ebc03faa60dfcb6f923a2d75bc563b0142e6a469371c3e31ee4a01a4fdaf1b"} Dec 12 00:25:51 crc kubenswrapper[4606]: I1212 00:25:51.491746 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-l5pmq" event={"ID":"ae2ce179-dd9a-4a2d-8f4c-e35424c12f94","Type":"ContainerStarted","Data":"738e9fc56c61e1b455a170de74b710e363252773eb7231f79eda9da3385f66d3"} Dec 12 00:25:51 crc kubenswrapper[4606]: I1212 00:25:51.535754 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-vlq68" event={"ID":"5635c63d-bd71-4b80-b111-0fd9ff2cd053","Type":"ContainerStarted","Data":"437b5bbbda746ca660662d581a945900fdeb0b8ce45c76da9a36cbebb7320422"} Dec 12 00:25:51 crc kubenswrapper[4606]: I1212 00:25:51.536455 4606 patch_prober.go:28] interesting pod/downloads-7954f5f757-vlq68 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" start-of-body= Dec 12 00:25:51 crc kubenswrapper[4606]: I1212 00:25:51.536514 4606 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-vlq68" podUID="5635c63d-bd71-4b80-b111-0fd9ff2cd053" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" Dec 12 00:25:51 crc kubenswrapper[4606]: I1212 00:25:51.570831 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8nn7\" (UID: \"40292f84-e865-4368-9e37-e385dfcb5880\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8nn7" Dec 12 00:25:51 crc kubenswrapper[4606]: E1212 00:25:51.571283 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-12 00:25:52.07126855 +0000 UTC m=+142.616621416 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t8nn7" (UID: "40292f84-e865-4368-9e37-e385dfcb5880") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 12 00:25:51 crc kubenswrapper[4606]: I1212 00:25:51.578452 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-544fr" event={"ID":"d5b97b3b-8994-4d7f-a165-c04d13546e89","Type":"ContainerStarted","Data":"edd206a36798d813911a2fd408e5191bffe6745cf0f3e5c6e01da4b3757da49a"} Dec 12 00:25:51 crc kubenswrapper[4606]: I1212 00:25:51.579792 4606 patch_prober.go:28] interesting pod/router-default-5444994796-64s9l container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 12 00:25:51 crc kubenswrapper[4606]: [-]has-synced failed: reason withheld Dec 12 00:25:51 crc kubenswrapper[4606]: [+]process-running ok Dec 12 00:25:51 crc kubenswrapper[4606]: healthz check failed Dec 12 00:25:51 crc kubenswrapper[4606]: I1212 00:25:51.579867 4606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-64s9l" podUID="21420052-2e90-4be9-923e-2b8d0d5ad189" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 12 00:25:51 crc kubenswrapper[4606]: I1212 00:25:51.580049 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-544fr" Dec 12 00:25:51 crc kubenswrapper[4606]: I1212 00:25:51.580109 4606 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-544fr container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.29:8080/healthz\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Dec 12 00:25:51 crc kubenswrapper[4606]: I1212 00:25:51.580134 4606 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-544fr" podUID="d5b97b3b-8994-4d7f-a165-c04d13546e89" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.29:8080/healthz\": dial tcp 10.217.0.29:8080: connect: connection refused" Dec 12 00:25:51 crc kubenswrapper[4606]: I1212 00:25:51.596287 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4hwhh" podStartSLOduration=122.596271766 podStartE2EDuration="2m2.596271766s" podCreationTimestamp="2025-12-12 00:23:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:25:51.594670141 +0000 UTC m=+142.140023017" watchObservedRunningTime="2025-12-12 00:25:51.596271766 +0000 UTC m=+142.141624632" Dec 12 00:25:51 crc kubenswrapper[4606]: I1212 00:25:51.612387 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-ffwdl" event={"ID":"f6960858-8b4e-4855-b52a-caa021444b7d","Type":"ContainerStarted","Data":"b6bda5e9d6b3e30f009a5c3f4134a8990f06629f6d4c23b8fefd032f76302bce"} Dec 12 00:25:51 crc kubenswrapper[4606]: I1212 00:25:51.648484 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-mbfr9" event={"ID":"e3e167ac-8428-49a3-899c-a2290fb8f78f","Type":"ContainerStarted","Data":"9f7b7555ec63eaee68e916f40b207f34386a70920809545bb08d432a0cf33d2d"} Dec 12 00:25:51 crc kubenswrapper[4606]: I1212 00:25:51.664335 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-dgfmw" podStartSLOduration=123.664317708 podStartE2EDuration="2m3.664317708s" podCreationTimestamp="2025-12-12 00:23:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:25:51.662071796 +0000 UTC m=+142.207424682" watchObservedRunningTime="2025-12-12 00:25:51.664317708 +0000 UTC m=+142.209670574" Dec 12 00:25:51 crc kubenswrapper[4606]: I1212 00:25:51.665783 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-h8tjq" event={"ID":"e836a0be-41d7-4d9d-9b59-d1db42826d1b","Type":"ContainerStarted","Data":"e47ea634a541555b4bde990e9475dec6c3d50e872839f3c785768604d50d39ad"} Dec 12 00:25:51 crc kubenswrapper[4606]: I1212 00:25:51.672016 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 12 00:25:51 crc kubenswrapper[4606]: E1212 00:25:51.673339 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-12 00:25:52.173316708 +0000 UTC m=+142.718669574 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 12 00:25:51 crc kubenswrapper[4606]: I1212 00:25:51.674872 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-52vsx" event={"ID":"ef54a55c-b8cc-405b-a5a6-ff9118635c9d","Type":"ContainerStarted","Data":"6cbb2800a781118a59d6051b22655cc531017cc7330aaf0985e5350abaf1c419"} Dec 12 00:25:51 crc kubenswrapper[4606]: I1212 00:25:51.764444 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-898cq" podStartSLOduration=8.764421193 podStartE2EDuration="8.764421193s" podCreationTimestamp="2025-12-12 00:25:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:25:51.756269726 +0000 UTC m=+142.301622602" watchObservedRunningTime="2025-12-12 00:25:51.764421193 +0000 UTC m=+142.309774059" Dec 12 00:25:51 crc kubenswrapper[4606]: I1212 00:25:51.781053 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8nn7\" (UID: \"40292f84-e865-4368-9e37-e385dfcb5880\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8nn7" Dec 12 00:25:51 crc kubenswrapper[4606]: E1212 00:25:51.783862 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-12 00:25:52.283847993 +0000 UTC m=+142.829200859 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t8nn7" (UID: "40292f84-e865-4368-9e37-e385dfcb5880") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 12 00:25:51 crc kubenswrapper[4606]: I1212 00:25:51.829668 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-46kcq" podStartSLOduration=122.829650737 podStartE2EDuration="2m2.829650737s" podCreationTimestamp="2025-12-12 00:23:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:25:51.829456952 +0000 UTC m=+142.374809808" watchObservedRunningTime="2025-12-12 00:25:51.829650737 +0000 UTC m=+142.375003623" Dec 12 00:25:51 crc kubenswrapper[4606]: I1212 00:25:51.882718 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 12 00:25:51 crc kubenswrapper[4606]: E1212 00:25:51.883051 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-12 00:25:52.383036602 +0000 UTC m=+142.928389468 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 12 00:25:51 crc kubenswrapper[4606]: I1212 00:25:51.965940 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-q79c6" podStartSLOduration=122.965923797 podStartE2EDuration="2m2.965923797s" podCreationTimestamp="2025-12-12 00:23:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:25:51.893736509 +0000 UTC m=+142.439089375" watchObservedRunningTime="2025-12-12 00:25:51.965923797 +0000 UTC m=+142.511276663" Dec 12 00:25:51 crc kubenswrapper[4606]: I1212 00:25:51.992608 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8nn7\" (UID: \"40292f84-e865-4368-9e37-e385dfcb5880\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8nn7" Dec 12 00:25:51 crc kubenswrapper[4606]: E1212 00:25:51.992939 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-12 00:25:52.492924098 +0000 UTC m=+143.038276964 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t8nn7" (UID: "40292f84-e865-4368-9e37-e385dfcb5880") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 12 00:25:52 crc kubenswrapper[4606]: I1212 00:25:52.094678 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 12 00:25:52 crc kubenswrapper[4606]: E1212 00:25:52.095208 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-12 00:25:52.595193953 +0000 UTC m=+143.140546819 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 12 00:25:52 crc kubenswrapper[4606]: I1212 00:25:52.124721 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8ltmf" podStartSLOduration=123.124705024 podStartE2EDuration="2m3.124705024s" podCreationTimestamp="2025-12-12 00:23:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:25:52.040396439 +0000 UTC m=+142.585749315" watchObservedRunningTime="2025-12-12 00:25:52.124705024 +0000 UTC m=+142.670057890" Dec 12 00:25:52 crc kubenswrapper[4606]: I1212 00:25:52.125128 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-ffwdl" podStartSLOduration=123.125124305 podStartE2EDuration="2m3.125124305s" podCreationTimestamp="2025-12-12 00:23:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:25:52.124300753 +0000 UTC m=+142.669653619" watchObservedRunningTime="2025-12-12 00:25:52.125124305 +0000 UTC m=+142.670477171" Dec 12 00:25:52 crc kubenswrapper[4606]: I1212 00:25:52.172376 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-4p7d9" podStartSLOduration=123.172353658 podStartE2EDuration="2m3.172353658s" podCreationTimestamp="2025-12-12 00:23:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:25:52.169599042 +0000 UTC m=+142.714951908" watchObservedRunningTime="2025-12-12 00:25:52.172353658 +0000 UTC m=+142.717706524" Dec 12 00:25:52 crc kubenswrapper[4606]: I1212 00:25:52.199222 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8nn7\" (UID: \"40292f84-e865-4368-9e37-e385dfcb5880\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8nn7" Dec 12 00:25:52 crc kubenswrapper[4606]: E1212 00:25:52.199581 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-12 00:25:52.699566945 +0000 UTC m=+143.244919811 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t8nn7" (UID: "40292f84-e865-4368-9e37-e385dfcb5880") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 12 00:25:52 crc kubenswrapper[4606]: I1212 00:25:52.254978 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-l5pmq" podStartSLOduration=123.254962006 podStartE2EDuration="2m3.254962006s" podCreationTimestamp="2025-12-12 00:23:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:25:52.212505035 +0000 UTC m=+142.757857901" watchObservedRunningTime="2025-12-12 00:25:52.254962006 +0000 UTC m=+142.800314872" Dec 12 00:25:52 crc kubenswrapper[4606]: I1212 00:25:52.256877 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6h2gf" podStartSLOduration=123.256870169 podStartE2EDuration="2m3.256870169s" podCreationTimestamp="2025-12-12 00:23:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:25:52.254555645 +0000 UTC m=+142.799908511" watchObservedRunningTime="2025-12-12 00:25:52.256870169 +0000 UTC m=+142.802223035" Dec 12 00:25:52 crc kubenswrapper[4606]: I1212 00:25:52.287276 4606 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-8ltmf container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.37:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 12 00:25:52 crc kubenswrapper[4606]: I1212 00:25:52.287330 4606 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8ltmf" podUID="e107c29e-3757-44e7-ad54-223801e19085" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.37:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 12 00:25:52 crc kubenswrapper[4606]: I1212 00:25:52.306618 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4fdkr" podStartSLOduration=123.306601822 podStartE2EDuration="2m3.306601822s" podCreationTimestamp="2025-12-12 00:23:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:25:52.304409661 +0000 UTC m=+142.849762527" watchObservedRunningTime="2025-12-12 00:25:52.306601822 +0000 UTC m=+142.851954688" Dec 12 00:25:52 crc kubenswrapper[4606]: I1212 00:25:52.306831 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 12 00:25:52 crc kubenswrapper[4606]: E1212 00:25:52.307229 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-12 00:25:52.807215149 +0000 UTC m=+143.352568015 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 12 00:25:52 crc kubenswrapper[4606]: I1212 00:25:52.408750 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8nn7\" (UID: \"40292f84-e865-4368-9e37-e385dfcb5880\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8nn7" Dec 12 00:25:52 crc kubenswrapper[4606]: E1212 00:25:52.409265 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-12 00:25:52.909254028 +0000 UTC m=+143.454606884 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t8nn7" (UID: "40292f84-e865-4368-9e37-e385dfcb5880") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 12 00:25:52 crc kubenswrapper[4606]: I1212 00:25:52.410149 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-52vsx" podStartSLOduration=123.410139322 podStartE2EDuration="2m3.410139322s" podCreationTimestamp="2025-12-12 00:23:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:25:52.405600776 +0000 UTC m=+142.950953642" watchObservedRunningTime="2025-12-12 00:25:52.410139322 +0000 UTC m=+142.955492188" Dec 12 00:25:52 crc kubenswrapper[4606]: I1212 00:25:52.448872 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-544fr" podStartSLOduration=123.448854619 podStartE2EDuration="2m3.448854619s" podCreationTimestamp="2025-12-12 00:23:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:25:52.446136053 +0000 UTC m=+142.991488919" watchObservedRunningTime="2025-12-12 00:25:52.448854619 +0000 UTC m=+142.994207485" Dec 12 00:25:52 crc kubenswrapper[4606]: I1212 00:25:52.481048 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-mbfr9" podStartSLOduration=123.481031654 podStartE2EDuration="2m3.481031654s" podCreationTimestamp="2025-12-12 00:23:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:25:52.480748416 +0000 UTC m=+143.026101292" watchObservedRunningTime="2025-12-12 00:25:52.481031654 +0000 UTC m=+143.026384520" Dec 12 00:25:52 crc kubenswrapper[4606]: I1212 00:25:52.513644 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 12 00:25:52 crc kubenswrapper[4606]: E1212 00:25:52.514083 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-12 00:25:53.014068903 +0000 UTC m=+143.559421769 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 12 00:25:52 crc kubenswrapper[4606]: I1212 00:25:52.548832 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-pvw8f" podStartSLOduration=9.548814849 podStartE2EDuration="9.548814849s" podCreationTimestamp="2025-12-12 00:25:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:25:52.546373752 +0000 UTC m=+143.091726618" watchObservedRunningTime="2025-12-12 00:25:52.548814849 +0000 UTC m=+143.094167715" Dec 12 00:25:52 crc kubenswrapper[4606]: I1212 00:25:52.572367 4606 patch_prober.go:28] interesting pod/router-default-5444994796-64s9l container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 12 00:25:52 crc kubenswrapper[4606]: [-]has-synced failed: reason withheld Dec 12 00:25:52 crc kubenswrapper[4606]: [+]process-running ok Dec 12 00:25:52 crc kubenswrapper[4606]: healthz check failed Dec 12 00:25:52 crc kubenswrapper[4606]: I1212 00:25:52.572432 4606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-64s9l" podUID="21420052-2e90-4be9-923e-2b8d0d5ad189" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 12 00:25:52 crc kubenswrapper[4606]: I1212 00:25:52.615213 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8nn7\" (UID: \"40292f84-e865-4368-9e37-e385dfcb5880\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8nn7" Dec 12 00:25:52 crc kubenswrapper[4606]: E1212 00:25:52.615557 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-12 00:25:53.115543845 +0000 UTC m=+143.660896711 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t8nn7" (UID: "40292f84-e865-4368-9e37-e385dfcb5880") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 12 00:25:52 crc kubenswrapper[4606]: I1212 00:25:52.688029 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-f6l62" event={"ID":"6fb93471-e75b-43b2-a4e2-d36bfc617930","Type":"ContainerStarted","Data":"cea8cdc876c88f517e9292fd3063721586f8cb10835f7f7f5acbea275622b879"} Dec 12 00:25:52 crc kubenswrapper[4606]: I1212 00:25:52.692312 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-snpww" event={"ID":"24b34ccb-e494-493e-98a9-31cf59981c38","Type":"ContainerStarted","Data":"c840510bc618b15fd2306c20e976c501e94c26ebcbb08b680aa5eea5131f6737"} Dec 12 00:25:52 crc kubenswrapper[4606]: I1212 00:25:52.716725 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 12 00:25:52 crc kubenswrapper[4606]: E1212 00:25:52.716952 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-12 00:25:53.216912695 +0000 UTC m=+143.762265561 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 12 00:25:52 crc kubenswrapper[4606]: I1212 00:25:52.717706 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8nn7\" (UID: \"40292f84-e865-4368-9e37-e385dfcb5880\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8nn7" Dec 12 00:25:52 crc kubenswrapper[4606]: E1212 00:25:52.718384 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-12 00:25:53.218211391 +0000 UTC m=+143.763564247 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t8nn7" (UID: "40292f84-e865-4368-9e37-e385dfcb5880") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 12 00:25:52 crc kubenswrapper[4606]: I1212 00:25:52.719355 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-mbfr9" event={"ID":"e3e167ac-8428-49a3-899c-a2290fb8f78f","Type":"ContainerStarted","Data":"b7a2f96ec5bcf1bbda3b041c8e42b5bbe1fe86b00597f60b4e23c01d65a00067"} Dec 12 00:25:52 crc kubenswrapper[4606]: I1212 00:25:52.732357 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-2drsc" event={"ID":"5c84db9e-ba60-4723-95f8-887b16eaa84a","Type":"ContainerStarted","Data":"d136a132087b672d265d89e320a70ddfc9c713b8a072189687965c876184efda"} Dec 12 00:25:52 crc kubenswrapper[4606]: I1212 00:25:52.732408 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-2drsc" event={"ID":"5c84db9e-ba60-4723-95f8-887b16eaa84a","Type":"ContainerStarted","Data":"b51bf9a2f5bff3ee9c93d2955f327e408e425797d286415b59f0d8b330800b20"} Dec 12 00:25:52 crc kubenswrapper[4606]: I1212 00:25:52.737027 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-4p7d9" event={"ID":"848f49c7-d7b8-4490-9956-4014339c4a31","Type":"ContainerStarted","Data":"58bf66edb8a27ff739e4a359c1590a300face345efdeff1e41a5a5c226dbc03c"} Dec 12 00:25:52 crc kubenswrapper[4606]: I1212 00:25:52.743746 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n9mcs" event={"ID":"b993a003-2c7c-484b-a44b-17f07bdf6784","Type":"ContainerStarted","Data":"cb694be2fe097bfafba222ce53686269e08644739cb1e513a3768d01d303ce32"} Dec 12 00:25:52 crc kubenswrapper[4606]: I1212 00:25:52.749789 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29424975-jpm67" event={"ID":"9b7c71bd-1fac-494e-8407-ecedfa667fc7","Type":"ContainerStarted","Data":"f3acce94c709bfd6e56ca940f2b563c56c62902741e670a2d5dd229d699dbb46"} Dec 12 00:25:52 crc kubenswrapper[4606]: I1212 00:25:52.754384 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-h8tjq" event={"ID":"e836a0be-41d7-4d9d-9b59-d1db42826d1b","Type":"ContainerStarted","Data":"fad8d32a272573c3c2be844e844db1d539fd5dd80b574ccc1db7ede863b25d89"} Dec 12 00:25:52 crc kubenswrapper[4606]: I1212 00:25:52.762480 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6h2gf" event={"ID":"e829b744-8b79-4f65-8783-ea555f280ce8","Type":"ContainerStarted","Data":"63240d775d2ee713bdf7e20ec29d6087ff76f27972ede9ce8f4c3c8ba3631c88"} Dec 12 00:25:52 crc kubenswrapper[4606]: I1212 00:25:52.762702 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-f6l62" podStartSLOduration=124.762685518 podStartE2EDuration="2m4.762685518s" podCreationTimestamp="2025-12-12 00:23:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:25:52.759386977 +0000 UTC m=+143.304739843" watchObservedRunningTime="2025-12-12 00:25:52.762685518 +0000 UTC m=+143.308038384" Dec 12 00:25:52 crc kubenswrapper[4606]: I1212 00:25:52.768059 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-hhzb9" event={"ID":"e9e0df34-ce0f-47c9-b74a-ef5e3a3f0d77","Type":"ContainerStarted","Data":"eebb4ae19be2f3c8ed02542eb911d66d6f8a39546ee56dafa3353f874878f47e"} Dec 12 00:25:52 crc kubenswrapper[4606]: I1212 00:25:52.776717 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-9vfqp" event={"ID":"0ef842ad-7b0b-4e92-bfb2-23306b0f85f9","Type":"ContainerStarted","Data":"0be0ed408cdf8fcd21872b0d4d2dc1b3772c10c7df4878a2d604cc578fe6ab15"} Dec 12 00:25:52 crc kubenswrapper[4606]: I1212 00:25:52.782443 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-frwjc" event={"ID":"84d78ffd-976a-4d55-9b6a-d10369b35718","Type":"ContainerStarted","Data":"8b29dc03d75470996964505c89de11f8528511543dc8cac547ab217e640ee5da"} Dec 12 00:25:52 crc kubenswrapper[4606]: I1212 00:25:52.782926 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-frwjc" Dec 12 00:25:52 crc kubenswrapper[4606]: I1212 00:25:52.791471 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-z6php" event={"ID":"073ca95c-7738-4217-99af-72c9335f0c30","Type":"ContainerStarted","Data":"4a29cb0f52dad59f36f2cec43b26615d57232bc45f62ac248033532a748cea55"} Dec 12 00:25:52 crc kubenswrapper[4606]: I1212 00:25:52.791518 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-z6php" event={"ID":"073ca95c-7738-4217-99af-72c9335f0c30","Type":"ContainerStarted","Data":"a27f5c31ae152e6e451c1fa7ff4b9cc8c1449af269f4eeebe0537911fb583e6d"} Dec 12 00:25:52 crc kubenswrapper[4606]: I1212 00:25:52.791654 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-z6php" Dec 12 00:25:52 crc kubenswrapper[4606]: I1212 00:25:52.794686 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-52vsx" event={"ID":"ef54a55c-b8cc-405b-a5a6-ff9118635c9d","Type":"ContainerStarted","Data":"7255c1869cffdb74a168db74ec09d0399d18fe68e209de63a9fc9bd6e9bf2546"} Dec 12 00:25:52 crc kubenswrapper[4606]: I1212 00:25:52.795570 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-52vsx" Dec 12 00:25:52 crc kubenswrapper[4606]: I1212 00:25:52.797226 4606 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-52vsx container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.41:8443/healthz\": dial tcp 10.217.0.41:8443: connect: connection refused" start-of-body= Dec 12 00:25:52 crc kubenswrapper[4606]: I1212 00:25:52.797260 4606 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-52vsx" podUID="ef54a55c-b8cc-405b-a5a6-ff9118635c9d" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.41:8443/healthz\": dial tcp 10.217.0.41:8443: connect: connection refused" Dec 12 00:25:52 crc kubenswrapper[4606]: I1212 00:25:52.799248 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-q79c6" event={"ID":"77014cd9-7c35-4271-b875-a1f2b2a712d1","Type":"ContainerStarted","Data":"81444eced8848cb3bc95e1b19a3d25d41c95bfe414a511333b9c3bf50fd87bf9"} Dec 12 00:25:52 crc kubenswrapper[4606]: I1212 00:25:52.801462 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dmjzw" event={"ID":"a76fe324-756f-48ce-8e78-9ee4b4933b96","Type":"ContainerStarted","Data":"afea8ac9001ef3c79071ecb53ac9e5109a039c417057968c8b77e6f9c8dc8e83"} Dec 12 00:25:52 crc kubenswrapper[4606]: I1212 00:25:52.805017 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-c86kb" event={"ID":"4dc94fa6-9193-4aff-94d5-3ab846094763","Type":"ContainerStarted","Data":"1a8cf7dfa5796cf46e323449b61631543e3f558e513a39e5fe87d7bbf05148eb"} Dec 12 00:25:52 crc kubenswrapper[4606]: I1212 00:25:52.805165 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-c86kb" Dec 12 00:25:52 crc kubenswrapper[4606]: I1212 00:25:52.805290 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-c86kb" event={"ID":"4dc94fa6-9193-4aff-94d5-3ab846094763","Type":"ContainerStarted","Data":"5c3fdc7734bf51b51c523c3550d6c703968a7f6bdaf8566af4039bbb8bec776b"} Dec 12 00:25:52 crc kubenswrapper[4606]: I1212 00:25:52.807959 4606 patch_prober.go:28] interesting pod/downloads-7954f5f757-vlq68 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" start-of-body= Dec 12 00:25:52 crc kubenswrapper[4606]: I1212 00:25:52.808003 4606 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-vlq68" podUID="5635c63d-bd71-4b80-b111-0fd9ff2cd053" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" Dec 12 00:25:52 crc kubenswrapper[4606]: I1212 00:25:52.808560 4606 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-544fr container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.29:8080/healthz\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Dec 12 00:25:52 crc kubenswrapper[4606]: I1212 00:25:52.808587 4606 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-544fr" podUID="d5b97b3b-8994-4d7f-a165-c04d13546e89" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.29:8080/healthz\": dial tcp 10.217.0.29:8080: connect: connection refused" Dec 12 00:25:52 crc kubenswrapper[4606]: I1212 00:25:52.820089 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 12 00:25:52 crc kubenswrapper[4606]: E1212 00:25:52.820219 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-12 00:25:53.320196348 +0000 UTC m=+143.865549214 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 12 00:25:52 crc kubenswrapper[4606]: I1212 00:25:52.820525 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8nn7\" (UID: \"40292f84-e865-4368-9e37-e385dfcb5880\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8nn7" Dec 12 00:25:52 crc kubenswrapper[4606]: E1212 00:25:52.823404 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-12 00:25:53.323371716 +0000 UTC m=+143.868724582 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t8nn7" (UID: "40292f84-e865-4368-9e37-e385dfcb5880") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 12 00:25:52 crc kubenswrapper[4606]: I1212 00:25:52.849814 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-snpww" podStartSLOduration=123.849798731 podStartE2EDuration="2m3.849798731s" podCreationTimestamp="2025-12-12 00:23:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:25:52.819856788 +0000 UTC m=+143.365209664" watchObservedRunningTime="2025-12-12 00:25:52.849798731 +0000 UTC m=+143.395151597" Dec 12 00:25:52 crc kubenswrapper[4606]: I1212 00:25:52.901075 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29424975-jpm67" podStartSLOduration=124.901057587 podStartE2EDuration="2m4.901057587s" podCreationTimestamp="2025-12-12 00:23:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:25:52.85837317 +0000 UTC m=+143.403726036" watchObservedRunningTime="2025-12-12 00:25:52.901057587 +0000 UTC m=+143.446410453" Dec 12 00:25:52 crc kubenswrapper[4606]: E1212 00:25:52.921527 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-12 00:25:53.421501386 +0000 UTC m=+143.966854252 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 12 00:25:52 crc kubenswrapper[4606]: I1212 00:25:52.921549 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 12 00:25:52 crc kubenswrapper[4606]: I1212 00:25:52.921811 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8nn7\" (UID: \"40292f84-e865-4368-9e37-e385dfcb5880\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8nn7" Dec 12 00:25:52 crc kubenswrapper[4606]: E1212 00:25:52.926620 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-12 00:25:53.426604298 +0000 UTC m=+143.971957164 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t8nn7" (UID: "40292f84-e865-4368-9e37-e385dfcb5880") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 12 00:25:52 crc kubenswrapper[4606]: I1212 00:25:52.951882 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-2drsc" podStartSLOduration=123.95186553 podStartE2EDuration="2m3.95186553s" podCreationTimestamp="2025-12-12 00:23:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:25:52.950900283 +0000 UTC m=+143.496253149" watchObservedRunningTime="2025-12-12 00:25:52.95186553 +0000 UTC m=+143.497218386" Dec 12 00:25:52 crc kubenswrapper[4606]: I1212 00:25:52.952214 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n9mcs" podStartSLOduration=123.95221022 podStartE2EDuration="2m3.95221022s" podCreationTimestamp="2025-12-12 00:23:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:25:52.903784083 +0000 UTC m=+143.449136949" watchObservedRunningTime="2025-12-12 00:25:52.95221022 +0000 UTC m=+143.497563086" Dec 12 00:25:53 crc kubenswrapper[4606]: I1212 00:25:53.026000 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 12 00:25:53 crc kubenswrapper[4606]: E1212 00:25:53.026301 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-12 00:25:53.5262871 +0000 UTC m=+144.071639966 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 12 00:25:53 crc kubenswrapper[4606]: I1212 00:25:53.048656 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-frwjc" podStartSLOduration=124.048638992 podStartE2EDuration="2m4.048638992s" podCreationTimestamp="2025-12-12 00:23:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:25:53.034475338 +0000 UTC m=+143.579828204" watchObservedRunningTime="2025-12-12 00:25:53.048638992 +0000 UTC m=+143.593991858" Dec 12 00:25:53 crc kubenswrapper[4606]: I1212 00:25:53.049984 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-drkxp"] Dec 12 00:25:53 crc kubenswrapper[4606]: I1212 00:25:53.050862 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-drkxp" Dec 12 00:25:53 crc kubenswrapper[4606]: I1212 00:25:53.065825 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 12 00:25:53 crc kubenswrapper[4606]: I1212 00:25:53.113780 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-dmjzw" podStartSLOduration=124.113758803 podStartE2EDuration="2m4.113758803s" podCreationTimestamp="2025-12-12 00:23:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:25:53.096776031 +0000 UTC m=+143.642128897" watchObservedRunningTime="2025-12-12 00:25:53.113758803 +0000 UTC m=+143.659111659" Dec 12 00:25:53 crc kubenswrapper[4606]: I1212 00:25:53.114117 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-drkxp"] Dec 12 00:25:53 crc kubenswrapper[4606]: I1212 00:25:53.127839 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16aefbaf-f9b9-452e-84fa-a710b9284349-catalog-content\") pod \"certified-operators-drkxp\" (UID: \"16aefbaf-f9b9-452e-84fa-a710b9284349\") " pod="openshift-marketplace/certified-operators-drkxp" Dec 12 00:25:53 crc kubenswrapper[4606]: I1212 00:25:53.127920 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16aefbaf-f9b9-452e-84fa-a710b9284349-utilities\") pod \"certified-operators-drkxp\" (UID: \"16aefbaf-f9b9-452e-84fa-a710b9284349\") " pod="openshift-marketplace/certified-operators-drkxp" Dec 12 00:25:53 crc kubenswrapper[4606]: I1212 00:25:53.127950 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8nn7\" (UID: \"40292f84-e865-4368-9e37-e385dfcb5880\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8nn7" Dec 12 00:25:53 crc kubenswrapper[4606]: I1212 00:25:53.127980 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5btch\" (UniqueName: \"kubernetes.io/projected/16aefbaf-f9b9-452e-84fa-a710b9284349-kube-api-access-5btch\") pod \"certified-operators-drkxp\" (UID: \"16aefbaf-f9b9-452e-84fa-a710b9284349\") " pod="openshift-marketplace/certified-operators-drkxp" Dec 12 00:25:53 crc kubenswrapper[4606]: E1212 00:25:53.128317 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-12 00:25:53.628305558 +0000 UTC m=+144.173658424 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t8nn7" (UID: "40292f84-e865-4368-9e37-e385dfcb5880") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 12 00:25:53 crc kubenswrapper[4606]: I1212 00:25:53.160321 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-hhzb9" podStartSLOduration=124.160305858 podStartE2EDuration="2m4.160305858s" podCreationTimestamp="2025-12-12 00:23:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:25:53.157617293 +0000 UTC m=+143.702970159" watchObservedRunningTime="2025-12-12 00:25:53.160305858 +0000 UTC m=+143.705658714" Dec 12 00:25:53 crc kubenswrapper[4606]: I1212 00:25:53.203144 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-xg5pj"] Dec 12 00:25:53 crc kubenswrapper[4606]: I1212 00:25:53.204026 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xg5pj" Dec 12 00:25:53 crc kubenswrapper[4606]: W1212 00:25:53.212627 4606 reflector.go:561] object-"openshift-marketplace"/"community-operators-dockercfg-dmngl": failed to list *v1.Secret: secrets "community-operators-dockercfg-dmngl" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-marketplace": no relationship found between node 'crc' and this object Dec 12 00:25:53 crc kubenswrapper[4606]: E1212 00:25:53.212669 4606 reflector.go:158] "Unhandled Error" err="object-\"openshift-marketplace\"/\"community-operators-dockercfg-dmngl\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"community-operators-dockercfg-dmngl\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-marketplace\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 12 00:25:53 crc kubenswrapper[4606]: I1212 00:25:53.228994 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 12 00:25:53 crc kubenswrapper[4606]: I1212 00:25:53.229257 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16aefbaf-f9b9-452e-84fa-a710b9284349-catalog-content\") pod \"certified-operators-drkxp\" (UID: \"16aefbaf-f9b9-452e-84fa-a710b9284349\") " pod="openshift-marketplace/certified-operators-drkxp" Dec 12 00:25:53 crc kubenswrapper[4606]: I1212 00:25:53.229323 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16aefbaf-f9b9-452e-84fa-a710b9284349-utilities\") pod \"certified-operators-drkxp\" (UID: \"16aefbaf-f9b9-452e-84fa-a710b9284349\") " pod="openshift-marketplace/certified-operators-drkxp" Dec 12 00:25:53 crc kubenswrapper[4606]: I1212 00:25:53.229383 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5btch\" (UniqueName: \"kubernetes.io/projected/16aefbaf-f9b9-452e-84fa-a710b9284349-kube-api-access-5btch\") pod \"certified-operators-drkxp\" (UID: \"16aefbaf-f9b9-452e-84fa-a710b9284349\") " pod="openshift-marketplace/certified-operators-drkxp" Dec 12 00:25:53 crc kubenswrapper[4606]: E1212 00:25:53.229761 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-12 00:25:53.729745579 +0000 UTC m=+144.275098445 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 12 00:25:53 crc kubenswrapper[4606]: I1212 00:25:53.230101 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16aefbaf-f9b9-452e-84fa-a710b9284349-catalog-content\") pod \"certified-operators-drkxp\" (UID: \"16aefbaf-f9b9-452e-84fa-a710b9284349\") " pod="openshift-marketplace/certified-operators-drkxp" Dec 12 00:25:53 crc kubenswrapper[4606]: I1212 00:25:53.230319 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16aefbaf-f9b9-452e-84fa-a710b9284349-utilities\") pod \"certified-operators-drkxp\" (UID: \"16aefbaf-f9b9-452e-84fa-a710b9284349\") " pod="openshift-marketplace/certified-operators-drkxp" Dec 12 00:25:53 crc kubenswrapper[4606]: I1212 00:25:53.299350 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-9vfqp" podStartSLOduration=124.299329525 podStartE2EDuration="2m4.299329525s" podCreationTimestamp="2025-12-12 00:23:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:25:53.231698344 +0000 UTC m=+143.777051200" watchObservedRunningTime="2025-12-12 00:25:53.299329525 +0000 UTC m=+143.844682391" Dec 12 00:25:53 crc kubenswrapper[4606]: I1212 00:25:53.300639 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xg5pj"] Dec 12 00:25:53 crc kubenswrapper[4606]: I1212 00:25:53.318107 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5btch\" (UniqueName: \"kubernetes.io/projected/16aefbaf-f9b9-452e-84fa-a710b9284349-kube-api-access-5btch\") pod \"certified-operators-drkxp\" (UID: \"16aefbaf-f9b9-452e-84fa-a710b9284349\") " pod="openshift-marketplace/certified-operators-drkxp" Dec 12 00:25:53 crc kubenswrapper[4606]: I1212 00:25:53.330494 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34b189fb-cbc9-4ebe-bfa2-b8a4e1e0d4b4-catalog-content\") pod \"community-operators-xg5pj\" (UID: \"34b189fb-cbc9-4ebe-bfa2-b8a4e1e0d4b4\") " pod="openshift-marketplace/community-operators-xg5pj" Dec 12 00:25:53 crc kubenswrapper[4606]: I1212 00:25:53.330820 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drsn7\" (UniqueName: \"kubernetes.io/projected/34b189fb-cbc9-4ebe-bfa2-b8a4e1e0d4b4-kube-api-access-drsn7\") pod \"community-operators-xg5pj\" (UID: \"34b189fb-cbc9-4ebe-bfa2-b8a4e1e0d4b4\") " pod="openshift-marketplace/community-operators-xg5pj" Dec 12 00:25:53 crc kubenswrapper[4606]: I1212 00:25:53.330843 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34b189fb-cbc9-4ebe-bfa2-b8a4e1e0d4b4-utilities\") pod \"community-operators-xg5pj\" (UID: \"34b189fb-cbc9-4ebe-bfa2-b8a4e1e0d4b4\") " pod="openshift-marketplace/community-operators-xg5pj" Dec 12 00:25:53 crc kubenswrapper[4606]: I1212 00:25:53.330869 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8nn7\" (UID: \"40292f84-e865-4368-9e37-e385dfcb5880\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8nn7" Dec 12 00:25:53 crc kubenswrapper[4606]: E1212 00:25:53.331092 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-12 00:25:53.831082568 +0000 UTC m=+144.376435434 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t8nn7" (UID: "40292f84-e865-4368-9e37-e385dfcb5880") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 12 00:25:53 crc kubenswrapper[4606]: I1212 00:25:53.365449 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-c86kb" podStartSLOduration=124.365433884 podStartE2EDuration="2m4.365433884s" podCreationTimestamp="2025-12-12 00:23:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:25:53.36457052 +0000 UTC m=+143.909923386" watchObservedRunningTime="2025-12-12 00:25:53.365433884 +0000 UTC m=+143.910786740" Dec 12 00:25:53 crc kubenswrapper[4606]: I1212 00:25:53.367000 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-z6php" podStartSLOduration=9.366990887 podStartE2EDuration="9.366990887s" podCreationTimestamp="2025-12-12 00:25:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:25:53.30129026 +0000 UTC m=+143.846643136" watchObservedRunningTime="2025-12-12 00:25:53.366990887 +0000 UTC m=+143.912343753" Dec 12 00:25:53 crc kubenswrapper[4606]: I1212 00:25:53.379926 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-krjvv"] Dec 12 00:25:53 crc kubenswrapper[4606]: I1212 00:25:53.380114 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-drkxp" Dec 12 00:25:53 crc kubenswrapper[4606]: I1212 00:25:53.402579 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-krjvv" Dec 12 00:25:53 crc kubenswrapper[4606]: I1212 00:25:53.431933 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 12 00:25:53 crc kubenswrapper[4606]: I1212 00:25:53.432258 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drsn7\" (UniqueName: \"kubernetes.io/projected/34b189fb-cbc9-4ebe-bfa2-b8a4e1e0d4b4-kube-api-access-drsn7\") pod \"community-operators-xg5pj\" (UID: \"34b189fb-cbc9-4ebe-bfa2-b8a4e1e0d4b4\") " pod="openshift-marketplace/community-operators-xg5pj" Dec 12 00:25:53 crc kubenswrapper[4606]: I1212 00:25:53.432286 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34b189fb-cbc9-4ebe-bfa2-b8a4e1e0d4b4-utilities\") pod \"community-operators-xg5pj\" (UID: \"34b189fb-cbc9-4ebe-bfa2-b8a4e1e0d4b4\") " pod="openshift-marketplace/community-operators-xg5pj" Dec 12 00:25:53 crc kubenswrapper[4606]: I1212 00:25:53.432321 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34b189fb-cbc9-4ebe-bfa2-b8a4e1e0d4b4-catalog-content\") pod \"community-operators-xg5pj\" (UID: \"34b189fb-cbc9-4ebe-bfa2-b8a4e1e0d4b4\") " pod="openshift-marketplace/community-operators-xg5pj" Dec 12 00:25:53 crc kubenswrapper[4606]: I1212 00:25:53.432709 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34b189fb-cbc9-4ebe-bfa2-b8a4e1e0d4b4-catalog-content\") pod \"community-operators-xg5pj\" (UID: \"34b189fb-cbc9-4ebe-bfa2-b8a4e1e0d4b4\") " pod="openshift-marketplace/community-operators-xg5pj" Dec 12 00:25:53 crc kubenswrapper[4606]: E1212 00:25:53.432780 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-12 00:25:53.932765516 +0000 UTC m=+144.478118382 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 12 00:25:53 crc kubenswrapper[4606]: I1212 00:25:53.433231 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34b189fb-cbc9-4ebe-bfa2-b8a4e1e0d4b4-utilities\") pod \"community-operators-xg5pj\" (UID: \"34b189fb-cbc9-4ebe-bfa2-b8a4e1e0d4b4\") " pod="openshift-marketplace/community-operators-xg5pj" Dec 12 00:25:53 crc kubenswrapper[4606]: I1212 00:25:53.447940 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-krjvv"] Dec 12 00:25:53 crc kubenswrapper[4606]: I1212 00:25:53.466849 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drsn7\" (UniqueName: \"kubernetes.io/projected/34b189fb-cbc9-4ebe-bfa2-b8a4e1e0d4b4-kube-api-access-drsn7\") pod \"community-operators-xg5pj\" (UID: \"34b189fb-cbc9-4ebe-bfa2-b8a4e1e0d4b4\") " pod="openshift-marketplace/community-operators-xg5pj" Dec 12 00:25:53 crc kubenswrapper[4606]: I1212 00:25:53.533127 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8nn7\" (UID: \"40292f84-e865-4368-9e37-e385dfcb5880\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8nn7" Dec 12 00:25:53 crc kubenswrapper[4606]: I1212 00:25:53.533193 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gbkx\" (UniqueName: \"kubernetes.io/projected/4f43f314-8361-4374-87fc-00d8955b4ca4-kube-api-access-9gbkx\") pod \"certified-operators-krjvv\" (UID: \"4f43f314-8361-4374-87fc-00d8955b4ca4\") " pod="openshift-marketplace/certified-operators-krjvv" Dec 12 00:25:53 crc kubenswrapper[4606]: I1212 00:25:53.533230 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f43f314-8361-4374-87fc-00d8955b4ca4-utilities\") pod \"certified-operators-krjvv\" (UID: \"4f43f314-8361-4374-87fc-00d8955b4ca4\") " pod="openshift-marketplace/certified-operators-krjvv" Dec 12 00:25:53 crc kubenswrapper[4606]: I1212 00:25:53.533258 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f43f314-8361-4374-87fc-00d8955b4ca4-catalog-content\") pod \"certified-operators-krjvv\" (UID: \"4f43f314-8361-4374-87fc-00d8955b4ca4\") " pod="openshift-marketplace/certified-operators-krjvv" Dec 12 00:25:53 crc kubenswrapper[4606]: E1212 00:25:53.533548 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-12 00:25:54.033537139 +0000 UTC m=+144.578890005 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t8nn7" (UID: "40292f84-e865-4368-9e37-e385dfcb5880") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 12 00:25:53 crc kubenswrapper[4606]: I1212 00:25:53.574928 4606 patch_prober.go:28] interesting pod/router-default-5444994796-64s9l container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 12 00:25:53 crc kubenswrapper[4606]: [-]has-synced failed: reason withheld Dec 12 00:25:53 crc kubenswrapper[4606]: [+]process-running ok Dec 12 00:25:53 crc kubenswrapper[4606]: healthz check failed Dec 12 00:25:53 crc kubenswrapper[4606]: I1212 00:25:53.574977 4606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-64s9l" podUID="21420052-2e90-4be9-923e-2b8d0d5ad189" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 12 00:25:53 crc kubenswrapper[4606]: I1212 00:25:53.589767 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" Dec 12 00:25:53 crc kubenswrapper[4606]: I1212 00:25:53.612123 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-sfvgg"] Dec 12 00:25:53 crc kubenswrapper[4606]: I1212 00:25:53.612993 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sfvgg" Dec 12 00:25:53 crc kubenswrapper[4606]: I1212 00:25:53.634845 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 12 00:25:53 crc kubenswrapper[4606]: I1212 00:25:53.634989 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9gbkx\" (UniqueName: \"kubernetes.io/projected/4f43f314-8361-4374-87fc-00d8955b4ca4-kube-api-access-9gbkx\") pod \"certified-operators-krjvv\" (UID: \"4f43f314-8361-4374-87fc-00d8955b4ca4\") " pod="openshift-marketplace/certified-operators-krjvv" Dec 12 00:25:53 crc kubenswrapper[4606]: I1212 00:25:53.635043 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f43f314-8361-4374-87fc-00d8955b4ca4-utilities\") pod \"certified-operators-krjvv\" (UID: \"4f43f314-8361-4374-87fc-00d8955b4ca4\") " pod="openshift-marketplace/certified-operators-krjvv" Dec 12 00:25:53 crc kubenswrapper[4606]: I1212 00:25:53.635069 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f43f314-8361-4374-87fc-00d8955b4ca4-catalog-content\") pod \"certified-operators-krjvv\" (UID: \"4f43f314-8361-4374-87fc-00d8955b4ca4\") " pod="openshift-marketplace/certified-operators-krjvv" Dec 12 00:25:53 crc kubenswrapper[4606]: I1212 00:25:53.635706 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f43f314-8361-4374-87fc-00d8955b4ca4-catalog-content\") pod \"certified-operators-krjvv\" (UID: \"4f43f314-8361-4374-87fc-00d8955b4ca4\") " pod="openshift-marketplace/certified-operators-krjvv" Dec 12 00:25:53 crc kubenswrapper[4606]: E1212 00:25:53.635781 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-12 00:25:54.135767583 +0000 UTC m=+144.681120449 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 12 00:25:53 crc kubenswrapper[4606]: I1212 00:25:53.636196 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f43f314-8361-4374-87fc-00d8955b4ca4-utilities\") pod \"certified-operators-krjvv\" (UID: \"4f43f314-8361-4374-87fc-00d8955b4ca4\") " pod="openshift-marketplace/certified-operators-krjvv" Dec 12 00:25:53 crc kubenswrapper[4606]: I1212 00:25:53.696342 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sfvgg"] Dec 12 00:25:53 crc kubenswrapper[4606]: I1212 00:25:53.736877 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/77eaeae2-bc76-4cb3-9578-f24186325c2c-catalog-content\") pod \"community-operators-sfvgg\" (UID: \"77eaeae2-bc76-4cb3-9578-f24186325c2c\") " pod="openshift-marketplace/community-operators-sfvgg" Dec 12 00:25:53 crc kubenswrapper[4606]: I1212 00:25:53.736913 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8nn7\" (UID: \"40292f84-e865-4368-9e37-e385dfcb5880\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8nn7" Dec 12 00:25:53 crc kubenswrapper[4606]: I1212 00:25:53.737034 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/77eaeae2-bc76-4cb3-9578-f24186325c2c-utilities\") pod \"community-operators-sfvgg\" (UID: \"77eaeae2-bc76-4cb3-9578-f24186325c2c\") " pod="openshift-marketplace/community-operators-sfvgg" Dec 12 00:25:53 crc kubenswrapper[4606]: I1212 00:25:53.737063 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flcpx\" (UniqueName: \"kubernetes.io/projected/77eaeae2-bc76-4cb3-9578-f24186325c2c-kube-api-access-flcpx\") pod \"community-operators-sfvgg\" (UID: \"77eaeae2-bc76-4cb3-9578-f24186325c2c\") " pod="openshift-marketplace/community-operators-sfvgg" Dec 12 00:25:53 crc kubenswrapper[4606]: E1212 00:25:53.737334 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-12 00:25:54.237324748 +0000 UTC m=+144.782677614 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t8nn7" (UID: "40292f84-e865-4368-9e37-e385dfcb5880") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 12 00:25:53 crc kubenswrapper[4606]: I1212 00:25:53.794868 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gbkx\" (UniqueName: \"kubernetes.io/projected/4f43f314-8361-4374-87fc-00d8955b4ca4-kube-api-access-9gbkx\") pod \"certified-operators-krjvv\" (UID: \"4f43f314-8361-4374-87fc-00d8955b4ca4\") " pod="openshift-marketplace/certified-operators-krjvv" Dec 12 00:25:53 crc kubenswrapper[4606]: I1212 00:25:53.807347 4606 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-8ltmf container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.37:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 12 00:25:53 crc kubenswrapper[4606]: I1212 00:25:53.807402 4606 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8ltmf" podUID="e107c29e-3757-44e7-ad54-223801e19085" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.37:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 12 00:25:53 crc kubenswrapper[4606]: I1212 00:25:53.808401 4606 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-dgfmw container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.17:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 12 00:25:53 crc kubenswrapper[4606]: I1212 00:25:53.808449 4606 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-dgfmw" podUID="079b1c50-eaa5-4be5-a0d2-0015a67a1875" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.17:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 12 00:25:53 crc kubenswrapper[4606]: I1212 00:25:53.837846 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 12 00:25:53 crc kubenswrapper[4606]: I1212 00:25:53.838286 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/77eaeae2-bc76-4cb3-9578-f24186325c2c-catalog-content\") pod \"community-operators-sfvgg\" (UID: \"77eaeae2-bc76-4cb3-9578-f24186325c2c\") " pod="openshift-marketplace/community-operators-sfvgg" Dec 12 00:25:53 crc kubenswrapper[4606]: I1212 00:25:53.838397 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/77eaeae2-bc76-4cb3-9578-f24186325c2c-utilities\") pod \"community-operators-sfvgg\" (UID: \"77eaeae2-bc76-4cb3-9578-f24186325c2c\") " pod="openshift-marketplace/community-operators-sfvgg" Dec 12 00:25:53 crc kubenswrapper[4606]: I1212 00:25:53.838418 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-flcpx\" (UniqueName: \"kubernetes.io/projected/77eaeae2-bc76-4cb3-9578-f24186325c2c-kube-api-access-flcpx\") pod \"community-operators-sfvgg\" (UID: \"77eaeae2-bc76-4cb3-9578-f24186325c2c\") " pod="openshift-marketplace/community-operators-sfvgg" Dec 12 00:25:53 crc kubenswrapper[4606]: E1212 00:25:53.838722 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-12 00:25:54.338707068 +0000 UTC m=+144.884059934 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 12 00:25:53 crc kubenswrapper[4606]: I1212 00:25:53.839059 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/77eaeae2-bc76-4cb3-9578-f24186325c2c-catalog-content\") pod \"community-operators-sfvgg\" (UID: \"77eaeae2-bc76-4cb3-9578-f24186325c2c\") " pod="openshift-marketplace/community-operators-sfvgg" Dec 12 00:25:53 crc kubenswrapper[4606]: I1212 00:25:53.839268 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/77eaeae2-bc76-4cb3-9578-f24186325c2c-utilities\") pod \"community-operators-sfvgg\" (UID: \"77eaeae2-bc76-4cb3-9578-f24186325c2c\") " pod="openshift-marketplace/community-operators-sfvgg" Dec 12 00:25:53 crc kubenswrapper[4606]: I1212 00:25:53.840139 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-h8tjq" event={"ID":"e836a0be-41d7-4d9d-9b59-d1db42826d1b","Type":"ContainerStarted","Data":"3773efebae801b97d70d6270c7a06862ef236a18bcfd7461a555533bbdbba319"} Dec 12 00:25:53 crc kubenswrapper[4606]: I1212 00:25:53.852301 4606 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-544fr container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.29:8080/healthz\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Dec 12 00:25:53 crc kubenswrapper[4606]: I1212 00:25:53.852352 4606 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-544fr" podUID="d5b97b3b-8994-4d7f-a165-c04d13546e89" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.29:8080/healthz\": dial tcp 10.217.0.29:8080: connect: connection refused" Dec 12 00:25:53 crc kubenswrapper[4606]: I1212 00:25:53.874487 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-52vsx" Dec 12 00:25:53 crc kubenswrapper[4606]: I1212 00:25:53.912851 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-flcpx\" (UniqueName: \"kubernetes.io/projected/77eaeae2-bc76-4cb3-9578-f24186325c2c-kube-api-access-flcpx\") pod \"community-operators-sfvgg\" (UID: \"77eaeae2-bc76-4cb3-9578-f24186325c2c\") " pod="openshift-marketplace/community-operators-sfvgg" Dec 12 00:25:53 crc kubenswrapper[4606]: I1212 00:25:53.928932 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-dgfmw" Dec 12 00:25:53 crc kubenswrapper[4606]: I1212 00:25:53.955876 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8nn7\" (UID: \"40292f84-e865-4368-9e37-e385dfcb5880\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8nn7" Dec 12 00:25:53 crc kubenswrapper[4606]: E1212 00:25:53.965137 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-12 00:25:54.465124344 +0000 UTC m=+145.010477200 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t8nn7" (UID: "40292f84-e865-4368-9e37-e385dfcb5880") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 12 00:25:54 crc kubenswrapper[4606]: I1212 00:25:54.038425 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-krjvv" Dec 12 00:25:54 crc kubenswrapper[4606]: I1212 00:25:54.057666 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 12 00:25:54 crc kubenswrapper[4606]: E1212 00:25:54.057960 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-12 00:25:54.557944416 +0000 UTC m=+145.103297282 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 12 00:25:54 crc kubenswrapper[4606]: I1212 00:25:54.160723 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8nn7\" (UID: \"40292f84-e865-4368-9e37-e385dfcb5880\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8nn7" Dec 12 00:25:54 crc kubenswrapper[4606]: E1212 00:25:54.161041 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-12 00:25:54.661030703 +0000 UTC m=+145.206383569 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t8nn7" (UID: "40292f84-e865-4368-9e37-e385dfcb5880") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 12 00:25:54 crc kubenswrapper[4606]: I1212 00:25:54.163934 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8ltmf" Dec 12 00:25:54 crc kubenswrapper[4606]: I1212 00:25:54.266703 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 12 00:25:54 crc kubenswrapper[4606]: E1212 00:25:54.267010 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-12 00:25:54.766995541 +0000 UTC m=+145.312348407 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 12 00:25:54 crc kubenswrapper[4606]: I1212 00:25:54.281544 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 12 00:25:54 crc kubenswrapper[4606]: I1212 00:25:54.288565 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xg5pj" Dec 12 00:25:54 crc kubenswrapper[4606]: I1212 00:25:54.289320 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sfvgg" Dec 12 00:25:54 crc kubenswrapper[4606]: I1212 00:25:54.301577 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-drkxp"] Dec 12 00:25:54 crc kubenswrapper[4606]: I1212 00:25:54.375012 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8nn7\" (UID: \"40292f84-e865-4368-9e37-e385dfcb5880\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8nn7" Dec 12 00:25:54 crc kubenswrapper[4606]: E1212 00:25:54.375283 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-12 00:25:54.875271682 +0000 UTC m=+145.420624538 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t8nn7" (UID: "40292f84-e865-4368-9e37-e385dfcb5880") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 12 00:25:54 crc kubenswrapper[4606]: I1212 00:25:54.486576 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 12 00:25:54 crc kubenswrapper[4606]: E1212 00:25:54.487038 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-12 00:25:54.98701698 +0000 UTC m=+145.532369846 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 12 00:25:54 crc kubenswrapper[4606]: I1212 00:25:54.487223 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8nn7\" (UID: \"40292f84-e865-4368-9e37-e385dfcb5880\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8nn7" Dec 12 00:25:54 crc kubenswrapper[4606]: E1212 00:25:54.487525 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-12 00:25:54.987511144 +0000 UTC m=+145.532864010 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t8nn7" (UID: "40292f84-e865-4368-9e37-e385dfcb5880") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 12 00:25:54 crc kubenswrapper[4606]: I1212 00:25:54.576190 4606 patch_prober.go:28] interesting pod/router-default-5444994796-64s9l container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 12 00:25:54 crc kubenswrapper[4606]: [-]has-synced failed: reason withheld Dec 12 00:25:54 crc kubenswrapper[4606]: [+]process-running ok Dec 12 00:25:54 crc kubenswrapper[4606]: healthz check failed Dec 12 00:25:54 crc kubenswrapper[4606]: I1212 00:25:54.576235 4606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-64s9l" podUID="21420052-2e90-4be9-923e-2b8d0d5ad189" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 12 00:25:54 crc kubenswrapper[4606]: I1212 00:25:54.588891 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 12 00:25:54 crc kubenswrapper[4606]: E1212 00:25:54.589323 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-12 00:25:55.089306126 +0000 UTC m=+145.634658992 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 12 00:25:54 crc kubenswrapper[4606]: I1212 00:25:54.690767 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8nn7\" (UID: \"40292f84-e865-4368-9e37-e385dfcb5880\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8nn7" Dec 12 00:25:54 crc kubenswrapper[4606]: E1212 00:25:54.691055 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-12 00:25:55.191044166 +0000 UTC m=+145.736397032 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t8nn7" (UID: "40292f84-e865-4368-9e37-e385dfcb5880") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 12 00:25:54 crc kubenswrapper[4606]: I1212 00:25:54.791730 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 12 00:25:54 crc kubenswrapper[4606]: E1212 00:25:54.792059 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-12 00:25:55.292034205 +0000 UTC m=+145.837387071 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 12 00:25:54 crc kubenswrapper[4606]: I1212 00:25:54.792247 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8nn7\" (UID: \"40292f84-e865-4368-9e37-e385dfcb5880\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8nn7" Dec 12 00:25:54 crc kubenswrapper[4606]: E1212 00:25:54.792496 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-12 00:25:55.292484257 +0000 UTC m=+145.837837123 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t8nn7" (UID: "40292f84-e865-4368-9e37-e385dfcb5880") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 12 00:25:54 crc kubenswrapper[4606]: I1212 00:25:54.854671 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-drkxp" event={"ID":"16aefbaf-f9b9-452e-84fa-a710b9284349","Type":"ContainerStarted","Data":"c97d7db2b7cfa6f202e7f21298fade29ee8280a38549795f518938dfa846a454"} Dec 12 00:25:54 crc kubenswrapper[4606]: I1212 00:25:54.893535 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 12 00:25:54 crc kubenswrapper[4606]: E1212 00:25:54.893827 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-12 00:25:55.393802235 +0000 UTC m=+145.939155101 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 12 00:25:54 crc kubenswrapper[4606]: I1212 00:25:54.894109 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8nn7\" (UID: \"40292f84-e865-4368-9e37-e385dfcb5880\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8nn7" Dec 12 00:25:54 crc kubenswrapper[4606]: E1212 00:25:54.894466 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-12 00:25:55.394459303 +0000 UTC m=+145.939812159 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t8nn7" (UID: "40292f84-e865-4368-9e37-e385dfcb5880") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 12 00:25:54 crc kubenswrapper[4606]: I1212 00:25:54.998652 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 12 00:25:54 crc kubenswrapper[4606]: E1212 00:25:54.998784 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-12 00:25:55.498759845 +0000 UTC m=+146.044112711 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 12 00:25:54 crc kubenswrapper[4606]: I1212 00:25:54.999212 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8nn7\" (UID: \"40292f84-e865-4368-9e37-e385dfcb5880\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8nn7" Dec 12 00:25:55 crc kubenswrapper[4606]: E1212 00:25:55.000235 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-12 00:25:55.500225215 +0000 UTC m=+146.045578081 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t8nn7" (UID: "40292f84-e865-4368-9e37-e385dfcb5880") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 12 00:25:55 crc kubenswrapper[4606]: I1212 00:25:55.048620 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-q5998"] Dec 12 00:25:55 crc kubenswrapper[4606]: I1212 00:25:55.050900 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q5998" Dec 12 00:25:55 crc kubenswrapper[4606]: I1212 00:25:55.055134 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 12 00:25:55 crc kubenswrapper[4606]: I1212 00:25:55.075590 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-krjvv"] Dec 12 00:25:55 crc kubenswrapper[4606]: I1212 00:25:55.100086 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 12 00:25:55 crc kubenswrapper[4606]: E1212 00:25:55.100388 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-12 00:25:55.600375491 +0000 UTC m=+146.145728357 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 12 00:25:55 crc kubenswrapper[4606]: I1212 00:25:55.159348 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-q5998"] Dec 12 00:25:55 crc kubenswrapper[4606]: I1212 00:25:55.201366 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46jbj\" (UniqueName: \"kubernetes.io/projected/ee26d00d-079d-41c7-b641-5f2373eef2ee-kube-api-access-46jbj\") pod \"redhat-marketplace-q5998\" (UID: \"ee26d00d-079d-41c7-b641-5f2373eef2ee\") " pod="openshift-marketplace/redhat-marketplace-q5998" Dec 12 00:25:55 crc kubenswrapper[4606]: I1212 00:25:55.201430 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee26d00d-079d-41c7-b641-5f2373eef2ee-catalog-content\") pod \"redhat-marketplace-q5998\" (UID: \"ee26d00d-079d-41c7-b641-5f2373eef2ee\") " pod="openshift-marketplace/redhat-marketplace-q5998" Dec 12 00:25:55 crc kubenswrapper[4606]: I1212 00:25:55.201510 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee26d00d-079d-41c7-b641-5f2373eef2ee-utilities\") pod \"redhat-marketplace-q5998\" (UID: \"ee26d00d-079d-41c7-b641-5f2373eef2ee\") " pod="openshift-marketplace/redhat-marketplace-q5998" Dec 12 00:25:55 crc kubenswrapper[4606]: I1212 00:25:55.201541 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8nn7\" (UID: \"40292f84-e865-4368-9e37-e385dfcb5880\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8nn7" Dec 12 00:25:55 crc kubenswrapper[4606]: E1212 00:25:55.201801 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-12 00:25:55.701787782 +0000 UTC m=+146.247140638 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t8nn7" (UID: "40292f84-e865-4368-9e37-e385dfcb5880") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 12 00:25:55 crc kubenswrapper[4606]: I1212 00:25:55.302956 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 12 00:25:55 crc kubenswrapper[4606]: E1212 00:25:55.303053 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-12 00:25:55.803037508 +0000 UTC m=+146.348390374 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 12 00:25:55 crc kubenswrapper[4606]: I1212 00:25:55.303229 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee26d00d-079d-41c7-b641-5f2373eef2ee-catalog-content\") pod \"redhat-marketplace-q5998\" (UID: \"ee26d00d-079d-41c7-b641-5f2373eef2ee\") " pod="openshift-marketplace/redhat-marketplace-q5998" Dec 12 00:25:55 crc kubenswrapper[4606]: I1212 00:25:55.303626 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee26d00d-079d-41c7-b641-5f2373eef2ee-catalog-content\") pod \"redhat-marketplace-q5998\" (UID: \"ee26d00d-079d-41c7-b641-5f2373eef2ee\") " pod="openshift-marketplace/redhat-marketplace-q5998" Dec 12 00:25:55 crc kubenswrapper[4606]: I1212 00:25:55.303720 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee26d00d-079d-41c7-b641-5f2373eef2ee-utilities\") pod \"redhat-marketplace-q5998\" (UID: \"ee26d00d-079d-41c7-b641-5f2373eef2ee\") " pod="openshift-marketplace/redhat-marketplace-q5998" Dec 12 00:25:55 crc kubenswrapper[4606]: I1212 00:25:55.303956 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee26d00d-079d-41c7-b641-5f2373eef2ee-utilities\") pod \"redhat-marketplace-q5998\" (UID: \"ee26d00d-079d-41c7-b641-5f2373eef2ee\") " pod="openshift-marketplace/redhat-marketplace-q5998" Dec 12 00:25:55 crc kubenswrapper[4606]: I1212 00:25:55.303754 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8nn7\" (UID: \"40292f84-e865-4368-9e37-e385dfcb5880\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8nn7" Dec 12 00:25:55 crc kubenswrapper[4606]: I1212 00:25:55.304013 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46jbj\" (UniqueName: \"kubernetes.io/projected/ee26d00d-079d-41c7-b641-5f2373eef2ee-kube-api-access-46jbj\") pod \"redhat-marketplace-q5998\" (UID: \"ee26d00d-079d-41c7-b641-5f2373eef2ee\") " pod="openshift-marketplace/redhat-marketplace-q5998" Dec 12 00:25:55 crc kubenswrapper[4606]: E1212 00:25:55.304257 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-12 00:25:55.804250072 +0000 UTC m=+146.349602938 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t8nn7" (UID: "40292f84-e865-4368-9e37-e385dfcb5880") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 12 00:25:55 crc kubenswrapper[4606]: I1212 00:25:55.308227 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-f6l62" Dec 12 00:25:55 crc kubenswrapper[4606]: I1212 00:25:55.308348 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-f6l62" Dec 12 00:25:55 crc kubenswrapper[4606]: I1212 00:25:55.332432 4606 patch_prober.go:28] interesting pod/apiserver-76f77b778f-f6l62 container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Dec 12 00:25:55 crc kubenswrapper[4606]: [+]log ok Dec 12 00:25:55 crc kubenswrapper[4606]: [+]etcd ok Dec 12 00:25:55 crc kubenswrapper[4606]: [+]poststarthook/start-apiserver-admission-initializer ok Dec 12 00:25:55 crc kubenswrapper[4606]: [+]poststarthook/generic-apiserver-start-informers ok Dec 12 00:25:55 crc kubenswrapper[4606]: [+]poststarthook/max-in-flight-filter ok Dec 12 00:25:55 crc kubenswrapper[4606]: [+]poststarthook/storage-object-count-tracker-hook ok Dec 12 00:25:55 crc kubenswrapper[4606]: [+]poststarthook/image.openshift.io-apiserver-caches ok Dec 12 00:25:55 crc kubenswrapper[4606]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Dec 12 00:25:55 crc kubenswrapper[4606]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Dec 12 00:25:55 crc kubenswrapper[4606]: [+]poststarthook/project.openshift.io-projectcache ok Dec 12 00:25:55 crc kubenswrapper[4606]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Dec 12 00:25:55 crc kubenswrapper[4606]: [+]poststarthook/openshift.io-startinformers ok Dec 12 00:25:55 crc kubenswrapper[4606]: [+]poststarthook/openshift.io-restmapperupdater ok Dec 12 00:25:55 crc kubenswrapper[4606]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Dec 12 00:25:55 crc kubenswrapper[4606]: livez check failed Dec 12 00:25:55 crc kubenswrapper[4606]: I1212 00:25:55.332470 4606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-f6l62" podUID="6fb93471-e75b-43b2-a4e2-d36bfc617930" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 12 00:25:55 crc kubenswrapper[4606]: I1212 00:25:55.339100 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46jbj\" (UniqueName: \"kubernetes.io/projected/ee26d00d-079d-41c7-b641-5f2373eef2ee-kube-api-access-46jbj\") pod \"redhat-marketplace-q5998\" (UID: \"ee26d00d-079d-41c7-b641-5f2373eef2ee\") " pod="openshift-marketplace/redhat-marketplace-q5998" Dec 12 00:25:55 crc kubenswrapper[4606]: I1212 00:25:55.368511 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-rkpr5"] Dec 12 00:25:55 crc kubenswrapper[4606]: I1212 00:25:55.369530 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rkpr5" Dec 12 00:25:55 crc kubenswrapper[4606]: I1212 00:25:55.382511 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rkpr5"] Dec 12 00:25:55 crc kubenswrapper[4606]: I1212 00:25:55.404439 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q5998" Dec 12 00:25:55 crc kubenswrapper[4606]: I1212 00:25:55.409119 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 12 00:25:55 crc kubenswrapper[4606]: E1212 00:25:55.410098 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-12 00:25:55.910082916 +0000 UTC m=+146.455435772 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 12 00:25:55 crc kubenswrapper[4606]: I1212 00:25:55.510532 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18996b4c-ea24-4fa6-8420-c6ff4cd30473-utilities\") pod \"redhat-marketplace-rkpr5\" (UID: \"18996b4c-ea24-4fa6-8420-c6ff4cd30473\") " pod="openshift-marketplace/redhat-marketplace-rkpr5" Dec 12 00:25:55 crc kubenswrapper[4606]: I1212 00:25:55.510824 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8nn7\" (UID: \"40292f84-e865-4368-9e37-e385dfcb5880\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8nn7" Dec 12 00:25:55 crc kubenswrapper[4606]: I1212 00:25:55.510845 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18996b4c-ea24-4fa6-8420-c6ff4cd30473-catalog-content\") pod \"redhat-marketplace-rkpr5\" (UID: \"18996b4c-ea24-4fa6-8420-c6ff4cd30473\") " pod="openshift-marketplace/redhat-marketplace-rkpr5" Dec 12 00:25:55 crc kubenswrapper[4606]: I1212 00:25:55.510893 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qd8z\" (UniqueName: \"kubernetes.io/projected/18996b4c-ea24-4fa6-8420-c6ff4cd30473-kube-api-access-6qd8z\") pod \"redhat-marketplace-rkpr5\" (UID: \"18996b4c-ea24-4fa6-8420-c6ff4cd30473\") " pod="openshift-marketplace/redhat-marketplace-rkpr5" Dec 12 00:25:55 crc kubenswrapper[4606]: E1212 00:25:55.511135 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-12 00:25:56.011123216 +0000 UTC m=+146.556476082 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t8nn7" (UID: "40292f84-e865-4368-9e37-e385dfcb5880") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 12 00:25:55 crc kubenswrapper[4606]: I1212 00:25:55.593772 4606 patch_prober.go:28] interesting pod/router-default-5444994796-64s9l container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 12 00:25:55 crc kubenswrapper[4606]: [-]has-synced failed: reason withheld Dec 12 00:25:55 crc kubenswrapper[4606]: [+]process-running ok Dec 12 00:25:55 crc kubenswrapper[4606]: healthz check failed Dec 12 00:25:55 crc kubenswrapper[4606]: I1212 00:25:55.593825 4606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-64s9l" podUID="21420052-2e90-4be9-923e-2b8d0d5ad189" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 12 00:25:55 crc kubenswrapper[4606]: I1212 00:25:55.611766 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 12 00:25:55 crc kubenswrapper[4606]: I1212 00:25:55.612156 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18996b4c-ea24-4fa6-8420-c6ff4cd30473-utilities\") pod \"redhat-marketplace-rkpr5\" (UID: \"18996b4c-ea24-4fa6-8420-c6ff4cd30473\") " pod="openshift-marketplace/redhat-marketplace-rkpr5" Dec 12 00:25:55 crc kubenswrapper[4606]: I1212 00:25:55.612237 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18996b4c-ea24-4fa6-8420-c6ff4cd30473-catalog-content\") pod \"redhat-marketplace-rkpr5\" (UID: \"18996b4c-ea24-4fa6-8420-c6ff4cd30473\") " pod="openshift-marketplace/redhat-marketplace-rkpr5" Dec 12 00:25:55 crc kubenswrapper[4606]: I1212 00:25:55.612291 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qd8z\" (UniqueName: \"kubernetes.io/projected/18996b4c-ea24-4fa6-8420-c6ff4cd30473-kube-api-access-6qd8z\") pod \"redhat-marketplace-rkpr5\" (UID: \"18996b4c-ea24-4fa6-8420-c6ff4cd30473\") " pod="openshift-marketplace/redhat-marketplace-rkpr5" Dec 12 00:25:55 crc kubenswrapper[4606]: E1212 00:25:55.612693 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-12 00:25:56.112678591 +0000 UTC m=+146.658031457 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 12 00:25:55 crc kubenswrapper[4606]: I1212 00:25:55.613482 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18996b4c-ea24-4fa6-8420-c6ff4cd30473-utilities\") pod \"redhat-marketplace-rkpr5\" (UID: \"18996b4c-ea24-4fa6-8420-c6ff4cd30473\") " pod="openshift-marketplace/redhat-marketplace-rkpr5" Dec 12 00:25:55 crc kubenswrapper[4606]: I1212 00:25:55.613514 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18996b4c-ea24-4fa6-8420-c6ff4cd30473-catalog-content\") pod \"redhat-marketplace-rkpr5\" (UID: \"18996b4c-ea24-4fa6-8420-c6ff4cd30473\") " pod="openshift-marketplace/redhat-marketplace-rkpr5" Dec 12 00:25:55 crc kubenswrapper[4606]: I1212 00:25:55.657115 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qd8z\" (UniqueName: \"kubernetes.io/projected/18996b4c-ea24-4fa6-8420-c6ff4cd30473-kube-api-access-6qd8z\") pod \"redhat-marketplace-rkpr5\" (UID: \"18996b4c-ea24-4fa6-8420-c6ff4cd30473\") " pod="openshift-marketplace/redhat-marketplace-rkpr5" Dec 12 00:25:55 crc kubenswrapper[4606]: I1212 00:25:55.706708 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rkpr5" Dec 12 00:25:55 crc kubenswrapper[4606]: I1212 00:25:55.714159 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8nn7\" (UID: \"40292f84-e865-4368-9e37-e385dfcb5880\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8nn7" Dec 12 00:25:55 crc kubenswrapper[4606]: E1212 00:25:55.714473 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-12 00:25:56.214460992 +0000 UTC m=+146.759813848 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t8nn7" (UID: "40292f84-e865-4368-9e37-e385dfcb5880") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 12 00:25:55 crc kubenswrapper[4606]: I1212 00:25:55.722199 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sfvgg"] Dec 12 00:25:55 crc kubenswrapper[4606]: I1212 00:25:55.817045 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 12 00:25:55 crc kubenswrapper[4606]: E1212 00:25:55.817241 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-12 00:25:56.317215279 +0000 UTC m=+146.862568145 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 12 00:25:55 crc kubenswrapper[4606]: I1212 00:25:55.817579 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8nn7\" (UID: \"40292f84-e865-4368-9e37-e385dfcb5880\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8nn7" Dec 12 00:25:55 crc kubenswrapper[4606]: E1212 00:25:55.817941 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-12 00:25:56.317928969 +0000 UTC m=+146.863281825 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t8nn7" (UID: "40292f84-e865-4368-9e37-e385dfcb5880") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 12 00:25:55 crc kubenswrapper[4606]: I1212 00:25:55.855762 4606 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-frwjc container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.25:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 12 00:25:55 crc kubenswrapper[4606]: I1212 00:25:55.855834 4606 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-frwjc" podUID="84d78ffd-976a-4d55-9b6a-d10369b35718" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.25:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 12 00:25:55 crc kubenswrapper[4606]: I1212 00:25:55.865810 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sfvgg" event={"ID":"77eaeae2-bc76-4cb3-9578-f24186325c2c","Type":"ContainerStarted","Data":"e1f7dc97b6cb6f849880022d071b0eb131b2ac93c97caa68eadee8da85b33e49"} Dec 12 00:25:55 crc kubenswrapper[4606]: I1212 00:25:55.869562 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-h8tjq" event={"ID":"e836a0be-41d7-4d9d-9b59-d1db42826d1b","Type":"ContainerStarted","Data":"6bc7b725b5c0b76a13b01180cb3b3b092aa2aea7d543076035e73226715f11c1"} Dec 12 00:25:55 crc kubenswrapper[4606]: I1212 00:25:55.870966 4606 generic.go:334] "Generic (PLEG): container finished" podID="16aefbaf-f9b9-452e-84fa-a710b9284349" containerID="406f7d08228d29199e42746334478922a308b9e64f193fdbd10718dbd55ef7fa" exitCode=0 Dec 12 00:25:55 crc kubenswrapper[4606]: I1212 00:25:55.871137 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-drkxp" event={"ID":"16aefbaf-f9b9-452e-84fa-a710b9284349","Type":"ContainerDied","Data":"406f7d08228d29199e42746334478922a308b9e64f193fdbd10718dbd55ef7fa"} Dec 12 00:25:55 crc kubenswrapper[4606]: I1212 00:25:55.872617 4606 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 12 00:25:55 crc kubenswrapper[4606]: I1212 00:25:55.873694 4606 generic.go:334] "Generic (PLEG): container finished" podID="4f43f314-8361-4374-87fc-00d8955b4ca4" containerID="8ce8e512c3248ab076b72b38d89594667065c64b0d80813ee37cfe4278fa1bb6" exitCode=0 Dec 12 00:25:55 crc kubenswrapper[4606]: I1212 00:25:55.873895 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-krjvv" event={"ID":"4f43f314-8361-4374-87fc-00d8955b4ca4","Type":"ContainerDied","Data":"8ce8e512c3248ab076b72b38d89594667065c64b0d80813ee37cfe4278fa1bb6"} Dec 12 00:25:55 crc kubenswrapper[4606]: I1212 00:25:55.873938 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-krjvv" event={"ID":"4f43f314-8361-4374-87fc-00d8955b4ca4","Type":"ContainerStarted","Data":"5dd8bbcbb7a5c1404fd88b5e9fd7ea24dae04ec828987fbbb5ffd11ec665b057"} Dec 12 00:25:55 crc kubenswrapper[4606]: I1212 00:25:55.919336 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 12 00:25:55 crc kubenswrapper[4606]: E1212 00:25:55.919726 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-12 00:25:56.419697449 +0000 UTC m=+146.965050315 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 12 00:25:55 crc kubenswrapper[4606]: I1212 00:25:55.923264 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8nn7\" (UID: \"40292f84-e865-4368-9e37-e385dfcb5880\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8nn7" Dec 12 00:25:55 crc kubenswrapper[4606]: E1212 00:25:55.923579 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-12 00:25:56.423562767 +0000 UTC m=+146.968915633 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t8nn7" (UID: "40292f84-e865-4368-9e37-e385dfcb5880") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 12 00:25:56 crc kubenswrapper[4606]: I1212 00:25:56.007025 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xg5pj"] Dec 12 00:25:56 crc kubenswrapper[4606]: W1212 00:25:56.020282 4606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod34b189fb_cbc9_4ebe_bfa2_b8a4e1e0d4b4.slice/crio-ead8403e15c823bdec4438306689ab40fb499d708c8b11597eaa155b8bdc63dc WatchSource:0}: Error finding container ead8403e15c823bdec4438306689ab40fb499d708c8b11597eaa155b8bdc63dc: Status 404 returned error can't find the container with id ead8403e15c823bdec4438306689ab40fb499d708c8b11597eaa155b8bdc63dc Dec 12 00:25:56 crc kubenswrapper[4606]: I1212 00:25:56.027935 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 12 00:25:56 crc kubenswrapper[4606]: E1212 00:25:56.028270 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-12 00:25:56.528256079 +0000 UTC m=+147.073608945 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 12 00:25:56 crc kubenswrapper[4606]: I1212 00:25:56.084856 4606 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Dec 12 00:25:56 crc kubenswrapper[4606]: I1212 00:25:56.129939 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8nn7\" (UID: \"40292f84-e865-4368-9e37-e385dfcb5880\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8nn7" Dec 12 00:25:56 crc kubenswrapper[4606]: E1212 00:25:56.130256 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-12 00:25:56.630244516 +0000 UTC m=+147.175597382 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t8nn7" (UID: "40292f84-e865-4368-9e37-e385dfcb5880") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 12 00:25:56 crc kubenswrapper[4606]: I1212 00:25:56.199972 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-q5998"] Dec 12 00:25:56 crc kubenswrapper[4606]: W1212 00:25:56.218308 4606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee26d00d_079d_41c7_b641_5f2373eef2ee.slice/crio-1dc5d89429b0743211d02a00386457a161b9ebf9a4bc8d8efd00b3c28c289ad6 WatchSource:0}: Error finding container 1dc5d89429b0743211d02a00386457a161b9ebf9a4bc8d8efd00b3c28c289ad6: Status 404 returned error can't find the container with id 1dc5d89429b0743211d02a00386457a161b9ebf9a4bc8d8efd00b3c28c289ad6 Dec 12 00:25:56 crc kubenswrapper[4606]: I1212 00:25:56.230656 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 12 00:25:56 crc kubenswrapper[4606]: E1212 00:25:56.231149 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-12 00:25:56.731134962 +0000 UTC m=+147.276487828 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 12 00:25:56 crc kubenswrapper[4606]: I1212 00:25:56.238397 4606 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-frwjc container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.217.0.25:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 12 00:25:56 crc kubenswrapper[4606]: I1212 00:25:56.240261 4606 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-frwjc" podUID="84d78ffd-976a-4d55-9b6a-d10369b35718" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.25:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 12 00:25:56 crc kubenswrapper[4606]: I1212 00:25:56.332044 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8nn7\" (UID: \"40292f84-e865-4368-9e37-e385dfcb5880\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8nn7" Dec 12 00:25:56 crc kubenswrapper[4606]: E1212 00:25:56.332413 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-12 00:25:56.832402709 +0000 UTC m=+147.377755565 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t8nn7" (UID: "40292f84-e865-4368-9e37-e385dfcb5880") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 12 00:25:56 crc kubenswrapper[4606]: I1212 00:25:56.346579 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-frwjc" Dec 12 00:25:56 crc kubenswrapper[4606]: I1212 00:25:56.363125 4606 patch_prober.go:28] interesting pod/downloads-7954f5f757-vlq68 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" start-of-body= Dec 12 00:25:56 crc kubenswrapper[4606]: I1212 00:25:56.363166 4606 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-vlq68" podUID="5635c63d-bd71-4b80-b111-0fd9ff2cd053" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" Dec 12 00:25:56 crc kubenswrapper[4606]: I1212 00:25:56.363221 4606 patch_prober.go:28] interesting pod/downloads-7954f5f757-vlq68 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" start-of-body= Dec 12 00:25:56 crc kubenswrapper[4606]: I1212 00:25:56.363268 4606 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-vlq68" podUID="5635c63d-bd71-4b80-b111-0fd9ff2cd053" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" Dec 12 00:25:56 crc kubenswrapper[4606]: I1212 00:25:56.399035 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-fqkw9"] Dec 12 00:25:56 crc kubenswrapper[4606]: I1212 00:25:56.400116 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fqkw9" Dec 12 00:25:56 crc kubenswrapper[4606]: I1212 00:25:56.415677 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n9mcs" Dec 12 00:25:56 crc kubenswrapper[4606]: I1212 00:25:56.415705 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n9mcs" Dec 12 00:25:56 crc kubenswrapper[4606]: I1212 00:25:56.417811 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fqkw9"] Dec 12 00:25:56 crc kubenswrapper[4606]: I1212 00:25:56.425909 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 12 00:25:56 crc kubenswrapper[4606]: I1212 00:25:56.435243 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 12 00:25:56 crc kubenswrapper[4606]: E1212 00:25:56.435641 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-12 00:25:56.93561089 +0000 UTC m=+147.480963756 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 12 00:25:56 crc kubenswrapper[4606]: I1212 00:25:56.435685 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/303940e6-1922-4197-ad2a-6524c192b1b5-catalog-content\") pod \"redhat-operators-fqkw9\" (UID: \"303940e6-1922-4197-ad2a-6524c192b1b5\") " pod="openshift-marketplace/redhat-operators-fqkw9" Dec 12 00:25:56 crc kubenswrapper[4606]: I1212 00:25:56.435738 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54mqq\" (UniqueName: \"kubernetes.io/projected/303940e6-1922-4197-ad2a-6524c192b1b5-kube-api-access-54mqq\") pod \"redhat-operators-fqkw9\" (UID: \"303940e6-1922-4197-ad2a-6524c192b1b5\") " pod="openshift-marketplace/redhat-operators-fqkw9" Dec 12 00:25:56 crc kubenswrapper[4606]: I1212 00:25:56.435806 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/303940e6-1922-4197-ad2a-6524c192b1b5-utilities\") pod \"redhat-operators-fqkw9\" (UID: \"303940e6-1922-4197-ad2a-6524c192b1b5\") " pod="openshift-marketplace/redhat-operators-fqkw9" Dec 12 00:25:56 crc kubenswrapper[4606]: I1212 00:25:56.443475 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n9mcs" Dec 12 00:25:56 crc kubenswrapper[4606]: I1212 00:25:56.519313 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-dlrwh" Dec 12 00:25:56 crc kubenswrapper[4606]: I1212 00:25:56.519638 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-dlrwh" Dec 12 00:25:56 crc kubenswrapper[4606]: I1212 00:25:56.524685 4606 patch_prober.go:28] interesting pod/console-f9d7485db-dlrwh container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.13:8443/health\": dial tcp 10.217.0.13:8443: connect: connection refused" start-of-body= Dec 12 00:25:56 crc kubenswrapper[4606]: I1212 00:25:56.524729 4606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-dlrwh" podUID="c454b7c4-18db-442a-ae25-d66e7e6061f3" containerName="console" probeResult="failure" output="Get \"https://10.217.0.13:8443/health\": dial tcp 10.217.0.13:8443: connect: connection refused" Dec 12 00:25:56 crc kubenswrapper[4606]: I1212 00:25:56.537560 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8nn7\" (UID: \"40292f84-e865-4368-9e37-e385dfcb5880\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8nn7" Dec 12 00:25:56 crc kubenswrapper[4606]: I1212 00:25:56.537858 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/303940e6-1922-4197-ad2a-6524c192b1b5-catalog-content\") pod \"redhat-operators-fqkw9\" (UID: \"303940e6-1922-4197-ad2a-6524c192b1b5\") " pod="openshift-marketplace/redhat-operators-fqkw9" Dec 12 00:25:56 crc kubenswrapper[4606]: I1212 00:25:56.537947 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54mqq\" (UniqueName: \"kubernetes.io/projected/303940e6-1922-4197-ad2a-6524c192b1b5-kube-api-access-54mqq\") pod \"redhat-operators-fqkw9\" (UID: \"303940e6-1922-4197-ad2a-6524c192b1b5\") " pod="openshift-marketplace/redhat-operators-fqkw9" Dec 12 00:25:56 crc kubenswrapper[4606]: I1212 00:25:56.538069 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/303940e6-1922-4197-ad2a-6524c192b1b5-utilities\") pod \"redhat-operators-fqkw9\" (UID: \"303940e6-1922-4197-ad2a-6524c192b1b5\") " pod="openshift-marketplace/redhat-operators-fqkw9" Dec 12 00:25:56 crc kubenswrapper[4606]: I1212 00:25:56.538506 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/303940e6-1922-4197-ad2a-6524c192b1b5-utilities\") pod \"redhat-operators-fqkw9\" (UID: \"303940e6-1922-4197-ad2a-6524c192b1b5\") " pod="openshift-marketplace/redhat-operators-fqkw9" Dec 12 00:25:56 crc kubenswrapper[4606]: I1212 00:25:56.538776 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/303940e6-1922-4197-ad2a-6524c192b1b5-catalog-content\") pod \"redhat-operators-fqkw9\" (UID: \"303940e6-1922-4197-ad2a-6524c192b1b5\") " pod="openshift-marketplace/redhat-operators-fqkw9" Dec 12 00:25:56 crc kubenswrapper[4606]: E1212 00:25:56.539622 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-12 00:25:57.039600892 +0000 UTC m=+147.584953808 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t8nn7" (UID: "40292f84-e865-4368-9e37-e385dfcb5880") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 12 00:25:56 crc kubenswrapper[4606]: I1212 00:25:56.565698 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-64s9l" Dec 12 00:25:56 crc kubenswrapper[4606]: I1212 00:25:56.566586 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54mqq\" (UniqueName: \"kubernetes.io/projected/303940e6-1922-4197-ad2a-6524c192b1b5-kube-api-access-54mqq\") pod \"redhat-operators-fqkw9\" (UID: \"303940e6-1922-4197-ad2a-6524c192b1b5\") " pod="openshift-marketplace/redhat-operators-fqkw9" Dec 12 00:25:56 crc kubenswrapper[4606]: I1212 00:25:56.573121 4606 patch_prober.go:28] interesting pod/router-default-5444994796-64s9l container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 12 00:25:56 crc kubenswrapper[4606]: [-]has-synced failed: reason withheld Dec 12 00:25:56 crc kubenswrapper[4606]: [+]process-running ok Dec 12 00:25:56 crc kubenswrapper[4606]: healthz check failed Dec 12 00:25:56 crc kubenswrapper[4606]: I1212 00:25:56.573230 4606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-64s9l" podUID="21420052-2e90-4be9-923e-2b8d0d5ad189" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 12 00:25:56 crc kubenswrapper[4606]: I1212 00:25:56.638891 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 12 00:25:56 crc kubenswrapper[4606]: E1212 00:25:56.639556 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-12 00:25:57.139529962 +0000 UTC m=+147.684882828 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 12 00:25:56 crc kubenswrapper[4606]: I1212 00:25:56.645828 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-544fr" Dec 12 00:25:56 crc kubenswrapper[4606]: I1212 00:25:56.653730 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rkpr5"] Dec 12 00:25:56 crc kubenswrapper[4606]: W1212 00:25:56.665700 4606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod18996b4c_ea24_4fa6_8420_c6ff4cd30473.slice/crio-9f205c4a3d93d161523dc2cfd7161c1bdc8e7624d44a7da82fbfc1c2b07e4114 WatchSource:0}: Error finding container 9f205c4a3d93d161523dc2cfd7161c1bdc8e7624d44a7da82fbfc1c2b07e4114: Status 404 returned error can't find the container with id 9f205c4a3d93d161523dc2cfd7161c1bdc8e7624d44a7da82fbfc1c2b07e4114 Dec 12 00:25:56 crc kubenswrapper[4606]: I1212 00:25:56.740075 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8nn7\" (UID: \"40292f84-e865-4368-9e37-e385dfcb5880\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8nn7" Dec 12 00:25:56 crc kubenswrapper[4606]: E1212 00:25:56.740350 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-12 00:25:57.240339046 +0000 UTC m=+147.785691912 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-t8nn7" (UID: "40292f84-e865-4368-9e37-e385dfcb5880") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 12 00:25:56 crc kubenswrapper[4606]: I1212 00:25:56.758059 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fqkw9" Dec 12 00:25:56 crc kubenswrapper[4606]: I1212 00:25:56.765079 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-kwr58"] Dec 12 00:25:56 crc kubenswrapper[4606]: I1212 00:25:56.766257 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kwr58" Dec 12 00:25:56 crc kubenswrapper[4606]: I1212 00:25:56.780092 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kwr58"] Dec 12 00:25:56 crc kubenswrapper[4606]: I1212 00:25:56.840829 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 12 00:25:56 crc kubenswrapper[4606]: I1212 00:25:56.840934 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 12 00:25:56 crc kubenswrapper[4606]: I1212 00:25:56.840967 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fz9dp\" (UniqueName: \"kubernetes.io/projected/828a9e1a-5485-4706-abd6-fb28b99d0f19-kube-api-access-fz9dp\") pod \"redhat-operators-kwr58\" (UID: \"828a9e1a-5485-4706-abd6-fb28b99d0f19\") " pod="openshift-marketplace/redhat-operators-kwr58" Dec 12 00:25:56 crc kubenswrapper[4606]: I1212 00:25:56.840998 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 12 00:25:56 crc kubenswrapper[4606]: I1212 00:25:56.841017 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 12 00:25:56 crc kubenswrapper[4606]: I1212 00:25:56.841041 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 12 00:25:56 crc kubenswrapper[4606]: I1212 00:25:56.841082 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/828a9e1a-5485-4706-abd6-fb28b99d0f19-catalog-content\") pod \"redhat-operators-kwr58\" (UID: \"828a9e1a-5485-4706-abd6-fb28b99d0f19\") " pod="openshift-marketplace/redhat-operators-kwr58" Dec 12 00:25:56 crc kubenswrapper[4606]: I1212 00:25:56.841103 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/828a9e1a-5485-4706-abd6-fb28b99d0f19-utilities\") pod \"redhat-operators-kwr58\" (UID: \"828a9e1a-5485-4706-abd6-fb28b99d0f19\") " pod="openshift-marketplace/redhat-operators-kwr58" Dec 12 00:25:56 crc kubenswrapper[4606]: E1212 00:25:56.841224 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-12 00:25:57.341211231 +0000 UTC m=+147.886564097 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 12 00:25:56 crc kubenswrapper[4606]: I1212 00:25:56.847286 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 12 00:25:56 crc kubenswrapper[4606]: I1212 00:25:56.847389 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 12 00:25:56 crc kubenswrapper[4606]: I1212 00:25:56.847472 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 12 00:25:56 crc kubenswrapper[4606]: I1212 00:25:56.856872 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 12 00:25:56 crc kubenswrapper[4606]: I1212 00:25:56.866754 4606 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-12-12T00:25:56.084883064Z","Handler":null,"Name":""} Dec 12 00:25:56 crc kubenswrapper[4606]: I1212 00:25:56.889471 4606 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Dec 12 00:25:56 crc kubenswrapper[4606]: I1212 00:25:56.889526 4606 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Dec 12 00:25:56 crc kubenswrapper[4606]: I1212 00:25:56.895208 4606 generic.go:334] "Generic (PLEG): container finished" podID="34b189fb-cbc9-4ebe-bfa2-b8a4e1e0d4b4" containerID="ffb804780da2159122527d7f19d3f8ef4d2196ca61d3098a3bdfa7837ff7be87" exitCode=0 Dec 12 00:25:56 crc kubenswrapper[4606]: I1212 00:25:56.895268 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xg5pj" event={"ID":"34b189fb-cbc9-4ebe-bfa2-b8a4e1e0d4b4","Type":"ContainerDied","Data":"ffb804780da2159122527d7f19d3f8ef4d2196ca61d3098a3bdfa7837ff7be87"} Dec 12 00:25:56 crc kubenswrapper[4606]: I1212 00:25:56.895293 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xg5pj" event={"ID":"34b189fb-cbc9-4ebe-bfa2-b8a4e1e0d4b4","Type":"ContainerStarted","Data":"ead8403e15c823bdec4438306689ab40fb499d708c8b11597eaa155b8bdc63dc"} Dec 12 00:25:56 crc kubenswrapper[4606]: I1212 00:25:56.898728 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rkpr5" event={"ID":"18996b4c-ea24-4fa6-8420-c6ff4cd30473","Type":"ContainerStarted","Data":"f91a1ee60687b4a608b25ae1ef9c058539f792ebe7c9d9ddf3954bb2ad9e6011"} Dec 12 00:25:56 crc kubenswrapper[4606]: I1212 00:25:56.898760 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rkpr5" event={"ID":"18996b4c-ea24-4fa6-8420-c6ff4cd30473","Type":"ContainerStarted","Data":"9f205c4a3d93d161523dc2cfd7161c1bdc8e7624d44a7da82fbfc1c2b07e4114"} Dec 12 00:25:56 crc kubenswrapper[4606]: I1212 00:25:56.923647 4606 generic.go:334] "Generic (PLEG): container finished" podID="ee26d00d-079d-41c7-b641-5f2373eef2ee" containerID="db3e0c552c5a09d6984055041f40cb04a87efd2ef550498502ab1c9e729b801f" exitCode=0 Dec 12 00:25:56 crc kubenswrapper[4606]: I1212 00:25:56.923756 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q5998" event={"ID":"ee26d00d-079d-41c7-b641-5f2373eef2ee","Type":"ContainerDied","Data":"db3e0c552c5a09d6984055041f40cb04a87efd2ef550498502ab1c9e729b801f"} Dec 12 00:25:56 crc kubenswrapper[4606]: I1212 00:25:56.923780 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q5998" event={"ID":"ee26d00d-079d-41c7-b641-5f2373eef2ee","Type":"ContainerStarted","Data":"1dc5d89429b0743211d02a00386457a161b9ebf9a4bc8d8efd00b3c28c289ad6"} Dec 12 00:25:56 crc kubenswrapper[4606]: I1212 00:25:56.927622 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 12 00:25:56 crc kubenswrapper[4606]: I1212 00:25:56.942356 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8nn7\" (UID: \"40292f84-e865-4368-9e37-e385dfcb5880\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8nn7" Dec 12 00:25:56 crc kubenswrapper[4606]: I1212 00:25:56.942391 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/828a9e1a-5485-4706-abd6-fb28b99d0f19-catalog-content\") pod \"redhat-operators-kwr58\" (UID: \"828a9e1a-5485-4706-abd6-fb28b99d0f19\") " pod="openshift-marketplace/redhat-operators-kwr58" Dec 12 00:25:56 crc kubenswrapper[4606]: I1212 00:25:56.942411 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/828a9e1a-5485-4706-abd6-fb28b99d0f19-utilities\") pod \"redhat-operators-kwr58\" (UID: \"828a9e1a-5485-4706-abd6-fb28b99d0f19\") " pod="openshift-marketplace/redhat-operators-kwr58" Dec 12 00:25:56 crc kubenswrapper[4606]: I1212 00:25:56.942492 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fz9dp\" (UniqueName: \"kubernetes.io/projected/828a9e1a-5485-4706-abd6-fb28b99d0f19-kube-api-access-fz9dp\") pod \"redhat-operators-kwr58\" (UID: \"828a9e1a-5485-4706-abd6-fb28b99d0f19\") " pod="openshift-marketplace/redhat-operators-kwr58" Dec 12 00:25:56 crc kubenswrapper[4606]: I1212 00:25:56.943548 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/828a9e1a-5485-4706-abd6-fb28b99d0f19-catalog-content\") pod \"redhat-operators-kwr58\" (UID: \"828a9e1a-5485-4706-abd6-fb28b99d0f19\") " pod="openshift-marketplace/redhat-operators-kwr58" Dec 12 00:25:56 crc kubenswrapper[4606]: I1212 00:25:56.943759 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/828a9e1a-5485-4706-abd6-fb28b99d0f19-utilities\") pod \"redhat-operators-kwr58\" (UID: \"828a9e1a-5485-4706-abd6-fb28b99d0f19\") " pod="openshift-marketplace/redhat-operators-kwr58" Dec 12 00:25:56 crc kubenswrapper[4606]: I1212 00:25:56.951033 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 12 00:25:56 crc kubenswrapper[4606]: I1212 00:25:56.951683 4606 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 12 00:25:56 crc kubenswrapper[4606]: I1212 00:25:56.951712 4606 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8nn7\" (UID: \"40292f84-e865-4368-9e37-e385dfcb5880\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-t8nn7" Dec 12 00:25:56 crc kubenswrapper[4606]: I1212 00:25:56.967386 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 12 00:25:56 crc kubenswrapper[4606]: I1212 00:25:56.976195 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fz9dp\" (UniqueName: \"kubernetes.io/projected/828a9e1a-5485-4706-abd6-fb28b99d0f19-kube-api-access-fz9dp\") pod \"redhat-operators-kwr58\" (UID: \"828a9e1a-5485-4706-abd6-fb28b99d0f19\") " pod="openshift-marketplace/redhat-operators-kwr58" Dec 12 00:25:56 crc kubenswrapper[4606]: I1212 00:25:56.991377 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-h8tjq" event={"ID":"e836a0be-41d7-4d9d-9b59-d1db42826d1b","Type":"ContainerStarted","Data":"64ca3d3791b743055a2860266a95861af8be57614d8bd432cda0ed8e54f26604"} Dec 12 00:25:57 crc kubenswrapper[4606]: I1212 00:25:57.038768 4606 generic.go:334] "Generic (PLEG): container finished" podID="77eaeae2-bc76-4cb3-9578-f24186325c2c" containerID="dabffad379ec82879b7b60ca950b50b1712b91ba194e9d37458ef9caa4cee9d6" exitCode=0 Dec 12 00:25:57 crc kubenswrapper[4606]: I1212 00:25:57.039690 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sfvgg" event={"ID":"77eaeae2-bc76-4cb3-9578-f24186325c2c","Type":"ContainerDied","Data":"dabffad379ec82879b7b60ca950b50b1712b91ba194e9d37458ef9caa4cee9d6"} Dec 12 00:25:57 crc kubenswrapper[4606]: I1212 00:25:57.051016 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-h8tjq" podStartSLOduration=14.050985156 podStartE2EDuration="14.050985156s" podCreationTimestamp="2025-12-12 00:25:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:25:57.02055428 +0000 UTC m=+147.565907146" watchObservedRunningTime="2025-12-12 00:25:57.050985156 +0000 UTC m=+147.596338022" Dec 12 00:25:57 crc kubenswrapper[4606]: I1212 00:25:57.076430 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n9mcs" Dec 12 00:25:57 crc kubenswrapper[4606]: I1212 00:25:57.110718 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-t8nn7\" (UID: \"40292f84-e865-4368-9e37-e385dfcb5880\") " pod="openshift-image-registry/image-registry-697d97f7c8-t8nn7" Dec 12 00:25:57 crc kubenswrapper[4606]: I1212 00:25:57.112484 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kwr58" Dec 12 00:25:57 crc kubenswrapper[4606]: I1212 00:25:57.149925 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 12 00:25:57 crc kubenswrapper[4606]: I1212 00:25:57.198671 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fqkw9"] Dec 12 00:25:57 crc kubenswrapper[4606]: I1212 00:25:57.215154 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 12 00:25:57 crc kubenswrapper[4606]: I1212 00:25:57.387063 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-t8nn7" Dec 12 00:25:57 crc kubenswrapper[4606]: W1212 00:25:57.532814 4606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-79f742b368f4869261868bbff0c25aaa50644dd2c84bd4e83fb973bb1cfe0a1e WatchSource:0}: Error finding container 79f742b368f4869261868bbff0c25aaa50644dd2c84bd4e83fb973bb1cfe0a1e: Status 404 returned error can't find the container with id 79f742b368f4869261868bbff0c25aaa50644dd2c84bd4e83fb973bb1cfe0a1e Dec 12 00:25:57 crc kubenswrapper[4606]: I1212 00:25:57.573781 4606 patch_prober.go:28] interesting pod/router-default-5444994796-64s9l container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 12 00:25:57 crc kubenswrapper[4606]: [-]has-synced failed: reason withheld Dec 12 00:25:57 crc kubenswrapper[4606]: [+]process-running ok Dec 12 00:25:57 crc kubenswrapper[4606]: healthz check failed Dec 12 00:25:57 crc kubenswrapper[4606]: I1212 00:25:57.573843 4606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-64s9l" podUID="21420052-2e90-4be9-923e-2b8d0d5ad189" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 12 00:25:57 crc kubenswrapper[4606]: I1212 00:25:57.748581 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Dec 12 00:25:57 crc kubenswrapper[4606]: I1212 00:25:57.798277 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kwr58"] Dec 12 00:25:57 crc kubenswrapper[4606]: W1212 00:25:57.874462 4606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod828a9e1a_5485_4706_abd6_fb28b99d0f19.slice/crio-9ba02e65463f1dcb01d77353b8da73cbaf6d01aff7ddecd1338ce38a5ada5e54 WatchSource:0}: Error finding container 9ba02e65463f1dcb01d77353b8da73cbaf6d01aff7ddecd1338ce38a5ada5e54: Status 404 returned error can't find the container with id 9ba02e65463f1dcb01d77353b8da73cbaf6d01aff7ddecd1338ce38a5ada5e54 Dec 12 00:25:58 crc kubenswrapper[4606]: I1212 00:25:58.087925 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"cdebdf53f584c9b20857eb6c0e55c3a99e6cb6afafa1321cf8cfe73f2d4c5b6b"} Dec 12 00:25:58 crc kubenswrapper[4606]: I1212 00:25:58.093340 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-t8nn7"] Dec 12 00:25:58 crc kubenswrapper[4606]: I1212 00:25:58.101256 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"172cf551489ffdfeb29dd9508f27139baeaf38b68862b6492607bd9094b1c3e6"} Dec 12 00:25:58 crc kubenswrapper[4606]: I1212 00:25:58.101316 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"79f742b368f4869261868bbff0c25aaa50644dd2c84bd4e83fb973bb1cfe0a1e"} Dec 12 00:25:58 crc kubenswrapper[4606]: I1212 00:25:58.102080 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 12 00:25:58 crc kubenswrapper[4606]: W1212 00:25:58.117714 4606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod40292f84_e865_4368_9e37_e385dfcb5880.slice/crio-c6a578130ba0136088789483edb93741101af32e0fd3f3096c2d4da613ec9421 WatchSource:0}: Error finding container c6a578130ba0136088789483edb93741101af32e0fd3f3096c2d4da613ec9421: Status 404 returned error can't find the container with id c6a578130ba0136088789483edb93741101af32e0fd3f3096c2d4da613ec9421 Dec 12 00:25:58 crc kubenswrapper[4606]: I1212 00:25:58.117913 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"6381da3fe466f793b1c5c2a5cb090548d407bb33af9ed347f6d73ba1559d71db"} Dec 12 00:25:58 crc kubenswrapper[4606]: I1212 00:25:58.202982 4606 generic.go:334] "Generic (PLEG): container finished" podID="18996b4c-ea24-4fa6-8420-c6ff4cd30473" containerID="f91a1ee60687b4a608b25ae1ef9c058539f792ebe7c9d9ddf3954bb2ad9e6011" exitCode=0 Dec 12 00:25:58 crc kubenswrapper[4606]: I1212 00:25:58.203155 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rkpr5" event={"ID":"18996b4c-ea24-4fa6-8420-c6ff4cd30473","Type":"ContainerDied","Data":"f91a1ee60687b4a608b25ae1ef9c058539f792ebe7c9d9ddf3954bb2ad9e6011"} Dec 12 00:25:58 crc kubenswrapper[4606]: I1212 00:25:58.219952 4606 generic.go:334] "Generic (PLEG): container finished" podID="303940e6-1922-4197-ad2a-6524c192b1b5" containerID="8d142d962b8d94334be0e3cb53e7c755953d22ba5b3e6f7ec56c694763725e87" exitCode=0 Dec 12 00:25:58 crc kubenswrapper[4606]: I1212 00:25:58.220021 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fqkw9" event={"ID":"303940e6-1922-4197-ad2a-6524c192b1b5","Type":"ContainerDied","Data":"8d142d962b8d94334be0e3cb53e7c755953d22ba5b3e6f7ec56c694763725e87"} Dec 12 00:25:58 crc kubenswrapper[4606]: I1212 00:25:58.220046 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fqkw9" event={"ID":"303940e6-1922-4197-ad2a-6524c192b1b5","Type":"ContainerStarted","Data":"eb51762e86cc272087d3623bc290edd5f5e73da812a29ff16042047144b4c8c5"} Dec 12 00:25:58 crc kubenswrapper[4606]: I1212 00:25:58.231626 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kwr58" event={"ID":"828a9e1a-5485-4706-abd6-fb28b99d0f19","Type":"ContainerStarted","Data":"9ba02e65463f1dcb01d77353b8da73cbaf6d01aff7ddecd1338ce38a5ada5e54"} Dec 12 00:25:58 crc kubenswrapper[4606]: I1212 00:25:58.573818 4606 patch_prober.go:28] interesting pod/router-default-5444994796-64s9l container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 12 00:25:58 crc kubenswrapper[4606]: [-]has-synced failed: reason withheld Dec 12 00:25:58 crc kubenswrapper[4606]: [+]process-running ok Dec 12 00:25:58 crc kubenswrapper[4606]: healthz check failed Dec 12 00:25:58 crc kubenswrapper[4606]: I1212 00:25:58.573868 4606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-64s9l" podUID="21420052-2e90-4be9-923e-2b8d0d5ad189" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 12 00:25:59 crc kubenswrapper[4606]: I1212 00:25:59.007027 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 12 00:25:59 crc kubenswrapper[4606]: I1212 00:25:59.008005 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 12 00:25:59 crc kubenswrapper[4606]: I1212 00:25:59.015103 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Dec 12 00:25:59 crc kubenswrapper[4606]: I1212 00:25:59.025445 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Dec 12 00:25:59 crc kubenswrapper[4606]: I1212 00:25:59.029360 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 12 00:25:59 crc kubenswrapper[4606]: I1212 00:25:59.082012 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/22692cd1-90b3-467c-b11b-03f029f9060e-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"22692cd1-90b3-467c-b11b-03f029f9060e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 12 00:25:59 crc kubenswrapper[4606]: I1212 00:25:59.082097 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/22692cd1-90b3-467c-b11b-03f029f9060e-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"22692cd1-90b3-467c-b11b-03f029f9060e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 12 00:25:59 crc kubenswrapper[4606]: I1212 00:25:59.184040 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/22692cd1-90b3-467c-b11b-03f029f9060e-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"22692cd1-90b3-467c-b11b-03f029f9060e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 12 00:25:59 crc kubenswrapper[4606]: I1212 00:25:59.184545 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/22692cd1-90b3-467c-b11b-03f029f9060e-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"22692cd1-90b3-467c-b11b-03f029f9060e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 12 00:25:59 crc kubenswrapper[4606]: I1212 00:25:59.184604 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/22692cd1-90b3-467c-b11b-03f029f9060e-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"22692cd1-90b3-467c-b11b-03f029f9060e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 12 00:25:59 crc kubenswrapper[4606]: I1212 00:25:59.222806 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/22692cd1-90b3-467c-b11b-03f029f9060e-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"22692cd1-90b3-467c-b11b-03f029f9060e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 12 00:25:59 crc kubenswrapper[4606]: I1212 00:25:59.244389 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"1295666ccbd0677aa399db199e85a9aa36928593a5285da90a7c87c3669a181c"} Dec 12 00:25:59 crc kubenswrapper[4606]: I1212 00:25:59.250892 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-t8nn7" event={"ID":"40292f84-e865-4368-9e37-e385dfcb5880","Type":"ContainerStarted","Data":"d34b66ee13398dfdf6ff81baadba90cf00bfbe6b920bd2925d713fc34d18df31"} Dec 12 00:25:59 crc kubenswrapper[4606]: I1212 00:25:59.250953 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-t8nn7" event={"ID":"40292f84-e865-4368-9e37-e385dfcb5880","Type":"ContainerStarted","Data":"c6a578130ba0136088789483edb93741101af32e0fd3f3096c2d4da613ec9421"} Dec 12 00:25:59 crc kubenswrapper[4606]: I1212 00:25:59.251836 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-t8nn7" Dec 12 00:25:59 crc kubenswrapper[4606]: I1212 00:25:59.257525 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"ecc55dde9c319d01bee8e00af0bed087b10b43fc588c37d729f8c754212769ed"} Dec 12 00:25:59 crc kubenswrapper[4606]: I1212 00:25:59.276236 4606 generic.go:334] "Generic (PLEG): container finished" podID="9b7c71bd-1fac-494e-8407-ecedfa667fc7" containerID="f3acce94c709bfd6e56ca940f2b563c56c62902741e670a2d5dd229d699dbb46" exitCode=0 Dec 12 00:25:59 crc kubenswrapper[4606]: I1212 00:25:59.276321 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29424975-jpm67" event={"ID":"9b7c71bd-1fac-494e-8407-ecedfa667fc7","Type":"ContainerDied","Data":"f3acce94c709bfd6e56ca940f2b563c56c62902741e670a2d5dd229d699dbb46"} Dec 12 00:25:59 crc kubenswrapper[4606]: I1212 00:25:59.293461 4606 generic.go:334] "Generic (PLEG): container finished" podID="828a9e1a-5485-4706-abd6-fb28b99d0f19" containerID="3a7c46ffe8ff62eb781eb16a95ab202bf645b84e0c45dfd47f389581830aa219" exitCode=0 Dec 12 00:25:59 crc kubenswrapper[4606]: I1212 00:25:59.293650 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kwr58" event={"ID":"828a9e1a-5485-4706-abd6-fb28b99d0f19","Type":"ContainerDied","Data":"3a7c46ffe8ff62eb781eb16a95ab202bf645b84e0c45dfd47f389581830aa219"} Dec 12 00:25:59 crc kubenswrapper[4606]: I1212 00:25:59.321700 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-t8nn7" podStartSLOduration=130.321670814 podStartE2EDuration="2m10.321670814s" podCreationTimestamp="2025-12-12 00:23:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:25:59.316506221 +0000 UTC m=+149.861859087" watchObservedRunningTime="2025-12-12 00:25:59.321670814 +0000 UTC m=+149.867023680" Dec 12 00:25:59 crc kubenswrapper[4606]: I1212 00:25:59.340341 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 12 00:25:59 crc kubenswrapper[4606]: I1212 00:25:59.619649 4606 patch_prober.go:28] interesting pod/router-default-5444994796-64s9l container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 12 00:25:59 crc kubenswrapper[4606]: [-]has-synced failed: reason withheld Dec 12 00:25:59 crc kubenswrapper[4606]: [+]process-running ok Dec 12 00:25:59 crc kubenswrapper[4606]: healthz check failed Dec 12 00:25:59 crc kubenswrapper[4606]: I1212 00:25:59.619983 4606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-64s9l" podUID="21420052-2e90-4be9-923e-2b8d0d5ad189" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 12 00:26:00 crc kubenswrapper[4606]: I1212 00:26:00.187007 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 12 00:26:00 crc kubenswrapper[4606]: I1212 00:26:00.327001 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-f6l62" Dec 12 00:26:00 crc kubenswrapper[4606]: I1212 00:26:00.336972 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-f6l62" Dec 12 00:26:00 crc kubenswrapper[4606]: I1212 00:26:00.570274 4606 patch_prober.go:28] interesting pod/router-default-5444994796-64s9l container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 12 00:26:00 crc kubenswrapper[4606]: [-]has-synced failed: reason withheld Dec 12 00:26:00 crc kubenswrapper[4606]: [+]process-running ok Dec 12 00:26:00 crc kubenswrapper[4606]: healthz check failed Dec 12 00:26:00 crc kubenswrapper[4606]: I1212 00:26:00.570650 4606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-64s9l" podUID="21420052-2e90-4be9-923e-2b8d0d5ad189" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 12 00:26:00 crc kubenswrapper[4606]: I1212 00:26:00.832438 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29424975-jpm67" Dec 12 00:26:00 crc kubenswrapper[4606]: I1212 00:26:00.843971 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9b7c71bd-1fac-494e-8407-ecedfa667fc7-config-volume\") pod \"9b7c71bd-1fac-494e-8407-ecedfa667fc7\" (UID: \"9b7c71bd-1fac-494e-8407-ecedfa667fc7\") " Dec 12 00:26:00 crc kubenswrapper[4606]: I1212 00:26:00.844057 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6d6pm\" (UniqueName: \"kubernetes.io/projected/9b7c71bd-1fac-494e-8407-ecedfa667fc7-kube-api-access-6d6pm\") pod \"9b7c71bd-1fac-494e-8407-ecedfa667fc7\" (UID: \"9b7c71bd-1fac-494e-8407-ecedfa667fc7\") " Dec 12 00:26:00 crc kubenswrapper[4606]: I1212 00:26:00.844087 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9b7c71bd-1fac-494e-8407-ecedfa667fc7-secret-volume\") pod \"9b7c71bd-1fac-494e-8407-ecedfa667fc7\" (UID: \"9b7c71bd-1fac-494e-8407-ecedfa667fc7\") " Dec 12 00:26:00 crc kubenswrapper[4606]: I1212 00:26:00.849645 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b7c71bd-1fac-494e-8407-ecedfa667fc7-config-volume" (OuterVolumeSpecName: "config-volume") pod "9b7c71bd-1fac-494e-8407-ecedfa667fc7" (UID: "9b7c71bd-1fac-494e-8407-ecedfa667fc7"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:26:00 crc kubenswrapper[4606]: I1212 00:26:00.855056 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b7c71bd-1fac-494e-8407-ecedfa667fc7-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "9b7c71bd-1fac-494e-8407-ecedfa667fc7" (UID: "9b7c71bd-1fac-494e-8407-ecedfa667fc7"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:26:00 crc kubenswrapper[4606]: I1212 00:26:00.856645 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b7c71bd-1fac-494e-8407-ecedfa667fc7-kube-api-access-6d6pm" (OuterVolumeSpecName: "kube-api-access-6d6pm") pod "9b7c71bd-1fac-494e-8407-ecedfa667fc7" (UID: "9b7c71bd-1fac-494e-8407-ecedfa667fc7"). InnerVolumeSpecName "kube-api-access-6d6pm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:26:00 crc kubenswrapper[4606]: I1212 00:26:00.945018 4606 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9b7c71bd-1fac-494e-8407-ecedfa667fc7-config-volume\") on node \"crc\" DevicePath \"\"" Dec 12 00:26:00 crc kubenswrapper[4606]: I1212 00:26:00.945058 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6d6pm\" (UniqueName: \"kubernetes.io/projected/9b7c71bd-1fac-494e-8407-ecedfa667fc7-kube-api-access-6d6pm\") on node \"crc\" DevicePath \"\"" Dec 12 00:26:00 crc kubenswrapper[4606]: I1212 00:26:00.945073 4606 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9b7c71bd-1fac-494e-8407-ecedfa667fc7-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 12 00:26:01 crc kubenswrapper[4606]: I1212 00:26:01.335786 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29424975-jpm67" event={"ID":"9b7c71bd-1fac-494e-8407-ecedfa667fc7","Type":"ContainerDied","Data":"9f0ab9482848c5388313654750598d5293cec0ba6fb453b8f8291468b8c5f1bd"} Dec 12 00:26:01 crc kubenswrapper[4606]: I1212 00:26:01.335846 4606 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9f0ab9482848c5388313654750598d5293cec0ba6fb453b8f8291468b8c5f1bd" Dec 12 00:26:01 crc kubenswrapper[4606]: I1212 00:26:01.335961 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29424975-jpm67" Dec 12 00:26:01 crc kubenswrapper[4606]: I1212 00:26:01.368175 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"22692cd1-90b3-467c-b11b-03f029f9060e","Type":"ContainerStarted","Data":"af1884fc186a6041b732f518edd811ddc5353ce5b093a6b2cc12515c4a423783"} Dec 12 00:26:01 crc kubenswrapper[4606]: I1212 00:26:01.572654 4606 patch_prober.go:28] interesting pod/router-default-5444994796-64s9l container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 12 00:26:01 crc kubenswrapper[4606]: [-]has-synced failed: reason withheld Dec 12 00:26:01 crc kubenswrapper[4606]: [+]process-running ok Dec 12 00:26:01 crc kubenswrapper[4606]: healthz check failed Dec 12 00:26:01 crc kubenswrapper[4606]: I1212 00:26:01.572714 4606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-64s9l" podUID="21420052-2e90-4be9-923e-2b8d0d5ad189" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 12 00:26:02 crc kubenswrapper[4606]: I1212 00:26:02.011646 4606 patch_prober.go:28] interesting pod/machine-config-daemon-cqmz5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 12 00:26:02 crc kubenswrapper[4606]: I1212 00:26:02.011707 4606 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 12 00:26:02 crc kubenswrapper[4606]: I1212 00:26:02.212545 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-z6php" Dec 12 00:26:02 crc kubenswrapper[4606]: I1212 00:26:02.390951 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"22692cd1-90b3-467c-b11b-03f029f9060e","Type":"ContainerStarted","Data":"b9abdd3f2d318ce137bc287d91e1b10caed3120cd9bc041c649ac8a27b2f04af"} Dec 12 00:26:02 crc kubenswrapper[4606]: I1212 00:26:02.524713 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 12 00:26:02 crc kubenswrapper[4606]: E1212 00:26:02.525023 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b7c71bd-1fac-494e-8407-ecedfa667fc7" containerName="collect-profiles" Dec 12 00:26:02 crc kubenswrapper[4606]: I1212 00:26:02.525038 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b7c71bd-1fac-494e-8407-ecedfa667fc7" containerName="collect-profiles" Dec 12 00:26:02 crc kubenswrapper[4606]: I1212 00:26:02.525139 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b7c71bd-1fac-494e-8407-ecedfa667fc7" containerName="collect-profiles" Dec 12 00:26:02 crc kubenswrapper[4606]: I1212 00:26:02.525517 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 12 00:26:02 crc kubenswrapper[4606]: I1212 00:26:02.549563 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 12 00:26:02 crc kubenswrapper[4606]: I1212 00:26:02.550440 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 12 00:26:02 crc kubenswrapper[4606]: I1212 00:26:02.550697 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 12 00:26:02 crc kubenswrapper[4606]: I1212 00:26:02.570631 4606 patch_prober.go:28] interesting pod/router-default-5444994796-64s9l container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 12 00:26:02 crc kubenswrapper[4606]: [-]has-synced failed: reason withheld Dec 12 00:26:02 crc kubenswrapper[4606]: [+]process-running ok Dec 12 00:26:02 crc kubenswrapper[4606]: healthz check failed Dec 12 00:26:02 crc kubenswrapper[4606]: I1212 00:26:02.570710 4606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-64s9l" podUID="21420052-2e90-4be9-923e-2b8d0d5ad189" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 12 00:26:02 crc kubenswrapper[4606]: I1212 00:26:02.664825 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/eb2ff433-bebc-4600-9c55-32549e52559c-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"eb2ff433-bebc-4600-9c55-32549e52559c\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 12 00:26:02 crc kubenswrapper[4606]: I1212 00:26:02.664890 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/eb2ff433-bebc-4600-9c55-32549e52559c-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"eb2ff433-bebc-4600-9c55-32549e52559c\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 12 00:26:02 crc kubenswrapper[4606]: I1212 00:26:02.766985 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/eb2ff433-bebc-4600-9c55-32549e52559c-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"eb2ff433-bebc-4600-9c55-32549e52559c\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 12 00:26:02 crc kubenswrapper[4606]: I1212 00:26:02.767074 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/eb2ff433-bebc-4600-9c55-32549e52559c-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"eb2ff433-bebc-4600-9c55-32549e52559c\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 12 00:26:02 crc kubenswrapper[4606]: I1212 00:26:02.767279 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/eb2ff433-bebc-4600-9c55-32549e52559c-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"eb2ff433-bebc-4600-9c55-32549e52559c\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 12 00:26:02 crc kubenswrapper[4606]: I1212 00:26:02.808252 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/eb2ff433-bebc-4600-9c55-32549e52559c-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"eb2ff433-bebc-4600-9c55-32549e52559c\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 12 00:26:02 crc kubenswrapper[4606]: I1212 00:26:02.890255 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 12 00:26:03 crc kubenswrapper[4606]: I1212 00:26:03.454467 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=5.454450856 podStartE2EDuration="5.454450856s" podCreationTimestamp="2025-12-12 00:25:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:26:03.45170168 +0000 UTC m=+153.997054546" watchObservedRunningTime="2025-12-12 00:26:03.454450856 +0000 UTC m=+153.999803722" Dec 12 00:26:03 crc kubenswrapper[4606]: I1212 00:26:03.572279 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 12 00:26:03 crc kubenswrapper[4606]: I1212 00:26:03.575712 4606 patch_prober.go:28] interesting pod/router-default-5444994796-64s9l container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 12 00:26:03 crc kubenswrapper[4606]: [-]has-synced failed: reason withheld Dec 12 00:26:03 crc kubenswrapper[4606]: [+]process-running ok Dec 12 00:26:03 crc kubenswrapper[4606]: healthz check failed Dec 12 00:26:03 crc kubenswrapper[4606]: I1212 00:26:03.575762 4606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-64s9l" podUID="21420052-2e90-4be9-923e-2b8d0d5ad189" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 12 00:26:04 crc kubenswrapper[4606]: I1212 00:26:04.451927 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"eb2ff433-bebc-4600-9c55-32549e52559c","Type":"ContainerStarted","Data":"1562087a4f9d8374ad352445487ff715c7ea7dd2134897e80986167db682327c"} Dec 12 00:26:04 crc kubenswrapper[4606]: I1212 00:26:04.462771 4606 generic.go:334] "Generic (PLEG): container finished" podID="22692cd1-90b3-467c-b11b-03f029f9060e" containerID="b9abdd3f2d318ce137bc287d91e1b10caed3120cd9bc041c649ac8a27b2f04af" exitCode=0 Dec 12 00:26:04 crc kubenswrapper[4606]: I1212 00:26:04.462818 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"22692cd1-90b3-467c-b11b-03f029f9060e","Type":"ContainerDied","Data":"b9abdd3f2d318ce137bc287d91e1b10caed3120cd9bc041c649ac8a27b2f04af"} Dec 12 00:26:04 crc kubenswrapper[4606]: I1212 00:26:04.569438 4606 patch_prober.go:28] interesting pod/router-default-5444994796-64s9l container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 12 00:26:04 crc kubenswrapper[4606]: [-]has-synced failed: reason withheld Dec 12 00:26:04 crc kubenswrapper[4606]: [+]process-running ok Dec 12 00:26:04 crc kubenswrapper[4606]: healthz check failed Dec 12 00:26:04 crc kubenswrapper[4606]: I1212 00:26:04.569516 4606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-64s9l" podUID="21420052-2e90-4be9-923e-2b8d0d5ad189" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 12 00:26:05 crc kubenswrapper[4606]: I1212 00:26:05.481361 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"eb2ff433-bebc-4600-9c55-32549e52559c","Type":"ContainerStarted","Data":"2dd6cf8ef737c9ed956e3f635b7c4ce573873480461e63de1abddb0f2c620aa8"} Dec 12 00:26:05 crc kubenswrapper[4606]: I1212 00:26:05.512580 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=3.5125234020000002 podStartE2EDuration="3.512523402s" podCreationTimestamp="2025-12-12 00:26:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:26:05.502859343 +0000 UTC m=+156.048212239" watchObservedRunningTime="2025-12-12 00:26:05.512523402 +0000 UTC m=+156.057876268" Dec 12 00:26:05 crc kubenswrapper[4606]: I1212 00:26:05.568502 4606 patch_prober.go:28] interesting pod/router-default-5444994796-64s9l container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 12 00:26:05 crc kubenswrapper[4606]: [-]has-synced failed: reason withheld Dec 12 00:26:05 crc kubenswrapper[4606]: [+]process-running ok Dec 12 00:26:05 crc kubenswrapper[4606]: healthz check failed Dec 12 00:26:05 crc kubenswrapper[4606]: I1212 00:26:05.568552 4606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-64s9l" podUID="21420052-2e90-4be9-923e-2b8d0d5ad189" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 12 00:26:05 crc kubenswrapper[4606]: I1212 00:26:05.866125 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 12 00:26:05 crc kubenswrapper[4606]: I1212 00:26:05.970005 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/22692cd1-90b3-467c-b11b-03f029f9060e-kube-api-access\") pod \"22692cd1-90b3-467c-b11b-03f029f9060e\" (UID: \"22692cd1-90b3-467c-b11b-03f029f9060e\") " Dec 12 00:26:05 crc kubenswrapper[4606]: I1212 00:26:05.970129 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/22692cd1-90b3-467c-b11b-03f029f9060e-kubelet-dir\") pod \"22692cd1-90b3-467c-b11b-03f029f9060e\" (UID: \"22692cd1-90b3-467c-b11b-03f029f9060e\") " Dec 12 00:26:05 crc kubenswrapper[4606]: I1212 00:26:05.970230 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/22692cd1-90b3-467c-b11b-03f029f9060e-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "22692cd1-90b3-467c-b11b-03f029f9060e" (UID: "22692cd1-90b3-467c-b11b-03f029f9060e"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 12 00:26:05 crc kubenswrapper[4606]: I1212 00:26:05.970571 4606 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/22692cd1-90b3-467c-b11b-03f029f9060e-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 12 00:26:05 crc kubenswrapper[4606]: I1212 00:26:05.994232 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22692cd1-90b3-467c-b11b-03f029f9060e-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "22692cd1-90b3-467c-b11b-03f029f9060e" (UID: "22692cd1-90b3-467c-b11b-03f029f9060e"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:26:06 crc kubenswrapper[4606]: I1212 00:26:06.071814 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/22692cd1-90b3-467c-b11b-03f029f9060e-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 12 00:26:06 crc kubenswrapper[4606]: I1212 00:26:06.363267 4606 patch_prober.go:28] interesting pod/downloads-7954f5f757-vlq68 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" start-of-body= Dec 12 00:26:06 crc kubenswrapper[4606]: I1212 00:26:06.363592 4606 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-vlq68" podUID="5635c63d-bd71-4b80-b111-0fd9ff2cd053" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" Dec 12 00:26:06 crc kubenswrapper[4606]: I1212 00:26:06.364246 4606 patch_prober.go:28] interesting pod/downloads-7954f5f757-vlq68 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" start-of-body= Dec 12 00:26:06 crc kubenswrapper[4606]: I1212 00:26:06.364294 4606 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-vlq68" podUID="5635c63d-bd71-4b80-b111-0fd9ff2cd053" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" Dec 12 00:26:06 crc kubenswrapper[4606]: I1212 00:26:06.519824 4606 patch_prober.go:28] interesting pod/console-f9d7485db-dlrwh container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.13:8443/health\": dial tcp 10.217.0.13:8443: connect: connection refused" start-of-body= Dec 12 00:26:06 crc kubenswrapper[4606]: I1212 00:26:06.519869 4606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-dlrwh" podUID="c454b7c4-18db-442a-ae25-d66e7e6061f3" containerName="console" probeResult="failure" output="Get \"https://10.217.0.13:8443/health\": dial tcp 10.217.0.13:8443: connect: connection refused" Dec 12 00:26:06 crc kubenswrapper[4606]: I1212 00:26:06.538551 4606 generic.go:334] "Generic (PLEG): container finished" podID="eb2ff433-bebc-4600-9c55-32549e52559c" containerID="2dd6cf8ef737c9ed956e3f635b7c4ce573873480461e63de1abddb0f2c620aa8" exitCode=0 Dec 12 00:26:06 crc kubenswrapper[4606]: I1212 00:26:06.538847 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"eb2ff433-bebc-4600-9c55-32549e52559c","Type":"ContainerDied","Data":"2dd6cf8ef737c9ed956e3f635b7c4ce573873480461e63de1abddb0f2c620aa8"} Dec 12 00:26:06 crc kubenswrapper[4606]: I1212 00:26:06.568133 4606 patch_prober.go:28] interesting pod/router-default-5444994796-64s9l container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 12 00:26:06 crc kubenswrapper[4606]: [-]has-synced failed: reason withheld Dec 12 00:26:06 crc kubenswrapper[4606]: [+]process-running ok Dec 12 00:26:06 crc kubenswrapper[4606]: healthz check failed Dec 12 00:26:06 crc kubenswrapper[4606]: I1212 00:26:06.568202 4606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-64s9l" podUID="21420052-2e90-4be9-923e-2b8d0d5ad189" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 12 00:26:06 crc kubenswrapper[4606]: I1212 00:26:06.576353 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"22692cd1-90b3-467c-b11b-03f029f9060e","Type":"ContainerDied","Data":"af1884fc186a6041b732f518edd811ddc5353ce5b093a6b2cc12515c4a423783"} Dec 12 00:26:06 crc kubenswrapper[4606]: I1212 00:26:06.576393 4606 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="af1884fc186a6041b732f518edd811ddc5353ce5b093a6b2cc12515c4a423783" Dec 12 00:26:06 crc kubenswrapper[4606]: I1212 00:26:06.576589 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 12 00:26:07 crc kubenswrapper[4606]: I1212 00:26:07.568311 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-64s9l" Dec 12 00:26:07 crc kubenswrapper[4606]: I1212 00:26:07.571573 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-64s9l" Dec 12 00:26:07 crc kubenswrapper[4606]: I1212 00:26:07.949910 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 12 00:26:08 crc kubenswrapper[4606]: I1212 00:26:08.116818 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/eb2ff433-bebc-4600-9c55-32549e52559c-kube-api-access\") pod \"eb2ff433-bebc-4600-9c55-32549e52559c\" (UID: \"eb2ff433-bebc-4600-9c55-32549e52559c\") " Dec 12 00:26:08 crc kubenswrapper[4606]: I1212 00:26:08.116869 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/eb2ff433-bebc-4600-9c55-32549e52559c-kubelet-dir\") pod \"eb2ff433-bebc-4600-9c55-32549e52559c\" (UID: \"eb2ff433-bebc-4600-9c55-32549e52559c\") " Dec 12 00:26:08 crc kubenswrapper[4606]: I1212 00:26:08.117105 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eb2ff433-bebc-4600-9c55-32549e52559c-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "eb2ff433-bebc-4600-9c55-32549e52559c" (UID: "eb2ff433-bebc-4600-9c55-32549e52559c"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 12 00:26:08 crc kubenswrapper[4606]: I1212 00:26:08.130648 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb2ff433-bebc-4600-9c55-32549e52559c-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "eb2ff433-bebc-4600-9c55-32549e52559c" (UID: "eb2ff433-bebc-4600-9c55-32549e52559c"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:26:08 crc kubenswrapper[4606]: I1212 00:26:08.218021 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/eb2ff433-bebc-4600-9c55-32549e52559c-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 12 00:26:08 crc kubenswrapper[4606]: I1212 00:26:08.218078 4606 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/eb2ff433-bebc-4600-9c55-32549e52559c-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 12 00:26:08 crc kubenswrapper[4606]: I1212 00:26:08.603853 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 12 00:26:08 crc kubenswrapper[4606]: I1212 00:26:08.603766 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"eb2ff433-bebc-4600-9c55-32549e52559c","Type":"ContainerDied","Data":"1562087a4f9d8374ad352445487ff715c7ea7dd2134897e80986167db682327c"} Dec 12 00:26:08 crc kubenswrapper[4606]: I1212 00:26:08.610293 4606 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1562087a4f9d8374ad352445487ff715c7ea7dd2134897e80986167db682327c" Dec 12 00:26:11 crc kubenswrapper[4606]: I1212 00:26:11.568226 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0853dce1-c009-407e-960d-1113f85e503f-metrics-certs\") pod \"network-metrics-daemon-mjjwd\" (UID: \"0853dce1-c009-407e-960d-1113f85e503f\") " pod="openshift-multus/network-metrics-daemon-mjjwd" Dec 12 00:26:11 crc kubenswrapper[4606]: I1212 00:26:11.578141 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0853dce1-c009-407e-960d-1113f85e503f-metrics-certs\") pod \"network-metrics-daemon-mjjwd\" (UID: \"0853dce1-c009-407e-960d-1113f85e503f\") " pod="openshift-multus/network-metrics-daemon-mjjwd" Dec 12 00:26:11 crc kubenswrapper[4606]: I1212 00:26:11.639427 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mjjwd" Dec 12 00:26:16 crc kubenswrapper[4606]: I1212 00:26:16.367529 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-vlq68" Dec 12 00:26:16 crc kubenswrapper[4606]: I1212 00:26:16.723505 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-dlrwh" Dec 12 00:26:16 crc kubenswrapper[4606]: I1212 00:26:16.731783 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-dlrwh" Dec 12 00:26:17 crc kubenswrapper[4606]: I1212 00:26:17.394667 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-t8nn7" Dec 12 00:26:24 crc kubenswrapper[4606]: I1212 00:26:24.721898 4606 generic.go:334] "Generic (PLEG): container finished" podID="148f1f7a-b994-4984-a900-18e9d5868002" containerID="f0f374be38c7614b20aa284d3ad950f1cb7ccd72c89a2c8181f7c2ab0c3b634d" exitCode=0 Dec 12 00:26:24 crc kubenswrapper[4606]: I1212 00:26:24.722226 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29424960-bh2l7" event={"ID":"148f1f7a-b994-4984-a900-18e9d5868002","Type":"ContainerDied","Data":"f0f374be38c7614b20aa284d3ad950f1cb7ccd72c89a2c8181f7c2ab0c3b634d"} Dec 12 00:26:27 crc kubenswrapper[4606]: I1212 00:26:27.270484 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-c86kb" Dec 12 00:26:29 crc kubenswrapper[4606]: I1212 00:26:29.502660 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29424960-bh2l7" Dec 12 00:26:29 crc kubenswrapper[4606]: I1212 00:26:29.540209 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcl2w\" (UniqueName: \"kubernetes.io/projected/148f1f7a-b994-4984-a900-18e9d5868002-kube-api-access-fcl2w\") pod \"148f1f7a-b994-4984-a900-18e9d5868002\" (UID: \"148f1f7a-b994-4984-a900-18e9d5868002\") " Dec 12 00:26:29 crc kubenswrapper[4606]: I1212 00:26:29.540616 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/148f1f7a-b994-4984-a900-18e9d5868002-serviceca\") pod \"148f1f7a-b994-4984-a900-18e9d5868002\" (UID: \"148f1f7a-b994-4984-a900-18e9d5868002\") " Dec 12 00:26:29 crc kubenswrapper[4606]: I1212 00:26:29.542104 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/148f1f7a-b994-4984-a900-18e9d5868002-serviceca" (OuterVolumeSpecName: "serviceca") pod "148f1f7a-b994-4984-a900-18e9d5868002" (UID: "148f1f7a-b994-4984-a900-18e9d5868002"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:26:29 crc kubenswrapper[4606]: I1212 00:26:29.561438 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/148f1f7a-b994-4984-a900-18e9d5868002-kube-api-access-fcl2w" (OuterVolumeSpecName: "kube-api-access-fcl2w") pod "148f1f7a-b994-4984-a900-18e9d5868002" (UID: "148f1f7a-b994-4984-a900-18e9d5868002"). InnerVolumeSpecName "kube-api-access-fcl2w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:26:29 crc kubenswrapper[4606]: I1212 00:26:29.644256 4606 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/148f1f7a-b994-4984-a900-18e9d5868002-serviceca\") on node \"crc\" DevicePath \"\"" Dec 12 00:26:29 crc kubenswrapper[4606]: I1212 00:26:29.644290 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcl2w\" (UniqueName: \"kubernetes.io/projected/148f1f7a-b994-4984-a900-18e9d5868002-kube-api-access-fcl2w\") on node \"crc\" DevicePath \"\"" Dec 12 00:26:29 crc kubenswrapper[4606]: I1212 00:26:29.765829 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29424960-bh2l7" event={"ID":"148f1f7a-b994-4984-a900-18e9d5868002","Type":"ContainerDied","Data":"20c4027857969c7e2a2673cdc195dec9ed8c96c9024aa23f76e3cba36775fcbd"} Dec 12 00:26:29 crc kubenswrapper[4606]: I1212 00:26:29.765873 4606 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="20c4027857969c7e2a2673cdc195dec9ed8c96c9024aa23f76e3cba36775fcbd" Dec 12 00:26:29 crc kubenswrapper[4606]: I1212 00:26:29.765899 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29424960-bh2l7" Dec 12 00:26:32 crc kubenswrapper[4606]: I1212 00:26:32.009987 4606 patch_prober.go:28] interesting pod/machine-config-daemon-cqmz5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 12 00:26:32 crc kubenswrapper[4606]: I1212 00:26:32.010417 4606 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 12 00:26:33 crc kubenswrapper[4606]: E1212 00:26:33.197218 4606 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Dec 12 00:26:33 crc kubenswrapper[4606]: E1212 00:26:33.197677 4606 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6qd8z,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-rkpr5_openshift-marketplace(18996b4c-ea24-4fa6-8420-c6ff4cd30473): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 12 00:26:33 crc kubenswrapper[4606]: E1212 00:26:33.198848 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-rkpr5" podUID="18996b4c-ea24-4fa6-8420-c6ff4cd30473" Dec 12 00:26:33 crc kubenswrapper[4606]: I1212 00:26:33.716521 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 12 00:26:33 crc kubenswrapper[4606]: E1212 00:26:33.716753 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="148f1f7a-b994-4984-a900-18e9d5868002" containerName="image-pruner" Dec 12 00:26:33 crc kubenswrapper[4606]: I1212 00:26:33.716764 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="148f1f7a-b994-4984-a900-18e9d5868002" containerName="image-pruner" Dec 12 00:26:33 crc kubenswrapper[4606]: E1212 00:26:33.716780 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb2ff433-bebc-4600-9c55-32549e52559c" containerName="pruner" Dec 12 00:26:33 crc kubenswrapper[4606]: I1212 00:26:33.716787 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb2ff433-bebc-4600-9c55-32549e52559c" containerName="pruner" Dec 12 00:26:33 crc kubenswrapper[4606]: E1212 00:26:33.716796 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22692cd1-90b3-467c-b11b-03f029f9060e" containerName="pruner" Dec 12 00:26:33 crc kubenswrapper[4606]: I1212 00:26:33.716803 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="22692cd1-90b3-467c-b11b-03f029f9060e" containerName="pruner" Dec 12 00:26:33 crc kubenswrapper[4606]: I1212 00:26:33.716910 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb2ff433-bebc-4600-9c55-32549e52559c" containerName="pruner" Dec 12 00:26:33 crc kubenswrapper[4606]: I1212 00:26:33.716920 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="22692cd1-90b3-467c-b11b-03f029f9060e" containerName="pruner" Dec 12 00:26:33 crc kubenswrapper[4606]: I1212 00:26:33.716930 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="148f1f7a-b994-4984-a900-18e9d5868002" containerName="image-pruner" Dec 12 00:26:33 crc kubenswrapper[4606]: I1212 00:26:33.717311 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 12 00:26:33 crc kubenswrapper[4606]: I1212 00:26:33.720824 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 12 00:26:33 crc kubenswrapper[4606]: I1212 00:26:33.721191 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 12 00:26:33 crc kubenswrapper[4606]: I1212 00:26:33.727713 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 12 00:26:33 crc kubenswrapper[4606]: I1212 00:26:33.893261 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/06836183-841c-41cf-b6aa-26ad3cc92e58-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"06836183-841c-41cf-b6aa-26ad3cc92e58\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 12 00:26:33 crc kubenswrapper[4606]: I1212 00:26:33.893376 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/06836183-841c-41cf-b6aa-26ad3cc92e58-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"06836183-841c-41cf-b6aa-26ad3cc92e58\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 12 00:26:33 crc kubenswrapper[4606]: I1212 00:26:33.994319 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/06836183-841c-41cf-b6aa-26ad3cc92e58-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"06836183-841c-41cf-b6aa-26ad3cc92e58\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 12 00:26:33 crc kubenswrapper[4606]: I1212 00:26:33.994427 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/06836183-841c-41cf-b6aa-26ad3cc92e58-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"06836183-841c-41cf-b6aa-26ad3cc92e58\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 12 00:26:33 crc kubenswrapper[4606]: I1212 00:26:33.994441 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/06836183-841c-41cf-b6aa-26ad3cc92e58-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"06836183-841c-41cf-b6aa-26ad3cc92e58\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 12 00:26:34 crc kubenswrapper[4606]: I1212 00:26:34.016982 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/06836183-841c-41cf-b6aa-26ad3cc92e58-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"06836183-841c-41cf-b6aa-26ad3cc92e58\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 12 00:26:34 crc kubenswrapper[4606]: I1212 00:26:34.048137 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 12 00:26:36 crc kubenswrapper[4606]: I1212 00:26:36.973043 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 12 00:26:39 crc kubenswrapper[4606]: E1212 00:26:39.295222 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-rkpr5" podUID="18996b4c-ea24-4fa6-8420-c6ff4cd30473" Dec 12 00:26:39 crc kubenswrapper[4606]: E1212 00:26:39.373455 4606 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Dec 12 00:26:39 crc kubenswrapper[4606]: E1212 00:26:39.373700 4606 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-54mqq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-fqkw9_openshift-marketplace(303940e6-1922-4197-ad2a-6524c192b1b5): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 12 00:26:39 crc kubenswrapper[4606]: E1212 00:26:39.374879 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-fqkw9" podUID="303940e6-1922-4197-ad2a-6524c192b1b5" Dec 12 00:26:39 crc kubenswrapper[4606]: I1212 00:26:39.713800 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 12 00:26:39 crc kubenswrapper[4606]: I1212 00:26:39.714743 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 12 00:26:39 crc kubenswrapper[4606]: I1212 00:26:39.735276 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 12 00:26:39 crc kubenswrapper[4606]: I1212 00:26:39.845786 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/de9116f1-408c-4f51-9d29-0acfc5ed7f4f-kube-api-access\") pod \"installer-9-crc\" (UID: \"de9116f1-408c-4f51-9d29-0acfc5ed7f4f\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 12 00:26:39 crc kubenswrapper[4606]: I1212 00:26:39.845870 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/de9116f1-408c-4f51-9d29-0acfc5ed7f4f-var-lock\") pod \"installer-9-crc\" (UID: \"de9116f1-408c-4f51-9d29-0acfc5ed7f4f\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 12 00:26:39 crc kubenswrapper[4606]: I1212 00:26:39.845972 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/de9116f1-408c-4f51-9d29-0acfc5ed7f4f-kubelet-dir\") pod \"installer-9-crc\" (UID: \"de9116f1-408c-4f51-9d29-0acfc5ed7f4f\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 12 00:26:39 crc kubenswrapper[4606]: I1212 00:26:39.946646 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/de9116f1-408c-4f51-9d29-0acfc5ed7f4f-kubelet-dir\") pod \"installer-9-crc\" (UID: \"de9116f1-408c-4f51-9d29-0acfc5ed7f4f\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 12 00:26:39 crc kubenswrapper[4606]: I1212 00:26:39.946712 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/de9116f1-408c-4f51-9d29-0acfc5ed7f4f-kube-api-access\") pod \"installer-9-crc\" (UID: \"de9116f1-408c-4f51-9d29-0acfc5ed7f4f\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 12 00:26:39 crc kubenswrapper[4606]: I1212 00:26:39.946718 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/de9116f1-408c-4f51-9d29-0acfc5ed7f4f-kubelet-dir\") pod \"installer-9-crc\" (UID: \"de9116f1-408c-4f51-9d29-0acfc5ed7f4f\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 12 00:26:39 crc kubenswrapper[4606]: I1212 00:26:39.946734 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/de9116f1-408c-4f51-9d29-0acfc5ed7f4f-var-lock\") pod \"installer-9-crc\" (UID: \"de9116f1-408c-4f51-9d29-0acfc5ed7f4f\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 12 00:26:39 crc kubenswrapper[4606]: I1212 00:26:39.946832 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/de9116f1-408c-4f51-9d29-0acfc5ed7f4f-var-lock\") pod \"installer-9-crc\" (UID: \"de9116f1-408c-4f51-9d29-0acfc5ed7f4f\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 12 00:26:39 crc kubenswrapper[4606]: I1212 00:26:39.969969 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/de9116f1-408c-4f51-9d29-0acfc5ed7f4f-kube-api-access\") pod \"installer-9-crc\" (UID: \"de9116f1-408c-4f51-9d29-0acfc5ed7f4f\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 12 00:26:40 crc kubenswrapper[4606]: I1212 00:26:40.045595 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 12 00:26:42 crc kubenswrapper[4606]: E1212 00:26:42.320347 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-fqkw9" podUID="303940e6-1922-4197-ad2a-6524c192b1b5" Dec 12 00:26:42 crc kubenswrapper[4606]: E1212 00:26:42.422731 4606 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Dec 12 00:26:42 crc kubenswrapper[4606]: E1212 00:26:42.423196 4606 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-flcpx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-sfvgg_openshift-marketplace(77eaeae2-bc76-4cb3-9578-f24186325c2c): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 12 00:26:42 crc kubenswrapper[4606]: E1212 00:26:42.427122 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-sfvgg" podUID="77eaeae2-bc76-4cb3-9578-f24186325c2c" Dec 12 00:26:42 crc kubenswrapper[4606]: E1212 00:26:42.437205 4606 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Dec 12 00:26:42 crc kubenswrapper[4606]: E1212 00:26:42.437367 4606 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fz9dp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-kwr58_openshift-marketplace(828a9e1a-5485-4706-abd6-fb28b99d0f19): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 12 00:26:42 crc kubenswrapper[4606]: E1212 00:26:42.438531 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-kwr58" podUID="828a9e1a-5485-4706-abd6-fb28b99d0f19" Dec 12 00:26:42 crc kubenswrapper[4606]: E1212 00:26:42.518431 4606 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Dec 12 00:26:42 crc kubenswrapper[4606]: E1212 00:26:42.518562 4606 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-drsn7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-xg5pj_openshift-marketplace(34b189fb-cbc9-4ebe-bfa2-b8a4e1e0d4b4): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 12 00:26:42 crc kubenswrapper[4606]: E1212 00:26:42.520187 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-xg5pj" podUID="34b189fb-cbc9-4ebe-bfa2-b8a4e1e0d4b4" Dec 12 00:26:42 crc kubenswrapper[4606]: I1212 00:26:42.823042 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 12 00:26:42 crc kubenswrapper[4606]: I1212 00:26:42.840090 4606 generic.go:334] "Generic (PLEG): container finished" podID="ee26d00d-079d-41c7-b641-5f2373eef2ee" containerID="ebc3634ed151afc936779c7397fec3d15c9ac679b8f7908f8132b99d20e7186d" exitCode=0 Dec 12 00:26:42 crc kubenswrapper[4606]: I1212 00:26:42.840226 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q5998" event={"ID":"ee26d00d-079d-41c7-b641-5f2373eef2ee","Type":"ContainerDied","Data":"ebc3634ed151afc936779c7397fec3d15c9ac679b8f7908f8132b99d20e7186d"} Dec 12 00:26:42 crc kubenswrapper[4606]: W1212 00:26:42.841695 4606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod06836183_841c_41cf_b6aa_26ad3cc92e58.slice/crio-0337f2112c3772f4a81fffe9d5f3eb84e1cbf7397c8e53701cb78259bcc9ded2 WatchSource:0}: Error finding container 0337f2112c3772f4a81fffe9d5f3eb84e1cbf7397c8e53701cb78259bcc9ded2: Status 404 returned error can't find the container with id 0337f2112c3772f4a81fffe9d5f3eb84e1cbf7397c8e53701cb78259bcc9ded2 Dec 12 00:26:42 crc kubenswrapper[4606]: I1212 00:26:42.842573 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-drkxp" event={"ID":"16aefbaf-f9b9-452e-84fa-a710b9284349","Type":"ContainerStarted","Data":"49ad1c6582f45f7ee377a6f339952af147df9d234fc0a6e0776ccb6fde77ad7d"} Dec 12 00:26:42 crc kubenswrapper[4606]: I1212 00:26:42.852110 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-krjvv" event={"ID":"4f43f314-8361-4374-87fc-00d8955b4ca4","Type":"ContainerStarted","Data":"f6873ca3bf900ca69933e5ea959181ad7b624a896ac52d9e8efd18ec0733c11f"} Dec 12 00:26:42 crc kubenswrapper[4606]: E1212 00:26:42.855577 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-kwr58" podUID="828a9e1a-5485-4706-abd6-fb28b99d0f19" Dec 12 00:26:42 crc kubenswrapper[4606]: E1212 00:26:42.855605 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-sfvgg" podUID="77eaeae2-bc76-4cb3-9578-f24186325c2c" Dec 12 00:26:42 crc kubenswrapper[4606]: E1212 00:26:42.855693 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-xg5pj" podUID="34b189fb-cbc9-4ebe-bfa2-b8a4e1e0d4b4" Dec 12 00:26:42 crc kubenswrapper[4606]: I1212 00:26:42.899797 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-mjjwd"] Dec 12 00:26:42 crc kubenswrapper[4606]: I1212 00:26:42.986053 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 12 00:26:43 crc kubenswrapper[4606]: W1212 00:26:43.016731 4606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podde9116f1_408c_4f51_9d29_0acfc5ed7f4f.slice/crio-133dde96b0f24570670147da20e9894c38b10a4fa7e2189a4eff1c7a7c98b356 WatchSource:0}: Error finding container 133dde96b0f24570670147da20e9894c38b10a4fa7e2189a4eff1c7a7c98b356: Status 404 returned error can't find the container with id 133dde96b0f24570670147da20e9894c38b10a4fa7e2189a4eff1c7a7c98b356 Dec 12 00:26:43 crc kubenswrapper[4606]: I1212 00:26:43.858878 4606 generic.go:334] "Generic (PLEG): container finished" podID="4f43f314-8361-4374-87fc-00d8955b4ca4" containerID="f6873ca3bf900ca69933e5ea959181ad7b624a896ac52d9e8efd18ec0733c11f" exitCode=0 Dec 12 00:26:43 crc kubenswrapper[4606]: I1212 00:26:43.859092 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-krjvv" event={"ID":"4f43f314-8361-4374-87fc-00d8955b4ca4","Type":"ContainerDied","Data":"f6873ca3bf900ca69933e5ea959181ad7b624a896ac52d9e8efd18ec0733c11f"} Dec 12 00:26:43 crc kubenswrapper[4606]: I1212 00:26:43.864121 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-mjjwd" event={"ID":"0853dce1-c009-407e-960d-1113f85e503f","Type":"ContainerStarted","Data":"6d7c1c02f1a94fda2e52895664a04b0bfc900d2ee2302d1fb2afc9d191700d31"} Dec 12 00:26:43 crc kubenswrapper[4606]: I1212 00:26:43.864149 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-mjjwd" event={"ID":"0853dce1-c009-407e-960d-1113f85e503f","Type":"ContainerStarted","Data":"576eb90963ac07cabe7fd9a28d800c0f92c8fde8846ba856afd17b6abf673700"} Dec 12 00:26:43 crc kubenswrapper[4606]: I1212 00:26:43.867612 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"06836183-841c-41cf-b6aa-26ad3cc92e58","Type":"ContainerStarted","Data":"fa6dc713c77107fd0abd1df3222e2b32155a0ab877b66ce2d52caa4670a6c5f6"} Dec 12 00:26:43 crc kubenswrapper[4606]: I1212 00:26:43.867643 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"06836183-841c-41cf-b6aa-26ad3cc92e58","Type":"ContainerStarted","Data":"0337f2112c3772f4a81fffe9d5f3eb84e1cbf7397c8e53701cb78259bcc9ded2"} Dec 12 00:26:43 crc kubenswrapper[4606]: I1212 00:26:43.871715 4606 generic.go:334] "Generic (PLEG): container finished" podID="16aefbaf-f9b9-452e-84fa-a710b9284349" containerID="49ad1c6582f45f7ee377a6f339952af147df9d234fc0a6e0776ccb6fde77ad7d" exitCode=0 Dec 12 00:26:43 crc kubenswrapper[4606]: I1212 00:26:43.871794 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-drkxp" event={"ID":"16aefbaf-f9b9-452e-84fa-a710b9284349","Type":"ContainerDied","Data":"49ad1c6582f45f7ee377a6f339952af147df9d234fc0a6e0776ccb6fde77ad7d"} Dec 12 00:26:43 crc kubenswrapper[4606]: I1212 00:26:43.873467 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"de9116f1-408c-4f51-9d29-0acfc5ed7f4f","Type":"ContainerStarted","Data":"86a12228627f6c7eeec8e1d5cd118632b072ec71b5e12826db85d066c59dab67"} Dec 12 00:26:43 crc kubenswrapper[4606]: I1212 00:26:43.873494 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"de9116f1-408c-4f51-9d29-0acfc5ed7f4f","Type":"ContainerStarted","Data":"133dde96b0f24570670147da20e9894c38b10a4fa7e2189a4eff1c7a7c98b356"} Dec 12 00:26:43 crc kubenswrapper[4606]: I1212 00:26:43.895635 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=10.895618886 podStartE2EDuration="10.895618886s" podCreationTimestamp="2025-12-12 00:26:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:26:43.887267524 +0000 UTC m=+194.432620390" watchObservedRunningTime="2025-12-12 00:26:43.895618886 +0000 UTC m=+194.440971752" Dec 12 00:26:44 crc kubenswrapper[4606]: I1212 00:26:44.878904 4606 generic.go:334] "Generic (PLEG): container finished" podID="06836183-841c-41cf-b6aa-26ad3cc92e58" containerID="fa6dc713c77107fd0abd1df3222e2b32155a0ab877b66ce2d52caa4670a6c5f6" exitCode=0 Dec 12 00:26:44 crc kubenswrapper[4606]: I1212 00:26:44.879123 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"06836183-841c-41cf-b6aa-26ad3cc92e58","Type":"ContainerDied","Data":"fa6dc713c77107fd0abd1df3222e2b32155a0ab877b66ce2d52caa4670a6c5f6"} Dec 12 00:26:44 crc kubenswrapper[4606]: I1212 00:26:44.909298 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=5.909282491 podStartE2EDuration="5.909282491s" podCreationTimestamp="2025-12-12 00:26:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:26:44.906497773 +0000 UTC m=+195.451850639" watchObservedRunningTime="2025-12-12 00:26:44.909282491 +0000 UTC m=+195.454635357" Dec 12 00:26:45 crc kubenswrapper[4606]: I1212 00:26:45.031773 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-dgfmw"] Dec 12 00:26:45 crc kubenswrapper[4606]: I1212 00:26:45.885611 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-mjjwd" event={"ID":"0853dce1-c009-407e-960d-1113f85e503f","Type":"ContainerStarted","Data":"4eb8068eb486ba1fa9cdb5e12ad040e0b2e8109083b2fe8dc2ff219a752146e1"} Dec 12 00:26:45 crc kubenswrapper[4606]: I1212 00:26:45.903035 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-mjjwd" podStartSLOduration=176.903014361 podStartE2EDuration="2m56.903014361s" podCreationTimestamp="2025-12-12 00:23:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:26:45.899118612 +0000 UTC m=+196.444471478" watchObservedRunningTime="2025-12-12 00:26:45.903014361 +0000 UTC m=+196.448367217" Dec 12 00:26:46 crc kubenswrapper[4606]: I1212 00:26:46.114335 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 12 00:26:46 crc kubenswrapper[4606]: I1212 00:26:46.252758 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/06836183-841c-41cf-b6aa-26ad3cc92e58-kubelet-dir\") pod \"06836183-841c-41cf-b6aa-26ad3cc92e58\" (UID: \"06836183-841c-41cf-b6aa-26ad3cc92e58\") " Dec 12 00:26:46 crc kubenswrapper[4606]: I1212 00:26:46.252866 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/06836183-841c-41cf-b6aa-26ad3cc92e58-kube-api-access\") pod \"06836183-841c-41cf-b6aa-26ad3cc92e58\" (UID: \"06836183-841c-41cf-b6aa-26ad3cc92e58\") " Dec 12 00:26:46 crc kubenswrapper[4606]: I1212 00:26:46.252897 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/06836183-841c-41cf-b6aa-26ad3cc92e58-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "06836183-841c-41cf-b6aa-26ad3cc92e58" (UID: "06836183-841c-41cf-b6aa-26ad3cc92e58"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 12 00:26:46 crc kubenswrapper[4606]: I1212 00:26:46.253107 4606 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/06836183-841c-41cf-b6aa-26ad3cc92e58-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 12 00:26:46 crc kubenswrapper[4606]: I1212 00:26:46.261555 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06836183-841c-41cf-b6aa-26ad3cc92e58-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "06836183-841c-41cf-b6aa-26ad3cc92e58" (UID: "06836183-841c-41cf-b6aa-26ad3cc92e58"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:26:46 crc kubenswrapper[4606]: I1212 00:26:46.354654 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/06836183-841c-41cf-b6aa-26ad3cc92e58-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 12 00:26:46 crc kubenswrapper[4606]: I1212 00:26:46.892398 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 12 00:26:46 crc kubenswrapper[4606]: I1212 00:26:46.896003 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"06836183-841c-41cf-b6aa-26ad3cc92e58","Type":"ContainerDied","Data":"0337f2112c3772f4a81fffe9d5f3eb84e1cbf7397c8e53701cb78259bcc9ded2"} Dec 12 00:26:46 crc kubenswrapper[4606]: I1212 00:26:46.896111 4606 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0337f2112c3772f4a81fffe9d5f3eb84e1cbf7397c8e53701cb78259bcc9ded2" Dec 12 00:26:49 crc kubenswrapper[4606]: I1212 00:26:49.910645 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q5998" event={"ID":"ee26d00d-079d-41c7-b641-5f2373eef2ee","Type":"ContainerStarted","Data":"921af76b9dbda9d86c1cec7c7610a9a4396e93e859431c6729a55e9f4af917e3"} Dec 12 00:26:49 crc kubenswrapper[4606]: I1212 00:26:49.919449 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-drkxp" event={"ID":"16aefbaf-f9b9-452e-84fa-a710b9284349","Type":"ContainerStarted","Data":"abe99af364e8244bff119476cc08dcf3b6a56be253275b60b224e0ddc05eec24"} Dec 12 00:26:49 crc kubenswrapper[4606]: I1212 00:26:49.924704 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-krjvv" event={"ID":"4f43f314-8361-4374-87fc-00d8955b4ca4","Type":"ContainerStarted","Data":"9d85d8948a3a4a595ef3701786636e800b3bc6df78dc714317f579a2573885ba"} Dec 12 00:26:49 crc kubenswrapper[4606]: I1212 00:26:49.946782 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-q5998" podStartSLOduration=4.929663554 podStartE2EDuration="55.946765578s" podCreationTimestamp="2025-12-12 00:25:54 +0000 UTC" firstStartedPulling="2025-12-12 00:25:56.934431734 +0000 UTC m=+147.479784600" lastFinishedPulling="2025-12-12 00:26:47.951533758 +0000 UTC m=+198.496886624" observedRunningTime="2025-12-12 00:26:49.942989434 +0000 UTC m=+200.488342300" watchObservedRunningTime="2025-12-12 00:26:49.946765578 +0000 UTC m=+200.492118444" Dec 12 00:26:49 crc kubenswrapper[4606]: I1212 00:26:49.975101 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-krjvv" podStartSLOduration=3.716571474 podStartE2EDuration="56.975082372s" podCreationTimestamp="2025-12-12 00:25:53 +0000 UTC" firstStartedPulling="2025-12-12 00:25:55.875741147 +0000 UTC m=+146.421094013" lastFinishedPulling="2025-12-12 00:26:49.134252045 +0000 UTC m=+199.679604911" observedRunningTime="2025-12-12 00:26:49.971040061 +0000 UTC m=+200.516392927" watchObservedRunningTime="2025-12-12 00:26:49.975082372 +0000 UTC m=+200.520435238" Dec 12 00:26:49 crc kubenswrapper[4606]: I1212 00:26:49.988688 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-drkxp" podStartSLOduration=3.64168237 podStartE2EDuration="56.988673499s" podCreationTimestamp="2025-12-12 00:25:53 +0000 UTC" firstStartedPulling="2025-12-12 00:25:55.872366633 +0000 UTC m=+146.417719509" lastFinishedPulling="2025-12-12 00:26:49.219357772 +0000 UTC m=+199.764710638" observedRunningTime="2025-12-12 00:26:49.98835327 +0000 UTC m=+200.533706136" watchObservedRunningTime="2025-12-12 00:26:49.988673499 +0000 UTC m=+200.534026365" Dec 12 00:26:53 crc kubenswrapper[4606]: I1212 00:26:53.380381 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-drkxp" Dec 12 00:26:53 crc kubenswrapper[4606]: I1212 00:26:53.380773 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-drkxp" Dec 12 00:26:53 crc kubenswrapper[4606]: I1212 00:26:53.445052 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-drkxp" Dec 12 00:26:53 crc kubenswrapper[4606]: I1212 00:26:53.950345 4606 generic.go:334] "Generic (PLEG): container finished" podID="18996b4c-ea24-4fa6-8420-c6ff4cd30473" containerID="3a6a56d1565d6e8667e0a8510b3b6ae4f0b70ba769cbb222f2465309645eb084" exitCode=0 Dec 12 00:26:53 crc kubenswrapper[4606]: I1212 00:26:53.950457 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rkpr5" event={"ID":"18996b4c-ea24-4fa6-8420-c6ff4cd30473","Type":"ContainerDied","Data":"3a6a56d1565d6e8667e0a8510b3b6ae4f0b70ba769cbb222f2465309645eb084"} Dec 12 00:26:54 crc kubenswrapper[4606]: I1212 00:26:54.039992 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-krjvv" Dec 12 00:26:54 crc kubenswrapper[4606]: I1212 00:26:54.040071 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-krjvv" Dec 12 00:26:54 crc kubenswrapper[4606]: I1212 00:26:54.278148 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-krjvv" Dec 12 00:26:54 crc kubenswrapper[4606]: I1212 00:26:54.959655 4606 generic.go:334] "Generic (PLEG): container finished" podID="77eaeae2-bc76-4cb3-9578-f24186325c2c" containerID="72c3fde291e87710f12fb1bfc2bf94bd7b62bc175f57e5494fe08109e05be9ae" exitCode=0 Dec 12 00:26:54 crc kubenswrapper[4606]: I1212 00:26:54.959732 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sfvgg" event={"ID":"77eaeae2-bc76-4cb3-9578-f24186325c2c","Type":"ContainerDied","Data":"72c3fde291e87710f12fb1bfc2bf94bd7b62bc175f57e5494fe08109e05be9ae"} Dec 12 00:26:54 crc kubenswrapper[4606]: I1212 00:26:54.963336 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rkpr5" event={"ID":"18996b4c-ea24-4fa6-8420-c6ff4cd30473","Type":"ContainerStarted","Data":"ea5102ab4a6bfe746bf9579cee1801113bdabfe63e82d1d2b5686ffc6a4ca0dc"} Dec 12 00:26:55 crc kubenswrapper[4606]: I1212 00:26:55.030997 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-krjvv" Dec 12 00:26:55 crc kubenswrapper[4606]: I1212 00:26:55.044810 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-rkpr5" podStartSLOduration=3.631449058 podStartE2EDuration="1m0.044789536s" podCreationTimestamp="2025-12-12 00:25:55 +0000 UTC" firstStartedPulling="2025-12-12 00:25:58.219518819 +0000 UTC m=+148.764871685" lastFinishedPulling="2025-12-12 00:26:54.632859297 +0000 UTC m=+205.178212163" observedRunningTime="2025-12-12 00:26:55.014910049 +0000 UTC m=+205.560262915" watchObservedRunningTime="2025-12-12 00:26:55.044789536 +0000 UTC m=+205.590142402" Dec 12 00:26:55 crc kubenswrapper[4606]: I1212 00:26:55.405299 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-q5998" Dec 12 00:26:55 crc kubenswrapper[4606]: I1212 00:26:55.405513 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-q5998" Dec 12 00:26:55 crc kubenswrapper[4606]: I1212 00:26:55.462915 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-q5998" Dec 12 00:26:55 crc kubenswrapper[4606]: I1212 00:26:55.708681 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-rkpr5" Dec 12 00:26:55 crc kubenswrapper[4606]: I1212 00:26:55.708722 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-rkpr5" Dec 12 00:26:55 crc kubenswrapper[4606]: I1212 00:26:55.970640 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kwr58" event={"ID":"828a9e1a-5485-4706-abd6-fb28b99d0f19","Type":"ContainerStarted","Data":"5f1904ad466704e614da49528fd03a316e55cd608fb52fd79b3a0607517eb2f2"} Dec 12 00:26:55 crc kubenswrapper[4606]: I1212 00:26:55.974837 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sfvgg" event={"ID":"77eaeae2-bc76-4cb3-9578-f24186325c2c","Type":"ContainerStarted","Data":"5e075c0689efbd2557c3359cb107e85aaf92cf7adc21dbe78b6c7a549ec9b234"} Dec 12 00:26:55 crc kubenswrapper[4606]: I1212 00:26:55.977343 4606 generic.go:334] "Generic (PLEG): container finished" podID="34b189fb-cbc9-4ebe-bfa2-b8a4e1e0d4b4" containerID="bdd82eef506390486aad683f30431a03c99de4a60b63bdf9308c0fcd787bb6d7" exitCode=0 Dec 12 00:26:55 crc kubenswrapper[4606]: I1212 00:26:55.977957 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xg5pj" event={"ID":"34b189fb-cbc9-4ebe-bfa2-b8a4e1e0d4b4","Type":"ContainerDied","Data":"bdd82eef506390486aad683f30431a03c99de4a60b63bdf9308c0fcd787bb6d7"} Dec 12 00:26:56 crc kubenswrapper[4606]: I1212 00:26:56.017823 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-sfvgg" podStartSLOduration=4.559642425 podStartE2EDuration="1m3.017805816s" podCreationTimestamp="2025-12-12 00:25:53 +0000 UTC" firstStartedPulling="2025-12-12 00:25:57.062681521 +0000 UTC m=+147.608034387" lastFinishedPulling="2025-12-12 00:26:55.520844912 +0000 UTC m=+206.066197778" observedRunningTime="2025-12-12 00:26:56.016732136 +0000 UTC m=+206.562085012" watchObservedRunningTime="2025-12-12 00:26:56.017805816 +0000 UTC m=+206.563158672" Dec 12 00:26:56 crc kubenswrapper[4606]: I1212 00:26:56.045967 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-q5998" Dec 12 00:26:56 crc kubenswrapper[4606]: I1212 00:26:56.765256 4606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-rkpr5" podUID="18996b4c-ea24-4fa6-8420-c6ff4cd30473" containerName="registry-server" probeResult="failure" output=< Dec 12 00:26:56 crc kubenswrapper[4606]: timeout: failed to connect service ":50051" within 1s Dec 12 00:26:56 crc kubenswrapper[4606]: > Dec 12 00:26:56 crc kubenswrapper[4606]: I1212 00:26:56.984747 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xg5pj" event={"ID":"34b189fb-cbc9-4ebe-bfa2-b8a4e1e0d4b4","Type":"ContainerStarted","Data":"ba3c7e4fc582bcd6e984c004f7d4e787f3870a52e1b6cd14daf6751753a6fe60"} Dec 12 00:26:56 crc kubenswrapper[4606]: I1212 00:26:56.987689 4606 generic.go:334] "Generic (PLEG): container finished" podID="828a9e1a-5485-4706-abd6-fb28b99d0f19" containerID="5f1904ad466704e614da49528fd03a316e55cd608fb52fd79b3a0607517eb2f2" exitCode=0 Dec 12 00:26:56 crc kubenswrapper[4606]: I1212 00:26:56.987755 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kwr58" event={"ID":"828a9e1a-5485-4706-abd6-fb28b99d0f19","Type":"ContainerDied","Data":"5f1904ad466704e614da49528fd03a316e55cd608fb52fd79b3a0607517eb2f2"} Dec 12 00:26:57 crc kubenswrapper[4606]: I1212 00:26:57.006918 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-xg5pj" podStartSLOduration=4.406753732 podStartE2EDuration="1m4.00689782s" podCreationTimestamp="2025-12-12 00:25:53 +0000 UTC" firstStartedPulling="2025-12-12 00:25:56.902839336 +0000 UTC m=+147.448192202" lastFinishedPulling="2025-12-12 00:26:56.502983424 +0000 UTC m=+207.048336290" observedRunningTime="2025-12-12 00:26:57.005948673 +0000 UTC m=+207.551301559" watchObservedRunningTime="2025-12-12 00:26:57.00689782 +0000 UTC m=+207.552250686" Dec 12 00:26:57 crc kubenswrapper[4606]: I1212 00:26:57.800674 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-krjvv"] Dec 12 00:26:57 crc kubenswrapper[4606]: I1212 00:26:57.801245 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-krjvv" podUID="4f43f314-8361-4374-87fc-00d8955b4ca4" containerName="registry-server" containerID="cri-o://9d85d8948a3a4a595ef3701786636e800b3bc6df78dc714317f579a2573885ba" gracePeriod=2 Dec 12 00:26:57 crc kubenswrapper[4606]: I1212 00:26:57.995862 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kwr58" event={"ID":"828a9e1a-5485-4706-abd6-fb28b99d0f19","Type":"ContainerStarted","Data":"baa6a3e3b4200aa5e87c5419c158e6680be44b5094a5512a17df94b433730c6c"} Dec 12 00:26:57 crc kubenswrapper[4606]: I1212 00:26:57.998508 4606 generic.go:334] "Generic (PLEG): container finished" podID="4f43f314-8361-4374-87fc-00d8955b4ca4" containerID="9d85d8948a3a4a595ef3701786636e800b3bc6df78dc714317f579a2573885ba" exitCode=0 Dec 12 00:26:57 crc kubenswrapper[4606]: I1212 00:26:57.998645 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-krjvv" event={"ID":"4f43f314-8361-4374-87fc-00d8955b4ca4","Type":"ContainerDied","Data":"9d85d8948a3a4a595ef3701786636e800b3bc6df78dc714317f579a2573885ba"} Dec 12 00:26:58 crc kubenswrapper[4606]: I1212 00:26:58.000383 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fqkw9" event={"ID":"303940e6-1922-4197-ad2a-6524c192b1b5","Type":"ContainerStarted","Data":"5fa430cbc98eaa886b23f7cb54253b4f5b9e0765e97c2c3069e0d74f2de5ade5"} Dec 12 00:26:58 crc kubenswrapper[4606]: I1212 00:26:58.052649 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-kwr58" podStartSLOduration=3.896487113 podStartE2EDuration="1m2.052629313s" podCreationTimestamp="2025-12-12 00:25:56 +0000 UTC" firstStartedPulling="2025-12-12 00:25:59.301492984 +0000 UTC m=+149.846845850" lastFinishedPulling="2025-12-12 00:26:57.457635184 +0000 UTC m=+208.002988050" observedRunningTime="2025-12-12 00:26:58.051330227 +0000 UTC m=+208.596683093" watchObservedRunningTime="2025-12-12 00:26:58.052629313 +0000 UTC m=+208.597982189" Dec 12 00:26:58 crc kubenswrapper[4606]: I1212 00:26:58.592869 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-krjvv" Dec 12 00:26:58 crc kubenswrapper[4606]: I1212 00:26:58.710424 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f43f314-8361-4374-87fc-00d8955b4ca4-catalog-content\") pod \"4f43f314-8361-4374-87fc-00d8955b4ca4\" (UID: \"4f43f314-8361-4374-87fc-00d8955b4ca4\") " Dec 12 00:26:58 crc kubenswrapper[4606]: I1212 00:26:58.710518 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9gbkx\" (UniqueName: \"kubernetes.io/projected/4f43f314-8361-4374-87fc-00d8955b4ca4-kube-api-access-9gbkx\") pod \"4f43f314-8361-4374-87fc-00d8955b4ca4\" (UID: \"4f43f314-8361-4374-87fc-00d8955b4ca4\") " Dec 12 00:26:58 crc kubenswrapper[4606]: I1212 00:26:58.710562 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f43f314-8361-4374-87fc-00d8955b4ca4-utilities\") pod \"4f43f314-8361-4374-87fc-00d8955b4ca4\" (UID: \"4f43f314-8361-4374-87fc-00d8955b4ca4\") " Dec 12 00:26:58 crc kubenswrapper[4606]: I1212 00:26:58.711436 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f43f314-8361-4374-87fc-00d8955b4ca4-utilities" (OuterVolumeSpecName: "utilities") pod "4f43f314-8361-4374-87fc-00d8955b4ca4" (UID: "4f43f314-8361-4374-87fc-00d8955b4ca4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 00:26:58 crc kubenswrapper[4606]: I1212 00:26:58.717748 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f43f314-8361-4374-87fc-00d8955b4ca4-kube-api-access-9gbkx" (OuterVolumeSpecName: "kube-api-access-9gbkx") pod "4f43f314-8361-4374-87fc-00d8955b4ca4" (UID: "4f43f314-8361-4374-87fc-00d8955b4ca4"). InnerVolumeSpecName "kube-api-access-9gbkx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:26:58 crc kubenswrapper[4606]: I1212 00:26:58.782627 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f43f314-8361-4374-87fc-00d8955b4ca4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4f43f314-8361-4374-87fc-00d8955b4ca4" (UID: "4f43f314-8361-4374-87fc-00d8955b4ca4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 00:26:58 crc kubenswrapper[4606]: I1212 00:26:58.811906 4606 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f43f314-8361-4374-87fc-00d8955b4ca4-utilities\") on node \"crc\" DevicePath \"\"" Dec 12 00:26:58 crc kubenswrapper[4606]: I1212 00:26:58.811960 4606 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f43f314-8361-4374-87fc-00d8955b4ca4-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 12 00:26:58 crc kubenswrapper[4606]: I1212 00:26:58.811976 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9gbkx\" (UniqueName: \"kubernetes.io/projected/4f43f314-8361-4374-87fc-00d8955b4ca4-kube-api-access-9gbkx\") on node \"crc\" DevicePath \"\"" Dec 12 00:26:59 crc kubenswrapper[4606]: I1212 00:26:59.020613 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-krjvv" event={"ID":"4f43f314-8361-4374-87fc-00d8955b4ca4","Type":"ContainerDied","Data":"5dd8bbcbb7a5c1404fd88b5e9fd7ea24dae04ec828987fbbb5ffd11ec665b057"} Dec 12 00:26:59 crc kubenswrapper[4606]: I1212 00:26:59.020659 4606 scope.go:117] "RemoveContainer" containerID="9d85d8948a3a4a595ef3701786636e800b3bc6df78dc714317f579a2573885ba" Dec 12 00:26:59 crc kubenswrapper[4606]: I1212 00:26:59.020783 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-krjvv" Dec 12 00:26:59 crc kubenswrapper[4606]: I1212 00:26:59.049489 4606 scope.go:117] "RemoveContainer" containerID="f6873ca3bf900ca69933e5ea959181ad7b624a896ac52d9e8efd18ec0733c11f" Dec 12 00:26:59 crc kubenswrapper[4606]: I1212 00:26:59.049840 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-krjvv"] Dec 12 00:26:59 crc kubenswrapper[4606]: I1212 00:26:59.052557 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-krjvv"] Dec 12 00:26:59 crc kubenswrapper[4606]: I1212 00:26:59.065639 4606 scope.go:117] "RemoveContainer" containerID="8ce8e512c3248ab076b72b38d89594667065c64b0d80813ee37cfe4278fa1bb6" Dec 12 00:26:59 crc kubenswrapper[4606]: I1212 00:26:59.706569 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f43f314-8361-4374-87fc-00d8955b4ca4" path="/var/lib/kubelet/pods/4f43f314-8361-4374-87fc-00d8955b4ca4/volumes" Dec 12 00:27:00 crc kubenswrapper[4606]: I1212 00:27:00.029394 4606 generic.go:334] "Generic (PLEG): container finished" podID="303940e6-1922-4197-ad2a-6524c192b1b5" containerID="5fa430cbc98eaa886b23f7cb54253b4f5b9e0765e97c2c3069e0d74f2de5ade5" exitCode=0 Dec 12 00:27:00 crc kubenswrapper[4606]: I1212 00:27:00.029494 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fqkw9" event={"ID":"303940e6-1922-4197-ad2a-6524c192b1b5","Type":"ContainerDied","Data":"5fa430cbc98eaa886b23f7cb54253b4f5b9e0765e97c2c3069e0d74f2de5ade5"} Dec 12 00:27:02 crc kubenswrapper[4606]: I1212 00:27:02.010877 4606 patch_prober.go:28] interesting pod/machine-config-daemon-cqmz5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 12 00:27:02 crc kubenswrapper[4606]: I1212 00:27:02.012079 4606 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 12 00:27:02 crc kubenswrapper[4606]: I1212 00:27:02.012228 4606 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" Dec 12 00:27:02 crc kubenswrapper[4606]: I1212 00:27:02.012789 4606 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5de882904f4a5718bcf44d07da05096363447836f325f35130914c17a30c3a55"} pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 12 00:27:02 crc kubenswrapper[4606]: I1212 00:27:02.013007 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" containerName="machine-config-daemon" containerID="cri-o://5de882904f4a5718bcf44d07da05096363447836f325f35130914c17a30c3a55" gracePeriod=600 Dec 12 00:27:03 crc kubenswrapper[4606]: I1212 00:27:03.047989 4606 generic.go:334] "Generic (PLEG): container finished" podID="a543e227-be89-40cb-941d-b4707cc28921" containerID="5de882904f4a5718bcf44d07da05096363447836f325f35130914c17a30c3a55" exitCode=0 Dec 12 00:27:03 crc kubenswrapper[4606]: I1212 00:27:03.048073 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" event={"ID":"a543e227-be89-40cb-941d-b4707cc28921","Type":"ContainerDied","Data":"5de882904f4a5718bcf44d07da05096363447836f325f35130914c17a30c3a55"} Dec 12 00:27:03 crc kubenswrapper[4606]: I1212 00:27:03.048325 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" event={"ID":"a543e227-be89-40cb-941d-b4707cc28921","Type":"ContainerStarted","Data":"a395445cb02f1ca45c19ddbb727d5c765fe5792a9edbe62b6d34f236f52c7139"} Dec 12 00:27:03 crc kubenswrapper[4606]: I1212 00:27:03.050832 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fqkw9" event={"ID":"303940e6-1922-4197-ad2a-6524c192b1b5","Type":"ContainerStarted","Data":"78c3f9ad0cf1e9fa50796a473926316088ec8645f169bedad0718bfac093b0d5"} Dec 12 00:27:03 crc kubenswrapper[4606]: I1212 00:27:03.090270 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-fqkw9" podStartSLOduration=2.9107176040000002 podStartE2EDuration="1m7.090233088s" podCreationTimestamp="2025-12-12 00:25:56 +0000 UTC" firstStartedPulling="2025-12-12 00:25:58.226601296 +0000 UTC m=+148.771954162" lastFinishedPulling="2025-12-12 00:27:02.40611678 +0000 UTC m=+212.951469646" observedRunningTime="2025-12-12 00:27:03.086304629 +0000 UTC m=+213.631657495" watchObservedRunningTime="2025-12-12 00:27:03.090233088 +0000 UTC m=+213.635585954" Dec 12 00:27:03 crc kubenswrapper[4606]: I1212 00:27:03.430537 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-drkxp" Dec 12 00:27:04 crc kubenswrapper[4606]: I1212 00:27:04.290689 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-sfvgg" Dec 12 00:27:04 crc kubenswrapper[4606]: I1212 00:27:04.290745 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-xg5pj" Dec 12 00:27:04 crc kubenswrapper[4606]: I1212 00:27:04.290766 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-xg5pj" Dec 12 00:27:04 crc kubenswrapper[4606]: I1212 00:27:04.290777 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-sfvgg" Dec 12 00:27:04 crc kubenswrapper[4606]: I1212 00:27:04.338524 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-sfvgg" Dec 12 00:27:04 crc kubenswrapper[4606]: I1212 00:27:04.370400 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-xg5pj" Dec 12 00:27:05 crc kubenswrapper[4606]: I1212 00:27:05.104590 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-sfvgg" Dec 12 00:27:05 crc kubenswrapper[4606]: I1212 00:27:05.110626 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-xg5pj" Dec 12 00:27:05 crc kubenswrapper[4606]: I1212 00:27:05.740539 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-rkpr5" Dec 12 00:27:05 crc kubenswrapper[4606]: I1212 00:27:05.785330 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-rkpr5" Dec 12 00:27:06 crc kubenswrapper[4606]: I1212 00:27:06.657133 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-sfvgg"] Dec 12 00:27:06 crc kubenswrapper[4606]: I1212 00:27:06.758972 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-fqkw9" Dec 12 00:27:06 crc kubenswrapper[4606]: I1212 00:27:06.759374 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-fqkw9" Dec 12 00:27:07 crc kubenswrapper[4606]: I1212 00:27:07.113632 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-kwr58" Dec 12 00:27:07 crc kubenswrapper[4606]: I1212 00:27:07.113699 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-kwr58" Dec 12 00:27:07 crc kubenswrapper[4606]: I1212 00:27:07.156095 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-kwr58" Dec 12 00:27:07 crc kubenswrapper[4606]: I1212 00:27:07.806215 4606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-fqkw9" podUID="303940e6-1922-4197-ad2a-6524c192b1b5" containerName="registry-server" probeResult="failure" output=< Dec 12 00:27:07 crc kubenswrapper[4606]: timeout: failed to connect service ":50051" within 1s Dec 12 00:27:07 crc kubenswrapper[4606]: > Dec 12 00:27:08 crc kubenswrapper[4606]: I1212 00:27:08.079077 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-sfvgg" podUID="77eaeae2-bc76-4cb3-9578-f24186325c2c" containerName="registry-server" containerID="cri-o://5e075c0689efbd2557c3359cb107e85aaf92cf7adc21dbe78b6c7a549ec9b234" gracePeriod=2 Dec 12 00:27:08 crc kubenswrapper[4606]: I1212 00:27:08.137236 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-kwr58" Dec 12 00:27:08 crc kubenswrapper[4606]: I1212 00:27:08.402108 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rkpr5"] Dec 12 00:27:08 crc kubenswrapper[4606]: I1212 00:27:08.402410 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-rkpr5" podUID="18996b4c-ea24-4fa6-8420-c6ff4cd30473" containerName="registry-server" containerID="cri-o://ea5102ab4a6bfe746bf9579cee1801113bdabfe63e82d1d2b5686ffc6a4ca0dc" gracePeriod=2 Dec 12 00:27:10 crc kubenswrapper[4606]: I1212 00:27:10.067400 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-dgfmw" podUID="079b1c50-eaa5-4be5-a0d2-0015a67a1875" containerName="oauth-openshift" containerID="cri-o://61a62b7d118fee6eb82dc1f3fb5eafc71f04f337afad6c27d4c7647b0068dfce" gracePeriod=15 Dec 12 00:27:10 crc kubenswrapper[4606]: I1212 00:27:10.407713 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sfvgg" Dec 12 00:27:10 crc kubenswrapper[4606]: I1212 00:27:10.548212 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-dgfmw" Dec 12 00:27:10 crc kubenswrapper[4606]: I1212 00:27:10.592986 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/77eaeae2-bc76-4cb3-9578-f24186325c2c-utilities\") pod \"77eaeae2-bc76-4cb3-9578-f24186325c2c\" (UID: \"77eaeae2-bc76-4cb3-9578-f24186325c2c\") " Dec 12 00:27:10 crc kubenswrapper[4606]: I1212 00:27:10.593143 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-flcpx\" (UniqueName: \"kubernetes.io/projected/77eaeae2-bc76-4cb3-9578-f24186325c2c-kube-api-access-flcpx\") pod \"77eaeae2-bc76-4cb3-9578-f24186325c2c\" (UID: \"77eaeae2-bc76-4cb3-9578-f24186325c2c\") " Dec 12 00:27:10 crc kubenswrapper[4606]: I1212 00:27:10.593216 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/77eaeae2-bc76-4cb3-9578-f24186325c2c-catalog-content\") pod \"77eaeae2-bc76-4cb3-9578-f24186325c2c\" (UID: \"77eaeae2-bc76-4cb3-9578-f24186325c2c\") " Dec 12 00:27:10 crc kubenswrapper[4606]: I1212 00:27:10.594435 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/77eaeae2-bc76-4cb3-9578-f24186325c2c-utilities" (OuterVolumeSpecName: "utilities") pod "77eaeae2-bc76-4cb3-9578-f24186325c2c" (UID: "77eaeae2-bc76-4cb3-9578-f24186325c2c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 00:27:10 crc kubenswrapper[4606]: I1212 00:27:10.599944 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77eaeae2-bc76-4cb3-9578-f24186325c2c-kube-api-access-flcpx" (OuterVolumeSpecName: "kube-api-access-flcpx") pod "77eaeae2-bc76-4cb3-9578-f24186325c2c" (UID: "77eaeae2-bc76-4cb3-9578-f24186325c2c"). InnerVolumeSpecName "kube-api-access-flcpx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:27:10 crc kubenswrapper[4606]: I1212 00:27:10.646105 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/77eaeae2-bc76-4cb3-9578-f24186325c2c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "77eaeae2-bc76-4cb3-9578-f24186325c2c" (UID: "77eaeae2-bc76-4cb3-9578-f24186325c2c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 00:27:10 crc kubenswrapper[4606]: I1212 00:27:10.694465 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/079b1c50-eaa5-4be5-a0d2-0015a67a1875-v4-0-config-system-trusted-ca-bundle\") pod \"079b1c50-eaa5-4be5-a0d2-0015a67a1875\" (UID: \"079b1c50-eaa5-4be5-a0d2-0015a67a1875\") " Dec 12 00:27:10 crc kubenswrapper[4606]: I1212 00:27:10.694535 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/079b1c50-eaa5-4be5-a0d2-0015a67a1875-v4-0-config-system-service-ca\") pod \"079b1c50-eaa5-4be5-a0d2-0015a67a1875\" (UID: \"079b1c50-eaa5-4be5-a0d2-0015a67a1875\") " Dec 12 00:27:10 crc kubenswrapper[4606]: I1212 00:27:10.694569 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/079b1c50-eaa5-4be5-a0d2-0015a67a1875-v4-0-config-user-template-login\") pod \"079b1c50-eaa5-4be5-a0d2-0015a67a1875\" (UID: \"079b1c50-eaa5-4be5-a0d2-0015a67a1875\") " Dec 12 00:27:10 crc kubenswrapper[4606]: I1212 00:27:10.694595 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/079b1c50-eaa5-4be5-a0d2-0015a67a1875-v4-0-config-system-ocp-branding-template\") pod \"079b1c50-eaa5-4be5-a0d2-0015a67a1875\" (UID: \"079b1c50-eaa5-4be5-a0d2-0015a67a1875\") " Dec 12 00:27:10 crc kubenswrapper[4606]: I1212 00:27:10.694634 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/079b1c50-eaa5-4be5-a0d2-0015a67a1875-v4-0-config-user-template-provider-selection\") pod \"079b1c50-eaa5-4be5-a0d2-0015a67a1875\" (UID: \"079b1c50-eaa5-4be5-a0d2-0015a67a1875\") " Dec 12 00:27:10 crc kubenswrapper[4606]: I1212 00:27:10.694675 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/079b1c50-eaa5-4be5-a0d2-0015a67a1875-v4-0-config-system-serving-cert\") pod \"079b1c50-eaa5-4be5-a0d2-0015a67a1875\" (UID: \"079b1c50-eaa5-4be5-a0d2-0015a67a1875\") " Dec 12 00:27:10 crc kubenswrapper[4606]: I1212 00:27:10.694708 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/079b1c50-eaa5-4be5-a0d2-0015a67a1875-audit-policies\") pod \"079b1c50-eaa5-4be5-a0d2-0015a67a1875\" (UID: \"079b1c50-eaa5-4be5-a0d2-0015a67a1875\") " Dec 12 00:27:10 crc kubenswrapper[4606]: I1212 00:27:10.694744 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/079b1c50-eaa5-4be5-a0d2-0015a67a1875-v4-0-config-system-session\") pod \"079b1c50-eaa5-4be5-a0d2-0015a67a1875\" (UID: \"079b1c50-eaa5-4be5-a0d2-0015a67a1875\") " Dec 12 00:27:10 crc kubenswrapper[4606]: I1212 00:27:10.694772 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vksbn\" (UniqueName: \"kubernetes.io/projected/079b1c50-eaa5-4be5-a0d2-0015a67a1875-kube-api-access-vksbn\") pod \"079b1c50-eaa5-4be5-a0d2-0015a67a1875\" (UID: \"079b1c50-eaa5-4be5-a0d2-0015a67a1875\") " Dec 12 00:27:10 crc kubenswrapper[4606]: I1212 00:27:10.694796 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/079b1c50-eaa5-4be5-a0d2-0015a67a1875-v4-0-config-system-cliconfig\") pod \"079b1c50-eaa5-4be5-a0d2-0015a67a1875\" (UID: \"079b1c50-eaa5-4be5-a0d2-0015a67a1875\") " Dec 12 00:27:10 crc kubenswrapper[4606]: I1212 00:27:10.694824 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/079b1c50-eaa5-4be5-a0d2-0015a67a1875-v4-0-config-system-router-certs\") pod \"079b1c50-eaa5-4be5-a0d2-0015a67a1875\" (UID: \"079b1c50-eaa5-4be5-a0d2-0015a67a1875\") " Dec 12 00:27:10 crc kubenswrapper[4606]: I1212 00:27:10.694875 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/079b1c50-eaa5-4be5-a0d2-0015a67a1875-audit-dir\") pod \"079b1c50-eaa5-4be5-a0d2-0015a67a1875\" (UID: \"079b1c50-eaa5-4be5-a0d2-0015a67a1875\") " Dec 12 00:27:10 crc kubenswrapper[4606]: I1212 00:27:10.694902 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/079b1c50-eaa5-4be5-a0d2-0015a67a1875-v4-0-config-user-template-error\") pod \"079b1c50-eaa5-4be5-a0d2-0015a67a1875\" (UID: \"079b1c50-eaa5-4be5-a0d2-0015a67a1875\") " Dec 12 00:27:10 crc kubenswrapper[4606]: I1212 00:27:10.694926 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/079b1c50-eaa5-4be5-a0d2-0015a67a1875-v4-0-config-user-idp-0-file-data\") pod \"079b1c50-eaa5-4be5-a0d2-0015a67a1875\" (UID: \"079b1c50-eaa5-4be5-a0d2-0015a67a1875\") " Dec 12 00:27:10 crc kubenswrapper[4606]: I1212 00:27:10.695157 4606 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/77eaeae2-bc76-4cb3-9578-f24186325c2c-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 12 00:27:10 crc kubenswrapper[4606]: I1212 00:27:10.695202 4606 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/77eaeae2-bc76-4cb3-9578-f24186325c2c-utilities\") on node \"crc\" DevicePath \"\"" Dec 12 00:27:10 crc kubenswrapper[4606]: I1212 00:27:10.695221 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-flcpx\" (UniqueName: \"kubernetes.io/projected/77eaeae2-bc76-4cb3-9578-f24186325c2c-kube-api-access-flcpx\") on node \"crc\" DevicePath \"\"" Dec 12 00:27:10 crc kubenswrapper[4606]: I1212 00:27:10.696146 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/079b1c50-eaa5-4be5-a0d2-0015a67a1875-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "079b1c50-eaa5-4be5-a0d2-0015a67a1875" (UID: "079b1c50-eaa5-4be5-a0d2-0015a67a1875"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:27:10 crc kubenswrapper[4606]: I1212 00:27:10.696157 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/079b1c50-eaa5-4be5-a0d2-0015a67a1875-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "079b1c50-eaa5-4be5-a0d2-0015a67a1875" (UID: "079b1c50-eaa5-4be5-a0d2-0015a67a1875"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:27:10 crc kubenswrapper[4606]: I1212 00:27:10.696232 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/079b1c50-eaa5-4be5-a0d2-0015a67a1875-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "079b1c50-eaa5-4be5-a0d2-0015a67a1875" (UID: "079b1c50-eaa5-4be5-a0d2-0015a67a1875"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 12 00:27:10 crc kubenswrapper[4606]: I1212 00:27:10.697150 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/079b1c50-eaa5-4be5-a0d2-0015a67a1875-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "079b1c50-eaa5-4be5-a0d2-0015a67a1875" (UID: "079b1c50-eaa5-4be5-a0d2-0015a67a1875"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:27:10 crc kubenswrapper[4606]: I1212 00:27:10.699743 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/079b1c50-eaa5-4be5-a0d2-0015a67a1875-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "079b1c50-eaa5-4be5-a0d2-0015a67a1875" (UID: "079b1c50-eaa5-4be5-a0d2-0015a67a1875"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:27:10 crc kubenswrapper[4606]: I1212 00:27:10.700441 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/079b1c50-eaa5-4be5-a0d2-0015a67a1875-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "079b1c50-eaa5-4be5-a0d2-0015a67a1875" (UID: "079b1c50-eaa5-4be5-a0d2-0015a67a1875"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:27:10 crc kubenswrapper[4606]: I1212 00:27:10.702463 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/079b1c50-eaa5-4be5-a0d2-0015a67a1875-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "079b1c50-eaa5-4be5-a0d2-0015a67a1875" (UID: "079b1c50-eaa5-4be5-a0d2-0015a67a1875"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:27:10 crc kubenswrapper[4606]: I1212 00:27:10.703256 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/079b1c50-eaa5-4be5-a0d2-0015a67a1875-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "079b1c50-eaa5-4be5-a0d2-0015a67a1875" (UID: "079b1c50-eaa5-4be5-a0d2-0015a67a1875"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:27:10 crc kubenswrapper[4606]: I1212 00:27:10.703452 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/079b1c50-eaa5-4be5-a0d2-0015a67a1875-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "079b1c50-eaa5-4be5-a0d2-0015a67a1875" (UID: "079b1c50-eaa5-4be5-a0d2-0015a67a1875"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:27:10 crc kubenswrapper[4606]: I1212 00:27:10.703745 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/079b1c50-eaa5-4be5-a0d2-0015a67a1875-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "079b1c50-eaa5-4be5-a0d2-0015a67a1875" (UID: "079b1c50-eaa5-4be5-a0d2-0015a67a1875"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:27:10 crc kubenswrapper[4606]: I1212 00:27:10.703814 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/079b1c50-eaa5-4be5-a0d2-0015a67a1875-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "079b1c50-eaa5-4be5-a0d2-0015a67a1875" (UID: "079b1c50-eaa5-4be5-a0d2-0015a67a1875"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:27:10 crc kubenswrapper[4606]: I1212 00:27:10.704190 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/079b1c50-eaa5-4be5-a0d2-0015a67a1875-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "079b1c50-eaa5-4be5-a0d2-0015a67a1875" (UID: "079b1c50-eaa5-4be5-a0d2-0015a67a1875"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:27:10 crc kubenswrapper[4606]: I1212 00:27:10.706501 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/079b1c50-eaa5-4be5-a0d2-0015a67a1875-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "079b1c50-eaa5-4be5-a0d2-0015a67a1875" (UID: "079b1c50-eaa5-4be5-a0d2-0015a67a1875"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:27:10 crc kubenswrapper[4606]: I1212 00:27:10.715446 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/079b1c50-eaa5-4be5-a0d2-0015a67a1875-kube-api-access-vksbn" (OuterVolumeSpecName: "kube-api-access-vksbn") pod "079b1c50-eaa5-4be5-a0d2-0015a67a1875" (UID: "079b1c50-eaa5-4be5-a0d2-0015a67a1875"). InnerVolumeSpecName "kube-api-access-vksbn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:27:10 crc kubenswrapper[4606]: I1212 00:27:10.795936 4606 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/079b1c50-eaa5-4be5-a0d2-0015a67a1875-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 12 00:27:10 crc kubenswrapper[4606]: I1212 00:27:10.795969 4606 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/079b1c50-eaa5-4be5-a0d2-0015a67a1875-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 12 00:27:10 crc kubenswrapper[4606]: I1212 00:27:10.795979 4606 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/079b1c50-eaa5-4be5-a0d2-0015a67a1875-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 12 00:27:10 crc kubenswrapper[4606]: I1212 00:27:10.795990 4606 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/079b1c50-eaa5-4be5-a0d2-0015a67a1875-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 12 00:27:10 crc kubenswrapper[4606]: I1212 00:27:10.795999 4606 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/079b1c50-eaa5-4be5-a0d2-0015a67a1875-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 12 00:27:10 crc kubenswrapper[4606]: I1212 00:27:10.796007 4606 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/079b1c50-eaa5-4be5-a0d2-0015a67a1875-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 12 00:27:10 crc kubenswrapper[4606]: I1212 00:27:10.796016 4606 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/079b1c50-eaa5-4be5-a0d2-0015a67a1875-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 12 00:27:10 crc kubenswrapper[4606]: I1212 00:27:10.796026 4606 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/079b1c50-eaa5-4be5-a0d2-0015a67a1875-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 12 00:27:10 crc kubenswrapper[4606]: I1212 00:27:10.796036 4606 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/079b1c50-eaa5-4be5-a0d2-0015a67a1875-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 12 00:27:10 crc kubenswrapper[4606]: I1212 00:27:10.796045 4606 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/079b1c50-eaa5-4be5-a0d2-0015a67a1875-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 12 00:27:10 crc kubenswrapper[4606]: I1212 00:27:10.796054 4606 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/079b1c50-eaa5-4be5-a0d2-0015a67a1875-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 12 00:27:10 crc kubenswrapper[4606]: I1212 00:27:10.796063 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vksbn\" (UniqueName: \"kubernetes.io/projected/079b1c50-eaa5-4be5-a0d2-0015a67a1875-kube-api-access-vksbn\") on node \"crc\" DevicePath \"\"" Dec 12 00:27:10 crc kubenswrapper[4606]: I1212 00:27:10.796071 4606 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/079b1c50-eaa5-4be5-a0d2-0015a67a1875-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 12 00:27:10 crc kubenswrapper[4606]: I1212 00:27:10.796081 4606 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/079b1c50-eaa5-4be5-a0d2-0015a67a1875-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 12 00:27:11 crc kubenswrapper[4606]: I1212 00:27:11.006780 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kwr58"] Dec 12 00:27:11 crc kubenswrapper[4606]: I1212 00:27:11.007111 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-kwr58" podUID="828a9e1a-5485-4706-abd6-fb28b99d0f19" containerName="registry-server" containerID="cri-o://baa6a3e3b4200aa5e87c5419c158e6680be44b5094a5512a17df94b433730c6c" gracePeriod=2 Dec 12 00:27:11 crc kubenswrapper[4606]: I1212 00:27:11.100620 4606 generic.go:334] "Generic (PLEG): container finished" podID="77eaeae2-bc76-4cb3-9578-f24186325c2c" containerID="5e075c0689efbd2557c3359cb107e85aaf92cf7adc21dbe78b6c7a549ec9b234" exitCode=0 Dec 12 00:27:11 crc kubenswrapper[4606]: I1212 00:27:11.100689 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sfvgg" event={"ID":"77eaeae2-bc76-4cb3-9578-f24186325c2c","Type":"ContainerDied","Data":"5e075c0689efbd2557c3359cb107e85aaf92cf7adc21dbe78b6c7a549ec9b234"} Dec 12 00:27:11 crc kubenswrapper[4606]: I1212 00:27:11.100739 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sfvgg" event={"ID":"77eaeae2-bc76-4cb3-9578-f24186325c2c","Type":"ContainerDied","Data":"e1f7dc97b6cb6f849880022d071b0eb131b2ac93c97caa68eadee8da85b33e49"} Dec 12 00:27:11 crc kubenswrapper[4606]: I1212 00:27:11.100751 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sfvgg" Dec 12 00:27:11 crc kubenswrapper[4606]: I1212 00:27:11.100763 4606 scope.go:117] "RemoveContainer" containerID="5e075c0689efbd2557c3359cb107e85aaf92cf7adc21dbe78b6c7a549ec9b234" Dec 12 00:27:11 crc kubenswrapper[4606]: I1212 00:27:11.104077 4606 generic.go:334] "Generic (PLEG): container finished" podID="18996b4c-ea24-4fa6-8420-c6ff4cd30473" containerID="ea5102ab4a6bfe746bf9579cee1801113bdabfe63e82d1d2b5686ffc6a4ca0dc" exitCode=0 Dec 12 00:27:11 crc kubenswrapper[4606]: I1212 00:27:11.104309 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rkpr5" event={"ID":"18996b4c-ea24-4fa6-8420-c6ff4cd30473","Type":"ContainerDied","Data":"ea5102ab4a6bfe746bf9579cee1801113bdabfe63e82d1d2b5686ffc6a4ca0dc"} Dec 12 00:27:11 crc kubenswrapper[4606]: I1212 00:27:11.105892 4606 generic.go:334] "Generic (PLEG): container finished" podID="079b1c50-eaa5-4be5-a0d2-0015a67a1875" containerID="61a62b7d118fee6eb82dc1f3fb5eafc71f04f337afad6c27d4c7647b0068dfce" exitCode=0 Dec 12 00:27:11 crc kubenswrapper[4606]: I1212 00:27:11.105915 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-dgfmw" event={"ID":"079b1c50-eaa5-4be5-a0d2-0015a67a1875","Type":"ContainerDied","Data":"61a62b7d118fee6eb82dc1f3fb5eafc71f04f337afad6c27d4c7647b0068dfce"} Dec 12 00:27:11 crc kubenswrapper[4606]: I1212 00:27:11.105931 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-dgfmw" event={"ID":"079b1c50-eaa5-4be5-a0d2-0015a67a1875","Type":"ContainerDied","Data":"8fea28cea6c35e7ac4af8b0e8dd81ced09cae2d35eb72f8ba1c189771a683ded"} Dec 12 00:27:11 crc kubenswrapper[4606]: I1212 00:27:11.106002 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-dgfmw" Dec 12 00:27:11 crc kubenswrapper[4606]: I1212 00:27:11.145532 4606 scope.go:117] "RemoveContainer" containerID="72c3fde291e87710f12fb1bfc2bf94bd7b62bc175f57e5494fe08109e05be9ae" Dec 12 00:27:11 crc kubenswrapper[4606]: I1212 00:27:11.163758 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-sfvgg"] Dec 12 00:27:11 crc kubenswrapper[4606]: I1212 00:27:11.168664 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-sfvgg"] Dec 12 00:27:11 crc kubenswrapper[4606]: I1212 00:27:11.188040 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-dgfmw"] Dec 12 00:27:11 crc kubenswrapper[4606]: I1212 00:27:11.189503 4606 scope.go:117] "RemoveContainer" containerID="dabffad379ec82879b7b60ca950b50b1712b91ba194e9d37458ef9caa4cee9d6" Dec 12 00:27:11 crc kubenswrapper[4606]: I1212 00:27:11.192572 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-dgfmw"] Dec 12 00:27:11 crc kubenswrapper[4606]: I1212 00:27:11.203522 4606 scope.go:117] "RemoveContainer" containerID="5e075c0689efbd2557c3359cb107e85aaf92cf7adc21dbe78b6c7a549ec9b234" Dec 12 00:27:11 crc kubenswrapper[4606]: E1212 00:27:11.204066 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e075c0689efbd2557c3359cb107e85aaf92cf7adc21dbe78b6c7a549ec9b234\": container with ID starting with 5e075c0689efbd2557c3359cb107e85aaf92cf7adc21dbe78b6c7a549ec9b234 not found: ID does not exist" containerID="5e075c0689efbd2557c3359cb107e85aaf92cf7adc21dbe78b6c7a549ec9b234" Dec 12 00:27:11 crc kubenswrapper[4606]: I1212 00:27:11.204139 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e075c0689efbd2557c3359cb107e85aaf92cf7adc21dbe78b6c7a549ec9b234"} err="failed to get container status \"5e075c0689efbd2557c3359cb107e85aaf92cf7adc21dbe78b6c7a549ec9b234\": rpc error: code = NotFound desc = could not find container \"5e075c0689efbd2557c3359cb107e85aaf92cf7adc21dbe78b6c7a549ec9b234\": container with ID starting with 5e075c0689efbd2557c3359cb107e85aaf92cf7adc21dbe78b6c7a549ec9b234 not found: ID does not exist" Dec 12 00:27:11 crc kubenswrapper[4606]: I1212 00:27:11.204232 4606 scope.go:117] "RemoveContainer" containerID="72c3fde291e87710f12fb1bfc2bf94bd7b62bc175f57e5494fe08109e05be9ae" Dec 12 00:27:11 crc kubenswrapper[4606]: E1212 00:27:11.204747 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72c3fde291e87710f12fb1bfc2bf94bd7b62bc175f57e5494fe08109e05be9ae\": container with ID starting with 72c3fde291e87710f12fb1bfc2bf94bd7b62bc175f57e5494fe08109e05be9ae not found: ID does not exist" containerID="72c3fde291e87710f12fb1bfc2bf94bd7b62bc175f57e5494fe08109e05be9ae" Dec 12 00:27:11 crc kubenswrapper[4606]: I1212 00:27:11.204811 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72c3fde291e87710f12fb1bfc2bf94bd7b62bc175f57e5494fe08109e05be9ae"} err="failed to get container status \"72c3fde291e87710f12fb1bfc2bf94bd7b62bc175f57e5494fe08109e05be9ae\": rpc error: code = NotFound desc = could not find container \"72c3fde291e87710f12fb1bfc2bf94bd7b62bc175f57e5494fe08109e05be9ae\": container with ID starting with 72c3fde291e87710f12fb1bfc2bf94bd7b62bc175f57e5494fe08109e05be9ae not found: ID does not exist" Dec 12 00:27:11 crc kubenswrapper[4606]: I1212 00:27:11.204851 4606 scope.go:117] "RemoveContainer" containerID="dabffad379ec82879b7b60ca950b50b1712b91ba194e9d37458ef9caa4cee9d6" Dec 12 00:27:11 crc kubenswrapper[4606]: E1212 00:27:11.205231 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dabffad379ec82879b7b60ca950b50b1712b91ba194e9d37458ef9caa4cee9d6\": container with ID starting with dabffad379ec82879b7b60ca950b50b1712b91ba194e9d37458ef9caa4cee9d6 not found: ID does not exist" containerID="dabffad379ec82879b7b60ca950b50b1712b91ba194e9d37458ef9caa4cee9d6" Dec 12 00:27:11 crc kubenswrapper[4606]: I1212 00:27:11.205289 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dabffad379ec82879b7b60ca950b50b1712b91ba194e9d37458ef9caa4cee9d6"} err="failed to get container status \"dabffad379ec82879b7b60ca950b50b1712b91ba194e9d37458ef9caa4cee9d6\": rpc error: code = NotFound desc = could not find container \"dabffad379ec82879b7b60ca950b50b1712b91ba194e9d37458ef9caa4cee9d6\": container with ID starting with dabffad379ec82879b7b60ca950b50b1712b91ba194e9d37458ef9caa4cee9d6 not found: ID does not exist" Dec 12 00:27:11 crc kubenswrapper[4606]: I1212 00:27:11.205331 4606 scope.go:117] "RemoveContainer" containerID="61a62b7d118fee6eb82dc1f3fb5eafc71f04f337afad6c27d4c7647b0068dfce" Dec 12 00:27:11 crc kubenswrapper[4606]: I1212 00:27:11.224564 4606 scope.go:117] "RemoveContainer" containerID="61a62b7d118fee6eb82dc1f3fb5eafc71f04f337afad6c27d4c7647b0068dfce" Dec 12 00:27:11 crc kubenswrapper[4606]: E1212 00:27:11.225213 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61a62b7d118fee6eb82dc1f3fb5eafc71f04f337afad6c27d4c7647b0068dfce\": container with ID starting with 61a62b7d118fee6eb82dc1f3fb5eafc71f04f337afad6c27d4c7647b0068dfce not found: ID does not exist" containerID="61a62b7d118fee6eb82dc1f3fb5eafc71f04f337afad6c27d4c7647b0068dfce" Dec 12 00:27:11 crc kubenswrapper[4606]: I1212 00:27:11.225274 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61a62b7d118fee6eb82dc1f3fb5eafc71f04f337afad6c27d4c7647b0068dfce"} err="failed to get container status \"61a62b7d118fee6eb82dc1f3fb5eafc71f04f337afad6c27d4c7647b0068dfce\": rpc error: code = NotFound desc = could not find container \"61a62b7d118fee6eb82dc1f3fb5eafc71f04f337afad6c27d4c7647b0068dfce\": container with ID starting with 61a62b7d118fee6eb82dc1f3fb5eafc71f04f337afad6c27d4c7647b0068dfce not found: ID does not exist" Dec 12 00:27:11 crc kubenswrapper[4606]: I1212 00:27:11.713974 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="079b1c50-eaa5-4be5-a0d2-0015a67a1875" path="/var/lib/kubelet/pods/079b1c50-eaa5-4be5-a0d2-0015a67a1875/volumes" Dec 12 00:27:11 crc kubenswrapper[4606]: I1212 00:27:11.715303 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77eaeae2-bc76-4cb3-9578-f24186325c2c" path="/var/lib/kubelet/pods/77eaeae2-bc76-4cb3-9578-f24186325c2c/volumes" Dec 12 00:27:11 crc kubenswrapper[4606]: E1212 00:27:11.855257 4606 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod828a9e1a_5485_4706_abd6_fb28b99d0f19.slice/crio-conmon-baa6a3e3b4200aa5e87c5419c158e6680be44b5094a5512a17df94b433730c6c.scope\": RecentStats: unable to find data in memory cache]" Dec 12 00:27:12 crc kubenswrapper[4606]: I1212 00:27:12.026279 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rkpr5" Dec 12 00:27:12 crc kubenswrapper[4606]: I1212 00:27:12.114586 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6qd8z\" (UniqueName: \"kubernetes.io/projected/18996b4c-ea24-4fa6-8420-c6ff4cd30473-kube-api-access-6qd8z\") pod \"18996b4c-ea24-4fa6-8420-c6ff4cd30473\" (UID: \"18996b4c-ea24-4fa6-8420-c6ff4cd30473\") " Dec 12 00:27:12 crc kubenswrapper[4606]: I1212 00:27:12.114715 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18996b4c-ea24-4fa6-8420-c6ff4cd30473-catalog-content\") pod \"18996b4c-ea24-4fa6-8420-c6ff4cd30473\" (UID: \"18996b4c-ea24-4fa6-8420-c6ff4cd30473\") " Dec 12 00:27:12 crc kubenswrapper[4606]: I1212 00:27:12.114774 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18996b4c-ea24-4fa6-8420-c6ff4cd30473-utilities\") pod \"18996b4c-ea24-4fa6-8420-c6ff4cd30473\" (UID: \"18996b4c-ea24-4fa6-8420-c6ff4cd30473\") " Dec 12 00:27:12 crc kubenswrapper[4606]: I1212 00:27:12.115990 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18996b4c-ea24-4fa6-8420-c6ff4cd30473-utilities" (OuterVolumeSpecName: "utilities") pod "18996b4c-ea24-4fa6-8420-c6ff4cd30473" (UID: "18996b4c-ea24-4fa6-8420-c6ff4cd30473"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 00:27:12 crc kubenswrapper[4606]: I1212 00:27:12.124935 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18996b4c-ea24-4fa6-8420-c6ff4cd30473-kube-api-access-6qd8z" (OuterVolumeSpecName: "kube-api-access-6qd8z") pod "18996b4c-ea24-4fa6-8420-c6ff4cd30473" (UID: "18996b4c-ea24-4fa6-8420-c6ff4cd30473"). InnerVolumeSpecName "kube-api-access-6qd8z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:27:12 crc kubenswrapper[4606]: I1212 00:27:12.129782 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rkpr5" event={"ID":"18996b4c-ea24-4fa6-8420-c6ff4cd30473","Type":"ContainerDied","Data":"9f205c4a3d93d161523dc2cfd7161c1bdc8e7624d44a7da82fbfc1c2b07e4114"} Dec 12 00:27:12 crc kubenswrapper[4606]: I1212 00:27:12.129825 4606 scope.go:117] "RemoveContainer" containerID="ea5102ab4a6bfe746bf9579cee1801113bdabfe63e82d1d2b5686ffc6a4ca0dc" Dec 12 00:27:12 crc kubenswrapper[4606]: I1212 00:27:12.129915 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rkpr5" Dec 12 00:27:12 crc kubenswrapper[4606]: I1212 00:27:12.145216 4606 generic.go:334] "Generic (PLEG): container finished" podID="828a9e1a-5485-4706-abd6-fb28b99d0f19" containerID="baa6a3e3b4200aa5e87c5419c158e6680be44b5094a5512a17df94b433730c6c" exitCode=0 Dec 12 00:27:12 crc kubenswrapper[4606]: I1212 00:27:12.145274 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kwr58" event={"ID":"828a9e1a-5485-4706-abd6-fb28b99d0f19","Type":"ContainerDied","Data":"baa6a3e3b4200aa5e87c5419c158e6680be44b5094a5512a17df94b433730c6c"} Dec 12 00:27:12 crc kubenswrapper[4606]: I1212 00:27:12.157780 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18996b4c-ea24-4fa6-8420-c6ff4cd30473-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "18996b4c-ea24-4fa6-8420-c6ff4cd30473" (UID: "18996b4c-ea24-4fa6-8420-c6ff4cd30473"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 00:27:12 crc kubenswrapper[4606]: I1212 00:27:12.162213 4606 scope.go:117] "RemoveContainer" containerID="3a6a56d1565d6e8667e0a8510b3b6ae4f0b70ba769cbb222f2465309645eb084" Dec 12 00:27:12 crc kubenswrapper[4606]: I1212 00:27:12.205875 4606 scope.go:117] "RemoveContainer" containerID="f91a1ee60687b4a608b25ae1ef9c058539f792ebe7c9d9ddf3954bb2ad9e6011" Dec 12 00:27:12 crc kubenswrapper[4606]: I1212 00:27:12.215859 4606 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18996b4c-ea24-4fa6-8420-c6ff4cd30473-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 12 00:27:12 crc kubenswrapper[4606]: I1212 00:27:12.216047 4606 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18996b4c-ea24-4fa6-8420-c6ff4cd30473-utilities\") on node \"crc\" DevicePath \"\"" Dec 12 00:27:12 crc kubenswrapper[4606]: I1212 00:27:12.216114 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6qd8z\" (UniqueName: \"kubernetes.io/projected/18996b4c-ea24-4fa6-8420-c6ff4cd30473-kube-api-access-6qd8z\") on node \"crc\" DevicePath \"\"" Dec 12 00:27:12 crc kubenswrapper[4606]: I1212 00:27:12.219275 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kwr58" Dec 12 00:27:12 crc kubenswrapper[4606]: I1212 00:27:12.317585 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fz9dp\" (UniqueName: \"kubernetes.io/projected/828a9e1a-5485-4706-abd6-fb28b99d0f19-kube-api-access-fz9dp\") pod \"828a9e1a-5485-4706-abd6-fb28b99d0f19\" (UID: \"828a9e1a-5485-4706-abd6-fb28b99d0f19\") " Dec 12 00:27:12 crc kubenswrapper[4606]: I1212 00:27:12.317929 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/828a9e1a-5485-4706-abd6-fb28b99d0f19-catalog-content\") pod \"828a9e1a-5485-4706-abd6-fb28b99d0f19\" (UID: \"828a9e1a-5485-4706-abd6-fb28b99d0f19\") " Dec 12 00:27:12 crc kubenswrapper[4606]: I1212 00:27:12.317978 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/828a9e1a-5485-4706-abd6-fb28b99d0f19-utilities\") pod \"828a9e1a-5485-4706-abd6-fb28b99d0f19\" (UID: \"828a9e1a-5485-4706-abd6-fb28b99d0f19\") " Dec 12 00:27:12 crc kubenswrapper[4606]: I1212 00:27:12.319569 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/828a9e1a-5485-4706-abd6-fb28b99d0f19-utilities" (OuterVolumeSpecName: "utilities") pod "828a9e1a-5485-4706-abd6-fb28b99d0f19" (UID: "828a9e1a-5485-4706-abd6-fb28b99d0f19"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 00:27:12 crc kubenswrapper[4606]: I1212 00:27:12.322352 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/828a9e1a-5485-4706-abd6-fb28b99d0f19-kube-api-access-fz9dp" (OuterVolumeSpecName: "kube-api-access-fz9dp") pod "828a9e1a-5485-4706-abd6-fb28b99d0f19" (UID: "828a9e1a-5485-4706-abd6-fb28b99d0f19"). InnerVolumeSpecName "kube-api-access-fz9dp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:27:12 crc kubenswrapper[4606]: I1212 00:27:12.419735 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fz9dp\" (UniqueName: \"kubernetes.io/projected/828a9e1a-5485-4706-abd6-fb28b99d0f19-kube-api-access-fz9dp\") on node \"crc\" DevicePath \"\"" Dec 12 00:27:12 crc kubenswrapper[4606]: I1212 00:27:12.419784 4606 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/828a9e1a-5485-4706-abd6-fb28b99d0f19-utilities\") on node \"crc\" DevicePath \"\"" Dec 12 00:27:12 crc kubenswrapper[4606]: I1212 00:27:12.434870 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/828a9e1a-5485-4706-abd6-fb28b99d0f19-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "828a9e1a-5485-4706-abd6-fb28b99d0f19" (UID: "828a9e1a-5485-4706-abd6-fb28b99d0f19"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 00:27:12 crc kubenswrapper[4606]: I1212 00:27:12.457813 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rkpr5"] Dec 12 00:27:12 crc kubenswrapper[4606]: I1212 00:27:12.460416 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-rkpr5"] Dec 12 00:27:12 crc kubenswrapper[4606]: I1212 00:27:12.520764 4606 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/828a9e1a-5485-4706-abd6-fb28b99d0f19-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 12 00:27:13 crc kubenswrapper[4606]: I1212 00:27:13.168005 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kwr58" event={"ID":"828a9e1a-5485-4706-abd6-fb28b99d0f19","Type":"ContainerDied","Data":"9ba02e65463f1dcb01d77353b8da73cbaf6d01aff7ddecd1338ce38a5ada5e54"} Dec 12 00:27:13 crc kubenswrapper[4606]: I1212 00:27:13.168085 4606 scope.go:117] "RemoveContainer" containerID="baa6a3e3b4200aa5e87c5419c158e6680be44b5094a5512a17df94b433730c6c" Dec 12 00:27:13 crc kubenswrapper[4606]: I1212 00:27:13.168270 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kwr58" Dec 12 00:27:13 crc kubenswrapper[4606]: I1212 00:27:13.196393 4606 scope.go:117] "RemoveContainer" containerID="5f1904ad466704e614da49528fd03a316e55cd608fb52fd79b3a0607517eb2f2" Dec 12 00:27:13 crc kubenswrapper[4606]: I1212 00:27:13.200014 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kwr58"] Dec 12 00:27:13 crc kubenswrapper[4606]: I1212 00:27:13.204164 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-kwr58"] Dec 12 00:27:13 crc kubenswrapper[4606]: I1212 00:27:13.231505 4606 scope.go:117] "RemoveContainer" containerID="3a7c46ffe8ff62eb781eb16a95ab202bf645b84e0c45dfd47f389581830aa219" Dec 12 00:27:13 crc kubenswrapper[4606]: I1212 00:27:13.707573 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18996b4c-ea24-4fa6-8420-c6ff4cd30473" path="/var/lib/kubelet/pods/18996b4c-ea24-4fa6-8420-c6ff4cd30473/volumes" Dec 12 00:27:13 crc kubenswrapper[4606]: I1212 00:27:13.708449 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="828a9e1a-5485-4706-abd6-fb28b99d0f19" path="/var/lib/kubelet/pods/828a9e1a-5485-4706-abd6-fb28b99d0f19/volumes" Dec 12 00:27:16 crc kubenswrapper[4606]: I1212 00:27:16.808131 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-fqkw9" Dec 12 00:27:16 crc kubenswrapper[4606]: I1212 00:27:16.861401 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-fqkw9" Dec 12 00:27:19 crc kubenswrapper[4606]: I1212 00:27:19.420736 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-578876cb6-x9cmg"] Dec 12 00:27:19 crc kubenswrapper[4606]: E1212 00:27:19.421220 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18996b4c-ea24-4fa6-8420-c6ff4cd30473" containerName="extract-content" Dec 12 00:27:19 crc kubenswrapper[4606]: I1212 00:27:19.421234 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="18996b4c-ea24-4fa6-8420-c6ff4cd30473" containerName="extract-content" Dec 12 00:27:19 crc kubenswrapper[4606]: E1212 00:27:19.421248 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18996b4c-ea24-4fa6-8420-c6ff4cd30473" containerName="extract-utilities" Dec 12 00:27:19 crc kubenswrapper[4606]: I1212 00:27:19.421256 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="18996b4c-ea24-4fa6-8420-c6ff4cd30473" containerName="extract-utilities" Dec 12 00:27:19 crc kubenswrapper[4606]: E1212 00:27:19.421267 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06836183-841c-41cf-b6aa-26ad3cc92e58" containerName="pruner" Dec 12 00:27:19 crc kubenswrapper[4606]: I1212 00:27:19.421274 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="06836183-841c-41cf-b6aa-26ad3cc92e58" containerName="pruner" Dec 12 00:27:19 crc kubenswrapper[4606]: E1212 00:27:19.421282 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f43f314-8361-4374-87fc-00d8955b4ca4" containerName="extract-content" Dec 12 00:27:19 crc kubenswrapper[4606]: I1212 00:27:19.421304 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f43f314-8361-4374-87fc-00d8955b4ca4" containerName="extract-content" Dec 12 00:27:19 crc kubenswrapper[4606]: E1212 00:27:19.421315 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18996b4c-ea24-4fa6-8420-c6ff4cd30473" containerName="registry-server" Dec 12 00:27:19 crc kubenswrapper[4606]: I1212 00:27:19.421322 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="18996b4c-ea24-4fa6-8420-c6ff4cd30473" containerName="registry-server" Dec 12 00:27:19 crc kubenswrapper[4606]: E1212 00:27:19.421334 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="828a9e1a-5485-4706-abd6-fb28b99d0f19" containerName="extract-utilities" Dec 12 00:27:19 crc kubenswrapper[4606]: I1212 00:27:19.421341 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="828a9e1a-5485-4706-abd6-fb28b99d0f19" containerName="extract-utilities" Dec 12 00:27:19 crc kubenswrapper[4606]: E1212 00:27:19.421352 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f43f314-8361-4374-87fc-00d8955b4ca4" containerName="extract-utilities" Dec 12 00:27:19 crc kubenswrapper[4606]: I1212 00:27:19.421359 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f43f314-8361-4374-87fc-00d8955b4ca4" containerName="extract-utilities" Dec 12 00:27:19 crc kubenswrapper[4606]: E1212 00:27:19.421373 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="828a9e1a-5485-4706-abd6-fb28b99d0f19" containerName="extract-content" Dec 12 00:27:19 crc kubenswrapper[4606]: I1212 00:27:19.421381 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="828a9e1a-5485-4706-abd6-fb28b99d0f19" containerName="extract-content" Dec 12 00:27:19 crc kubenswrapper[4606]: E1212 00:27:19.421389 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="079b1c50-eaa5-4be5-a0d2-0015a67a1875" containerName="oauth-openshift" Dec 12 00:27:19 crc kubenswrapper[4606]: I1212 00:27:19.421396 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="079b1c50-eaa5-4be5-a0d2-0015a67a1875" containerName="oauth-openshift" Dec 12 00:27:19 crc kubenswrapper[4606]: E1212 00:27:19.421406 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77eaeae2-bc76-4cb3-9578-f24186325c2c" containerName="extract-content" Dec 12 00:27:19 crc kubenswrapper[4606]: I1212 00:27:19.421413 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="77eaeae2-bc76-4cb3-9578-f24186325c2c" containerName="extract-content" Dec 12 00:27:19 crc kubenswrapper[4606]: E1212 00:27:19.421423 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77eaeae2-bc76-4cb3-9578-f24186325c2c" containerName="registry-server" Dec 12 00:27:19 crc kubenswrapper[4606]: I1212 00:27:19.421430 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="77eaeae2-bc76-4cb3-9578-f24186325c2c" containerName="registry-server" Dec 12 00:27:19 crc kubenswrapper[4606]: E1212 00:27:19.421442 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f43f314-8361-4374-87fc-00d8955b4ca4" containerName="registry-server" Dec 12 00:27:19 crc kubenswrapper[4606]: I1212 00:27:19.421448 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f43f314-8361-4374-87fc-00d8955b4ca4" containerName="registry-server" Dec 12 00:27:19 crc kubenswrapper[4606]: E1212 00:27:19.421458 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77eaeae2-bc76-4cb3-9578-f24186325c2c" containerName="extract-utilities" Dec 12 00:27:19 crc kubenswrapper[4606]: I1212 00:27:19.421465 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="77eaeae2-bc76-4cb3-9578-f24186325c2c" containerName="extract-utilities" Dec 12 00:27:19 crc kubenswrapper[4606]: E1212 00:27:19.421476 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="828a9e1a-5485-4706-abd6-fb28b99d0f19" containerName="registry-server" Dec 12 00:27:19 crc kubenswrapper[4606]: I1212 00:27:19.421483 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="828a9e1a-5485-4706-abd6-fb28b99d0f19" containerName="registry-server" Dec 12 00:27:19 crc kubenswrapper[4606]: I1212 00:27:19.421584 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="77eaeae2-bc76-4cb3-9578-f24186325c2c" containerName="registry-server" Dec 12 00:27:19 crc kubenswrapper[4606]: I1212 00:27:19.421600 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="18996b4c-ea24-4fa6-8420-c6ff4cd30473" containerName="registry-server" Dec 12 00:27:19 crc kubenswrapper[4606]: I1212 00:27:19.421612 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="06836183-841c-41cf-b6aa-26ad3cc92e58" containerName="pruner" Dec 12 00:27:19 crc kubenswrapper[4606]: I1212 00:27:19.421621 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f43f314-8361-4374-87fc-00d8955b4ca4" containerName="registry-server" Dec 12 00:27:19 crc kubenswrapper[4606]: I1212 00:27:19.421632 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="079b1c50-eaa5-4be5-a0d2-0015a67a1875" containerName="oauth-openshift" Dec 12 00:27:19 crc kubenswrapper[4606]: I1212 00:27:19.421642 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="828a9e1a-5485-4706-abd6-fb28b99d0f19" containerName="registry-server" Dec 12 00:27:19 crc kubenswrapper[4606]: I1212 00:27:19.422061 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-578876cb6-x9cmg" Dec 12 00:27:19 crc kubenswrapper[4606]: I1212 00:27:19.425004 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 12 00:27:19 crc kubenswrapper[4606]: I1212 00:27:19.425535 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 12 00:27:19 crc kubenswrapper[4606]: I1212 00:27:19.425795 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 12 00:27:19 crc kubenswrapper[4606]: I1212 00:27:19.427279 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 12 00:27:19 crc kubenswrapper[4606]: I1212 00:27:19.429815 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 12 00:27:19 crc kubenswrapper[4606]: I1212 00:27:19.437324 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 12 00:27:19 crc kubenswrapper[4606]: I1212 00:27:19.437806 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 12 00:27:19 crc kubenswrapper[4606]: I1212 00:27:19.437882 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 12 00:27:19 crc kubenswrapper[4606]: I1212 00:27:19.437965 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 12 00:27:19 crc kubenswrapper[4606]: I1212 00:27:19.438010 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 12 00:27:19 crc kubenswrapper[4606]: I1212 00:27:19.438485 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 12 00:27:19 crc kubenswrapper[4606]: I1212 00:27:19.438712 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 12 00:27:19 crc kubenswrapper[4606]: I1212 00:27:19.444506 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-578876cb6-x9cmg"] Dec 12 00:27:19 crc kubenswrapper[4606]: I1212 00:27:19.447540 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 12 00:27:19 crc kubenswrapper[4606]: I1212 00:27:19.448471 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 12 00:27:19 crc kubenswrapper[4606]: I1212 00:27:19.460207 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 12 00:27:19 crc kubenswrapper[4606]: I1212 00:27:19.505452 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/812a735b-7f3f-4443-9420-286f305e4125-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-578876cb6-x9cmg\" (UID: \"812a735b-7f3f-4443-9420-286f305e4125\") " pod="openshift-authentication/oauth-openshift-578876cb6-x9cmg" Dec 12 00:27:19 crc kubenswrapper[4606]: I1212 00:27:19.505544 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/812a735b-7f3f-4443-9420-286f305e4125-v4-0-config-system-router-certs\") pod \"oauth-openshift-578876cb6-x9cmg\" (UID: \"812a735b-7f3f-4443-9420-286f305e4125\") " pod="openshift-authentication/oauth-openshift-578876cb6-x9cmg" Dec 12 00:27:19 crc kubenswrapper[4606]: I1212 00:27:19.505581 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/812a735b-7f3f-4443-9420-286f305e4125-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-578876cb6-x9cmg\" (UID: \"812a735b-7f3f-4443-9420-286f305e4125\") " pod="openshift-authentication/oauth-openshift-578876cb6-x9cmg" Dec 12 00:27:19 crc kubenswrapper[4606]: I1212 00:27:19.505655 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/812a735b-7f3f-4443-9420-286f305e4125-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-578876cb6-x9cmg\" (UID: \"812a735b-7f3f-4443-9420-286f305e4125\") " pod="openshift-authentication/oauth-openshift-578876cb6-x9cmg" Dec 12 00:27:19 crc kubenswrapper[4606]: I1212 00:27:19.505716 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/812a735b-7f3f-4443-9420-286f305e4125-audit-dir\") pod \"oauth-openshift-578876cb6-x9cmg\" (UID: \"812a735b-7f3f-4443-9420-286f305e4125\") " pod="openshift-authentication/oauth-openshift-578876cb6-x9cmg" Dec 12 00:27:19 crc kubenswrapper[4606]: I1212 00:27:19.505766 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/812a735b-7f3f-4443-9420-286f305e4125-v4-0-config-system-session\") pod \"oauth-openshift-578876cb6-x9cmg\" (UID: \"812a735b-7f3f-4443-9420-286f305e4125\") " pod="openshift-authentication/oauth-openshift-578876cb6-x9cmg" Dec 12 00:27:19 crc kubenswrapper[4606]: I1212 00:27:19.505835 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/812a735b-7f3f-4443-9420-286f305e4125-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-578876cb6-x9cmg\" (UID: \"812a735b-7f3f-4443-9420-286f305e4125\") " pod="openshift-authentication/oauth-openshift-578876cb6-x9cmg" Dec 12 00:27:19 crc kubenswrapper[4606]: I1212 00:27:19.505888 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/812a735b-7f3f-4443-9420-286f305e4125-v4-0-config-system-service-ca\") pod \"oauth-openshift-578876cb6-x9cmg\" (UID: \"812a735b-7f3f-4443-9420-286f305e4125\") " pod="openshift-authentication/oauth-openshift-578876cb6-x9cmg" Dec 12 00:27:19 crc kubenswrapper[4606]: I1212 00:27:19.505951 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2h49k\" (UniqueName: \"kubernetes.io/projected/812a735b-7f3f-4443-9420-286f305e4125-kube-api-access-2h49k\") pod \"oauth-openshift-578876cb6-x9cmg\" (UID: \"812a735b-7f3f-4443-9420-286f305e4125\") " pod="openshift-authentication/oauth-openshift-578876cb6-x9cmg" Dec 12 00:27:19 crc kubenswrapper[4606]: I1212 00:27:19.505978 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/812a735b-7f3f-4443-9420-286f305e4125-v4-0-config-system-cliconfig\") pod \"oauth-openshift-578876cb6-x9cmg\" (UID: \"812a735b-7f3f-4443-9420-286f305e4125\") " pod="openshift-authentication/oauth-openshift-578876cb6-x9cmg" Dec 12 00:27:19 crc kubenswrapper[4606]: I1212 00:27:19.506051 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/812a735b-7f3f-4443-9420-286f305e4125-v4-0-config-user-template-error\") pod \"oauth-openshift-578876cb6-x9cmg\" (UID: \"812a735b-7f3f-4443-9420-286f305e4125\") " pod="openshift-authentication/oauth-openshift-578876cb6-x9cmg" Dec 12 00:27:19 crc kubenswrapper[4606]: I1212 00:27:19.506085 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/812a735b-7f3f-4443-9420-286f305e4125-audit-policies\") pod \"oauth-openshift-578876cb6-x9cmg\" (UID: \"812a735b-7f3f-4443-9420-286f305e4125\") " pod="openshift-authentication/oauth-openshift-578876cb6-x9cmg" Dec 12 00:27:19 crc kubenswrapper[4606]: I1212 00:27:19.506124 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/812a735b-7f3f-4443-9420-286f305e4125-v4-0-config-user-template-login\") pod \"oauth-openshift-578876cb6-x9cmg\" (UID: \"812a735b-7f3f-4443-9420-286f305e4125\") " pod="openshift-authentication/oauth-openshift-578876cb6-x9cmg" Dec 12 00:27:19 crc kubenswrapper[4606]: I1212 00:27:19.506260 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/812a735b-7f3f-4443-9420-286f305e4125-v4-0-config-system-serving-cert\") pod \"oauth-openshift-578876cb6-x9cmg\" (UID: \"812a735b-7f3f-4443-9420-286f305e4125\") " pod="openshift-authentication/oauth-openshift-578876cb6-x9cmg" Dec 12 00:27:19 crc kubenswrapper[4606]: I1212 00:27:19.607770 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/812a735b-7f3f-4443-9420-286f305e4125-v4-0-config-system-session\") pod \"oauth-openshift-578876cb6-x9cmg\" (UID: \"812a735b-7f3f-4443-9420-286f305e4125\") " pod="openshift-authentication/oauth-openshift-578876cb6-x9cmg" Dec 12 00:27:19 crc kubenswrapper[4606]: I1212 00:27:19.607841 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/812a735b-7f3f-4443-9420-286f305e4125-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-578876cb6-x9cmg\" (UID: \"812a735b-7f3f-4443-9420-286f305e4125\") " pod="openshift-authentication/oauth-openshift-578876cb6-x9cmg" Dec 12 00:27:19 crc kubenswrapper[4606]: I1212 00:27:19.607879 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/812a735b-7f3f-4443-9420-286f305e4125-v4-0-config-system-service-ca\") pod \"oauth-openshift-578876cb6-x9cmg\" (UID: \"812a735b-7f3f-4443-9420-286f305e4125\") " pod="openshift-authentication/oauth-openshift-578876cb6-x9cmg" Dec 12 00:27:19 crc kubenswrapper[4606]: I1212 00:27:19.607909 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2h49k\" (UniqueName: \"kubernetes.io/projected/812a735b-7f3f-4443-9420-286f305e4125-kube-api-access-2h49k\") pod \"oauth-openshift-578876cb6-x9cmg\" (UID: \"812a735b-7f3f-4443-9420-286f305e4125\") " pod="openshift-authentication/oauth-openshift-578876cb6-x9cmg" Dec 12 00:27:19 crc kubenswrapper[4606]: I1212 00:27:19.607938 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/812a735b-7f3f-4443-9420-286f305e4125-v4-0-config-system-cliconfig\") pod \"oauth-openshift-578876cb6-x9cmg\" (UID: \"812a735b-7f3f-4443-9420-286f305e4125\") " pod="openshift-authentication/oauth-openshift-578876cb6-x9cmg" Dec 12 00:27:19 crc kubenswrapper[4606]: I1212 00:27:19.607969 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/812a735b-7f3f-4443-9420-286f305e4125-v4-0-config-user-template-error\") pod \"oauth-openshift-578876cb6-x9cmg\" (UID: \"812a735b-7f3f-4443-9420-286f305e4125\") " pod="openshift-authentication/oauth-openshift-578876cb6-x9cmg" Dec 12 00:27:19 crc kubenswrapper[4606]: I1212 00:27:19.608005 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/812a735b-7f3f-4443-9420-286f305e4125-audit-policies\") pod \"oauth-openshift-578876cb6-x9cmg\" (UID: \"812a735b-7f3f-4443-9420-286f305e4125\") " pod="openshift-authentication/oauth-openshift-578876cb6-x9cmg" Dec 12 00:27:19 crc kubenswrapper[4606]: I1212 00:27:19.608048 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/812a735b-7f3f-4443-9420-286f305e4125-v4-0-config-user-template-login\") pod \"oauth-openshift-578876cb6-x9cmg\" (UID: \"812a735b-7f3f-4443-9420-286f305e4125\") " pod="openshift-authentication/oauth-openshift-578876cb6-x9cmg" Dec 12 00:27:19 crc kubenswrapper[4606]: I1212 00:27:19.608091 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/812a735b-7f3f-4443-9420-286f305e4125-v4-0-config-system-serving-cert\") pod \"oauth-openshift-578876cb6-x9cmg\" (UID: \"812a735b-7f3f-4443-9420-286f305e4125\") " pod="openshift-authentication/oauth-openshift-578876cb6-x9cmg" Dec 12 00:27:19 crc kubenswrapper[4606]: I1212 00:27:19.608127 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/812a735b-7f3f-4443-9420-286f305e4125-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-578876cb6-x9cmg\" (UID: \"812a735b-7f3f-4443-9420-286f305e4125\") " pod="openshift-authentication/oauth-openshift-578876cb6-x9cmg" Dec 12 00:27:19 crc kubenswrapper[4606]: I1212 00:27:19.608190 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/812a735b-7f3f-4443-9420-286f305e4125-v4-0-config-system-router-certs\") pod \"oauth-openshift-578876cb6-x9cmg\" (UID: \"812a735b-7f3f-4443-9420-286f305e4125\") " pod="openshift-authentication/oauth-openshift-578876cb6-x9cmg" Dec 12 00:27:19 crc kubenswrapper[4606]: I1212 00:27:19.608224 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/812a735b-7f3f-4443-9420-286f305e4125-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-578876cb6-x9cmg\" (UID: \"812a735b-7f3f-4443-9420-286f305e4125\") " pod="openshift-authentication/oauth-openshift-578876cb6-x9cmg" Dec 12 00:27:19 crc kubenswrapper[4606]: I1212 00:27:19.608277 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/812a735b-7f3f-4443-9420-286f305e4125-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-578876cb6-x9cmg\" (UID: \"812a735b-7f3f-4443-9420-286f305e4125\") " pod="openshift-authentication/oauth-openshift-578876cb6-x9cmg" Dec 12 00:27:19 crc kubenswrapper[4606]: I1212 00:27:19.608322 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/812a735b-7f3f-4443-9420-286f305e4125-audit-dir\") pod \"oauth-openshift-578876cb6-x9cmg\" (UID: \"812a735b-7f3f-4443-9420-286f305e4125\") " pod="openshift-authentication/oauth-openshift-578876cb6-x9cmg" Dec 12 00:27:19 crc kubenswrapper[4606]: I1212 00:27:19.608399 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/812a735b-7f3f-4443-9420-286f305e4125-audit-dir\") pod \"oauth-openshift-578876cb6-x9cmg\" (UID: \"812a735b-7f3f-4443-9420-286f305e4125\") " pod="openshift-authentication/oauth-openshift-578876cb6-x9cmg" Dec 12 00:27:19 crc kubenswrapper[4606]: I1212 00:27:19.609147 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/812a735b-7f3f-4443-9420-286f305e4125-v4-0-config-system-service-ca\") pod \"oauth-openshift-578876cb6-x9cmg\" (UID: \"812a735b-7f3f-4443-9420-286f305e4125\") " pod="openshift-authentication/oauth-openshift-578876cb6-x9cmg" Dec 12 00:27:19 crc kubenswrapper[4606]: I1212 00:27:19.609937 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/812a735b-7f3f-4443-9420-286f305e4125-v4-0-config-system-cliconfig\") pod \"oauth-openshift-578876cb6-x9cmg\" (UID: \"812a735b-7f3f-4443-9420-286f305e4125\") " pod="openshift-authentication/oauth-openshift-578876cb6-x9cmg" Dec 12 00:27:19 crc kubenswrapper[4606]: I1212 00:27:19.610708 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/812a735b-7f3f-4443-9420-286f305e4125-audit-policies\") pod \"oauth-openshift-578876cb6-x9cmg\" (UID: \"812a735b-7f3f-4443-9420-286f305e4125\") " pod="openshift-authentication/oauth-openshift-578876cb6-x9cmg" Dec 12 00:27:19 crc kubenswrapper[4606]: I1212 00:27:19.611691 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/812a735b-7f3f-4443-9420-286f305e4125-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-578876cb6-x9cmg\" (UID: \"812a735b-7f3f-4443-9420-286f305e4125\") " pod="openshift-authentication/oauth-openshift-578876cb6-x9cmg" Dec 12 00:27:19 crc kubenswrapper[4606]: I1212 00:27:19.616218 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/812a735b-7f3f-4443-9420-286f305e4125-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-578876cb6-x9cmg\" (UID: \"812a735b-7f3f-4443-9420-286f305e4125\") " pod="openshift-authentication/oauth-openshift-578876cb6-x9cmg" Dec 12 00:27:19 crc kubenswrapper[4606]: I1212 00:27:19.616284 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/812a735b-7f3f-4443-9420-286f305e4125-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-578876cb6-x9cmg\" (UID: \"812a735b-7f3f-4443-9420-286f305e4125\") " pod="openshift-authentication/oauth-openshift-578876cb6-x9cmg" Dec 12 00:27:19 crc kubenswrapper[4606]: I1212 00:27:19.616714 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/812a735b-7f3f-4443-9420-286f305e4125-v4-0-config-user-template-login\") pod \"oauth-openshift-578876cb6-x9cmg\" (UID: \"812a735b-7f3f-4443-9420-286f305e4125\") " pod="openshift-authentication/oauth-openshift-578876cb6-x9cmg" Dec 12 00:27:19 crc kubenswrapper[4606]: I1212 00:27:19.616935 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/812a735b-7f3f-4443-9420-286f305e4125-v4-0-config-user-template-error\") pod \"oauth-openshift-578876cb6-x9cmg\" (UID: \"812a735b-7f3f-4443-9420-286f305e4125\") " pod="openshift-authentication/oauth-openshift-578876cb6-x9cmg" Dec 12 00:27:19 crc kubenswrapper[4606]: I1212 00:27:19.619396 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/812a735b-7f3f-4443-9420-286f305e4125-v4-0-config-system-serving-cert\") pod \"oauth-openshift-578876cb6-x9cmg\" (UID: \"812a735b-7f3f-4443-9420-286f305e4125\") " pod="openshift-authentication/oauth-openshift-578876cb6-x9cmg" Dec 12 00:27:19 crc kubenswrapper[4606]: I1212 00:27:19.619574 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/812a735b-7f3f-4443-9420-286f305e4125-v4-0-config-system-router-certs\") pod \"oauth-openshift-578876cb6-x9cmg\" (UID: \"812a735b-7f3f-4443-9420-286f305e4125\") " pod="openshift-authentication/oauth-openshift-578876cb6-x9cmg" Dec 12 00:27:19 crc kubenswrapper[4606]: I1212 00:27:19.619733 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/812a735b-7f3f-4443-9420-286f305e4125-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-578876cb6-x9cmg\" (UID: \"812a735b-7f3f-4443-9420-286f305e4125\") " pod="openshift-authentication/oauth-openshift-578876cb6-x9cmg" Dec 12 00:27:19 crc kubenswrapper[4606]: I1212 00:27:19.619879 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/812a735b-7f3f-4443-9420-286f305e4125-v4-0-config-system-session\") pod \"oauth-openshift-578876cb6-x9cmg\" (UID: \"812a735b-7f3f-4443-9420-286f305e4125\") " pod="openshift-authentication/oauth-openshift-578876cb6-x9cmg" Dec 12 00:27:19 crc kubenswrapper[4606]: I1212 00:27:19.643641 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2h49k\" (UniqueName: \"kubernetes.io/projected/812a735b-7f3f-4443-9420-286f305e4125-kube-api-access-2h49k\") pod \"oauth-openshift-578876cb6-x9cmg\" (UID: \"812a735b-7f3f-4443-9420-286f305e4125\") " pod="openshift-authentication/oauth-openshift-578876cb6-x9cmg" Dec 12 00:27:19 crc kubenswrapper[4606]: I1212 00:27:19.741905 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-578876cb6-x9cmg" Dec 12 00:27:20 crc kubenswrapper[4606]: I1212 00:27:20.172631 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-578876cb6-x9cmg"] Dec 12 00:27:20 crc kubenswrapper[4606]: W1212 00:27:20.183505 4606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod812a735b_7f3f_4443_9420_286f305e4125.slice/crio-555400d1db4efa8652e6f8e9113aa48fffdea0640e449dadb540c9e3c776a3a7 WatchSource:0}: Error finding container 555400d1db4efa8652e6f8e9113aa48fffdea0640e449dadb540c9e3c776a3a7: Status 404 returned error can't find the container with id 555400d1db4efa8652e6f8e9113aa48fffdea0640e449dadb540c9e3c776a3a7 Dec 12 00:27:20 crc kubenswrapper[4606]: I1212 00:27:20.206507 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-578876cb6-x9cmg" event={"ID":"812a735b-7f3f-4443-9420-286f305e4125","Type":"ContainerStarted","Data":"555400d1db4efa8652e6f8e9113aa48fffdea0640e449dadb540c9e3c776a3a7"} Dec 12 00:27:20 crc kubenswrapper[4606]: E1212 00:27:20.876657 4606 file.go:109] "Unable to process watch event" err="can't process config file \"/etc/kubernetes/manifests/kube-apiserver-startup-monitor-pod.yaml\": /etc/kubernetes/manifests/kube-apiserver-startup-monitor-pod.yaml: couldn't parse as pod(Object 'Kind' is missing in 'null'), please check config file" Dec 12 00:27:20 crc kubenswrapper[4606]: I1212 00:27:20.878118 4606 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 12 00:27:20 crc kubenswrapper[4606]: I1212 00:27:20.878741 4606 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 12 00:27:20 crc kubenswrapper[4606]: E1212 00:27:20.878795 4606 file.go:109] "Unable to process watch event" err="can't process config file \"/etc/kubernetes/manifests/kube-apiserver-pod.yaml\": /etc/kubernetes/manifests/kube-apiserver-pod.yaml: couldn't parse as pod(Object 'Kind' is missing in 'null'), please check config file" Dec 12 00:27:20 crc kubenswrapper[4606]: I1212 00:27:20.878895 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 12 00:27:20 crc kubenswrapper[4606]: I1212 00:27:20.879032 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://4b764ca1153735e365d95ed2855c7d41cd7457614e1ddce0f008d62720d607e6" gracePeriod=15 Dec 12 00:27:20 crc kubenswrapper[4606]: I1212 00:27:20.879069 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://e76f2e4c2d97f36aebba4bcbabe12214a1df27637376d1440ee54d43fca39065" gracePeriod=15 Dec 12 00:27:20 crc kubenswrapper[4606]: I1212 00:27:20.879117 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://b3baf1bf5fbaa04f9f165f43b6a4fba496be0d453641fcd8bbc811687891dd66" gracePeriod=15 Dec 12 00:27:20 crc kubenswrapper[4606]: I1212 00:27:20.879062 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://f9531d10c07644d54aeff48edf4ac068968acbd1b6597baabcb0660e7bcd54c4" gracePeriod=15 Dec 12 00:27:20 crc kubenswrapper[4606]: I1212 00:27:20.879011 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://2d18b3a560566347e9cfcd748453a89d1588de77f02a5e961a71916c6182eff9" gracePeriod=15 Dec 12 00:27:20 crc kubenswrapper[4606]: I1212 00:27:20.880360 4606 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 12 00:27:20 crc kubenswrapper[4606]: E1212 00:27:20.880495 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 12 00:27:20 crc kubenswrapper[4606]: I1212 00:27:20.880510 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 12 00:27:20 crc kubenswrapper[4606]: E1212 00:27:20.880520 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 12 00:27:20 crc kubenswrapper[4606]: I1212 00:27:20.880527 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 12 00:27:20 crc kubenswrapper[4606]: E1212 00:27:20.880534 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 12 00:27:20 crc kubenswrapper[4606]: I1212 00:27:20.880541 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 12 00:27:20 crc kubenswrapper[4606]: E1212 00:27:20.880549 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 12 00:27:20 crc kubenswrapper[4606]: I1212 00:27:20.880554 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 12 00:27:20 crc kubenswrapper[4606]: E1212 00:27:20.880563 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 12 00:27:20 crc kubenswrapper[4606]: I1212 00:27:20.880568 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 12 00:27:20 crc kubenswrapper[4606]: E1212 00:27:20.880577 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 12 00:27:20 crc kubenswrapper[4606]: I1212 00:27:20.880583 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 12 00:27:20 crc kubenswrapper[4606]: I1212 00:27:20.880671 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 12 00:27:20 crc kubenswrapper[4606]: I1212 00:27:20.880682 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 12 00:27:20 crc kubenswrapper[4606]: I1212 00:27:20.880698 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 12 00:27:20 crc kubenswrapper[4606]: I1212 00:27:20.880709 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 12 00:27:20 crc kubenswrapper[4606]: I1212 00:27:20.880720 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 12 00:27:20 crc kubenswrapper[4606]: I1212 00:27:20.880733 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 12 00:27:20 crc kubenswrapper[4606]: E1212 00:27:20.880817 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 12 00:27:20 crc kubenswrapper[4606]: I1212 00:27:20.880824 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 12 00:27:21 crc kubenswrapper[4606]: I1212 00:27:21.027257 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 12 00:27:21 crc kubenswrapper[4606]: I1212 00:27:21.027330 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 12 00:27:21 crc kubenswrapper[4606]: I1212 00:27:21.027354 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 12 00:27:21 crc kubenswrapper[4606]: I1212 00:27:21.027374 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 12 00:27:21 crc kubenswrapper[4606]: I1212 00:27:21.027403 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 12 00:27:21 crc kubenswrapper[4606]: I1212 00:27:21.027441 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 12 00:27:21 crc kubenswrapper[4606]: I1212 00:27:21.027493 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 12 00:27:21 crc kubenswrapper[4606]: I1212 00:27:21.027516 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 12 00:27:21 crc kubenswrapper[4606]: I1212 00:27:21.129034 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 12 00:27:21 crc kubenswrapper[4606]: I1212 00:27:21.129080 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 12 00:27:21 crc kubenswrapper[4606]: I1212 00:27:21.129099 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 12 00:27:21 crc kubenswrapper[4606]: I1212 00:27:21.129140 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 12 00:27:21 crc kubenswrapper[4606]: I1212 00:27:21.129194 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 12 00:27:21 crc kubenswrapper[4606]: I1212 00:27:21.129239 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 12 00:27:21 crc kubenswrapper[4606]: I1212 00:27:21.129257 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 12 00:27:21 crc kubenswrapper[4606]: I1212 00:27:21.129280 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 12 00:27:21 crc kubenswrapper[4606]: I1212 00:27:21.129321 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 12 00:27:21 crc kubenswrapper[4606]: I1212 00:27:21.129351 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 12 00:27:21 crc kubenswrapper[4606]: I1212 00:27:21.129333 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 12 00:27:21 crc kubenswrapper[4606]: I1212 00:27:21.129376 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 12 00:27:21 crc kubenswrapper[4606]: I1212 00:27:21.129398 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 12 00:27:21 crc kubenswrapper[4606]: I1212 00:27:21.129396 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 12 00:27:21 crc kubenswrapper[4606]: I1212 00:27:21.129419 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 12 00:27:21 crc kubenswrapper[4606]: I1212 00:27:21.129395 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 12 00:27:21 crc kubenswrapper[4606]: I1212 00:27:21.212789 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-578876cb6-x9cmg" event={"ID":"812a735b-7f3f-4443-9420-286f305e4125","Type":"ContainerStarted","Data":"056cb3fe47d5837feb9c4441af07ddbb1d12dacb16b2d83ed8882e8f400f393d"} Dec 12 00:27:21 crc kubenswrapper[4606]: I1212 00:27:21.213514 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-578876cb6-x9cmg" Dec 12 00:27:21 crc kubenswrapper[4606]: I1212 00:27:21.214948 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 12 00:27:21 crc kubenswrapper[4606]: I1212 00:27:21.215398 4606 status_manager.go:851] "Failed to get status for pod" podUID="812a735b-7f3f-4443-9420-286f305e4125" pod="openshift-authentication/oauth-openshift-578876cb6-x9cmg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-578876cb6-x9cmg\": dial tcp 38.102.83.204:6443: connect: connection refused" Dec 12 00:27:21 crc kubenswrapper[4606]: I1212 00:27:21.215627 4606 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Dec 12 00:27:21 crc kubenswrapper[4606]: I1212 00:27:21.216104 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 12 00:27:21 crc kubenswrapper[4606]: I1212 00:27:21.217267 4606 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="4b764ca1153735e365d95ed2855c7d41cd7457614e1ddce0f008d62720d607e6" exitCode=0 Dec 12 00:27:21 crc kubenswrapper[4606]: I1212 00:27:21.217300 4606 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="b3baf1bf5fbaa04f9f165f43b6a4fba496be0d453641fcd8bbc811687891dd66" exitCode=0 Dec 12 00:27:21 crc kubenswrapper[4606]: I1212 00:27:21.217312 4606 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="f9531d10c07644d54aeff48edf4ac068968acbd1b6597baabcb0660e7bcd54c4" exitCode=0 Dec 12 00:27:21 crc kubenswrapper[4606]: I1212 00:27:21.217319 4606 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="e76f2e4c2d97f36aebba4bcbabe12214a1df27637376d1440ee54d43fca39065" exitCode=2 Dec 12 00:27:21 crc kubenswrapper[4606]: I1212 00:27:21.217316 4606 scope.go:117] "RemoveContainer" containerID="c75350d79a93c04db2bd6d71995920cde9b897776249b8696f3ead3939f7d5d4" Dec 12 00:27:21 crc kubenswrapper[4606]: I1212 00:27:21.222160 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-578876cb6-x9cmg" Dec 12 00:27:21 crc kubenswrapper[4606]: I1212 00:27:21.222596 4606 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Dec 12 00:27:21 crc kubenswrapper[4606]: I1212 00:27:21.223054 4606 status_manager.go:851] "Failed to get status for pod" podUID="812a735b-7f3f-4443-9420-286f305e4125" pod="openshift-authentication/oauth-openshift-578876cb6-x9cmg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-578876cb6-x9cmg\": dial tcp 38.102.83.204:6443: connect: connection refused" Dec 12 00:27:21 crc kubenswrapper[4606]: I1212 00:27:21.229901 4606 generic.go:334] "Generic (PLEG): container finished" podID="de9116f1-408c-4f51-9d29-0acfc5ed7f4f" containerID="86a12228627f6c7eeec8e1d5cd118632b072ec71b5e12826db85d066c59dab67" exitCode=0 Dec 12 00:27:21 crc kubenswrapper[4606]: I1212 00:27:21.229955 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"de9116f1-408c-4f51-9d29-0acfc5ed7f4f","Type":"ContainerDied","Data":"86a12228627f6c7eeec8e1d5cd118632b072ec71b5e12826db85d066c59dab67"} Dec 12 00:27:21 crc kubenswrapper[4606]: I1212 00:27:21.231139 4606 status_manager.go:851] "Failed to get status for pod" podUID="812a735b-7f3f-4443-9420-286f305e4125" pod="openshift-authentication/oauth-openshift-578876cb6-x9cmg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-578876cb6-x9cmg\": dial tcp 38.102.83.204:6443: connect: connection refused" Dec 12 00:27:21 crc kubenswrapper[4606]: I1212 00:27:21.231469 4606 status_manager.go:851] "Failed to get status for pod" podUID="de9116f1-408c-4f51-9d29-0acfc5ed7f4f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Dec 12 00:27:21 crc kubenswrapper[4606]: I1212 00:27:21.231798 4606 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Dec 12 00:27:22 crc kubenswrapper[4606]: I1212 00:27:22.237217 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 12 00:27:22 crc kubenswrapper[4606]: I1212 00:27:22.510224 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 12 00:27:22 crc kubenswrapper[4606]: I1212 00:27:22.510921 4606 status_manager.go:851] "Failed to get status for pod" podUID="812a735b-7f3f-4443-9420-286f305e4125" pod="openshift-authentication/oauth-openshift-578876cb6-x9cmg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-578876cb6-x9cmg\": dial tcp 38.102.83.204:6443: connect: connection refused" Dec 12 00:27:22 crc kubenswrapper[4606]: I1212 00:27:22.511470 4606 status_manager.go:851] "Failed to get status for pod" podUID="de9116f1-408c-4f51-9d29-0acfc5ed7f4f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Dec 12 00:27:22 crc kubenswrapper[4606]: I1212 00:27:22.647922 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/de9116f1-408c-4f51-9d29-0acfc5ed7f4f-kube-api-access\") pod \"de9116f1-408c-4f51-9d29-0acfc5ed7f4f\" (UID: \"de9116f1-408c-4f51-9d29-0acfc5ed7f4f\") " Dec 12 00:27:22 crc kubenswrapper[4606]: I1212 00:27:22.648011 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/de9116f1-408c-4f51-9d29-0acfc5ed7f4f-var-lock\") pod \"de9116f1-408c-4f51-9d29-0acfc5ed7f4f\" (UID: \"de9116f1-408c-4f51-9d29-0acfc5ed7f4f\") " Dec 12 00:27:22 crc kubenswrapper[4606]: I1212 00:27:22.648076 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/de9116f1-408c-4f51-9d29-0acfc5ed7f4f-var-lock" (OuterVolumeSpecName: "var-lock") pod "de9116f1-408c-4f51-9d29-0acfc5ed7f4f" (UID: "de9116f1-408c-4f51-9d29-0acfc5ed7f4f"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 12 00:27:22 crc kubenswrapper[4606]: I1212 00:27:22.648121 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/de9116f1-408c-4f51-9d29-0acfc5ed7f4f-kubelet-dir\") pod \"de9116f1-408c-4f51-9d29-0acfc5ed7f4f\" (UID: \"de9116f1-408c-4f51-9d29-0acfc5ed7f4f\") " Dec 12 00:27:22 crc kubenswrapper[4606]: I1212 00:27:22.648227 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/de9116f1-408c-4f51-9d29-0acfc5ed7f4f-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "de9116f1-408c-4f51-9d29-0acfc5ed7f4f" (UID: "de9116f1-408c-4f51-9d29-0acfc5ed7f4f"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 12 00:27:22 crc kubenswrapper[4606]: I1212 00:27:22.648522 4606 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/de9116f1-408c-4f51-9d29-0acfc5ed7f4f-var-lock\") on node \"crc\" DevicePath \"\"" Dec 12 00:27:22 crc kubenswrapper[4606]: I1212 00:27:22.648558 4606 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/de9116f1-408c-4f51-9d29-0acfc5ed7f4f-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 12 00:27:22 crc kubenswrapper[4606]: I1212 00:27:22.652646 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de9116f1-408c-4f51-9d29-0acfc5ed7f4f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "de9116f1-408c-4f51-9d29-0acfc5ed7f4f" (UID: "de9116f1-408c-4f51-9d29-0acfc5ed7f4f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:27:22 crc kubenswrapper[4606]: I1212 00:27:22.750164 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/de9116f1-408c-4f51-9d29-0acfc5ed7f4f-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 12 00:27:23 crc kubenswrapper[4606]: I1212 00:27:23.245840 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"de9116f1-408c-4f51-9d29-0acfc5ed7f4f","Type":"ContainerDied","Data":"133dde96b0f24570670147da20e9894c38b10a4fa7e2189a4eff1c7a7c98b356"} Dec 12 00:27:23 crc kubenswrapper[4606]: I1212 00:27:23.246200 4606 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="133dde96b0f24570670147da20e9894c38b10a4fa7e2189a4eff1c7a7c98b356" Dec 12 00:27:23 crc kubenswrapper[4606]: I1212 00:27:23.245859 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 12 00:27:23 crc kubenswrapper[4606]: I1212 00:27:23.342993 4606 status_manager.go:851] "Failed to get status for pod" podUID="de9116f1-408c-4f51-9d29-0acfc5ed7f4f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Dec 12 00:27:23 crc kubenswrapper[4606]: E1212 00:27:23.343408 4606 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.204:6443: connect: connection refused" Dec 12 00:27:23 crc kubenswrapper[4606]: I1212 00:27:23.343898 4606 status_manager.go:851] "Failed to get status for pod" podUID="812a735b-7f3f-4443-9420-286f305e4125" pod="openshift-authentication/oauth-openshift-578876cb6-x9cmg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-578876cb6-x9cmg\": dial tcp 38.102.83.204:6443: connect: connection refused" Dec 12 00:27:23 crc kubenswrapper[4606]: E1212 00:27:23.343905 4606 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.204:6443: connect: connection refused" Dec 12 00:27:23 crc kubenswrapper[4606]: E1212 00:27:23.345573 4606 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.204:6443: connect: connection refused" Dec 12 00:27:23 crc kubenswrapper[4606]: I1212 00:27:23.347146 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 12 00:27:23 crc kubenswrapper[4606]: I1212 00:27:23.348013 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 12 00:27:23 crc kubenswrapper[4606]: E1212 00:27:23.348147 4606 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.204:6443: connect: connection refused" Dec 12 00:27:23 crc kubenswrapper[4606]: I1212 00:27:23.348393 4606 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Dec 12 00:27:23 crc kubenswrapper[4606]: E1212 00:27:23.348583 4606 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.204:6443: connect: connection refused" Dec 12 00:27:23 crc kubenswrapper[4606]: I1212 00:27:23.348624 4606 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Dec 12 00:27:23 crc kubenswrapper[4606]: I1212 00:27:23.348595 4606 status_manager.go:851] "Failed to get status for pod" podUID="812a735b-7f3f-4443-9420-286f305e4125" pod="openshift-authentication/oauth-openshift-578876cb6-x9cmg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-578876cb6-x9cmg\": dial tcp 38.102.83.204:6443: connect: connection refused" Dec 12 00:27:23 crc kubenswrapper[4606]: E1212 00:27:23.348885 4606 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.204:6443: connect: connection refused" interval="200ms" Dec 12 00:27:23 crc kubenswrapper[4606]: I1212 00:27:23.349111 4606 status_manager.go:851] "Failed to get status for pod" podUID="de9116f1-408c-4f51-9d29-0acfc5ed7f4f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Dec 12 00:27:23 crc kubenswrapper[4606]: I1212 00:27:23.477674 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 12 00:27:23 crc kubenswrapper[4606]: I1212 00:27:23.477775 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 12 00:27:23 crc kubenswrapper[4606]: I1212 00:27:23.477806 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 12 00:27:23 crc kubenswrapper[4606]: I1212 00:27:23.477865 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 12 00:27:23 crc kubenswrapper[4606]: I1212 00:27:23.477916 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 12 00:27:23 crc kubenswrapper[4606]: I1212 00:27:23.477984 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 12 00:27:23 crc kubenswrapper[4606]: I1212 00:27:23.478308 4606 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 12 00:27:23 crc kubenswrapper[4606]: I1212 00:27:23.478331 4606 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 12 00:27:23 crc kubenswrapper[4606]: I1212 00:27:23.478341 4606 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Dec 12 00:27:23 crc kubenswrapper[4606]: E1212 00:27:23.549643 4606 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.204:6443: connect: connection refused" interval="400ms" Dec 12 00:27:23 crc kubenswrapper[4606]: I1212 00:27:23.707906 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Dec 12 00:27:23 crc kubenswrapper[4606]: E1212 00:27:23.951091 4606 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.204:6443: connect: connection refused" interval="800ms" Dec 12 00:27:24 crc kubenswrapper[4606]: I1212 00:27:24.255636 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 12 00:27:24 crc kubenswrapper[4606]: I1212 00:27:24.256813 4606 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="2d18b3a560566347e9cfcd748453a89d1588de77f02a5e961a71916c6182eff9" exitCode=0 Dec 12 00:27:24 crc kubenswrapper[4606]: I1212 00:27:24.256908 4606 scope.go:117] "RemoveContainer" containerID="4b764ca1153735e365d95ed2855c7d41cd7457614e1ddce0f008d62720d607e6" Dec 12 00:27:24 crc kubenswrapper[4606]: I1212 00:27:24.256958 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 12 00:27:24 crc kubenswrapper[4606]: I1212 00:27:24.257880 4606 status_manager.go:851] "Failed to get status for pod" podUID="812a735b-7f3f-4443-9420-286f305e4125" pod="openshift-authentication/oauth-openshift-578876cb6-x9cmg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-578876cb6-x9cmg\": dial tcp 38.102.83.204:6443: connect: connection refused" Dec 12 00:27:24 crc kubenswrapper[4606]: I1212 00:27:24.258823 4606 status_manager.go:851] "Failed to get status for pod" podUID="de9116f1-408c-4f51-9d29-0acfc5ed7f4f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Dec 12 00:27:24 crc kubenswrapper[4606]: I1212 00:27:24.259112 4606 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Dec 12 00:27:24 crc kubenswrapper[4606]: I1212 00:27:24.260347 4606 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Dec 12 00:27:24 crc kubenswrapper[4606]: I1212 00:27:24.260711 4606 status_manager.go:851] "Failed to get status for pod" podUID="812a735b-7f3f-4443-9420-286f305e4125" pod="openshift-authentication/oauth-openshift-578876cb6-x9cmg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-578876cb6-x9cmg\": dial tcp 38.102.83.204:6443: connect: connection refused" Dec 12 00:27:24 crc kubenswrapper[4606]: I1212 00:27:24.261057 4606 status_manager.go:851] "Failed to get status for pod" podUID="de9116f1-408c-4f51-9d29-0acfc5ed7f4f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Dec 12 00:27:24 crc kubenswrapper[4606]: I1212 00:27:24.275476 4606 scope.go:117] "RemoveContainer" containerID="b3baf1bf5fbaa04f9f165f43b6a4fba496be0d453641fcd8bbc811687891dd66" Dec 12 00:27:24 crc kubenswrapper[4606]: I1212 00:27:24.293774 4606 scope.go:117] "RemoveContainer" containerID="f9531d10c07644d54aeff48edf4ac068968acbd1b6597baabcb0660e7bcd54c4" Dec 12 00:27:24 crc kubenswrapper[4606]: I1212 00:27:24.324907 4606 scope.go:117] "RemoveContainer" containerID="e76f2e4c2d97f36aebba4bcbabe12214a1df27637376d1440ee54d43fca39065" Dec 12 00:27:24 crc kubenswrapper[4606]: I1212 00:27:24.343907 4606 scope.go:117] "RemoveContainer" containerID="2d18b3a560566347e9cfcd748453a89d1588de77f02a5e961a71916c6182eff9" Dec 12 00:27:24 crc kubenswrapper[4606]: I1212 00:27:24.360202 4606 scope.go:117] "RemoveContainer" containerID="0bbabfdcef0c9f4dcaada0bf5c2e98c083fd0f809462f017f8a767ef800e8e13" Dec 12 00:27:24 crc kubenswrapper[4606]: I1212 00:27:24.390030 4606 scope.go:117] "RemoveContainer" containerID="4b764ca1153735e365d95ed2855c7d41cd7457614e1ddce0f008d62720d607e6" Dec 12 00:27:24 crc kubenswrapper[4606]: E1212 00:27:24.390864 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b764ca1153735e365d95ed2855c7d41cd7457614e1ddce0f008d62720d607e6\": container with ID starting with 4b764ca1153735e365d95ed2855c7d41cd7457614e1ddce0f008d62720d607e6 not found: ID does not exist" containerID="4b764ca1153735e365d95ed2855c7d41cd7457614e1ddce0f008d62720d607e6" Dec 12 00:27:24 crc kubenswrapper[4606]: I1212 00:27:24.390918 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b764ca1153735e365d95ed2855c7d41cd7457614e1ddce0f008d62720d607e6"} err="failed to get container status \"4b764ca1153735e365d95ed2855c7d41cd7457614e1ddce0f008d62720d607e6\": rpc error: code = NotFound desc = could not find container \"4b764ca1153735e365d95ed2855c7d41cd7457614e1ddce0f008d62720d607e6\": container with ID starting with 4b764ca1153735e365d95ed2855c7d41cd7457614e1ddce0f008d62720d607e6 not found: ID does not exist" Dec 12 00:27:24 crc kubenswrapper[4606]: I1212 00:27:24.390949 4606 scope.go:117] "RemoveContainer" containerID="b3baf1bf5fbaa04f9f165f43b6a4fba496be0d453641fcd8bbc811687891dd66" Dec 12 00:27:24 crc kubenswrapper[4606]: E1212 00:27:24.391380 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3baf1bf5fbaa04f9f165f43b6a4fba496be0d453641fcd8bbc811687891dd66\": container with ID starting with b3baf1bf5fbaa04f9f165f43b6a4fba496be0d453641fcd8bbc811687891dd66 not found: ID does not exist" containerID="b3baf1bf5fbaa04f9f165f43b6a4fba496be0d453641fcd8bbc811687891dd66" Dec 12 00:27:24 crc kubenswrapper[4606]: I1212 00:27:24.391406 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3baf1bf5fbaa04f9f165f43b6a4fba496be0d453641fcd8bbc811687891dd66"} err="failed to get container status \"b3baf1bf5fbaa04f9f165f43b6a4fba496be0d453641fcd8bbc811687891dd66\": rpc error: code = NotFound desc = could not find container \"b3baf1bf5fbaa04f9f165f43b6a4fba496be0d453641fcd8bbc811687891dd66\": container with ID starting with b3baf1bf5fbaa04f9f165f43b6a4fba496be0d453641fcd8bbc811687891dd66 not found: ID does not exist" Dec 12 00:27:24 crc kubenswrapper[4606]: I1212 00:27:24.391429 4606 scope.go:117] "RemoveContainer" containerID="f9531d10c07644d54aeff48edf4ac068968acbd1b6597baabcb0660e7bcd54c4" Dec 12 00:27:24 crc kubenswrapper[4606]: E1212 00:27:24.391765 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9531d10c07644d54aeff48edf4ac068968acbd1b6597baabcb0660e7bcd54c4\": container with ID starting with f9531d10c07644d54aeff48edf4ac068968acbd1b6597baabcb0660e7bcd54c4 not found: ID does not exist" containerID="f9531d10c07644d54aeff48edf4ac068968acbd1b6597baabcb0660e7bcd54c4" Dec 12 00:27:24 crc kubenswrapper[4606]: I1212 00:27:24.391788 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9531d10c07644d54aeff48edf4ac068968acbd1b6597baabcb0660e7bcd54c4"} err="failed to get container status \"f9531d10c07644d54aeff48edf4ac068968acbd1b6597baabcb0660e7bcd54c4\": rpc error: code = NotFound desc = could not find container \"f9531d10c07644d54aeff48edf4ac068968acbd1b6597baabcb0660e7bcd54c4\": container with ID starting with f9531d10c07644d54aeff48edf4ac068968acbd1b6597baabcb0660e7bcd54c4 not found: ID does not exist" Dec 12 00:27:24 crc kubenswrapper[4606]: I1212 00:27:24.391803 4606 scope.go:117] "RemoveContainer" containerID="e76f2e4c2d97f36aebba4bcbabe12214a1df27637376d1440ee54d43fca39065" Dec 12 00:27:24 crc kubenswrapper[4606]: E1212 00:27:24.392134 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e76f2e4c2d97f36aebba4bcbabe12214a1df27637376d1440ee54d43fca39065\": container with ID starting with e76f2e4c2d97f36aebba4bcbabe12214a1df27637376d1440ee54d43fca39065 not found: ID does not exist" containerID="e76f2e4c2d97f36aebba4bcbabe12214a1df27637376d1440ee54d43fca39065" Dec 12 00:27:24 crc kubenswrapper[4606]: I1212 00:27:24.392321 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e76f2e4c2d97f36aebba4bcbabe12214a1df27637376d1440ee54d43fca39065"} err="failed to get container status \"e76f2e4c2d97f36aebba4bcbabe12214a1df27637376d1440ee54d43fca39065\": rpc error: code = NotFound desc = could not find container \"e76f2e4c2d97f36aebba4bcbabe12214a1df27637376d1440ee54d43fca39065\": container with ID starting with e76f2e4c2d97f36aebba4bcbabe12214a1df27637376d1440ee54d43fca39065 not found: ID does not exist" Dec 12 00:27:24 crc kubenswrapper[4606]: I1212 00:27:24.392348 4606 scope.go:117] "RemoveContainer" containerID="2d18b3a560566347e9cfcd748453a89d1588de77f02a5e961a71916c6182eff9" Dec 12 00:27:24 crc kubenswrapper[4606]: E1212 00:27:24.392847 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d18b3a560566347e9cfcd748453a89d1588de77f02a5e961a71916c6182eff9\": container with ID starting with 2d18b3a560566347e9cfcd748453a89d1588de77f02a5e961a71916c6182eff9 not found: ID does not exist" containerID="2d18b3a560566347e9cfcd748453a89d1588de77f02a5e961a71916c6182eff9" Dec 12 00:27:24 crc kubenswrapper[4606]: I1212 00:27:24.392877 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d18b3a560566347e9cfcd748453a89d1588de77f02a5e961a71916c6182eff9"} err="failed to get container status \"2d18b3a560566347e9cfcd748453a89d1588de77f02a5e961a71916c6182eff9\": rpc error: code = NotFound desc = could not find container \"2d18b3a560566347e9cfcd748453a89d1588de77f02a5e961a71916c6182eff9\": container with ID starting with 2d18b3a560566347e9cfcd748453a89d1588de77f02a5e961a71916c6182eff9 not found: ID does not exist" Dec 12 00:27:24 crc kubenswrapper[4606]: I1212 00:27:24.392893 4606 scope.go:117] "RemoveContainer" containerID="0bbabfdcef0c9f4dcaada0bf5c2e98c083fd0f809462f017f8a767ef800e8e13" Dec 12 00:27:24 crc kubenswrapper[4606]: E1212 00:27:24.393136 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0bbabfdcef0c9f4dcaada0bf5c2e98c083fd0f809462f017f8a767ef800e8e13\": container with ID starting with 0bbabfdcef0c9f4dcaada0bf5c2e98c083fd0f809462f017f8a767ef800e8e13 not found: ID does not exist" containerID="0bbabfdcef0c9f4dcaada0bf5c2e98c083fd0f809462f017f8a767ef800e8e13" Dec 12 00:27:24 crc kubenswrapper[4606]: I1212 00:27:24.393158 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0bbabfdcef0c9f4dcaada0bf5c2e98c083fd0f809462f017f8a767ef800e8e13"} err="failed to get container status \"0bbabfdcef0c9f4dcaada0bf5c2e98c083fd0f809462f017f8a767ef800e8e13\": rpc error: code = NotFound desc = could not find container \"0bbabfdcef0c9f4dcaada0bf5c2e98c083fd0f809462f017f8a767ef800e8e13\": container with ID starting with 0bbabfdcef0c9f4dcaada0bf5c2e98c083fd0f809462f017f8a767ef800e8e13 not found: ID does not exist" Dec 12 00:27:24 crc kubenswrapper[4606]: E1212 00:27:24.752776 4606 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.204:6443: connect: connection refused" interval="1.6s" Dec 12 00:27:25 crc kubenswrapper[4606]: E1212 00:27:25.748131 4606 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:27:25Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:27:25Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:27:25Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-12T00:27:25Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.102.83.204:6443: connect: connection refused" Dec 12 00:27:25 crc kubenswrapper[4606]: E1212 00:27:25.748391 4606 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.204:6443: connect: connection refused" Dec 12 00:27:25 crc kubenswrapper[4606]: E1212 00:27:25.748556 4606 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.204:6443: connect: connection refused" Dec 12 00:27:25 crc kubenswrapper[4606]: E1212 00:27:25.748817 4606 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.204:6443: connect: connection refused" Dec 12 00:27:25 crc kubenswrapper[4606]: E1212 00:27:25.749331 4606 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.204:6443: connect: connection refused" Dec 12 00:27:25 crc kubenswrapper[4606]: E1212 00:27:25.749357 4606 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 12 00:27:25 crc kubenswrapper[4606]: E1212 00:27:25.921676 4606 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.204:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 12 00:27:25 crc kubenswrapper[4606]: I1212 00:27:25.922272 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 12 00:27:25 crc kubenswrapper[4606]: E1212 00:27:25.956057 4606 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.204:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.18805035ce66b87f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-12 00:27:25.955561599 +0000 UTC m=+236.500914505,LastTimestamp:2025-12-12 00:27:25.955561599 +0000 UTC m=+236.500914505,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 12 00:27:26 crc kubenswrapper[4606]: I1212 00:27:26.270323 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"d9f8005916acc534014ff5a639c95b69688e3bcc8bf58cfccaeeb25a14c0edbe"} Dec 12 00:27:26 crc kubenswrapper[4606]: I1212 00:27:26.270661 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"74459a891ffaf0b721e830d5e32908b9dc198c511dba6faa6e68cbf43ab76c01"} Dec 12 00:27:26 crc kubenswrapper[4606]: E1212 00:27:26.271257 4606 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.204:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 12 00:27:26 crc kubenswrapper[4606]: I1212 00:27:26.271384 4606 status_manager.go:851] "Failed to get status for pod" podUID="de9116f1-408c-4f51-9d29-0acfc5ed7f4f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Dec 12 00:27:26 crc kubenswrapper[4606]: I1212 00:27:26.271650 4606 status_manager.go:851] "Failed to get status for pod" podUID="812a735b-7f3f-4443-9420-286f305e4125" pod="openshift-authentication/oauth-openshift-578876cb6-x9cmg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-578876cb6-x9cmg\": dial tcp 38.102.83.204:6443: connect: connection refused" Dec 12 00:27:26 crc kubenswrapper[4606]: E1212 00:27:26.353264 4606 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.204:6443: connect: connection refused" interval="3.2s" Dec 12 00:27:29 crc kubenswrapper[4606]: E1212 00:27:29.554942 4606 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.204:6443: connect: connection refused" interval="6.4s" Dec 12 00:27:29 crc kubenswrapper[4606]: I1212 00:27:29.703358 4606 status_manager.go:851] "Failed to get status for pod" podUID="812a735b-7f3f-4443-9420-286f305e4125" pod="openshift-authentication/oauth-openshift-578876cb6-x9cmg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-578876cb6-x9cmg\": dial tcp 38.102.83.204:6443: connect: connection refused" Dec 12 00:27:29 crc kubenswrapper[4606]: I1212 00:27:29.703660 4606 status_manager.go:851] "Failed to get status for pod" podUID="de9116f1-408c-4f51-9d29-0acfc5ed7f4f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Dec 12 00:27:30 crc kubenswrapper[4606]: E1212 00:27:30.025153 4606 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.204:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.18805035ce66b87f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-12 00:27:25.955561599 +0000 UTC m=+236.500914505,LastTimestamp:2025-12-12 00:27:25.955561599 +0000 UTC m=+236.500914505,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 12 00:27:30 crc kubenswrapper[4606]: E1212 00:27:30.728076 4606 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.102.83.204:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-t8nn7" volumeName="registry-storage" Dec 12 00:27:34 crc kubenswrapper[4606]: I1212 00:27:34.699082 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 12 00:27:34 crc kubenswrapper[4606]: I1212 00:27:34.701035 4606 status_manager.go:851] "Failed to get status for pod" podUID="de9116f1-408c-4f51-9d29-0acfc5ed7f4f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Dec 12 00:27:34 crc kubenswrapper[4606]: I1212 00:27:34.701747 4606 status_manager.go:851] "Failed to get status for pod" podUID="812a735b-7f3f-4443-9420-286f305e4125" pod="openshift-authentication/oauth-openshift-578876cb6-x9cmg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-578876cb6-x9cmg\": dial tcp 38.102.83.204:6443: connect: connection refused" Dec 12 00:27:34 crc kubenswrapper[4606]: I1212 00:27:34.724133 4606 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="92839d27-9755-4aaa-a4c2-8aa83b6473de" Dec 12 00:27:34 crc kubenswrapper[4606]: I1212 00:27:34.724188 4606 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="92839d27-9755-4aaa-a4c2-8aa83b6473de" Dec 12 00:27:34 crc kubenswrapper[4606]: E1212 00:27:34.724671 4606 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 12 00:27:34 crc kubenswrapper[4606]: I1212 00:27:34.725188 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 12 00:27:35 crc kubenswrapper[4606]: I1212 00:27:35.320998 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 12 00:27:35 crc kubenswrapper[4606]: I1212 00:27:35.321278 4606 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="a380a96c542428bf9cf66391f4d843144052269b294636e863319b2ba954047f" exitCode=1 Dec 12 00:27:35 crc kubenswrapper[4606]: I1212 00:27:35.321338 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"a380a96c542428bf9cf66391f4d843144052269b294636e863319b2ba954047f"} Dec 12 00:27:35 crc kubenswrapper[4606]: I1212 00:27:35.321792 4606 scope.go:117] "RemoveContainer" containerID="a380a96c542428bf9cf66391f4d843144052269b294636e863319b2ba954047f" Dec 12 00:27:35 crc kubenswrapper[4606]: I1212 00:27:35.322269 4606 status_manager.go:851] "Failed to get status for pod" podUID="812a735b-7f3f-4443-9420-286f305e4125" pod="openshift-authentication/oauth-openshift-578876cb6-x9cmg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-578876cb6-x9cmg\": dial tcp 38.102.83.204:6443: connect: connection refused" Dec 12 00:27:35 crc kubenswrapper[4606]: I1212 00:27:35.322927 4606 status_manager.go:851] "Failed to get status for pod" podUID="de9116f1-408c-4f51-9d29-0acfc5ed7f4f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Dec 12 00:27:35 crc kubenswrapper[4606]: I1212 00:27:35.323427 4606 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Dec 12 00:27:35 crc kubenswrapper[4606]: I1212 00:27:35.324382 4606 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="1e83e5eadd27af46d90825e014688c8dc385680760d5008f9c525e81f78abe18" exitCode=0 Dec 12 00:27:35 crc kubenswrapper[4606]: I1212 00:27:35.324440 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"1e83e5eadd27af46d90825e014688c8dc385680760d5008f9c525e81f78abe18"} Dec 12 00:27:35 crc kubenswrapper[4606]: I1212 00:27:35.324493 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"8de3f104dc31bbaff3e9b6237f7c401b00f32e72e945430f61ba5651caf89112"} Dec 12 00:27:35 crc kubenswrapper[4606]: I1212 00:27:35.324968 4606 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="92839d27-9755-4aaa-a4c2-8aa83b6473de" Dec 12 00:27:35 crc kubenswrapper[4606]: I1212 00:27:35.325009 4606 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="92839d27-9755-4aaa-a4c2-8aa83b6473de" Dec 12 00:27:35 crc kubenswrapper[4606]: I1212 00:27:35.325449 4606 status_manager.go:851] "Failed to get status for pod" podUID="de9116f1-408c-4f51-9d29-0acfc5ed7f4f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Dec 12 00:27:35 crc kubenswrapper[4606]: E1212 00:27:35.325589 4606 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 12 00:27:35 crc kubenswrapper[4606]: I1212 00:27:35.325927 4606 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Dec 12 00:27:35 crc kubenswrapper[4606]: I1212 00:27:35.326426 4606 status_manager.go:851] "Failed to get status for pod" podUID="812a735b-7f3f-4443-9420-286f305e4125" pod="openshift-authentication/oauth-openshift-578876cb6-x9cmg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-578876cb6-x9cmg\": dial tcp 38.102.83.204:6443: connect: connection refused" Dec 12 00:27:35 crc kubenswrapper[4606]: I1212 00:27:35.521421 4606 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 12 00:27:36 crc kubenswrapper[4606]: I1212 00:27:36.334678 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"8ac52153964d43292674801f8330c41eb20c02a22d5195f025caf0cbe49b0e2f"} Dec 12 00:27:36 crc kubenswrapper[4606]: I1212 00:27:36.334985 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"bfeeedc15a65caa33eef041e4b3b6abde88131d76d409b5fd544d248e346a732"} Dec 12 00:27:36 crc kubenswrapper[4606]: I1212 00:27:36.334996 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"f499039f8a99e4959845f0f5241ee89786a22e63fd0a3c345d6a04897d641d73"} Dec 12 00:27:36 crc kubenswrapper[4606]: I1212 00:27:36.335004 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"29e65998c596a46b33a4bfb77065a2e802af794a12476128e6bf1e6e78c3d8b1"} Dec 12 00:27:36 crc kubenswrapper[4606]: I1212 00:27:36.338127 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 12 00:27:36 crc kubenswrapper[4606]: I1212 00:27:36.338180 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"503a4f0df4d8ba35985534733f8c9f1de54c5168747f27edc9510fa30aa28fbd"} Dec 12 00:27:37 crc kubenswrapper[4606]: I1212 00:27:37.345292 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"c077acebec470a7a4c95a4c018efed2d3ea6cbc3e289882e47914200454a419a"} Dec 12 00:27:37 crc kubenswrapper[4606]: I1212 00:27:37.345661 4606 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="92839d27-9755-4aaa-a4c2-8aa83b6473de" Dec 12 00:27:37 crc kubenswrapper[4606]: I1212 00:27:37.345690 4606 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="92839d27-9755-4aaa-a4c2-8aa83b6473de" Dec 12 00:27:39 crc kubenswrapper[4606]: I1212 00:27:39.725974 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 12 00:27:39 crc kubenswrapper[4606]: I1212 00:27:39.726020 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 12 00:27:39 crc kubenswrapper[4606]: I1212 00:27:39.731394 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 12 00:27:41 crc kubenswrapper[4606]: I1212 00:27:41.690973 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 12 00:27:41 crc kubenswrapper[4606]: I1212 00:27:41.695137 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 12 00:27:42 crc kubenswrapper[4606]: I1212 00:27:42.354478 4606 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 12 00:27:42 crc kubenswrapper[4606]: I1212 00:27:42.372947 4606 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="92839d27-9755-4aaa-a4c2-8aa83b6473de" Dec 12 00:27:42 crc kubenswrapper[4606]: I1212 00:27:42.372974 4606 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="92839d27-9755-4aaa-a4c2-8aa83b6473de" Dec 12 00:27:42 crc kubenswrapper[4606]: I1212 00:27:42.373052 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 12 00:27:42 crc kubenswrapper[4606]: I1212 00:27:42.373077 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 12 00:27:42 crc kubenswrapper[4606]: I1212 00:27:42.376786 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 12 00:27:42 crc kubenswrapper[4606]: I1212 00:27:42.382092 4606 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="c91221ca-ffc2-40d3-b2d0-cb8aee38dc9c" Dec 12 00:27:43 crc kubenswrapper[4606]: I1212 00:27:43.377604 4606 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="92839d27-9755-4aaa-a4c2-8aa83b6473de" Dec 12 00:27:43 crc kubenswrapper[4606]: I1212 00:27:43.377636 4606 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="92839d27-9755-4aaa-a4c2-8aa83b6473de" Dec 12 00:27:49 crc kubenswrapper[4606]: I1212 00:27:49.723110 4606 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="c91221ca-ffc2-40d3-b2d0-cb8aee38dc9c" Dec 12 00:27:51 crc kubenswrapper[4606]: I1212 00:27:51.393792 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 12 00:27:51 crc kubenswrapper[4606]: I1212 00:27:51.541909 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 12 00:27:51 crc kubenswrapper[4606]: I1212 00:27:51.672215 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 12 00:27:53 crc kubenswrapper[4606]: I1212 00:27:53.157480 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 12 00:27:53 crc kubenswrapper[4606]: I1212 00:27:53.570099 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 12 00:27:53 crc kubenswrapper[4606]: I1212 00:27:53.894820 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 12 00:27:53 crc kubenswrapper[4606]: I1212 00:27:53.959481 4606 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 12 00:27:54 crc kubenswrapper[4606]: I1212 00:27:54.008857 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 12 00:27:54 crc kubenswrapper[4606]: I1212 00:27:54.145150 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 12 00:27:54 crc kubenswrapper[4606]: I1212 00:27:54.320262 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 12 00:27:54 crc kubenswrapper[4606]: I1212 00:27:54.502350 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 12 00:27:54 crc kubenswrapper[4606]: I1212 00:27:54.642002 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 12 00:27:54 crc kubenswrapper[4606]: I1212 00:27:54.661752 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 12 00:27:54 crc kubenswrapper[4606]: I1212 00:27:54.685899 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 12 00:27:54 crc kubenswrapper[4606]: I1212 00:27:54.855066 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 12 00:27:54 crc kubenswrapper[4606]: I1212 00:27:54.981075 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 12 00:27:55 crc kubenswrapper[4606]: I1212 00:27:55.253533 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 12 00:27:55 crc kubenswrapper[4606]: I1212 00:27:55.297666 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 12 00:27:55 crc kubenswrapper[4606]: I1212 00:27:55.372942 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 12 00:27:55 crc kubenswrapper[4606]: I1212 00:27:55.415774 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 12 00:27:55 crc kubenswrapper[4606]: I1212 00:27:55.449392 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 12 00:27:55 crc kubenswrapper[4606]: I1212 00:27:55.449653 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 12 00:27:55 crc kubenswrapper[4606]: I1212 00:27:55.470875 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 12 00:27:55 crc kubenswrapper[4606]: I1212 00:27:55.596370 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 12 00:27:55 crc kubenswrapper[4606]: I1212 00:27:55.613641 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 12 00:27:55 crc kubenswrapper[4606]: I1212 00:27:55.660798 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 12 00:27:55 crc kubenswrapper[4606]: I1212 00:27:55.713632 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 12 00:27:55 crc kubenswrapper[4606]: I1212 00:27:55.730729 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 12 00:27:55 crc kubenswrapper[4606]: I1212 00:27:55.770731 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 12 00:27:55 crc kubenswrapper[4606]: I1212 00:27:55.820025 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 12 00:27:55 crc kubenswrapper[4606]: I1212 00:27:55.845951 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 12 00:27:55 crc kubenswrapper[4606]: I1212 00:27:55.940646 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 12 00:27:55 crc kubenswrapper[4606]: I1212 00:27:55.970141 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 12 00:27:56 crc kubenswrapper[4606]: I1212 00:27:56.021713 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 12 00:27:56 crc kubenswrapper[4606]: I1212 00:27:56.076040 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 12 00:27:56 crc kubenswrapper[4606]: I1212 00:27:56.121516 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 12 00:27:56 crc kubenswrapper[4606]: I1212 00:27:56.132893 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 12 00:27:56 crc kubenswrapper[4606]: I1212 00:27:56.141199 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 12 00:27:56 crc kubenswrapper[4606]: I1212 00:27:56.159256 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 12 00:27:56 crc kubenswrapper[4606]: I1212 00:27:56.159456 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 12 00:27:56 crc kubenswrapper[4606]: I1212 00:27:56.159630 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 12 00:27:56 crc kubenswrapper[4606]: I1212 00:27:56.233357 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 12 00:27:56 crc kubenswrapper[4606]: I1212 00:27:56.302945 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 12 00:27:56 crc kubenswrapper[4606]: I1212 00:27:56.308887 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 12 00:27:56 crc kubenswrapper[4606]: I1212 00:27:56.311258 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 12 00:27:56 crc kubenswrapper[4606]: I1212 00:27:56.314681 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 12 00:27:56 crc kubenswrapper[4606]: I1212 00:27:56.339817 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 12 00:27:56 crc kubenswrapper[4606]: I1212 00:27:56.446750 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 12 00:27:56 crc kubenswrapper[4606]: I1212 00:27:56.537107 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 12 00:27:56 crc kubenswrapper[4606]: I1212 00:27:56.540539 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 12 00:27:56 crc kubenswrapper[4606]: I1212 00:27:56.619465 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 12 00:27:56 crc kubenswrapper[4606]: I1212 00:27:56.675243 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 12 00:27:56 crc kubenswrapper[4606]: I1212 00:27:56.794573 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 12 00:27:56 crc kubenswrapper[4606]: I1212 00:27:56.812363 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 12 00:27:56 crc kubenswrapper[4606]: I1212 00:27:56.866818 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 12 00:27:56 crc kubenswrapper[4606]: I1212 00:27:56.879036 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 12 00:27:56 crc kubenswrapper[4606]: I1212 00:27:56.917651 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 12 00:27:56 crc kubenswrapper[4606]: I1212 00:27:56.996213 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 12 00:27:56 crc kubenswrapper[4606]: I1212 00:27:56.998563 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 12 00:27:57 crc kubenswrapper[4606]: I1212 00:27:57.009515 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 12 00:27:57 crc kubenswrapper[4606]: I1212 00:27:57.049193 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 12 00:27:57 crc kubenswrapper[4606]: I1212 00:27:57.099722 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 12 00:27:57 crc kubenswrapper[4606]: I1212 00:27:57.180224 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 12 00:27:57 crc kubenswrapper[4606]: I1212 00:27:57.182292 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 12 00:27:57 crc kubenswrapper[4606]: I1212 00:27:57.233330 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 12 00:27:57 crc kubenswrapper[4606]: I1212 00:27:57.254105 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 12 00:27:57 crc kubenswrapper[4606]: I1212 00:27:57.283602 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 12 00:27:57 crc kubenswrapper[4606]: I1212 00:27:57.317628 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 12 00:27:57 crc kubenswrapper[4606]: I1212 00:27:57.323364 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 12 00:27:57 crc kubenswrapper[4606]: I1212 00:27:57.460584 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 12 00:27:57 crc kubenswrapper[4606]: I1212 00:27:57.559820 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 12 00:27:57 crc kubenswrapper[4606]: I1212 00:27:57.616573 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 12 00:27:57 crc kubenswrapper[4606]: I1212 00:27:57.642108 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 12 00:27:57 crc kubenswrapper[4606]: I1212 00:27:57.760549 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 12 00:27:57 crc kubenswrapper[4606]: I1212 00:27:57.888004 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 12 00:27:57 crc kubenswrapper[4606]: I1212 00:27:57.911001 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 12 00:27:57 crc kubenswrapper[4606]: I1212 00:27:57.993260 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 12 00:27:58 crc kubenswrapper[4606]: I1212 00:27:58.054379 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 12 00:27:58 crc kubenswrapper[4606]: I1212 00:27:58.213791 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 12 00:27:58 crc kubenswrapper[4606]: I1212 00:27:58.404375 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 12 00:27:58 crc kubenswrapper[4606]: I1212 00:27:58.405585 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 12 00:27:58 crc kubenswrapper[4606]: I1212 00:27:58.445646 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 12 00:27:58 crc kubenswrapper[4606]: I1212 00:27:58.610962 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 12 00:27:58 crc kubenswrapper[4606]: I1212 00:27:58.684600 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 12 00:27:58 crc kubenswrapper[4606]: I1212 00:27:58.692774 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 12 00:27:58 crc kubenswrapper[4606]: I1212 00:27:58.810518 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 12 00:27:58 crc kubenswrapper[4606]: I1212 00:27:58.849701 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 12 00:27:58 crc kubenswrapper[4606]: I1212 00:27:58.859806 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 12 00:27:58 crc kubenswrapper[4606]: I1212 00:27:58.883775 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 12 00:27:58 crc kubenswrapper[4606]: I1212 00:27:58.898764 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 12 00:27:58 crc kubenswrapper[4606]: I1212 00:27:58.955791 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 12 00:27:59 crc kubenswrapper[4606]: I1212 00:27:59.016065 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 12 00:27:59 crc kubenswrapper[4606]: I1212 00:27:59.076296 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 12 00:27:59 crc kubenswrapper[4606]: I1212 00:27:59.247736 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 12 00:27:59 crc kubenswrapper[4606]: I1212 00:27:59.371279 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 12 00:27:59 crc kubenswrapper[4606]: I1212 00:27:59.471661 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 12 00:27:59 crc kubenswrapper[4606]: I1212 00:27:59.482671 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 12 00:27:59 crc kubenswrapper[4606]: I1212 00:27:59.545589 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 12 00:27:59 crc kubenswrapper[4606]: I1212 00:27:59.586089 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 12 00:27:59 crc kubenswrapper[4606]: I1212 00:27:59.626717 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 12 00:27:59 crc kubenswrapper[4606]: I1212 00:27:59.717919 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 12 00:27:59 crc kubenswrapper[4606]: I1212 00:27:59.758762 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 12 00:27:59 crc kubenswrapper[4606]: I1212 00:27:59.796755 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 12 00:27:59 crc kubenswrapper[4606]: I1212 00:27:59.797324 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 12 00:27:59 crc kubenswrapper[4606]: I1212 00:27:59.951443 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 12 00:27:59 crc kubenswrapper[4606]: I1212 00:27:59.987987 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 12 00:27:59 crc kubenswrapper[4606]: I1212 00:27:59.997783 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 12 00:28:00 crc kubenswrapper[4606]: I1212 00:28:00.137488 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 12 00:28:00 crc kubenswrapper[4606]: I1212 00:28:00.184590 4606 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 12 00:28:00 crc kubenswrapper[4606]: I1212 00:28:00.200375 4606 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 12 00:28:00 crc kubenswrapper[4606]: I1212 00:28:00.202647 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-578876cb6-x9cmg" podStartSLOduration=75.20262603 podStartE2EDuration="1m15.20262603s" podCreationTimestamp="2025-12-12 00:26:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:27:41.944418267 +0000 UTC m=+252.489771153" watchObservedRunningTime="2025-12-12 00:28:00.20262603 +0000 UTC m=+270.747978886" Dec 12 00:28:00 crc kubenswrapper[4606]: I1212 00:28:00.206950 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 12 00:28:00 crc kubenswrapper[4606]: I1212 00:28:00.207035 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 12 00:28:00 crc kubenswrapper[4606]: I1212 00:28:00.214512 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 12 00:28:00 crc kubenswrapper[4606]: I1212 00:28:00.236953 4606 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 12 00:28:00 crc kubenswrapper[4606]: I1212 00:28:00.251364 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=18.251331093 podStartE2EDuration="18.251331093s" podCreationTimestamp="2025-12-12 00:27:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:28:00.247617401 +0000 UTC m=+270.792970317" watchObservedRunningTime="2025-12-12 00:28:00.251331093 +0000 UTC m=+270.796684089" Dec 12 00:28:00 crc kubenswrapper[4606]: I1212 00:28:00.303662 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 12 00:28:00 crc kubenswrapper[4606]: I1212 00:28:00.317780 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 12 00:28:00 crc kubenswrapper[4606]: I1212 00:28:00.323272 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 12 00:28:00 crc kubenswrapper[4606]: I1212 00:28:00.520726 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 12 00:28:00 crc kubenswrapper[4606]: I1212 00:28:00.557746 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 12 00:28:00 crc kubenswrapper[4606]: I1212 00:28:00.563037 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 12 00:28:00 crc kubenswrapper[4606]: I1212 00:28:00.583813 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 12 00:28:00 crc kubenswrapper[4606]: I1212 00:28:00.641978 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 12 00:28:00 crc kubenswrapper[4606]: I1212 00:28:00.658583 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 12 00:28:00 crc kubenswrapper[4606]: I1212 00:28:00.684370 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 12 00:28:00 crc kubenswrapper[4606]: I1212 00:28:00.744632 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 12 00:28:00 crc kubenswrapper[4606]: I1212 00:28:00.846730 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 12 00:28:00 crc kubenswrapper[4606]: I1212 00:28:00.883065 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 12 00:28:00 crc kubenswrapper[4606]: I1212 00:28:00.997349 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 12 00:28:01 crc kubenswrapper[4606]: I1212 00:28:01.026125 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 12 00:28:01 crc kubenswrapper[4606]: I1212 00:28:01.122123 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 12 00:28:01 crc kubenswrapper[4606]: I1212 00:28:01.125795 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 12 00:28:01 crc kubenswrapper[4606]: I1212 00:28:01.352265 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 12 00:28:01 crc kubenswrapper[4606]: I1212 00:28:01.378809 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 12 00:28:01 crc kubenswrapper[4606]: I1212 00:28:01.395564 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 12 00:28:01 crc kubenswrapper[4606]: I1212 00:28:01.402859 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 12 00:28:01 crc kubenswrapper[4606]: I1212 00:28:01.453461 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 12 00:28:01 crc kubenswrapper[4606]: I1212 00:28:01.460256 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 12 00:28:01 crc kubenswrapper[4606]: I1212 00:28:01.464768 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 12 00:28:01 crc kubenswrapper[4606]: I1212 00:28:01.536379 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 12 00:28:01 crc kubenswrapper[4606]: I1212 00:28:01.576259 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 12 00:28:01 crc kubenswrapper[4606]: I1212 00:28:01.614581 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 12 00:28:01 crc kubenswrapper[4606]: I1212 00:28:01.621232 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 12 00:28:01 crc kubenswrapper[4606]: I1212 00:28:01.661119 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 12 00:28:01 crc kubenswrapper[4606]: I1212 00:28:01.723393 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 12 00:28:01 crc kubenswrapper[4606]: I1212 00:28:01.723995 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 12 00:28:01 crc kubenswrapper[4606]: I1212 00:28:01.744360 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 12 00:28:01 crc kubenswrapper[4606]: I1212 00:28:01.902512 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 12 00:28:01 crc kubenswrapper[4606]: I1212 00:28:01.919232 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 12 00:28:01 crc kubenswrapper[4606]: I1212 00:28:01.950090 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 12 00:28:01 crc kubenswrapper[4606]: I1212 00:28:01.962913 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 12 00:28:02 crc kubenswrapper[4606]: I1212 00:28:02.006519 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 12 00:28:02 crc kubenswrapper[4606]: I1212 00:28:02.068409 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 12 00:28:02 crc kubenswrapper[4606]: I1212 00:28:02.070083 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 12 00:28:02 crc kubenswrapper[4606]: I1212 00:28:02.102457 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 12 00:28:02 crc kubenswrapper[4606]: I1212 00:28:02.114751 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 12 00:28:02 crc kubenswrapper[4606]: I1212 00:28:02.134796 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 12 00:28:02 crc kubenswrapper[4606]: I1212 00:28:02.218862 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 12 00:28:02 crc kubenswrapper[4606]: I1212 00:28:02.377260 4606 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 12 00:28:02 crc kubenswrapper[4606]: I1212 00:28:02.391215 4606 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 12 00:28:02 crc kubenswrapper[4606]: I1212 00:28:02.453072 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 12 00:28:02 crc kubenswrapper[4606]: I1212 00:28:02.483115 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 12 00:28:02 crc kubenswrapper[4606]: I1212 00:28:02.669703 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 12 00:28:02 crc kubenswrapper[4606]: I1212 00:28:02.713327 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 12 00:28:02 crc kubenswrapper[4606]: I1212 00:28:02.714637 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 12 00:28:02 crc kubenswrapper[4606]: I1212 00:28:02.902244 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 12 00:28:02 crc kubenswrapper[4606]: I1212 00:28:02.903164 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 12 00:28:02 crc kubenswrapper[4606]: I1212 00:28:02.935156 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 12 00:28:02 crc kubenswrapper[4606]: I1212 00:28:02.961612 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 12 00:28:03 crc kubenswrapper[4606]: I1212 00:28:03.028965 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 12 00:28:03 crc kubenswrapper[4606]: I1212 00:28:03.049051 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 12 00:28:03 crc kubenswrapper[4606]: I1212 00:28:03.289281 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 12 00:28:03 crc kubenswrapper[4606]: I1212 00:28:03.320260 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 12 00:28:03 crc kubenswrapper[4606]: I1212 00:28:03.321903 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 12 00:28:03 crc kubenswrapper[4606]: I1212 00:28:03.360354 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 12 00:28:03 crc kubenswrapper[4606]: I1212 00:28:03.375558 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 12 00:28:03 crc kubenswrapper[4606]: I1212 00:28:03.393747 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 12 00:28:03 crc kubenswrapper[4606]: I1212 00:28:03.609600 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 12 00:28:03 crc kubenswrapper[4606]: I1212 00:28:03.610768 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 12 00:28:03 crc kubenswrapper[4606]: I1212 00:28:03.685460 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 12 00:28:03 crc kubenswrapper[4606]: I1212 00:28:03.703664 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 12 00:28:03 crc kubenswrapper[4606]: I1212 00:28:03.757604 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 12 00:28:03 crc kubenswrapper[4606]: I1212 00:28:03.759268 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 12 00:28:03 crc kubenswrapper[4606]: I1212 00:28:03.808586 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 12 00:28:03 crc kubenswrapper[4606]: I1212 00:28:03.812874 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 12 00:28:03 crc kubenswrapper[4606]: I1212 00:28:03.821396 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 12 00:28:03 crc kubenswrapper[4606]: I1212 00:28:03.862737 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 12 00:28:03 crc kubenswrapper[4606]: I1212 00:28:03.985331 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 12 00:28:04 crc kubenswrapper[4606]: I1212 00:28:04.037353 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 12 00:28:04 crc kubenswrapper[4606]: I1212 00:28:04.069943 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 12 00:28:04 crc kubenswrapper[4606]: I1212 00:28:04.138495 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 12 00:28:04 crc kubenswrapper[4606]: I1212 00:28:04.174613 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 12 00:28:04 crc kubenswrapper[4606]: I1212 00:28:04.223593 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 12 00:28:04 crc kubenswrapper[4606]: I1212 00:28:04.228610 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 12 00:28:04 crc kubenswrapper[4606]: I1212 00:28:04.279967 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 12 00:28:04 crc kubenswrapper[4606]: I1212 00:28:04.294915 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 12 00:28:04 crc kubenswrapper[4606]: I1212 00:28:04.299805 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 12 00:28:04 crc kubenswrapper[4606]: I1212 00:28:04.346352 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 12 00:28:04 crc kubenswrapper[4606]: I1212 00:28:04.385874 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 12 00:28:04 crc kubenswrapper[4606]: I1212 00:28:04.464653 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 12 00:28:04 crc kubenswrapper[4606]: I1212 00:28:04.525872 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 12 00:28:04 crc kubenswrapper[4606]: I1212 00:28:04.542065 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 12 00:28:04 crc kubenswrapper[4606]: I1212 00:28:04.579214 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 12 00:28:04 crc kubenswrapper[4606]: I1212 00:28:04.580137 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 12 00:28:04 crc kubenswrapper[4606]: I1212 00:28:04.595765 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 12 00:28:04 crc kubenswrapper[4606]: I1212 00:28:04.608569 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 12 00:28:04 crc kubenswrapper[4606]: I1212 00:28:04.634051 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 12 00:28:04 crc kubenswrapper[4606]: I1212 00:28:04.635925 4606 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 12 00:28:04 crc kubenswrapper[4606]: I1212 00:28:04.636351 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://d9f8005916acc534014ff5a639c95b69688e3bcc8bf58cfccaeeb25a14c0edbe" gracePeriod=5 Dec 12 00:28:04 crc kubenswrapper[4606]: I1212 00:28:04.638365 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 12 00:28:04 crc kubenswrapper[4606]: I1212 00:28:04.642455 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 12 00:28:04 crc kubenswrapper[4606]: I1212 00:28:04.656934 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 12 00:28:04 crc kubenswrapper[4606]: I1212 00:28:04.693314 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 12 00:28:04 crc kubenswrapper[4606]: I1212 00:28:04.792809 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 12 00:28:04 crc kubenswrapper[4606]: I1212 00:28:04.800417 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 12 00:28:04 crc kubenswrapper[4606]: I1212 00:28:04.878358 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 12 00:28:04 crc kubenswrapper[4606]: I1212 00:28:04.927944 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 12 00:28:04 crc kubenswrapper[4606]: I1212 00:28:04.950898 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 12 00:28:05 crc kubenswrapper[4606]: I1212 00:28:05.081360 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 12 00:28:05 crc kubenswrapper[4606]: I1212 00:28:05.090310 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 12 00:28:05 crc kubenswrapper[4606]: I1212 00:28:05.250445 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 12 00:28:05 crc kubenswrapper[4606]: I1212 00:28:05.252417 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 12 00:28:05 crc kubenswrapper[4606]: I1212 00:28:05.258624 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 12 00:28:05 crc kubenswrapper[4606]: I1212 00:28:05.298571 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 12 00:28:05 crc kubenswrapper[4606]: I1212 00:28:05.300211 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 12 00:28:05 crc kubenswrapper[4606]: I1212 00:28:05.387513 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 12 00:28:05 crc kubenswrapper[4606]: I1212 00:28:05.456237 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 12 00:28:05 crc kubenswrapper[4606]: I1212 00:28:05.572146 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 12 00:28:05 crc kubenswrapper[4606]: I1212 00:28:05.711623 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 12 00:28:05 crc kubenswrapper[4606]: I1212 00:28:05.794581 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 12 00:28:05 crc kubenswrapper[4606]: I1212 00:28:05.898614 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 12 00:28:05 crc kubenswrapper[4606]: I1212 00:28:05.969969 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 12 00:28:06 crc kubenswrapper[4606]: I1212 00:28:06.034371 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 12 00:28:06 crc kubenswrapper[4606]: I1212 00:28:06.035788 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 12 00:28:06 crc kubenswrapper[4606]: I1212 00:28:06.093824 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 12 00:28:06 crc kubenswrapper[4606]: I1212 00:28:06.158850 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 12 00:28:06 crc kubenswrapper[4606]: I1212 00:28:06.203022 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 12 00:28:06 crc kubenswrapper[4606]: I1212 00:28:06.317412 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 12 00:28:06 crc kubenswrapper[4606]: I1212 00:28:06.341894 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 12 00:28:06 crc kubenswrapper[4606]: I1212 00:28:06.357391 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 12 00:28:06 crc kubenswrapper[4606]: I1212 00:28:06.898678 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 12 00:28:06 crc kubenswrapper[4606]: I1212 00:28:06.933281 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 12 00:28:06 crc kubenswrapper[4606]: I1212 00:28:06.935329 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 12 00:28:06 crc kubenswrapper[4606]: I1212 00:28:06.978031 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 12 00:28:07 crc kubenswrapper[4606]: I1212 00:28:07.289004 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 12 00:28:07 crc kubenswrapper[4606]: I1212 00:28:07.304757 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 12 00:28:07 crc kubenswrapper[4606]: I1212 00:28:07.408797 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 12 00:28:07 crc kubenswrapper[4606]: I1212 00:28:07.442683 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 12 00:28:07 crc kubenswrapper[4606]: I1212 00:28:07.734564 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 12 00:28:07 crc kubenswrapper[4606]: I1212 00:28:07.916777 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 12 00:28:08 crc kubenswrapper[4606]: I1212 00:28:08.184528 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 12 00:28:08 crc kubenswrapper[4606]: I1212 00:28:08.191892 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 12 00:28:08 crc kubenswrapper[4606]: I1212 00:28:08.193785 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 12 00:28:08 crc kubenswrapper[4606]: I1212 00:28:08.385114 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 12 00:28:08 crc kubenswrapper[4606]: I1212 00:28:08.509034 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 12 00:28:08 crc kubenswrapper[4606]: I1212 00:28:08.552408 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 12 00:28:08 crc kubenswrapper[4606]: I1212 00:28:08.558471 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 12 00:28:08 crc kubenswrapper[4606]: I1212 00:28:08.642169 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 12 00:28:08 crc kubenswrapper[4606]: I1212 00:28:08.665045 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 12 00:28:10 crc kubenswrapper[4606]: I1212 00:28:10.210978 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 12 00:28:10 crc kubenswrapper[4606]: I1212 00:28:10.211347 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 12 00:28:10 crc kubenswrapper[4606]: I1212 00:28:10.289598 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 12 00:28:10 crc kubenswrapper[4606]: I1212 00:28:10.289689 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 12 00:28:10 crc kubenswrapper[4606]: I1212 00:28:10.289715 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 12 00:28:10 crc kubenswrapper[4606]: I1212 00:28:10.289729 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 12 00:28:10 crc kubenswrapper[4606]: I1212 00:28:10.289760 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 12 00:28:10 crc kubenswrapper[4606]: I1212 00:28:10.289961 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 12 00:28:10 crc kubenswrapper[4606]: I1212 00:28:10.290048 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 12 00:28:10 crc kubenswrapper[4606]: I1212 00:28:10.290090 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 12 00:28:10 crc kubenswrapper[4606]: I1212 00:28:10.290131 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 12 00:28:10 crc kubenswrapper[4606]: I1212 00:28:10.298364 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 12 00:28:10 crc kubenswrapper[4606]: I1212 00:28:10.391472 4606 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 12 00:28:10 crc kubenswrapper[4606]: I1212 00:28:10.391502 4606 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Dec 12 00:28:10 crc kubenswrapper[4606]: I1212 00:28:10.391510 4606 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Dec 12 00:28:10 crc kubenswrapper[4606]: I1212 00:28:10.391519 4606 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 12 00:28:10 crc kubenswrapper[4606]: I1212 00:28:10.391526 4606 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Dec 12 00:28:10 crc kubenswrapper[4606]: I1212 00:28:10.510976 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 12 00:28:10 crc kubenswrapper[4606]: I1212 00:28:10.511037 4606 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="d9f8005916acc534014ff5a639c95b69688e3bcc8bf58cfccaeeb25a14c0edbe" exitCode=137 Dec 12 00:28:10 crc kubenswrapper[4606]: I1212 00:28:10.511084 4606 scope.go:117] "RemoveContainer" containerID="d9f8005916acc534014ff5a639c95b69688e3bcc8bf58cfccaeeb25a14c0edbe" Dec 12 00:28:10 crc kubenswrapper[4606]: I1212 00:28:10.511202 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 12 00:28:10 crc kubenswrapper[4606]: I1212 00:28:10.543045 4606 scope.go:117] "RemoveContainer" containerID="d9f8005916acc534014ff5a639c95b69688e3bcc8bf58cfccaeeb25a14c0edbe" Dec 12 00:28:10 crc kubenswrapper[4606]: E1212 00:28:10.543982 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9f8005916acc534014ff5a639c95b69688e3bcc8bf58cfccaeeb25a14c0edbe\": container with ID starting with d9f8005916acc534014ff5a639c95b69688e3bcc8bf58cfccaeeb25a14c0edbe not found: ID does not exist" containerID="d9f8005916acc534014ff5a639c95b69688e3bcc8bf58cfccaeeb25a14c0edbe" Dec 12 00:28:10 crc kubenswrapper[4606]: I1212 00:28:10.544090 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9f8005916acc534014ff5a639c95b69688e3bcc8bf58cfccaeeb25a14c0edbe"} err="failed to get container status \"d9f8005916acc534014ff5a639c95b69688e3bcc8bf58cfccaeeb25a14c0edbe\": rpc error: code = NotFound desc = could not find container \"d9f8005916acc534014ff5a639c95b69688e3bcc8bf58cfccaeeb25a14c0edbe\": container with ID starting with d9f8005916acc534014ff5a639c95b69688e3bcc8bf58cfccaeeb25a14c0edbe not found: ID does not exist" Dec 12 00:28:11 crc kubenswrapper[4606]: I1212 00:28:11.708446 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Dec 12 00:28:25 crc kubenswrapper[4606]: I1212 00:28:25.590784 4606 generic.go:334] "Generic (PLEG): container finished" podID="d5b97b3b-8994-4d7f-a165-c04d13546e89" containerID="edd206a36798d813911a2fd408e5191bffe6745cf0f3e5c6e01da4b3757da49a" exitCode=0 Dec 12 00:28:25 crc kubenswrapper[4606]: I1212 00:28:25.590885 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-544fr" event={"ID":"d5b97b3b-8994-4d7f-a165-c04d13546e89","Type":"ContainerDied","Data":"edd206a36798d813911a2fd408e5191bffe6745cf0f3e5c6e01da4b3757da49a"} Dec 12 00:28:25 crc kubenswrapper[4606]: I1212 00:28:25.592104 4606 scope.go:117] "RemoveContainer" containerID="edd206a36798d813911a2fd408e5191bffe6745cf0f3e5c6e01da4b3757da49a" Dec 12 00:28:26 crc kubenswrapper[4606]: I1212 00:28:26.601847 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-544fr" event={"ID":"d5b97b3b-8994-4d7f-a165-c04d13546e89","Type":"ContainerStarted","Data":"f55b8074b8bed57843a80bd0cf27d76f607aba8f14e884901d3445646881acd6"} Dec 12 00:28:26 crc kubenswrapper[4606]: I1212 00:28:26.602603 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-544fr" Dec 12 00:28:26 crc kubenswrapper[4606]: I1212 00:28:26.605609 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-544fr" Dec 12 00:28:29 crc kubenswrapper[4606]: I1212 00:28:29.598623 4606 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials Dec 12 00:28:38 crc kubenswrapper[4606]: I1212 00:28:38.623613 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-dd5sp"] Dec 12 00:28:38 crc kubenswrapper[4606]: I1212 00:28:38.624399 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-dd5sp" podUID="16a5a061-f2aa-430e-9898-b7adff8ccb50" containerName="controller-manager" containerID="cri-o://b288c67cc787774b203dd95b89935cb0b8499a6c49600efbb6c5faba1ffeefb9" gracePeriod=30 Dec 12 00:28:38 crc kubenswrapper[4606]: I1212 00:28:38.715645 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-k5ghr"] Dec 12 00:28:38 crc kubenswrapper[4606]: I1212 00:28:38.716092 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-k5ghr" podUID="43bb746f-62c0-45c5-b1db-490810a0ba0e" containerName="route-controller-manager" containerID="cri-o://478c918609a55c577a257697e7c2007813fc8796a1cf99f6bd7acbbb7ca53e35" gracePeriod=30 Dec 12 00:28:38 crc kubenswrapper[4606]: I1212 00:28:38.970353 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-dd5sp" Dec 12 00:28:39 crc kubenswrapper[4606]: I1212 00:28:39.039493 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-k5ghr" Dec 12 00:28:39 crc kubenswrapper[4606]: I1212 00:28:39.059299 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2rw2l\" (UniqueName: \"kubernetes.io/projected/16a5a061-f2aa-430e-9898-b7adff8ccb50-kube-api-access-2rw2l\") pod \"16a5a061-f2aa-430e-9898-b7adff8ccb50\" (UID: \"16a5a061-f2aa-430e-9898-b7adff8ccb50\") " Dec 12 00:28:39 crc kubenswrapper[4606]: I1212 00:28:39.059391 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/16a5a061-f2aa-430e-9898-b7adff8ccb50-client-ca\") pod \"16a5a061-f2aa-430e-9898-b7adff8ccb50\" (UID: \"16a5a061-f2aa-430e-9898-b7adff8ccb50\") " Dec 12 00:28:39 crc kubenswrapper[4606]: I1212 00:28:39.059423 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/16a5a061-f2aa-430e-9898-b7adff8ccb50-serving-cert\") pod \"16a5a061-f2aa-430e-9898-b7adff8ccb50\" (UID: \"16a5a061-f2aa-430e-9898-b7adff8ccb50\") " Dec 12 00:28:39 crc kubenswrapper[4606]: I1212 00:28:39.059497 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16a5a061-f2aa-430e-9898-b7adff8ccb50-config\") pod \"16a5a061-f2aa-430e-9898-b7adff8ccb50\" (UID: \"16a5a061-f2aa-430e-9898-b7adff8ccb50\") " Dec 12 00:28:39 crc kubenswrapper[4606]: I1212 00:28:39.059528 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/16a5a061-f2aa-430e-9898-b7adff8ccb50-proxy-ca-bundles\") pod \"16a5a061-f2aa-430e-9898-b7adff8ccb50\" (UID: \"16a5a061-f2aa-430e-9898-b7adff8ccb50\") " Dec 12 00:28:39 crc kubenswrapper[4606]: I1212 00:28:39.060407 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16a5a061-f2aa-430e-9898-b7adff8ccb50-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "16a5a061-f2aa-430e-9898-b7adff8ccb50" (UID: "16a5a061-f2aa-430e-9898-b7adff8ccb50"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:28:39 crc kubenswrapper[4606]: I1212 00:28:39.060722 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16a5a061-f2aa-430e-9898-b7adff8ccb50-client-ca" (OuterVolumeSpecName: "client-ca") pod "16a5a061-f2aa-430e-9898-b7adff8ccb50" (UID: "16a5a061-f2aa-430e-9898-b7adff8ccb50"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:28:39 crc kubenswrapper[4606]: I1212 00:28:39.062123 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16a5a061-f2aa-430e-9898-b7adff8ccb50-config" (OuterVolumeSpecName: "config") pod "16a5a061-f2aa-430e-9898-b7adff8ccb50" (UID: "16a5a061-f2aa-430e-9898-b7adff8ccb50"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:28:39 crc kubenswrapper[4606]: I1212 00:28:39.066493 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16a5a061-f2aa-430e-9898-b7adff8ccb50-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "16a5a061-f2aa-430e-9898-b7adff8ccb50" (UID: "16a5a061-f2aa-430e-9898-b7adff8ccb50"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:28:39 crc kubenswrapper[4606]: I1212 00:28:39.066516 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16a5a061-f2aa-430e-9898-b7adff8ccb50-kube-api-access-2rw2l" (OuterVolumeSpecName: "kube-api-access-2rw2l") pod "16a5a061-f2aa-430e-9898-b7adff8ccb50" (UID: "16a5a061-f2aa-430e-9898-b7adff8ccb50"). InnerVolumeSpecName "kube-api-access-2rw2l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:28:39 crc kubenswrapper[4606]: I1212 00:28:39.160627 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/43bb746f-62c0-45c5-b1db-490810a0ba0e-config\") pod \"43bb746f-62c0-45c5-b1db-490810a0ba0e\" (UID: \"43bb746f-62c0-45c5-b1db-490810a0ba0e\") " Dec 12 00:28:39 crc kubenswrapper[4606]: I1212 00:28:39.160724 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/43bb746f-62c0-45c5-b1db-490810a0ba0e-serving-cert\") pod \"43bb746f-62c0-45c5-b1db-490810a0ba0e\" (UID: \"43bb746f-62c0-45c5-b1db-490810a0ba0e\") " Dec 12 00:28:39 crc kubenswrapper[4606]: I1212 00:28:39.160758 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vmcv9\" (UniqueName: \"kubernetes.io/projected/43bb746f-62c0-45c5-b1db-490810a0ba0e-kube-api-access-vmcv9\") pod \"43bb746f-62c0-45c5-b1db-490810a0ba0e\" (UID: \"43bb746f-62c0-45c5-b1db-490810a0ba0e\") " Dec 12 00:28:39 crc kubenswrapper[4606]: I1212 00:28:39.160817 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/43bb746f-62c0-45c5-b1db-490810a0ba0e-client-ca\") pod \"43bb746f-62c0-45c5-b1db-490810a0ba0e\" (UID: \"43bb746f-62c0-45c5-b1db-490810a0ba0e\") " Dec 12 00:28:39 crc kubenswrapper[4606]: I1212 00:28:39.160996 4606 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/16a5a061-f2aa-430e-9898-b7adff8ccb50-client-ca\") on node \"crc\" DevicePath \"\"" Dec 12 00:28:39 crc kubenswrapper[4606]: I1212 00:28:39.161009 4606 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/16a5a061-f2aa-430e-9898-b7adff8ccb50-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 12 00:28:39 crc kubenswrapper[4606]: I1212 00:28:39.161018 4606 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16a5a061-f2aa-430e-9898-b7adff8ccb50-config\") on node \"crc\" DevicePath \"\"" Dec 12 00:28:39 crc kubenswrapper[4606]: I1212 00:28:39.161028 4606 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/16a5a061-f2aa-430e-9898-b7adff8ccb50-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 12 00:28:39 crc kubenswrapper[4606]: I1212 00:28:39.161038 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2rw2l\" (UniqueName: \"kubernetes.io/projected/16a5a061-f2aa-430e-9898-b7adff8ccb50-kube-api-access-2rw2l\") on node \"crc\" DevicePath \"\"" Dec 12 00:28:39 crc kubenswrapper[4606]: I1212 00:28:39.161625 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43bb746f-62c0-45c5-b1db-490810a0ba0e-config" (OuterVolumeSpecName: "config") pod "43bb746f-62c0-45c5-b1db-490810a0ba0e" (UID: "43bb746f-62c0-45c5-b1db-490810a0ba0e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:28:39 crc kubenswrapper[4606]: I1212 00:28:39.161740 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43bb746f-62c0-45c5-b1db-490810a0ba0e-client-ca" (OuterVolumeSpecName: "client-ca") pod "43bb746f-62c0-45c5-b1db-490810a0ba0e" (UID: "43bb746f-62c0-45c5-b1db-490810a0ba0e"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:28:39 crc kubenswrapper[4606]: I1212 00:28:39.163549 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43bb746f-62c0-45c5-b1db-490810a0ba0e-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "43bb746f-62c0-45c5-b1db-490810a0ba0e" (UID: "43bb746f-62c0-45c5-b1db-490810a0ba0e"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:28:39 crc kubenswrapper[4606]: I1212 00:28:39.163561 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43bb746f-62c0-45c5-b1db-490810a0ba0e-kube-api-access-vmcv9" (OuterVolumeSpecName: "kube-api-access-vmcv9") pod "43bb746f-62c0-45c5-b1db-490810a0ba0e" (UID: "43bb746f-62c0-45c5-b1db-490810a0ba0e"). InnerVolumeSpecName "kube-api-access-vmcv9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:28:39 crc kubenswrapper[4606]: I1212 00:28:39.261728 4606 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/43bb746f-62c0-45c5-b1db-490810a0ba0e-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 12 00:28:39 crc kubenswrapper[4606]: I1212 00:28:39.261764 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vmcv9\" (UniqueName: \"kubernetes.io/projected/43bb746f-62c0-45c5-b1db-490810a0ba0e-kube-api-access-vmcv9\") on node \"crc\" DevicePath \"\"" Dec 12 00:28:39 crc kubenswrapper[4606]: I1212 00:28:39.261774 4606 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/43bb746f-62c0-45c5-b1db-490810a0ba0e-client-ca\") on node \"crc\" DevicePath \"\"" Dec 12 00:28:39 crc kubenswrapper[4606]: I1212 00:28:39.261782 4606 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/43bb746f-62c0-45c5-b1db-490810a0ba0e-config\") on node \"crc\" DevicePath \"\"" Dec 12 00:28:39 crc kubenswrapper[4606]: I1212 00:28:39.678607 4606 generic.go:334] "Generic (PLEG): container finished" podID="43bb746f-62c0-45c5-b1db-490810a0ba0e" containerID="478c918609a55c577a257697e7c2007813fc8796a1cf99f6bd7acbbb7ca53e35" exitCode=0 Dec 12 00:28:39 crc kubenswrapper[4606]: I1212 00:28:39.678695 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-k5ghr" event={"ID":"43bb746f-62c0-45c5-b1db-490810a0ba0e","Type":"ContainerDied","Data":"478c918609a55c577a257697e7c2007813fc8796a1cf99f6bd7acbbb7ca53e35"} Dec 12 00:28:39 crc kubenswrapper[4606]: I1212 00:28:39.678730 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-k5ghr" event={"ID":"43bb746f-62c0-45c5-b1db-490810a0ba0e","Type":"ContainerDied","Data":"ef8780fd900c9a3ff15e14ca55a321065bc15088435b697139975b77857ab7e6"} Dec 12 00:28:39 crc kubenswrapper[4606]: I1212 00:28:39.678761 4606 scope.go:117] "RemoveContainer" containerID="478c918609a55c577a257697e7c2007813fc8796a1cf99f6bd7acbbb7ca53e35" Dec 12 00:28:39 crc kubenswrapper[4606]: I1212 00:28:39.678887 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-k5ghr" Dec 12 00:28:39 crc kubenswrapper[4606]: I1212 00:28:39.684917 4606 generic.go:334] "Generic (PLEG): container finished" podID="16a5a061-f2aa-430e-9898-b7adff8ccb50" containerID="b288c67cc787774b203dd95b89935cb0b8499a6c49600efbb6c5faba1ffeefb9" exitCode=0 Dec 12 00:28:39 crc kubenswrapper[4606]: I1212 00:28:39.684984 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-dd5sp" event={"ID":"16a5a061-f2aa-430e-9898-b7adff8ccb50","Type":"ContainerDied","Data":"b288c67cc787774b203dd95b89935cb0b8499a6c49600efbb6c5faba1ffeefb9"} Dec 12 00:28:39 crc kubenswrapper[4606]: I1212 00:28:39.685031 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-dd5sp" event={"ID":"16a5a061-f2aa-430e-9898-b7adff8ccb50","Type":"ContainerDied","Data":"eba57dcb592fb4f77aa6f032523d3a53c6709d99d908bb4d18a80f1a57ba16ff"} Dec 12 00:28:39 crc kubenswrapper[4606]: I1212 00:28:39.685164 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-dd5sp" Dec 12 00:28:39 crc kubenswrapper[4606]: I1212 00:28:39.718167 4606 scope.go:117] "RemoveContainer" containerID="478c918609a55c577a257697e7c2007813fc8796a1cf99f6bd7acbbb7ca53e35" Dec 12 00:28:39 crc kubenswrapper[4606]: E1212 00:28:39.723273 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"478c918609a55c577a257697e7c2007813fc8796a1cf99f6bd7acbbb7ca53e35\": container with ID starting with 478c918609a55c577a257697e7c2007813fc8796a1cf99f6bd7acbbb7ca53e35 not found: ID does not exist" containerID="478c918609a55c577a257697e7c2007813fc8796a1cf99f6bd7acbbb7ca53e35" Dec 12 00:28:39 crc kubenswrapper[4606]: I1212 00:28:39.723395 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"478c918609a55c577a257697e7c2007813fc8796a1cf99f6bd7acbbb7ca53e35"} err="failed to get container status \"478c918609a55c577a257697e7c2007813fc8796a1cf99f6bd7acbbb7ca53e35\": rpc error: code = NotFound desc = could not find container \"478c918609a55c577a257697e7c2007813fc8796a1cf99f6bd7acbbb7ca53e35\": container with ID starting with 478c918609a55c577a257697e7c2007813fc8796a1cf99f6bd7acbbb7ca53e35 not found: ID does not exist" Dec 12 00:28:39 crc kubenswrapper[4606]: I1212 00:28:39.723445 4606 scope.go:117] "RemoveContainer" containerID="b288c67cc787774b203dd95b89935cb0b8499a6c49600efbb6c5faba1ffeefb9" Dec 12 00:28:39 crc kubenswrapper[4606]: I1212 00:28:39.754089 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-k5ghr"] Dec 12 00:28:39 crc kubenswrapper[4606]: I1212 00:28:39.759944 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-k5ghr"] Dec 12 00:28:39 crc kubenswrapper[4606]: I1212 00:28:39.766208 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-dd5sp"] Dec 12 00:28:39 crc kubenswrapper[4606]: I1212 00:28:39.771707 4606 scope.go:117] "RemoveContainer" containerID="b288c67cc787774b203dd95b89935cb0b8499a6c49600efbb6c5faba1ffeefb9" Dec 12 00:28:39 crc kubenswrapper[4606]: I1212 00:28:39.772109 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-dd5sp"] Dec 12 00:28:39 crc kubenswrapper[4606]: E1212 00:28:39.772163 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b288c67cc787774b203dd95b89935cb0b8499a6c49600efbb6c5faba1ffeefb9\": container with ID starting with b288c67cc787774b203dd95b89935cb0b8499a6c49600efbb6c5faba1ffeefb9 not found: ID does not exist" containerID="b288c67cc787774b203dd95b89935cb0b8499a6c49600efbb6c5faba1ffeefb9" Dec 12 00:28:39 crc kubenswrapper[4606]: I1212 00:28:39.772295 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b288c67cc787774b203dd95b89935cb0b8499a6c49600efbb6c5faba1ffeefb9"} err="failed to get container status \"b288c67cc787774b203dd95b89935cb0b8499a6c49600efbb6c5faba1ffeefb9\": rpc error: code = NotFound desc = could not find container \"b288c67cc787774b203dd95b89935cb0b8499a6c49600efbb6c5faba1ffeefb9\": container with ID starting with b288c67cc787774b203dd95b89935cb0b8499a6c49600efbb6c5faba1ffeefb9 not found: ID does not exist" Dec 12 00:28:40 crc kubenswrapper[4606]: I1212 00:28:40.498034 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6588b54b8f-snh6l"] Dec 12 00:28:40 crc kubenswrapper[4606]: E1212 00:28:40.498382 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de9116f1-408c-4f51-9d29-0acfc5ed7f4f" containerName="installer" Dec 12 00:28:40 crc kubenswrapper[4606]: I1212 00:28:40.498401 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="de9116f1-408c-4f51-9d29-0acfc5ed7f4f" containerName="installer" Dec 12 00:28:40 crc kubenswrapper[4606]: E1212 00:28:40.498410 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 12 00:28:40 crc kubenswrapper[4606]: I1212 00:28:40.498416 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 12 00:28:40 crc kubenswrapper[4606]: E1212 00:28:40.498426 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16a5a061-f2aa-430e-9898-b7adff8ccb50" containerName="controller-manager" Dec 12 00:28:40 crc kubenswrapper[4606]: I1212 00:28:40.498432 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="16a5a061-f2aa-430e-9898-b7adff8ccb50" containerName="controller-manager" Dec 12 00:28:40 crc kubenswrapper[4606]: E1212 00:28:40.498449 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43bb746f-62c0-45c5-b1db-490810a0ba0e" containerName="route-controller-manager" Dec 12 00:28:40 crc kubenswrapper[4606]: I1212 00:28:40.498455 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="43bb746f-62c0-45c5-b1db-490810a0ba0e" containerName="route-controller-manager" Dec 12 00:28:40 crc kubenswrapper[4606]: I1212 00:28:40.498571 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 12 00:28:40 crc kubenswrapper[4606]: I1212 00:28:40.498586 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="43bb746f-62c0-45c5-b1db-490810a0ba0e" containerName="route-controller-manager" Dec 12 00:28:40 crc kubenswrapper[4606]: I1212 00:28:40.498595 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="de9116f1-408c-4f51-9d29-0acfc5ed7f4f" containerName="installer" Dec 12 00:28:40 crc kubenswrapper[4606]: I1212 00:28:40.498603 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="16a5a061-f2aa-430e-9898-b7adff8ccb50" containerName="controller-manager" Dec 12 00:28:40 crc kubenswrapper[4606]: I1212 00:28:40.499003 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6588b54b8f-snh6l" Dec 12 00:28:40 crc kubenswrapper[4606]: I1212 00:28:40.502719 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 12 00:28:40 crc kubenswrapper[4606]: I1212 00:28:40.503965 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7887f885ff-c6src"] Dec 12 00:28:40 crc kubenswrapper[4606]: I1212 00:28:40.504605 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 12 00:28:40 crc kubenswrapper[4606]: I1212 00:28:40.504978 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7887f885ff-c6src" Dec 12 00:28:40 crc kubenswrapper[4606]: I1212 00:28:40.507630 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 12 00:28:40 crc kubenswrapper[4606]: I1212 00:28:40.507664 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 12 00:28:40 crc kubenswrapper[4606]: I1212 00:28:40.508681 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 12 00:28:40 crc kubenswrapper[4606]: I1212 00:28:40.516154 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 12 00:28:40 crc kubenswrapper[4606]: I1212 00:28:40.517038 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 12 00:28:40 crc kubenswrapper[4606]: I1212 00:28:40.517431 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 12 00:28:40 crc kubenswrapper[4606]: I1212 00:28:40.517539 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 12 00:28:40 crc kubenswrapper[4606]: I1212 00:28:40.517623 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 12 00:28:40 crc kubenswrapper[4606]: I1212 00:28:40.517740 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 12 00:28:40 crc kubenswrapper[4606]: I1212 00:28:40.517551 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 12 00:28:40 crc kubenswrapper[4606]: I1212 00:28:40.524317 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 12 00:28:40 crc kubenswrapper[4606]: I1212 00:28:40.528094 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6588b54b8f-snh6l"] Dec 12 00:28:40 crc kubenswrapper[4606]: I1212 00:28:40.537389 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7887f885ff-c6src"] Dec 12 00:28:40 crc kubenswrapper[4606]: I1212 00:28:40.578245 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/69eda6fa-0514-47b0-8f96-f15a19328c98-client-ca\") pod \"controller-manager-6588b54b8f-snh6l\" (UID: \"69eda6fa-0514-47b0-8f96-f15a19328c98\") " pod="openshift-controller-manager/controller-manager-6588b54b8f-snh6l" Dec 12 00:28:40 crc kubenswrapper[4606]: I1212 00:28:40.578353 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/69eda6fa-0514-47b0-8f96-f15a19328c98-proxy-ca-bundles\") pod \"controller-manager-6588b54b8f-snh6l\" (UID: \"69eda6fa-0514-47b0-8f96-f15a19328c98\") " pod="openshift-controller-manager/controller-manager-6588b54b8f-snh6l" Dec 12 00:28:40 crc kubenswrapper[4606]: I1212 00:28:40.578398 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69eda6fa-0514-47b0-8f96-f15a19328c98-config\") pod \"controller-manager-6588b54b8f-snh6l\" (UID: \"69eda6fa-0514-47b0-8f96-f15a19328c98\") " pod="openshift-controller-manager/controller-manager-6588b54b8f-snh6l" Dec 12 00:28:40 crc kubenswrapper[4606]: I1212 00:28:40.578486 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/34ba32d0-1782-494d-a0be-b745ff8a749c-client-ca\") pod \"route-controller-manager-7887f885ff-c6src\" (UID: \"34ba32d0-1782-494d-a0be-b745ff8a749c\") " pod="openshift-route-controller-manager/route-controller-manager-7887f885ff-c6src" Dec 12 00:28:40 crc kubenswrapper[4606]: I1212 00:28:40.578533 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dl6sc\" (UniqueName: \"kubernetes.io/projected/69eda6fa-0514-47b0-8f96-f15a19328c98-kube-api-access-dl6sc\") pod \"controller-manager-6588b54b8f-snh6l\" (UID: \"69eda6fa-0514-47b0-8f96-f15a19328c98\") " pod="openshift-controller-manager/controller-manager-6588b54b8f-snh6l" Dec 12 00:28:40 crc kubenswrapper[4606]: I1212 00:28:40.578561 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/69eda6fa-0514-47b0-8f96-f15a19328c98-serving-cert\") pod \"controller-manager-6588b54b8f-snh6l\" (UID: \"69eda6fa-0514-47b0-8f96-f15a19328c98\") " pod="openshift-controller-manager/controller-manager-6588b54b8f-snh6l" Dec 12 00:28:40 crc kubenswrapper[4606]: I1212 00:28:40.578596 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34ba32d0-1782-494d-a0be-b745ff8a749c-config\") pod \"route-controller-manager-7887f885ff-c6src\" (UID: \"34ba32d0-1782-494d-a0be-b745ff8a749c\") " pod="openshift-route-controller-manager/route-controller-manager-7887f885ff-c6src" Dec 12 00:28:40 crc kubenswrapper[4606]: I1212 00:28:40.578618 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zlh66\" (UniqueName: \"kubernetes.io/projected/34ba32d0-1782-494d-a0be-b745ff8a749c-kube-api-access-zlh66\") pod \"route-controller-manager-7887f885ff-c6src\" (UID: \"34ba32d0-1782-494d-a0be-b745ff8a749c\") " pod="openshift-route-controller-manager/route-controller-manager-7887f885ff-c6src" Dec 12 00:28:40 crc kubenswrapper[4606]: I1212 00:28:40.578644 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/34ba32d0-1782-494d-a0be-b745ff8a749c-serving-cert\") pod \"route-controller-manager-7887f885ff-c6src\" (UID: \"34ba32d0-1782-494d-a0be-b745ff8a749c\") " pod="openshift-route-controller-manager/route-controller-manager-7887f885ff-c6src" Dec 12 00:28:40 crc kubenswrapper[4606]: I1212 00:28:40.679841 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/34ba32d0-1782-494d-a0be-b745ff8a749c-serving-cert\") pod \"route-controller-manager-7887f885ff-c6src\" (UID: \"34ba32d0-1782-494d-a0be-b745ff8a749c\") " pod="openshift-route-controller-manager/route-controller-manager-7887f885ff-c6src" Dec 12 00:28:40 crc kubenswrapper[4606]: I1212 00:28:40.679934 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/69eda6fa-0514-47b0-8f96-f15a19328c98-client-ca\") pod \"controller-manager-6588b54b8f-snh6l\" (UID: \"69eda6fa-0514-47b0-8f96-f15a19328c98\") " pod="openshift-controller-manager/controller-manager-6588b54b8f-snh6l" Dec 12 00:28:40 crc kubenswrapper[4606]: I1212 00:28:40.679987 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/69eda6fa-0514-47b0-8f96-f15a19328c98-proxy-ca-bundles\") pod \"controller-manager-6588b54b8f-snh6l\" (UID: \"69eda6fa-0514-47b0-8f96-f15a19328c98\") " pod="openshift-controller-manager/controller-manager-6588b54b8f-snh6l" Dec 12 00:28:40 crc kubenswrapper[4606]: I1212 00:28:40.680013 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69eda6fa-0514-47b0-8f96-f15a19328c98-config\") pod \"controller-manager-6588b54b8f-snh6l\" (UID: \"69eda6fa-0514-47b0-8f96-f15a19328c98\") " pod="openshift-controller-manager/controller-manager-6588b54b8f-snh6l" Dec 12 00:28:40 crc kubenswrapper[4606]: I1212 00:28:40.680054 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/34ba32d0-1782-494d-a0be-b745ff8a749c-client-ca\") pod \"route-controller-manager-7887f885ff-c6src\" (UID: \"34ba32d0-1782-494d-a0be-b745ff8a749c\") " pod="openshift-route-controller-manager/route-controller-manager-7887f885ff-c6src" Dec 12 00:28:40 crc kubenswrapper[4606]: I1212 00:28:40.680078 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dl6sc\" (UniqueName: \"kubernetes.io/projected/69eda6fa-0514-47b0-8f96-f15a19328c98-kube-api-access-dl6sc\") pod \"controller-manager-6588b54b8f-snh6l\" (UID: \"69eda6fa-0514-47b0-8f96-f15a19328c98\") " pod="openshift-controller-manager/controller-manager-6588b54b8f-snh6l" Dec 12 00:28:40 crc kubenswrapper[4606]: I1212 00:28:40.680495 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/69eda6fa-0514-47b0-8f96-f15a19328c98-serving-cert\") pod \"controller-manager-6588b54b8f-snh6l\" (UID: \"69eda6fa-0514-47b0-8f96-f15a19328c98\") " pod="openshift-controller-manager/controller-manager-6588b54b8f-snh6l" Dec 12 00:28:40 crc kubenswrapper[4606]: I1212 00:28:40.681060 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34ba32d0-1782-494d-a0be-b745ff8a749c-config\") pod \"route-controller-manager-7887f885ff-c6src\" (UID: \"34ba32d0-1782-494d-a0be-b745ff8a749c\") " pod="openshift-route-controller-manager/route-controller-manager-7887f885ff-c6src" Dec 12 00:28:40 crc kubenswrapper[4606]: I1212 00:28:40.681241 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/34ba32d0-1782-494d-a0be-b745ff8a749c-client-ca\") pod \"route-controller-manager-7887f885ff-c6src\" (UID: \"34ba32d0-1782-494d-a0be-b745ff8a749c\") " pod="openshift-route-controller-manager/route-controller-manager-7887f885ff-c6src" Dec 12 00:28:40 crc kubenswrapper[4606]: I1212 00:28:40.681277 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zlh66\" (UniqueName: \"kubernetes.io/projected/34ba32d0-1782-494d-a0be-b745ff8a749c-kube-api-access-zlh66\") pod \"route-controller-manager-7887f885ff-c6src\" (UID: \"34ba32d0-1782-494d-a0be-b745ff8a749c\") " pod="openshift-route-controller-manager/route-controller-manager-7887f885ff-c6src" Dec 12 00:28:40 crc kubenswrapper[4606]: I1212 00:28:40.681232 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/69eda6fa-0514-47b0-8f96-f15a19328c98-client-ca\") pod \"controller-manager-6588b54b8f-snh6l\" (UID: \"69eda6fa-0514-47b0-8f96-f15a19328c98\") " pod="openshift-controller-manager/controller-manager-6588b54b8f-snh6l" Dec 12 00:28:40 crc kubenswrapper[4606]: I1212 00:28:40.681368 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/69eda6fa-0514-47b0-8f96-f15a19328c98-proxy-ca-bundles\") pod \"controller-manager-6588b54b8f-snh6l\" (UID: \"69eda6fa-0514-47b0-8f96-f15a19328c98\") " pod="openshift-controller-manager/controller-manager-6588b54b8f-snh6l" Dec 12 00:28:40 crc kubenswrapper[4606]: I1212 00:28:40.682061 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34ba32d0-1782-494d-a0be-b745ff8a749c-config\") pod \"route-controller-manager-7887f885ff-c6src\" (UID: \"34ba32d0-1782-494d-a0be-b745ff8a749c\") " pod="openshift-route-controller-manager/route-controller-manager-7887f885ff-c6src" Dec 12 00:28:40 crc kubenswrapper[4606]: I1212 00:28:40.682632 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69eda6fa-0514-47b0-8f96-f15a19328c98-config\") pod \"controller-manager-6588b54b8f-snh6l\" (UID: \"69eda6fa-0514-47b0-8f96-f15a19328c98\") " pod="openshift-controller-manager/controller-manager-6588b54b8f-snh6l" Dec 12 00:28:40 crc kubenswrapper[4606]: I1212 00:28:40.683942 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/34ba32d0-1782-494d-a0be-b745ff8a749c-serving-cert\") pod \"route-controller-manager-7887f885ff-c6src\" (UID: \"34ba32d0-1782-494d-a0be-b745ff8a749c\") " pod="openshift-route-controller-manager/route-controller-manager-7887f885ff-c6src" Dec 12 00:28:40 crc kubenswrapper[4606]: I1212 00:28:40.686361 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/69eda6fa-0514-47b0-8f96-f15a19328c98-serving-cert\") pod \"controller-manager-6588b54b8f-snh6l\" (UID: \"69eda6fa-0514-47b0-8f96-f15a19328c98\") " pod="openshift-controller-manager/controller-manager-6588b54b8f-snh6l" Dec 12 00:28:40 crc kubenswrapper[4606]: I1212 00:28:40.696054 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dl6sc\" (UniqueName: \"kubernetes.io/projected/69eda6fa-0514-47b0-8f96-f15a19328c98-kube-api-access-dl6sc\") pod \"controller-manager-6588b54b8f-snh6l\" (UID: \"69eda6fa-0514-47b0-8f96-f15a19328c98\") " pod="openshift-controller-manager/controller-manager-6588b54b8f-snh6l" Dec 12 00:28:40 crc kubenswrapper[4606]: I1212 00:28:40.701244 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zlh66\" (UniqueName: \"kubernetes.io/projected/34ba32d0-1782-494d-a0be-b745ff8a749c-kube-api-access-zlh66\") pod \"route-controller-manager-7887f885ff-c6src\" (UID: \"34ba32d0-1782-494d-a0be-b745ff8a749c\") " pod="openshift-route-controller-manager/route-controller-manager-7887f885ff-c6src" Dec 12 00:28:40 crc kubenswrapper[4606]: I1212 00:28:40.827044 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6588b54b8f-snh6l" Dec 12 00:28:40 crc kubenswrapper[4606]: I1212 00:28:40.842082 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7887f885ff-c6src" Dec 12 00:28:41 crc kubenswrapper[4606]: I1212 00:28:41.060709 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6588b54b8f-snh6l"] Dec 12 00:28:41 crc kubenswrapper[4606]: I1212 00:28:41.120787 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7887f885ff-c6src"] Dec 12 00:28:41 crc kubenswrapper[4606]: I1212 00:28:41.673681 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6588b54b8f-snh6l"] Dec 12 00:28:41 crc kubenswrapper[4606]: I1212 00:28:41.690185 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7887f885ff-c6src"] Dec 12 00:28:41 crc kubenswrapper[4606]: I1212 00:28:41.704924 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16a5a061-f2aa-430e-9898-b7adff8ccb50" path="/var/lib/kubelet/pods/16a5a061-f2aa-430e-9898-b7adff8ccb50/volumes" Dec 12 00:28:41 crc kubenswrapper[4606]: I1212 00:28:41.705836 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43bb746f-62c0-45c5-b1db-490810a0ba0e" path="/var/lib/kubelet/pods/43bb746f-62c0-45c5-b1db-490810a0ba0e/volumes" Dec 12 00:28:41 crc kubenswrapper[4606]: I1212 00:28:41.706373 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6588b54b8f-snh6l" Dec 12 00:28:41 crc kubenswrapper[4606]: I1212 00:28:41.706404 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6588b54b8f-snh6l" event={"ID":"69eda6fa-0514-47b0-8f96-f15a19328c98","Type":"ContainerStarted","Data":"50dedca4c3c8e6da1de0ffba6aff4485ac6cef297cae045c1f321c5848591956"} Dec 12 00:28:41 crc kubenswrapper[4606]: I1212 00:28:41.706423 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7887f885ff-c6src" Dec 12 00:28:41 crc kubenswrapper[4606]: I1212 00:28:41.706436 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6588b54b8f-snh6l" event={"ID":"69eda6fa-0514-47b0-8f96-f15a19328c98","Type":"ContainerStarted","Data":"986d525dfae69e1392e1085108188b6ceb9322546735bac93f76f1d7c526705e"} Dec 12 00:28:41 crc kubenswrapper[4606]: I1212 00:28:41.706450 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7887f885ff-c6src" event={"ID":"34ba32d0-1782-494d-a0be-b745ff8a749c","Type":"ContainerStarted","Data":"e4fca33b81b6d7bc697aaa838068a91761da1f4e0a497bf6fa68e0e7646a6094"} Dec 12 00:28:41 crc kubenswrapper[4606]: I1212 00:28:41.706461 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7887f885ff-c6src" event={"ID":"34ba32d0-1782-494d-a0be-b745ff8a749c","Type":"ContainerStarted","Data":"3956d7f31da40fb657166feacbe3422b2818a7e9260197913697b756141f27f6"} Dec 12 00:28:41 crc kubenswrapper[4606]: I1212 00:28:41.708236 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7887f885ff-c6src" Dec 12 00:28:41 crc kubenswrapper[4606]: I1212 00:28:41.715211 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6588b54b8f-snh6l" Dec 12 00:28:41 crc kubenswrapper[4606]: I1212 00:28:41.801191 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6588b54b8f-snh6l" podStartSLOduration=3.80106315 podStartE2EDuration="3.80106315s" podCreationTimestamp="2025-12-12 00:28:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:28:41.76561878 +0000 UTC m=+312.310971646" watchObservedRunningTime="2025-12-12 00:28:41.80106315 +0000 UTC m=+312.346416006" Dec 12 00:28:41 crc kubenswrapper[4606]: I1212 00:28:41.838264 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7887f885ff-c6src" podStartSLOduration=3.838246087 podStartE2EDuration="3.838246087s" podCreationTimestamp="2025-12-12 00:28:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:28:41.808593576 +0000 UTC m=+312.353946442" watchObservedRunningTime="2025-12-12 00:28:41.838246087 +0000 UTC m=+312.383598953" Dec 12 00:28:42 crc kubenswrapper[4606]: I1212 00:28:42.708956 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6588b54b8f-snh6l" podUID="69eda6fa-0514-47b0-8f96-f15a19328c98" containerName="controller-manager" containerID="cri-o://50dedca4c3c8e6da1de0ffba6aff4485ac6cef297cae045c1f321c5848591956" gracePeriod=30 Dec 12 00:28:42 crc kubenswrapper[4606]: I1212 00:28:42.708997 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-7887f885ff-c6src" podUID="34ba32d0-1782-494d-a0be-b745ff8a749c" containerName="route-controller-manager" containerID="cri-o://e4fca33b81b6d7bc697aaa838068a91761da1f4e0a497bf6fa68e0e7646a6094" gracePeriod=30 Dec 12 00:28:43 crc kubenswrapper[4606]: I1212 00:28:43.199992 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7887f885ff-c6src" Dec 12 00:28:43 crc kubenswrapper[4606]: I1212 00:28:43.205358 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6588b54b8f-snh6l" Dec 12 00:28:43 crc kubenswrapper[4606]: I1212 00:28:43.230661 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-585b55f498-rl7j8"] Dec 12 00:28:43 crc kubenswrapper[4606]: E1212 00:28:43.230898 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69eda6fa-0514-47b0-8f96-f15a19328c98" containerName="controller-manager" Dec 12 00:28:43 crc kubenswrapper[4606]: I1212 00:28:43.230912 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="69eda6fa-0514-47b0-8f96-f15a19328c98" containerName="controller-manager" Dec 12 00:28:43 crc kubenswrapper[4606]: E1212 00:28:43.230927 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34ba32d0-1782-494d-a0be-b745ff8a749c" containerName="route-controller-manager" Dec 12 00:28:43 crc kubenswrapper[4606]: I1212 00:28:43.230937 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="34ba32d0-1782-494d-a0be-b745ff8a749c" containerName="route-controller-manager" Dec 12 00:28:43 crc kubenswrapper[4606]: I1212 00:28:43.231091 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="69eda6fa-0514-47b0-8f96-f15a19328c98" containerName="controller-manager" Dec 12 00:28:43 crc kubenswrapper[4606]: I1212 00:28:43.231137 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="34ba32d0-1782-494d-a0be-b745ff8a749c" containerName="route-controller-manager" Dec 12 00:28:43 crc kubenswrapper[4606]: I1212 00:28:43.231716 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-585b55f498-rl7j8" Dec 12 00:28:43 crc kubenswrapper[4606]: I1212 00:28:43.255087 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-585b55f498-rl7j8"] Dec 12 00:28:43 crc kubenswrapper[4606]: I1212 00:28:43.318066 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/69eda6fa-0514-47b0-8f96-f15a19328c98-serving-cert\") pod \"69eda6fa-0514-47b0-8f96-f15a19328c98\" (UID: \"69eda6fa-0514-47b0-8f96-f15a19328c98\") " Dec 12 00:28:43 crc kubenswrapper[4606]: I1212 00:28:43.318139 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/69eda6fa-0514-47b0-8f96-f15a19328c98-proxy-ca-bundles\") pod \"69eda6fa-0514-47b0-8f96-f15a19328c98\" (UID: \"69eda6fa-0514-47b0-8f96-f15a19328c98\") " Dec 12 00:28:43 crc kubenswrapper[4606]: I1212 00:28:43.318204 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dl6sc\" (UniqueName: \"kubernetes.io/projected/69eda6fa-0514-47b0-8f96-f15a19328c98-kube-api-access-dl6sc\") pod \"69eda6fa-0514-47b0-8f96-f15a19328c98\" (UID: \"69eda6fa-0514-47b0-8f96-f15a19328c98\") " Dec 12 00:28:43 crc kubenswrapper[4606]: I1212 00:28:43.318221 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/69eda6fa-0514-47b0-8f96-f15a19328c98-client-ca\") pod \"69eda6fa-0514-47b0-8f96-f15a19328c98\" (UID: \"69eda6fa-0514-47b0-8f96-f15a19328c98\") " Dec 12 00:28:43 crc kubenswrapper[4606]: I1212 00:28:43.318241 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zlh66\" (UniqueName: \"kubernetes.io/projected/34ba32d0-1782-494d-a0be-b745ff8a749c-kube-api-access-zlh66\") pod \"34ba32d0-1782-494d-a0be-b745ff8a749c\" (UID: \"34ba32d0-1782-494d-a0be-b745ff8a749c\") " Dec 12 00:28:43 crc kubenswrapper[4606]: I1212 00:28:43.318264 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34ba32d0-1782-494d-a0be-b745ff8a749c-config\") pod \"34ba32d0-1782-494d-a0be-b745ff8a749c\" (UID: \"34ba32d0-1782-494d-a0be-b745ff8a749c\") " Dec 12 00:28:43 crc kubenswrapper[4606]: I1212 00:28:43.318284 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/34ba32d0-1782-494d-a0be-b745ff8a749c-serving-cert\") pod \"34ba32d0-1782-494d-a0be-b745ff8a749c\" (UID: \"34ba32d0-1782-494d-a0be-b745ff8a749c\") " Dec 12 00:28:43 crc kubenswrapper[4606]: I1212 00:28:43.318671 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69eda6fa-0514-47b0-8f96-f15a19328c98-client-ca" (OuterVolumeSpecName: "client-ca") pod "69eda6fa-0514-47b0-8f96-f15a19328c98" (UID: "69eda6fa-0514-47b0-8f96-f15a19328c98"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:28:43 crc kubenswrapper[4606]: I1212 00:28:43.318832 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34ba32d0-1782-494d-a0be-b745ff8a749c-config" (OuterVolumeSpecName: "config") pod "34ba32d0-1782-494d-a0be-b745ff8a749c" (UID: "34ba32d0-1782-494d-a0be-b745ff8a749c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:28:43 crc kubenswrapper[4606]: I1212 00:28:43.318868 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69eda6fa-0514-47b0-8f96-f15a19328c98-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "69eda6fa-0514-47b0-8f96-f15a19328c98" (UID: "69eda6fa-0514-47b0-8f96-f15a19328c98"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:28:43 crc kubenswrapper[4606]: I1212 00:28:43.318306 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/34ba32d0-1782-494d-a0be-b745ff8a749c-client-ca\") pod \"34ba32d0-1782-494d-a0be-b745ff8a749c\" (UID: \"34ba32d0-1782-494d-a0be-b745ff8a749c\") " Dec 12 00:28:43 crc kubenswrapper[4606]: I1212 00:28:43.319094 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69eda6fa-0514-47b0-8f96-f15a19328c98-config\") pod \"69eda6fa-0514-47b0-8f96-f15a19328c98\" (UID: \"69eda6fa-0514-47b0-8f96-f15a19328c98\") " Dec 12 00:28:43 crc kubenswrapper[4606]: I1212 00:28:43.319380 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c6499dc6-f01c-4d32-a671-8af65df96665-client-ca\") pod \"route-controller-manager-585b55f498-rl7j8\" (UID: \"c6499dc6-f01c-4d32-a671-8af65df96665\") " pod="openshift-route-controller-manager/route-controller-manager-585b55f498-rl7j8" Dec 12 00:28:43 crc kubenswrapper[4606]: I1212 00:28:43.319424 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w59jt\" (UniqueName: \"kubernetes.io/projected/c6499dc6-f01c-4d32-a671-8af65df96665-kube-api-access-w59jt\") pod \"route-controller-manager-585b55f498-rl7j8\" (UID: \"c6499dc6-f01c-4d32-a671-8af65df96665\") " pod="openshift-route-controller-manager/route-controller-manager-585b55f498-rl7j8" Dec 12 00:28:43 crc kubenswrapper[4606]: I1212 00:28:43.319463 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c6499dc6-f01c-4d32-a671-8af65df96665-config\") pod \"route-controller-manager-585b55f498-rl7j8\" (UID: \"c6499dc6-f01c-4d32-a671-8af65df96665\") " pod="openshift-route-controller-manager/route-controller-manager-585b55f498-rl7j8" Dec 12 00:28:43 crc kubenswrapper[4606]: I1212 00:28:43.319492 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c6499dc6-f01c-4d32-a671-8af65df96665-serving-cert\") pod \"route-controller-manager-585b55f498-rl7j8\" (UID: \"c6499dc6-f01c-4d32-a671-8af65df96665\") " pod="openshift-route-controller-manager/route-controller-manager-585b55f498-rl7j8" Dec 12 00:28:43 crc kubenswrapper[4606]: I1212 00:28:43.319528 4606 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/69eda6fa-0514-47b0-8f96-f15a19328c98-client-ca\") on node \"crc\" DevicePath \"\"" Dec 12 00:28:43 crc kubenswrapper[4606]: I1212 00:28:43.319538 4606 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34ba32d0-1782-494d-a0be-b745ff8a749c-config\") on node \"crc\" DevicePath \"\"" Dec 12 00:28:43 crc kubenswrapper[4606]: I1212 00:28:43.319547 4606 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/69eda6fa-0514-47b0-8f96-f15a19328c98-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 12 00:28:43 crc kubenswrapper[4606]: I1212 00:28:43.319571 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34ba32d0-1782-494d-a0be-b745ff8a749c-client-ca" (OuterVolumeSpecName: "client-ca") pod "34ba32d0-1782-494d-a0be-b745ff8a749c" (UID: "34ba32d0-1782-494d-a0be-b745ff8a749c"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:28:43 crc kubenswrapper[4606]: I1212 00:28:43.319641 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69eda6fa-0514-47b0-8f96-f15a19328c98-config" (OuterVolumeSpecName: "config") pod "69eda6fa-0514-47b0-8f96-f15a19328c98" (UID: "69eda6fa-0514-47b0-8f96-f15a19328c98"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:28:43 crc kubenswrapper[4606]: I1212 00:28:43.323189 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34ba32d0-1782-494d-a0be-b745ff8a749c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "34ba32d0-1782-494d-a0be-b745ff8a749c" (UID: "34ba32d0-1782-494d-a0be-b745ff8a749c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:28:43 crc kubenswrapper[4606]: I1212 00:28:43.323252 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69eda6fa-0514-47b0-8f96-f15a19328c98-kube-api-access-dl6sc" (OuterVolumeSpecName: "kube-api-access-dl6sc") pod "69eda6fa-0514-47b0-8f96-f15a19328c98" (UID: "69eda6fa-0514-47b0-8f96-f15a19328c98"). InnerVolumeSpecName "kube-api-access-dl6sc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:28:43 crc kubenswrapper[4606]: I1212 00:28:43.326014 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69eda6fa-0514-47b0-8f96-f15a19328c98-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "69eda6fa-0514-47b0-8f96-f15a19328c98" (UID: "69eda6fa-0514-47b0-8f96-f15a19328c98"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:28:43 crc kubenswrapper[4606]: I1212 00:28:43.327355 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34ba32d0-1782-494d-a0be-b745ff8a749c-kube-api-access-zlh66" (OuterVolumeSpecName: "kube-api-access-zlh66") pod "34ba32d0-1782-494d-a0be-b745ff8a749c" (UID: "34ba32d0-1782-494d-a0be-b745ff8a749c"). InnerVolumeSpecName "kube-api-access-zlh66". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:28:43 crc kubenswrapper[4606]: I1212 00:28:43.420906 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w59jt\" (UniqueName: \"kubernetes.io/projected/c6499dc6-f01c-4d32-a671-8af65df96665-kube-api-access-w59jt\") pod \"route-controller-manager-585b55f498-rl7j8\" (UID: \"c6499dc6-f01c-4d32-a671-8af65df96665\") " pod="openshift-route-controller-manager/route-controller-manager-585b55f498-rl7j8" Dec 12 00:28:43 crc kubenswrapper[4606]: I1212 00:28:43.420979 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c6499dc6-f01c-4d32-a671-8af65df96665-config\") pod \"route-controller-manager-585b55f498-rl7j8\" (UID: \"c6499dc6-f01c-4d32-a671-8af65df96665\") " pod="openshift-route-controller-manager/route-controller-manager-585b55f498-rl7j8" Dec 12 00:28:43 crc kubenswrapper[4606]: I1212 00:28:43.421021 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c6499dc6-f01c-4d32-a671-8af65df96665-serving-cert\") pod \"route-controller-manager-585b55f498-rl7j8\" (UID: \"c6499dc6-f01c-4d32-a671-8af65df96665\") " pod="openshift-route-controller-manager/route-controller-manager-585b55f498-rl7j8" Dec 12 00:28:43 crc kubenswrapper[4606]: I1212 00:28:43.421056 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c6499dc6-f01c-4d32-a671-8af65df96665-client-ca\") pod \"route-controller-manager-585b55f498-rl7j8\" (UID: \"c6499dc6-f01c-4d32-a671-8af65df96665\") " pod="openshift-route-controller-manager/route-controller-manager-585b55f498-rl7j8" Dec 12 00:28:43 crc kubenswrapper[4606]: I1212 00:28:43.421104 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zlh66\" (UniqueName: \"kubernetes.io/projected/34ba32d0-1782-494d-a0be-b745ff8a749c-kube-api-access-zlh66\") on node \"crc\" DevicePath \"\"" Dec 12 00:28:43 crc kubenswrapper[4606]: I1212 00:28:43.421118 4606 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/34ba32d0-1782-494d-a0be-b745ff8a749c-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 12 00:28:43 crc kubenswrapper[4606]: I1212 00:28:43.421130 4606 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/34ba32d0-1782-494d-a0be-b745ff8a749c-client-ca\") on node \"crc\" DevicePath \"\"" Dec 12 00:28:43 crc kubenswrapper[4606]: I1212 00:28:43.421141 4606 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69eda6fa-0514-47b0-8f96-f15a19328c98-config\") on node \"crc\" DevicePath \"\"" Dec 12 00:28:43 crc kubenswrapper[4606]: I1212 00:28:43.421152 4606 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/69eda6fa-0514-47b0-8f96-f15a19328c98-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 12 00:28:43 crc kubenswrapper[4606]: I1212 00:28:43.421163 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dl6sc\" (UniqueName: \"kubernetes.io/projected/69eda6fa-0514-47b0-8f96-f15a19328c98-kube-api-access-dl6sc\") on node \"crc\" DevicePath \"\"" Dec 12 00:28:43 crc kubenswrapper[4606]: I1212 00:28:43.422318 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c6499dc6-f01c-4d32-a671-8af65df96665-config\") pod \"route-controller-manager-585b55f498-rl7j8\" (UID: \"c6499dc6-f01c-4d32-a671-8af65df96665\") " pod="openshift-route-controller-manager/route-controller-manager-585b55f498-rl7j8" Dec 12 00:28:43 crc kubenswrapper[4606]: I1212 00:28:43.422707 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c6499dc6-f01c-4d32-a671-8af65df96665-client-ca\") pod \"route-controller-manager-585b55f498-rl7j8\" (UID: \"c6499dc6-f01c-4d32-a671-8af65df96665\") " pod="openshift-route-controller-manager/route-controller-manager-585b55f498-rl7j8" Dec 12 00:28:43 crc kubenswrapper[4606]: I1212 00:28:43.430156 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c6499dc6-f01c-4d32-a671-8af65df96665-serving-cert\") pod \"route-controller-manager-585b55f498-rl7j8\" (UID: \"c6499dc6-f01c-4d32-a671-8af65df96665\") " pod="openshift-route-controller-manager/route-controller-manager-585b55f498-rl7j8" Dec 12 00:28:43 crc kubenswrapper[4606]: I1212 00:28:43.436064 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w59jt\" (UniqueName: \"kubernetes.io/projected/c6499dc6-f01c-4d32-a671-8af65df96665-kube-api-access-w59jt\") pod \"route-controller-manager-585b55f498-rl7j8\" (UID: \"c6499dc6-f01c-4d32-a671-8af65df96665\") " pod="openshift-route-controller-manager/route-controller-manager-585b55f498-rl7j8" Dec 12 00:28:43 crc kubenswrapper[4606]: I1212 00:28:43.546676 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-585b55f498-rl7j8" Dec 12 00:28:43 crc kubenswrapper[4606]: I1212 00:28:43.717706 4606 generic.go:334] "Generic (PLEG): container finished" podID="69eda6fa-0514-47b0-8f96-f15a19328c98" containerID="50dedca4c3c8e6da1de0ffba6aff4485ac6cef297cae045c1f321c5848591956" exitCode=0 Dec 12 00:28:43 crc kubenswrapper[4606]: I1212 00:28:43.717757 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6588b54b8f-snh6l" event={"ID":"69eda6fa-0514-47b0-8f96-f15a19328c98","Type":"ContainerDied","Data":"50dedca4c3c8e6da1de0ffba6aff4485ac6cef297cae045c1f321c5848591956"} Dec 12 00:28:43 crc kubenswrapper[4606]: I1212 00:28:43.717782 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6588b54b8f-snh6l" event={"ID":"69eda6fa-0514-47b0-8f96-f15a19328c98","Type":"ContainerDied","Data":"986d525dfae69e1392e1085108188b6ceb9322546735bac93f76f1d7c526705e"} Dec 12 00:28:43 crc kubenswrapper[4606]: I1212 00:28:43.717799 4606 scope.go:117] "RemoveContainer" containerID="50dedca4c3c8e6da1de0ffba6aff4485ac6cef297cae045c1f321c5848591956" Dec 12 00:28:43 crc kubenswrapper[4606]: I1212 00:28:43.717888 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6588b54b8f-snh6l" Dec 12 00:28:43 crc kubenswrapper[4606]: I1212 00:28:43.730997 4606 generic.go:334] "Generic (PLEG): container finished" podID="34ba32d0-1782-494d-a0be-b745ff8a749c" containerID="e4fca33b81b6d7bc697aaa838068a91761da1f4e0a497bf6fa68e0e7646a6094" exitCode=0 Dec 12 00:28:43 crc kubenswrapper[4606]: I1212 00:28:43.731398 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7887f885ff-c6src" event={"ID":"34ba32d0-1782-494d-a0be-b745ff8a749c","Type":"ContainerDied","Data":"e4fca33b81b6d7bc697aaa838068a91761da1f4e0a497bf6fa68e0e7646a6094"} Dec 12 00:28:43 crc kubenswrapper[4606]: I1212 00:28:43.731539 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7887f885ff-c6src" event={"ID":"34ba32d0-1782-494d-a0be-b745ff8a749c","Type":"ContainerDied","Data":"3956d7f31da40fb657166feacbe3422b2818a7e9260197913697b756141f27f6"} Dec 12 00:28:43 crc kubenswrapper[4606]: I1212 00:28:43.731441 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7887f885ff-c6src" Dec 12 00:28:43 crc kubenswrapper[4606]: I1212 00:28:43.762786 4606 scope.go:117] "RemoveContainer" containerID="50dedca4c3c8e6da1de0ffba6aff4485ac6cef297cae045c1f321c5848591956" Dec 12 00:28:43 crc kubenswrapper[4606]: E1212 00:28:43.763951 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50dedca4c3c8e6da1de0ffba6aff4485ac6cef297cae045c1f321c5848591956\": container with ID starting with 50dedca4c3c8e6da1de0ffba6aff4485ac6cef297cae045c1f321c5848591956 not found: ID does not exist" containerID="50dedca4c3c8e6da1de0ffba6aff4485ac6cef297cae045c1f321c5848591956" Dec 12 00:28:43 crc kubenswrapper[4606]: I1212 00:28:43.763989 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50dedca4c3c8e6da1de0ffba6aff4485ac6cef297cae045c1f321c5848591956"} err="failed to get container status \"50dedca4c3c8e6da1de0ffba6aff4485ac6cef297cae045c1f321c5848591956\": rpc error: code = NotFound desc = could not find container \"50dedca4c3c8e6da1de0ffba6aff4485ac6cef297cae045c1f321c5848591956\": container with ID starting with 50dedca4c3c8e6da1de0ffba6aff4485ac6cef297cae045c1f321c5848591956 not found: ID does not exist" Dec 12 00:28:43 crc kubenswrapper[4606]: I1212 00:28:43.764016 4606 scope.go:117] "RemoveContainer" containerID="e4fca33b81b6d7bc697aaa838068a91761da1f4e0a497bf6fa68e0e7646a6094" Dec 12 00:28:43 crc kubenswrapper[4606]: I1212 00:28:43.771926 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-585b55f498-rl7j8"] Dec 12 00:28:43 crc kubenswrapper[4606]: I1212 00:28:43.776885 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7887f885ff-c6src"] Dec 12 00:28:43 crc kubenswrapper[4606]: I1212 00:28:43.781325 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7887f885ff-c6src"] Dec 12 00:28:43 crc kubenswrapper[4606]: I1212 00:28:43.784033 4606 scope.go:117] "RemoveContainer" containerID="e4fca33b81b6d7bc697aaa838068a91761da1f4e0a497bf6fa68e0e7646a6094" Dec 12 00:28:43 crc kubenswrapper[4606]: E1212 00:28:43.785041 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4fca33b81b6d7bc697aaa838068a91761da1f4e0a497bf6fa68e0e7646a6094\": container with ID starting with e4fca33b81b6d7bc697aaa838068a91761da1f4e0a497bf6fa68e0e7646a6094 not found: ID does not exist" containerID="e4fca33b81b6d7bc697aaa838068a91761da1f4e0a497bf6fa68e0e7646a6094" Dec 12 00:28:43 crc kubenswrapper[4606]: I1212 00:28:43.785140 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4fca33b81b6d7bc697aaa838068a91761da1f4e0a497bf6fa68e0e7646a6094"} err="failed to get container status \"e4fca33b81b6d7bc697aaa838068a91761da1f4e0a497bf6fa68e0e7646a6094\": rpc error: code = NotFound desc = could not find container \"e4fca33b81b6d7bc697aaa838068a91761da1f4e0a497bf6fa68e0e7646a6094\": container with ID starting with e4fca33b81b6d7bc697aaa838068a91761da1f4e0a497bf6fa68e0e7646a6094 not found: ID does not exist" Dec 12 00:28:43 crc kubenswrapper[4606]: I1212 00:28:43.785248 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6588b54b8f-snh6l"] Dec 12 00:28:43 crc kubenswrapper[4606]: I1212 00:28:43.789075 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6588b54b8f-snh6l"] Dec 12 00:28:44 crc kubenswrapper[4606]: I1212 00:28:44.738880 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-585b55f498-rl7j8" event={"ID":"c6499dc6-f01c-4d32-a671-8af65df96665","Type":"ContainerStarted","Data":"901dbbb22e421905433b80ad695c7920448d19b658b2f6930ebe4caf67bf1d39"} Dec 12 00:28:44 crc kubenswrapper[4606]: I1212 00:28:44.738932 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-585b55f498-rl7j8" event={"ID":"c6499dc6-f01c-4d32-a671-8af65df96665","Type":"ContainerStarted","Data":"522ff1ff058852bf929bdbafd5a1bc57fe5b6b3255fc45a04f091ed660557148"} Dec 12 00:28:44 crc kubenswrapper[4606]: I1212 00:28:44.739286 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-585b55f498-rl7j8" Dec 12 00:28:44 crc kubenswrapper[4606]: I1212 00:28:44.745087 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-585b55f498-rl7j8" Dec 12 00:28:44 crc kubenswrapper[4606]: I1212 00:28:44.758084 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-585b55f498-rl7j8" podStartSLOduration=3.758039282 podStartE2EDuration="3.758039282s" podCreationTimestamp="2025-12-12 00:28:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:28:44.756425108 +0000 UTC m=+315.301778034" watchObservedRunningTime="2025-12-12 00:28:44.758039282 +0000 UTC m=+315.303392168" Dec 12 00:28:45 crc kubenswrapper[4606]: I1212 00:28:45.497365 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-b974966b8-2pkbn"] Dec 12 00:28:45 crc kubenswrapper[4606]: I1212 00:28:45.498367 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-b974966b8-2pkbn" Dec 12 00:28:45 crc kubenswrapper[4606]: I1212 00:28:45.500553 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 12 00:28:45 crc kubenswrapper[4606]: I1212 00:28:45.500688 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 12 00:28:45 crc kubenswrapper[4606]: I1212 00:28:45.503838 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 12 00:28:45 crc kubenswrapper[4606]: I1212 00:28:45.504300 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 12 00:28:45 crc kubenswrapper[4606]: I1212 00:28:45.504317 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 12 00:28:45 crc kubenswrapper[4606]: I1212 00:28:45.506651 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 12 00:28:45 crc kubenswrapper[4606]: I1212 00:28:45.518695 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-b974966b8-2pkbn"] Dec 12 00:28:45 crc kubenswrapper[4606]: I1212 00:28:45.523423 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 12 00:28:45 crc kubenswrapper[4606]: I1212 00:28:45.549500 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d3742fdc-2cac-49fb-945b-11b645a54119-serving-cert\") pod \"controller-manager-b974966b8-2pkbn\" (UID: \"d3742fdc-2cac-49fb-945b-11b645a54119\") " pod="openshift-controller-manager/controller-manager-b974966b8-2pkbn" Dec 12 00:28:45 crc kubenswrapper[4606]: I1212 00:28:45.549565 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3742fdc-2cac-49fb-945b-11b645a54119-config\") pod \"controller-manager-b974966b8-2pkbn\" (UID: \"d3742fdc-2cac-49fb-945b-11b645a54119\") " pod="openshift-controller-manager/controller-manager-b974966b8-2pkbn" Dec 12 00:28:45 crc kubenswrapper[4606]: I1212 00:28:45.549607 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d3742fdc-2cac-49fb-945b-11b645a54119-proxy-ca-bundles\") pod \"controller-manager-b974966b8-2pkbn\" (UID: \"d3742fdc-2cac-49fb-945b-11b645a54119\") " pod="openshift-controller-manager/controller-manager-b974966b8-2pkbn" Dec 12 00:28:45 crc kubenswrapper[4606]: I1212 00:28:45.549641 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nb9w\" (UniqueName: \"kubernetes.io/projected/d3742fdc-2cac-49fb-945b-11b645a54119-kube-api-access-8nb9w\") pod \"controller-manager-b974966b8-2pkbn\" (UID: \"d3742fdc-2cac-49fb-945b-11b645a54119\") " pod="openshift-controller-manager/controller-manager-b974966b8-2pkbn" Dec 12 00:28:45 crc kubenswrapper[4606]: I1212 00:28:45.549669 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d3742fdc-2cac-49fb-945b-11b645a54119-client-ca\") pod \"controller-manager-b974966b8-2pkbn\" (UID: \"d3742fdc-2cac-49fb-945b-11b645a54119\") " pod="openshift-controller-manager/controller-manager-b974966b8-2pkbn" Dec 12 00:28:45 crc kubenswrapper[4606]: I1212 00:28:45.650126 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d3742fdc-2cac-49fb-945b-11b645a54119-proxy-ca-bundles\") pod \"controller-manager-b974966b8-2pkbn\" (UID: \"d3742fdc-2cac-49fb-945b-11b645a54119\") " pod="openshift-controller-manager/controller-manager-b974966b8-2pkbn" Dec 12 00:28:45 crc kubenswrapper[4606]: I1212 00:28:45.650201 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8nb9w\" (UniqueName: \"kubernetes.io/projected/d3742fdc-2cac-49fb-945b-11b645a54119-kube-api-access-8nb9w\") pod \"controller-manager-b974966b8-2pkbn\" (UID: \"d3742fdc-2cac-49fb-945b-11b645a54119\") " pod="openshift-controller-manager/controller-manager-b974966b8-2pkbn" Dec 12 00:28:45 crc kubenswrapper[4606]: I1212 00:28:45.650233 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d3742fdc-2cac-49fb-945b-11b645a54119-client-ca\") pod \"controller-manager-b974966b8-2pkbn\" (UID: \"d3742fdc-2cac-49fb-945b-11b645a54119\") " pod="openshift-controller-manager/controller-manager-b974966b8-2pkbn" Dec 12 00:28:45 crc kubenswrapper[4606]: I1212 00:28:45.650296 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d3742fdc-2cac-49fb-945b-11b645a54119-serving-cert\") pod \"controller-manager-b974966b8-2pkbn\" (UID: \"d3742fdc-2cac-49fb-945b-11b645a54119\") " pod="openshift-controller-manager/controller-manager-b974966b8-2pkbn" Dec 12 00:28:45 crc kubenswrapper[4606]: I1212 00:28:45.650328 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3742fdc-2cac-49fb-945b-11b645a54119-config\") pod \"controller-manager-b974966b8-2pkbn\" (UID: \"d3742fdc-2cac-49fb-945b-11b645a54119\") " pod="openshift-controller-manager/controller-manager-b974966b8-2pkbn" Dec 12 00:28:45 crc kubenswrapper[4606]: I1212 00:28:45.651843 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3742fdc-2cac-49fb-945b-11b645a54119-config\") pod \"controller-manager-b974966b8-2pkbn\" (UID: \"d3742fdc-2cac-49fb-945b-11b645a54119\") " pod="openshift-controller-manager/controller-manager-b974966b8-2pkbn" Dec 12 00:28:45 crc kubenswrapper[4606]: I1212 00:28:45.653509 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d3742fdc-2cac-49fb-945b-11b645a54119-proxy-ca-bundles\") pod \"controller-manager-b974966b8-2pkbn\" (UID: \"d3742fdc-2cac-49fb-945b-11b645a54119\") " pod="openshift-controller-manager/controller-manager-b974966b8-2pkbn" Dec 12 00:28:45 crc kubenswrapper[4606]: I1212 00:28:45.653778 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d3742fdc-2cac-49fb-945b-11b645a54119-client-ca\") pod \"controller-manager-b974966b8-2pkbn\" (UID: \"d3742fdc-2cac-49fb-945b-11b645a54119\") " pod="openshift-controller-manager/controller-manager-b974966b8-2pkbn" Dec 12 00:28:45 crc kubenswrapper[4606]: I1212 00:28:45.661239 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d3742fdc-2cac-49fb-945b-11b645a54119-serving-cert\") pod \"controller-manager-b974966b8-2pkbn\" (UID: \"d3742fdc-2cac-49fb-945b-11b645a54119\") " pod="openshift-controller-manager/controller-manager-b974966b8-2pkbn" Dec 12 00:28:45 crc kubenswrapper[4606]: I1212 00:28:45.685711 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nb9w\" (UniqueName: \"kubernetes.io/projected/d3742fdc-2cac-49fb-945b-11b645a54119-kube-api-access-8nb9w\") pod \"controller-manager-b974966b8-2pkbn\" (UID: \"d3742fdc-2cac-49fb-945b-11b645a54119\") " pod="openshift-controller-manager/controller-manager-b974966b8-2pkbn" Dec 12 00:28:45 crc kubenswrapper[4606]: I1212 00:28:45.712280 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34ba32d0-1782-494d-a0be-b745ff8a749c" path="/var/lib/kubelet/pods/34ba32d0-1782-494d-a0be-b745ff8a749c/volumes" Dec 12 00:28:45 crc kubenswrapper[4606]: I1212 00:28:45.713659 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69eda6fa-0514-47b0-8f96-f15a19328c98" path="/var/lib/kubelet/pods/69eda6fa-0514-47b0-8f96-f15a19328c98/volumes" Dec 12 00:28:45 crc kubenswrapper[4606]: I1212 00:28:45.819620 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-b974966b8-2pkbn" Dec 12 00:28:46 crc kubenswrapper[4606]: I1212 00:28:46.080051 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-b974966b8-2pkbn"] Dec 12 00:28:46 crc kubenswrapper[4606]: W1212 00:28:46.092744 4606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd3742fdc_2cac_49fb_945b_11b645a54119.slice/crio-bde59295a1843a4c352d9165451a46a01ab2e5081bf628199a53c11adcf495cf WatchSource:0}: Error finding container bde59295a1843a4c352d9165451a46a01ab2e5081bf628199a53c11adcf495cf: Status 404 returned error can't find the container with id bde59295a1843a4c352d9165451a46a01ab2e5081bf628199a53c11adcf495cf Dec 12 00:28:46 crc kubenswrapper[4606]: I1212 00:28:46.768416 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-b974966b8-2pkbn" event={"ID":"d3742fdc-2cac-49fb-945b-11b645a54119","Type":"ContainerStarted","Data":"7ec242e41e2f545349c2badfaff810881ef486f7b2eff38fbf1add05046c5299"} Dec 12 00:28:46 crc kubenswrapper[4606]: I1212 00:28:46.768735 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-b974966b8-2pkbn" event={"ID":"d3742fdc-2cac-49fb-945b-11b645a54119","Type":"ContainerStarted","Data":"bde59295a1843a4c352d9165451a46a01ab2e5081bf628199a53c11adcf495cf"} Dec 12 00:28:46 crc kubenswrapper[4606]: I1212 00:28:46.770608 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-b974966b8-2pkbn" Dec 12 00:28:46 crc kubenswrapper[4606]: I1212 00:28:46.776469 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-b974966b8-2pkbn" Dec 12 00:28:46 crc kubenswrapper[4606]: I1212 00:28:46.787199 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-b974966b8-2pkbn" podStartSLOduration=5.78718325 podStartE2EDuration="5.78718325s" podCreationTimestamp="2025-12-12 00:28:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:28:46.786745288 +0000 UTC m=+317.332098154" watchObservedRunningTime="2025-12-12 00:28:46.78718325 +0000 UTC m=+317.332536116" Dec 12 00:29:02 crc kubenswrapper[4606]: I1212 00:29:02.010328 4606 patch_prober.go:28] interesting pod/machine-config-daemon-cqmz5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 12 00:29:02 crc kubenswrapper[4606]: I1212 00:29:02.010863 4606 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 12 00:29:08 crc kubenswrapper[4606]: I1212 00:29:08.438513 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-drkxp"] Dec 12 00:29:08 crc kubenswrapper[4606]: I1212 00:29:08.440874 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-drkxp" podUID="16aefbaf-f9b9-452e-84fa-a710b9284349" containerName="registry-server" containerID="cri-o://abe99af364e8244bff119476cc08dcf3b6a56be253275b60b224e0ddc05eec24" gracePeriod=30 Dec 12 00:29:08 crc kubenswrapper[4606]: I1212 00:29:08.456487 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xg5pj"] Dec 12 00:29:08 crc kubenswrapper[4606]: I1212 00:29:08.456804 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-xg5pj" podUID="34b189fb-cbc9-4ebe-bfa2-b8a4e1e0d4b4" containerName="registry-server" containerID="cri-o://ba3c7e4fc582bcd6e984c004f7d4e787f3870a52e1b6cd14daf6751753a6fe60" gracePeriod=30 Dec 12 00:29:08 crc kubenswrapper[4606]: I1212 00:29:08.463208 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-544fr"] Dec 12 00:29:08 crc kubenswrapper[4606]: I1212 00:29:08.463404 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-544fr" podUID="d5b97b3b-8994-4d7f-a165-c04d13546e89" containerName="marketplace-operator" containerID="cri-o://f55b8074b8bed57843a80bd0cf27d76f607aba8f14e884901d3445646881acd6" gracePeriod=30 Dec 12 00:29:08 crc kubenswrapper[4606]: I1212 00:29:08.470734 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-q5998"] Dec 12 00:29:08 crc kubenswrapper[4606]: I1212 00:29:08.470992 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-q5998" podUID="ee26d00d-079d-41c7-b641-5f2373eef2ee" containerName="registry-server" containerID="cri-o://921af76b9dbda9d86c1cec7c7610a9a4396e93e859431c6729a55e9f4af917e3" gracePeriod=30 Dec 12 00:29:08 crc kubenswrapper[4606]: I1212 00:29:08.478160 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fqkw9"] Dec 12 00:29:08 crc kubenswrapper[4606]: I1212 00:29:08.478399 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-fqkw9" podUID="303940e6-1922-4197-ad2a-6524c192b1b5" containerName="registry-server" containerID="cri-o://78c3f9ad0cf1e9fa50796a473926316088ec8645f169bedad0718bfac093b0d5" gracePeriod=30 Dec 12 00:29:08 crc kubenswrapper[4606]: I1212 00:29:08.482140 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-rq529"] Dec 12 00:29:08 crc kubenswrapper[4606]: I1212 00:29:08.482721 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-rq529" Dec 12 00:29:08 crc kubenswrapper[4606]: I1212 00:29:08.510470 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-rq529"] Dec 12 00:29:08 crc kubenswrapper[4606]: I1212 00:29:08.570755 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d4738423-c294-4595-8656-3a2ebd437a75-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-rq529\" (UID: \"d4738423-c294-4595-8656-3a2ebd437a75\") " pod="openshift-marketplace/marketplace-operator-79b997595-rq529" Dec 12 00:29:08 crc kubenswrapper[4606]: I1212 00:29:08.570808 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbxd2\" (UniqueName: \"kubernetes.io/projected/d4738423-c294-4595-8656-3a2ebd437a75-kube-api-access-fbxd2\") pod \"marketplace-operator-79b997595-rq529\" (UID: \"d4738423-c294-4595-8656-3a2ebd437a75\") " pod="openshift-marketplace/marketplace-operator-79b997595-rq529" Dec 12 00:29:08 crc kubenswrapper[4606]: I1212 00:29:08.570876 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/d4738423-c294-4595-8656-3a2ebd437a75-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-rq529\" (UID: \"d4738423-c294-4595-8656-3a2ebd437a75\") " pod="openshift-marketplace/marketplace-operator-79b997595-rq529" Dec 12 00:29:08 crc kubenswrapper[4606]: I1212 00:29:08.673006 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/d4738423-c294-4595-8656-3a2ebd437a75-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-rq529\" (UID: \"d4738423-c294-4595-8656-3a2ebd437a75\") " pod="openshift-marketplace/marketplace-operator-79b997595-rq529" Dec 12 00:29:08 crc kubenswrapper[4606]: I1212 00:29:08.673065 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d4738423-c294-4595-8656-3a2ebd437a75-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-rq529\" (UID: \"d4738423-c294-4595-8656-3a2ebd437a75\") " pod="openshift-marketplace/marketplace-operator-79b997595-rq529" Dec 12 00:29:08 crc kubenswrapper[4606]: I1212 00:29:08.673088 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fbxd2\" (UniqueName: \"kubernetes.io/projected/d4738423-c294-4595-8656-3a2ebd437a75-kube-api-access-fbxd2\") pod \"marketplace-operator-79b997595-rq529\" (UID: \"d4738423-c294-4595-8656-3a2ebd437a75\") " pod="openshift-marketplace/marketplace-operator-79b997595-rq529" Dec 12 00:29:08 crc kubenswrapper[4606]: I1212 00:29:08.675441 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d4738423-c294-4595-8656-3a2ebd437a75-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-rq529\" (UID: \"d4738423-c294-4595-8656-3a2ebd437a75\") " pod="openshift-marketplace/marketplace-operator-79b997595-rq529" Dec 12 00:29:08 crc kubenswrapper[4606]: I1212 00:29:08.678777 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/d4738423-c294-4595-8656-3a2ebd437a75-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-rq529\" (UID: \"d4738423-c294-4595-8656-3a2ebd437a75\") " pod="openshift-marketplace/marketplace-operator-79b997595-rq529" Dec 12 00:29:08 crc kubenswrapper[4606]: I1212 00:29:08.689523 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbxd2\" (UniqueName: \"kubernetes.io/projected/d4738423-c294-4595-8656-3a2ebd437a75-kube-api-access-fbxd2\") pod \"marketplace-operator-79b997595-rq529\" (UID: \"d4738423-c294-4595-8656-3a2ebd437a75\") " pod="openshift-marketplace/marketplace-operator-79b997595-rq529" Dec 12 00:29:08 crc kubenswrapper[4606]: I1212 00:29:08.801627 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-rq529" Dec 12 00:29:08 crc kubenswrapper[4606]: I1212 00:29:08.912733 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xg5pj" Dec 12 00:29:08 crc kubenswrapper[4606]: I1212 00:29:08.977603 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-drsn7\" (UniqueName: \"kubernetes.io/projected/34b189fb-cbc9-4ebe-bfa2-b8a4e1e0d4b4-kube-api-access-drsn7\") pod \"34b189fb-cbc9-4ebe-bfa2-b8a4e1e0d4b4\" (UID: \"34b189fb-cbc9-4ebe-bfa2-b8a4e1e0d4b4\") " Dec 12 00:29:08 crc kubenswrapper[4606]: I1212 00:29:08.977692 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34b189fb-cbc9-4ebe-bfa2-b8a4e1e0d4b4-utilities\") pod \"34b189fb-cbc9-4ebe-bfa2-b8a4e1e0d4b4\" (UID: \"34b189fb-cbc9-4ebe-bfa2-b8a4e1e0d4b4\") " Dec 12 00:29:08 crc kubenswrapper[4606]: I1212 00:29:08.977773 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34b189fb-cbc9-4ebe-bfa2-b8a4e1e0d4b4-catalog-content\") pod \"34b189fb-cbc9-4ebe-bfa2-b8a4e1e0d4b4\" (UID: \"34b189fb-cbc9-4ebe-bfa2-b8a4e1e0d4b4\") " Dec 12 00:29:08 crc kubenswrapper[4606]: I1212 00:29:08.978783 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34b189fb-cbc9-4ebe-bfa2-b8a4e1e0d4b4-utilities" (OuterVolumeSpecName: "utilities") pod "34b189fb-cbc9-4ebe-bfa2-b8a4e1e0d4b4" (UID: "34b189fb-cbc9-4ebe-bfa2-b8a4e1e0d4b4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 00:29:08 crc kubenswrapper[4606]: I1212 00:29:08.988782 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34b189fb-cbc9-4ebe-bfa2-b8a4e1e0d4b4-kube-api-access-drsn7" (OuterVolumeSpecName: "kube-api-access-drsn7") pod "34b189fb-cbc9-4ebe-bfa2-b8a4e1e0d4b4" (UID: "34b189fb-cbc9-4ebe-bfa2-b8a4e1e0d4b4"). InnerVolumeSpecName "kube-api-access-drsn7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:29:09 crc kubenswrapper[4606]: I1212 00:29:09.030332 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34b189fb-cbc9-4ebe-bfa2-b8a4e1e0d4b4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "34b189fb-cbc9-4ebe-bfa2-b8a4e1e0d4b4" (UID: "34b189fb-cbc9-4ebe-bfa2-b8a4e1e0d4b4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 00:29:09 crc kubenswrapper[4606]: I1212 00:29:09.066709 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-drkxp" Dec 12 00:29:09 crc kubenswrapper[4606]: I1212 00:29:09.079535 4606 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34b189fb-cbc9-4ebe-bfa2-b8a4e1e0d4b4-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 12 00:29:09 crc kubenswrapper[4606]: I1212 00:29:09.079570 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-drsn7\" (UniqueName: \"kubernetes.io/projected/34b189fb-cbc9-4ebe-bfa2-b8a4e1e0d4b4-kube-api-access-drsn7\") on node \"crc\" DevicePath \"\"" Dec 12 00:29:09 crc kubenswrapper[4606]: I1212 00:29:09.079591 4606 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34b189fb-cbc9-4ebe-bfa2-b8a4e1e0d4b4-utilities\") on node \"crc\" DevicePath \"\"" Dec 12 00:29:09 crc kubenswrapper[4606]: I1212 00:29:09.180442 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16aefbaf-f9b9-452e-84fa-a710b9284349-catalog-content\") pod \"16aefbaf-f9b9-452e-84fa-a710b9284349\" (UID: \"16aefbaf-f9b9-452e-84fa-a710b9284349\") " Dec 12 00:29:09 crc kubenswrapper[4606]: I1212 00:29:09.181452 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5btch\" (UniqueName: \"kubernetes.io/projected/16aefbaf-f9b9-452e-84fa-a710b9284349-kube-api-access-5btch\") pod \"16aefbaf-f9b9-452e-84fa-a710b9284349\" (UID: \"16aefbaf-f9b9-452e-84fa-a710b9284349\") " Dec 12 00:29:09 crc kubenswrapper[4606]: I1212 00:29:09.181524 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16aefbaf-f9b9-452e-84fa-a710b9284349-utilities\") pod \"16aefbaf-f9b9-452e-84fa-a710b9284349\" (UID: \"16aefbaf-f9b9-452e-84fa-a710b9284349\") " Dec 12 00:29:09 crc kubenswrapper[4606]: I1212 00:29:09.182709 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16aefbaf-f9b9-452e-84fa-a710b9284349-utilities" (OuterVolumeSpecName: "utilities") pod "16aefbaf-f9b9-452e-84fa-a710b9284349" (UID: "16aefbaf-f9b9-452e-84fa-a710b9284349"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 00:29:09 crc kubenswrapper[4606]: I1212 00:29:09.185381 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16aefbaf-f9b9-452e-84fa-a710b9284349-kube-api-access-5btch" (OuterVolumeSpecName: "kube-api-access-5btch") pod "16aefbaf-f9b9-452e-84fa-a710b9284349" (UID: "16aefbaf-f9b9-452e-84fa-a710b9284349"). InnerVolumeSpecName "kube-api-access-5btch". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:29:09 crc kubenswrapper[4606]: I1212 00:29:09.234299 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16aefbaf-f9b9-452e-84fa-a710b9284349-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "16aefbaf-f9b9-452e-84fa-a710b9284349" (UID: "16aefbaf-f9b9-452e-84fa-a710b9284349"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 00:29:09 crc kubenswrapper[4606]: I1212 00:29:09.237649 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q5998" Dec 12 00:29:09 crc kubenswrapper[4606]: I1212 00:29:09.245115 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-544fr" Dec 12 00:29:09 crc kubenswrapper[4606]: I1212 00:29:09.282684 4606 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16aefbaf-f9b9-452e-84fa-a710b9284349-utilities\") on node \"crc\" DevicePath \"\"" Dec 12 00:29:09 crc kubenswrapper[4606]: I1212 00:29:09.282707 4606 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16aefbaf-f9b9-452e-84fa-a710b9284349-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 12 00:29:09 crc kubenswrapper[4606]: I1212 00:29:09.282716 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5btch\" (UniqueName: \"kubernetes.io/projected/16aefbaf-f9b9-452e-84fa-a710b9284349-kube-api-access-5btch\") on node \"crc\" DevicePath \"\"" Dec 12 00:29:09 crc kubenswrapper[4606]: I1212 00:29:09.283895 4606 generic.go:334] "Generic (PLEG): container finished" podID="303940e6-1922-4197-ad2a-6524c192b1b5" containerID="78c3f9ad0cf1e9fa50796a473926316088ec8645f169bedad0718bfac093b0d5" exitCode=0 Dec 12 00:29:09 crc kubenswrapper[4606]: I1212 00:29:09.283953 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fqkw9" event={"ID":"303940e6-1922-4197-ad2a-6524c192b1b5","Type":"ContainerDied","Data":"78c3f9ad0cf1e9fa50796a473926316088ec8645f169bedad0718bfac093b0d5"} Dec 12 00:29:09 crc kubenswrapper[4606]: I1212 00:29:09.289496 4606 generic.go:334] "Generic (PLEG): container finished" podID="ee26d00d-079d-41c7-b641-5f2373eef2ee" containerID="921af76b9dbda9d86c1cec7c7610a9a4396e93e859431c6729a55e9f4af917e3" exitCode=0 Dec 12 00:29:09 crc kubenswrapper[4606]: I1212 00:29:09.289558 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q5998" event={"ID":"ee26d00d-079d-41c7-b641-5f2373eef2ee","Type":"ContainerDied","Data":"921af76b9dbda9d86c1cec7c7610a9a4396e93e859431c6729a55e9f4af917e3"} Dec 12 00:29:09 crc kubenswrapper[4606]: I1212 00:29:09.289587 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q5998" event={"ID":"ee26d00d-079d-41c7-b641-5f2373eef2ee","Type":"ContainerDied","Data":"1dc5d89429b0743211d02a00386457a161b9ebf9a4bc8d8efd00b3c28c289ad6"} Dec 12 00:29:09 crc kubenswrapper[4606]: I1212 00:29:09.289604 4606 scope.go:117] "RemoveContainer" containerID="921af76b9dbda9d86c1cec7c7610a9a4396e93e859431c6729a55e9f4af917e3" Dec 12 00:29:09 crc kubenswrapper[4606]: I1212 00:29:09.289717 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q5998" Dec 12 00:29:09 crc kubenswrapper[4606]: I1212 00:29:09.300772 4606 generic.go:334] "Generic (PLEG): container finished" podID="16aefbaf-f9b9-452e-84fa-a710b9284349" containerID="abe99af364e8244bff119476cc08dcf3b6a56be253275b60b224e0ddc05eec24" exitCode=0 Dec 12 00:29:09 crc kubenswrapper[4606]: I1212 00:29:09.300830 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-drkxp" event={"ID":"16aefbaf-f9b9-452e-84fa-a710b9284349","Type":"ContainerDied","Data":"abe99af364e8244bff119476cc08dcf3b6a56be253275b60b224e0ddc05eec24"} Dec 12 00:29:09 crc kubenswrapper[4606]: I1212 00:29:09.300857 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-drkxp" event={"ID":"16aefbaf-f9b9-452e-84fa-a710b9284349","Type":"ContainerDied","Data":"c97d7db2b7cfa6f202e7f21298fade29ee8280a38549795f518938dfa846a454"} Dec 12 00:29:09 crc kubenswrapper[4606]: I1212 00:29:09.300919 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-drkxp" Dec 12 00:29:09 crc kubenswrapper[4606]: I1212 00:29:09.310839 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fqkw9" Dec 12 00:29:09 crc kubenswrapper[4606]: I1212 00:29:09.310835 4606 generic.go:334] "Generic (PLEG): container finished" podID="d5b97b3b-8994-4d7f-a165-c04d13546e89" containerID="f55b8074b8bed57843a80bd0cf27d76f607aba8f14e884901d3445646881acd6" exitCode=0 Dec 12 00:29:09 crc kubenswrapper[4606]: I1212 00:29:09.310860 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-544fr" event={"ID":"d5b97b3b-8994-4d7f-a165-c04d13546e89","Type":"ContainerDied","Data":"f55b8074b8bed57843a80bd0cf27d76f607aba8f14e884901d3445646881acd6"} Dec 12 00:29:09 crc kubenswrapper[4606]: I1212 00:29:09.311259 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-544fr" event={"ID":"d5b97b3b-8994-4d7f-a165-c04d13546e89","Type":"ContainerDied","Data":"40487f2c64688dbbe919542e52b95392d2fdf203e6bf59fca03ab76a8fdf3b9b"} Dec 12 00:29:09 crc kubenswrapper[4606]: I1212 00:29:09.310903 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-544fr" Dec 12 00:29:09 crc kubenswrapper[4606]: I1212 00:29:09.311426 4606 scope.go:117] "RemoveContainer" containerID="ebc3634ed151afc936779c7397fec3d15c9ac679b8f7908f8132b99d20e7186d" Dec 12 00:29:09 crc kubenswrapper[4606]: I1212 00:29:09.316186 4606 generic.go:334] "Generic (PLEG): container finished" podID="34b189fb-cbc9-4ebe-bfa2-b8a4e1e0d4b4" containerID="ba3c7e4fc582bcd6e984c004f7d4e787f3870a52e1b6cd14daf6751753a6fe60" exitCode=0 Dec 12 00:29:09 crc kubenswrapper[4606]: I1212 00:29:09.316256 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xg5pj" event={"ID":"34b189fb-cbc9-4ebe-bfa2-b8a4e1e0d4b4","Type":"ContainerDied","Data":"ba3c7e4fc582bcd6e984c004f7d4e787f3870a52e1b6cd14daf6751753a6fe60"} Dec 12 00:29:09 crc kubenswrapper[4606]: I1212 00:29:09.316286 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xg5pj" event={"ID":"34b189fb-cbc9-4ebe-bfa2-b8a4e1e0d4b4","Type":"ContainerDied","Data":"ead8403e15c823bdec4438306689ab40fb499d708c8b11597eaa155b8bdc63dc"} Dec 12 00:29:09 crc kubenswrapper[4606]: I1212 00:29:09.316354 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xg5pj" Dec 12 00:29:09 crc kubenswrapper[4606]: I1212 00:29:09.340270 4606 scope.go:117] "RemoveContainer" containerID="db3e0c552c5a09d6984055041f40cb04a87efd2ef550498502ab1c9e729b801f" Dec 12 00:29:09 crc kubenswrapper[4606]: I1212 00:29:09.353407 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-drkxp"] Dec 12 00:29:09 crc kubenswrapper[4606]: I1212 00:29:09.372957 4606 scope.go:117] "RemoveContainer" containerID="921af76b9dbda9d86c1cec7c7610a9a4396e93e859431c6729a55e9f4af917e3" Dec 12 00:29:09 crc kubenswrapper[4606]: E1212 00:29:09.373488 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"921af76b9dbda9d86c1cec7c7610a9a4396e93e859431c6729a55e9f4af917e3\": container with ID starting with 921af76b9dbda9d86c1cec7c7610a9a4396e93e859431c6729a55e9f4af917e3 not found: ID does not exist" containerID="921af76b9dbda9d86c1cec7c7610a9a4396e93e859431c6729a55e9f4af917e3" Dec 12 00:29:09 crc kubenswrapper[4606]: I1212 00:29:09.373539 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"921af76b9dbda9d86c1cec7c7610a9a4396e93e859431c6729a55e9f4af917e3"} err="failed to get container status \"921af76b9dbda9d86c1cec7c7610a9a4396e93e859431c6729a55e9f4af917e3\": rpc error: code = NotFound desc = could not find container \"921af76b9dbda9d86c1cec7c7610a9a4396e93e859431c6729a55e9f4af917e3\": container with ID starting with 921af76b9dbda9d86c1cec7c7610a9a4396e93e859431c6729a55e9f4af917e3 not found: ID does not exist" Dec 12 00:29:09 crc kubenswrapper[4606]: I1212 00:29:09.373568 4606 scope.go:117] "RemoveContainer" containerID="ebc3634ed151afc936779c7397fec3d15c9ac679b8f7908f8132b99d20e7186d" Dec 12 00:29:09 crc kubenswrapper[4606]: E1212 00:29:09.373784 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ebc3634ed151afc936779c7397fec3d15c9ac679b8f7908f8132b99d20e7186d\": container with ID starting with ebc3634ed151afc936779c7397fec3d15c9ac679b8f7908f8132b99d20e7186d not found: ID does not exist" containerID="ebc3634ed151afc936779c7397fec3d15c9ac679b8f7908f8132b99d20e7186d" Dec 12 00:29:09 crc kubenswrapper[4606]: I1212 00:29:09.373812 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebc3634ed151afc936779c7397fec3d15c9ac679b8f7908f8132b99d20e7186d"} err="failed to get container status \"ebc3634ed151afc936779c7397fec3d15c9ac679b8f7908f8132b99d20e7186d\": rpc error: code = NotFound desc = could not find container \"ebc3634ed151afc936779c7397fec3d15c9ac679b8f7908f8132b99d20e7186d\": container with ID starting with ebc3634ed151afc936779c7397fec3d15c9ac679b8f7908f8132b99d20e7186d not found: ID does not exist" Dec 12 00:29:09 crc kubenswrapper[4606]: I1212 00:29:09.373826 4606 scope.go:117] "RemoveContainer" containerID="db3e0c552c5a09d6984055041f40cb04a87efd2ef550498502ab1c9e729b801f" Dec 12 00:29:09 crc kubenswrapper[4606]: E1212 00:29:09.373986 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db3e0c552c5a09d6984055041f40cb04a87efd2ef550498502ab1c9e729b801f\": container with ID starting with db3e0c552c5a09d6984055041f40cb04a87efd2ef550498502ab1c9e729b801f not found: ID does not exist" containerID="db3e0c552c5a09d6984055041f40cb04a87efd2ef550498502ab1c9e729b801f" Dec 12 00:29:09 crc kubenswrapper[4606]: I1212 00:29:09.374008 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db3e0c552c5a09d6984055041f40cb04a87efd2ef550498502ab1c9e729b801f"} err="failed to get container status \"db3e0c552c5a09d6984055041f40cb04a87efd2ef550498502ab1c9e729b801f\": rpc error: code = NotFound desc = could not find container \"db3e0c552c5a09d6984055041f40cb04a87efd2ef550498502ab1c9e729b801f\": container with ID starting with db3e0c552c5a09d6984055041f40cb04a87efd2ef550498502ab1c9e729b801f not found: ID does not exist" Dec 12 00:29:09 crc kubenswrapper[4606]: I1212 00:29:09.374021 4606 scope.go:117] "RemoveContainer" containerID="abe99af364e8244bff119476cc08dcf3b6a56be253275b60b224e0ddc05eec24" Dec 12 00:29:09 crc kubenswrapper[4606]: I1212 00:29:09.374646 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-drkxp"] Dec 12 00:29:09 crc kubenswrapper[4606]: I1212 00:29:09.383207 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-54mqq\" (UniqueName: \"kubernetes.io/projected/303940e6-1922-4197-ad2a-6524c192b1b5-kube-api-access-54mqq\") pod \"303940e6-1922-4197-ad2a-6524c192b1b5\" (UID: \"303940e6-1922-4197-ad2a-6524c192b1b5\") " Dec 12 00:29:09 crc kubenswrapper[4606]: I1212 00:29:09.383250 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/303940e6-1922-4197-ad2a-6524c192b1b5-utilities\") pod \"303940e6-1922-4197-ad2a-6524c192b1b5\" (UID: \"303940e6-1922-4197-ad2a-6524c192b1b5\") " Dec 12 00:29:09 crc kubenswrapper[4606]: I1212 00:29:09.383290 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-46jbj\" (UniqueName: \"kubernetes.io/projected/ee26d00d-079d-41c7-b641-5f2373eef2ee-kube-api-access-46jbj\") pod \"ee26d00d-079d-41c7-b641-5f2373eef2ee\" (UID: \"ee26d00d-079d-41c7-b641-5f2373eef2ee\") " Dec 12 00:29:09 crc kubenswrapper[4606]: I1212 00:29:09.383321 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee26d00d-079d-41c7-b641-5f2373eef2ee-catalog-content\") pod \"ee26d00d-079d-41c7-b641-5f2373eef2ee\" (UID: \"ee26d00d-079d-41c7-b641-5f2373eef2ee\") " Dec 12 00:29:09 crc kubenswrapper[4606]: I1212 00:29:09.383343 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8j5bk\" (UniqueName: \"kubernetes.io/projected/d5b97b3b-8994-4d7f-a165-c04d13546e89-kube-api-access-8j5bk\") pod \"d5b97b3b-8994-4d7f-a165-c04d13546e89\" (UID: \"d5b97b3b-8994-4d7f-a165-c04d13546e89\") " Dec 12 00:29:09 crc kubenswrapper[4606]: I1212 00:29:09.384518 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/d5b97b3b-8994-4d7f-a165-c04d13546e89-marketplace-operator-metrics\") pod \"d5b97b3b-8994-4d7f-a165-c04d13546e89\" (UID: \"d5b97b3b-8994-4d7f-a165-c04d13546e89\") " Dec 12 00:29:09 crc kubenswrapper[4606]: I1212 00:29:09.385508 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/303940e6-1922-4197-ad2a-6524c192b1b5-utilities" (OuterVolumeSpecName: "utilities") pod "303940e6-1922-4197-ad2a-6524c192b1b5" (UID: "303940e6-1922-4197-ad2a-6524c192b1b5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 00:29:09 crc kubenswrapper[4606]: I1212 00:29:09.387149 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee26d00d-079d-41c7-b641-5f2373eef2ee-utilities" (OuterVolumeSpecName: "utilities") pod "ee26d00d-079d-41c7-b641-5f2373eef2ee" (UID: "ee26d00d-079d-41c7-b641-5f2373eef2ee"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 00:29:09 crc kubenswrapper[4606]: I1212 00:29:09.387012 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee26d00d-079d-41c7-b641-5f2373eef2ee-utilities\") pod \"ee26d00d-079d-41c7-b641-5f2373eef2ee\" (UID: \"ee26d00d-079d-41c7-b641-5f2373eef2ee\") " Dec 12 00:29:09 crc kubenswrapper[4606]: I1212 00:29:09.388669 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/303940e6-1922-4197-ad2a-6524c192b1b5-catalog-content\") pod \"303940e6-1922-4197-ad2a-6524c192b1b5\" (UID: \"303940e6-1922-4197-ad2a-6524c192b1b5\") " Dec 12 00:29:09 crc kubenswrapper[4606]: I1212 00:29:09.388707 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d5b97b3b-8994-4d7f-a165-c04d13546e89-marketplace-trusted-ca\") pod \"d5b97b3b-8994-4d7f-a165-c04d13546e89\" (UID: \"d5b97b3b-8994-4d7f-a165-c04d13546e89\") " Dec 12 00:29:09 crc kubenswrapper[4606]: I1212 00:29:09.389192 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5b97b3b-8994-4d7f-a165-c04d13546e89-kube-api-access-8j5bk" (OuterVolumeSpecName: "kube-api-access-8j5bk") pod "d5b97b3b-8994-4d7f-a165-c04d13546e89" (UID: "d5b97b3b-8994-4d7f-a165-c04d13546e89"). InnerVolumeSpecName "kube-api-access-8j5bk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:29:09 crc kubenswrapper[4606]: I1212 00:29:09.390097 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5b97b3b-8994-4d7f-a165-c04d13546e89-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "d5b97b3b-8994-4d7f-a165-c04d13546e89" (UID: "d5b97b3b-8994-4d7f-a165-c04d13546e89"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:29:09 crc kubenswrapper[4606]: I1212 00:29:09.394273 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/303940e6-1922-4197-ad2a-6524c192b1b5-kube-api-access-54mqq" (OuterVolumeSpecName: "kube-api-access-54mqq") pod "303940e6-1922-4197-ad2a-6524c192b1b5" (UID: "303940e6-1922-4197-ad2a-6524c192b1b5"). InnerVolumeSpecName "kube-api-access-54mqq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:29:09 crc kubenswrapper[4606]: I1212 00:29:09.396428 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee26d00d-079d-41c7-b641-5f2373eef2ee-kube-api-access-46jbj" (OuterVolumeSpecName: "kube-api-access-46jbj") pod "ee26d00d-079d-41c7-b641-5f2373eef2ee" (UID: "ee26d00d-079d-41c7-b641-5f2373eef2ee"). InnerVolumeSpecName "kube-api-access-46jbj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:29:09 crc kubenswrapper[4606]: I1212 00:29:09.396478 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xg5pj"] Dec 12 00:29:09 crc kubenswrapper[4606]: I1212 00:29:09.401783 4606 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee26d00d-079d-41c7-b641-5f2373eef2ee-utilities\") on node \"crc\" DevicePath \"\"" Dec 12 00:29:09 crc kubenswrapper[4606]: I1212 00:29:09.401813 4606 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d5b97b3b-8994-4d7f-a165-c04d13546e89-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 12 00:29:09 crc kubenswrapper[4606]: I1212 00:29:09.403624 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-54mqq\" (UniqueName: \"kubernetes.io/projected/303940e6-1922-4197-ad2a-6524c192b1b5-kube-api-access-54mqq\") on node \"crc\" DevicePath \"\"" Dec 12 00:29:09 crc kubenswrapper[4606]: I1212 00:29:09.404009 4606 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/303940e6-1922-4197-ad2a-6524c192b1b5-utilities\") on node \"crc\" DevicePath \"\"" Dec 12 00:29:09 crc kubenswrapper[4606]: I1212 00:29:09.404472 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-46jbj\" (UniqueName: \"kubernetes.io/projected/ee26d00d-079d-41c7-b641-5f2373eef2ee-kube-api-access-46jbj\") on node \"crc\" DevicePath \"\"" Dec 12 00:29:09 crc kubenswrapper[4606]: I1212 00:29:09.404835 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8j5bk\" (UniqueName: \"kubernetes.io/projected/d5b97b3b-8994-4d7f-a165-c04d13546e89-kube-api-access-8j5bk\") on node \"crc\" DevicePath \"\"" Dec 12 00:29:09 crc kubenswrapper[4606]: I1212 00:29:09.408659 4606 scope.go:117] "RemoveContainer" containerID="49ad1c6582f45f7ee377a6f339952af147df9d234fc0a6e0776ccb6fde77ad7d" Dec 12 00:29:09 crc kubenswrapper[4606]: I1212 00:29:09.409790 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5b97b3b-8994-4d7f-a165-c04d13546e89-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "d5b97b3b-8994-4d7f-a165-c04d13546e89" (UID: "d5b97b3b-8994-4d7f-a165-c04d13546e89"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:29:09 crc kubenswrapper[4606]: I1212 00:29:09.413124 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee26d00d-079d-41c7-b641-5f2373eef2ee-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ee26d00d-079d-41c7-b641-5f2373eef2ee" (UID: "ee26d00d-079d-41c7-b641-5f2373eef2ee"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 00:29:09 crc kubenswrapper[4606]: I1212 00:29:09.416231 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-xg5pj"] Dec 12 00:29:09 crc kubenswrapper[4606]: I1212 00:29:09.420272 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-rq529"] Dec 12 00:29:09 crc kubenswrapper[4606]: I1212 00:29:09.475159 4606 scope.go:117] "RemoveContainer" containerID="406f7d08228d29199e42746334478922a308b9e64f193fdbd10718dbd55ef7fa" Dec 12 00:29:09 crc kubenswrapper[4606]: I1212 00:29:09.494022 4606 scope.go:117] "RemoveContainer" containerID="abe99af364e8244bff119476cc08dcf3b6a56be253275b60b224e0ddc05eec24" Dec 12 00:29:09 crc kubenswrapper[4606]: E1212 00:29:09.494718 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"abe99af364e8244bff119476cc08dcf3b6a56be253275b60b224e0ddc05eec24\": container with ID starting with abe99af364e8244bff119476cc08dcf3b6a56be253275b60b224e0ddc05eec24 not found: ID does not exist" containerID="abe99af364e8244bff119476cc08dcf3b6a56be253275b60b224e0ddc05eec24" Dec 12 00:29:09 crc kubenswrapper[4606]: I1212 00:29:09.494755 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abe99af364e8244bff119476cc08dcf3b6a56be253275b60b224e0ddc05eec24"} err="failed to get container status \"abe99af364e8244bff119476cc08dcf3b6a56be253275b60b224e0ddc05eec24\": rpc error: code = NotFound desc = could not find container \"abe99af364e8244bff119476cc08dcf3b6a56be253275b60b224e0ddc05eec24\": container with ID starting with abe99af364e8244bff119476cc08dcf3b6a56be253275b60b224e0ddc05eec24 not found: ID does not exist" Dec 12 00:29:09 crc kubenswrapper[4606]: I1212 00:29:09.494796 4606 scope.go:117] "RemoveContainer" containerID="49ad1c6582f45f7ee377a6f339952af147df9d234fc0a6e0776ccb6fde77ad7d" Dec 12 00:29:09 crc kubenswrapper[4606]: E1212 00:29:09.495161 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49ad1c6582f45f7ee377a6f339952af147df9d234fc0a6e0776ccb6fde77ad7d\": container with ID starting with 49ad1c6582f45f7ee377a6f339952af147df9d234fc0a6e0776ccb6fde77ad7d not found: ID does not exist" containerID="49ad1c6582f45f7ee377a6f339952af147df9d234fc0a6e0776ccb6fde77ad7d" Dec 12 00:29:09 crc kubenswrapper[4606]: I1212 00:29:09.495239 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49ad1c6582f45f7ee377a6f339952af147df9d234fc0a6e0776ccb6fde77ad7d"} err="failed to get container status \"49ad1c6582f45f7ee377a6f339952af147df9d234fc0a6e0776ccb6fde77ad7d\": rpc error: code = NotFound desc = could not find container \"49ad1c6582f45f7ee377a6f339952af147df9d234fc0a6e0776ccb6fde77ad7d\": container with ID starting with 49ad1c6582f45f7ee377a6f339952af147df9d234fc0a6e0776ccb6fde77ad7d not found: ID does not exist" Dec 12 00:29:09 crc kubenswrapper[4606]: I1212 00:29:09.495269 4606 scope.go:117] "RemoveContainer" containerID="406f7d08228d29199e42746334478922a308b9e64f193fdbd10718dbd55ef7fa" Dec 12 00:29:09 crc kubenswrapper[4606]: E1212 00:29:09.495612 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"406f7d08228d29199e42746334478922a308b9e64f193fdbd10718dbd55ef7fa\": container with ID starting with 406f7d08228d29199e42746334478922a308b9e64f193fdbd10718dbd55ef7fa not found: ID does not exist" containerID="406f7d08228d29199e42746334478922a308b9e64f193fdbd10718dbd55ef7fa" Dec 12 00:29:09 crc kubenswrapper[4606]: I1212 00:29:09.495641 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"406f7d08228d29199e42746334478922a308b9e64f193fdbd10718dbd55ef7fa"} err="failed to get container status \"406f7d08228d29199e42746334478922a308b9e64f193fdbd10718dbd55ef7fa\": rpc error: code = NotFound desc = could not find container \"406f7d08228d29199e42746334478922a308b9e64f193fdbd10718dbd55ef7fa\": container with ID starting with 406f7d08228d29199e42746334478922a308b9e64f193fdbd10718dbd55ef7fa not found: ID does not exist" Dec 12 00:29:09 crc kubenswrapper[4606]: I1212 00:29:09.495657 4606 scope.go:117] "RemoveContainer" containerID="f55b8074b8bed57843a80bd0cf27d76f607aba8f14e884901d3445646881acd6" Dec 12 00:29:09 crc kubenswrapper[4606]: I1212 00:29:09.506689 4606 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee26d00d-079d-41c7-b641-5f2373eef2ee-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 12 00:29:09 crc kubenswrapper[4606]: I1212 00:29:09.506734 4606 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/d5b97b3b-8994-4d7f-a165-c04d13546e89-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 12 00:29:09 crc kubenswrapper[4606]: I1212 00:29:09.511001 4606 scope.go:117] "RemoveContainer" containerID="edd206a36798d813911a2fd408e5191bffe6745cf0f3e5c6e01da4b3757da49a" Dec 12 00:29:09 crc kubenswrapper[4606]: I1212 00:29:09.530844 4606 scope.go:117] "RemoveContainer" containerID="f55b8074b8bed57843a80bd0cf27d76f607aba8f14e884901d3445646881acd6" Dec 12 00:29:09 crc kubenswrapper[4606]: E1212 00:29:09.531686 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f55b8074b8bed57843a80bd0cf27d76f607aba8f14e884901d3445646881acd6\": container with ID starting with f55b8074b8bed57843a80bd0cf27d76f607aba8f14e884901d3445646881acd6 not found: ID does not exist" containerID="f55b8074b8bed57843a80bd0cf27d76f607aba8f14e884901d3445646881acd6" Dec 12 00:29:09 crc kubenswrapper[4606]: I1212 00:29:09.531720 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f55b8074b8bed57843a80bd0cf27d76f607aba8f14e884901d3445646881acd6"} err="failed to get container status \"f55b8074b8bed57843a80bd0cf27d76f607aba8f14e884901d3445646881acd6\": rpc error: code = NotFound desc = could not find container \"f55b8074b8bed57843a80bd0cf27d76f607aba8f14e884901d3445646881acd6\": container with ID starting with f55b8074b8bed57843a80bd0cf27d76f607aba8f14e884901d3445646881acd6 not found: ID does not exist" Dec 12 00:29:09 crc kubenswrapper[4606]: I1212 00:29:09.531749 4606 scope.go:117] "RemoveContainer" containerID="edd206a36798d813911a2fd408e5191bffe6745cf0f3e5c6e01da4b3757da49a" Dec 12 00:29:09 crc kubenswrapper[4606]: E1212 00:29:09.532135 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"edd206a36798d813911a2fd408e5191bffe6745cf0f3e5c6e01da4b3757da49a\": container with ID starting with edd206a36798d813911a2fd408e5191bffe6745cf0f3e5c6e01da4b3757da49a not found: ID does not exist" containerID="edd206a36798d813911a2fd408e5191bffe6745cf0f3e5c6e01da4b3757da49a" Dec 12 00:29:09 crc kubenswrapper[4606]: I1212 00:29:09.532163 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"edd206a36798d813911a2fd408e5191bffe6745cf0f3e5c6e01da4b3757da49a"} err="failed to get container status \"edd206a36798d813911a2fd408e5191bffe6745cf0f3e5c6e01da4b3757da49a\": rpc error: code = NotFound desc = could not find container \"edd206a36798d813911a2fd408e5191bffe6745cf0f3e5c6e01da4b3757da49a\": container with ID starting with edd206a36798d813911a2fd408e5191bffe6745cf0f3e5c6e01da4b3757da49a not found: ID does not exist" Dec 12 00:29:09 crc kubenswrapper[4606]: I1212 00:29:09.532233 4606 scope.go:117] "RemoveContainer" containerID="ba3c7e4fc582bcd6e984c004f7d4e787f3870a52e1b6cd14daf6751753a6fe60" Dec 12 00:29:09 crc kubenswrapper[4606]: I1212 00:29:09.545646 4606 scope.go:117] "RemoveContainer" containerID="bdd82eef506390486aad683f30431a03c99de4a60b63bdf9308c0fcd787bb6d7" Dec 12 00:29:09 crc kubenswrapper[4606]: I1212 00:29:09.551844 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/303940e6-1922-4197-ad2a-6524c192b1b5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "303940e6-1922-4197-ad2a-6524c192b1b5" (UID: "303940e6-1922-4197-ad2a-6524c192b1b5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 00:29:09 crc kubenswrapper[4606]: I1212 00:29:09.561367 4606 scope.go:117] "RemoveContainer" containerID="ffb804780da2159122527d7f19d3f8ef4d2196ca61d3098a3bdfa7837ff7be87" Dec 12 00:29:09 crc kubenswrapper[4606]: I1212 00:29:09.608270 4606 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/303940e6-1922-4197-ad2a-6524c192b1b5-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 12 00:29:09 crc kubenswrapper[4606]: I1212 00:29:09.616474 4606 scope.go:117] "RemoveContainer" containerID="ba3c7e4fc582bcd6e984c004f7d4e787f3870a52e1b6cd14daf6751753a6fe60" Dec 12 00:29:09 crc kubenswrapper[4606]: E1212 00:29:09.616942 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba3c7e4fc582bcd6e984c004f7d4e787f3870a52e1b6cd14daf6751753a6fe60\": container with ID starting with ba3c7e4fc582bcd6e984c004f7d4e787f3870a52e1b6cd14daf6751753a6fe60 not found: ID does not exist" containerID="ba3c7e4fc582bcd6e984c004f7d4e787f3870a52e1b6cd14daf6751753a6fe60" Dec 12 00:29:09 crc kubenswrapper[4606]: I1212 00:29:09.616982 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba3c7e4fc582bcd6e984c004f7d4e787f3870a52e1b6cd14daf6751753a6fe60"} err="failed to get container status \"ba3c7e4fc582bcd6e984c004f7d4e787f3870a52e1b6cd14daf6751753a6fe60\": rpc error: code = NotFound desc = could not find container \"ba3c7e4fc582bcd6e984c004f7d4e787f3870a52e1b6cd14daf6751753a6fe60\": container with ID starting with ba3c7e4fc582bcd6e984c004f7d4e787f3870a52e1b6cd14daf6751753a6fe60 not found: ID does not exist" Dec 12 00:29:09 crc kubenswrapper[4606]: I1212 00:29:09.617009 4606 scope.go:117] "RemoveContainer" containerID="bdd82eef506390486aad683f30431a03c99de4a60b63bdf9308c0fcd787bb6d7" Dec 12 00:29:09 crc kubenswrapper[4606]: E1212 00:29:09.617264 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bdd82eef506390486aad683f30431a03c99de4a60b63bdf9308c0fcd787bb6d7\": container with ID starting with bdd82eef506390486aad683f30431a03c99de4a60b63bdf9308c0fcd787bb6d7 not found: ID does not exist" containerID="bdd82eef506390486aad683f30431a03c99de4a60b63bdf9308c0fcd787bb6d7" Dec 12 00:29:09 crc kubenswrapper[4606]: I1212 00:29:09.617280 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bdd82eef506390486aad683f30431a03c99de4a60b63bdf9308c0fcd787bb6d7"} err="failed to get container status \"bdd82eef506390486aad683f30431a03c99de4a60b63bdf9308c0fcd787bb6d7\": rpc error: code = NotFound desc = could not find container \"bdd82eef506390486aad683f30431a03c99de4a60b63bdf9308c0fcd787bb6d7\": container with ID starting with bdd82eef506390486aad683f30431a03c99de4a60b63bdf9308c0fcd787bb6d7 not found: ID does not exist" Dec 12 00:29:09 crc kubenswrapper[4606]: I1212 00:29:09.617294 4606 scope.go:117] "RemoveContainer" containerID="ffb804780da2159122527d7f19d3f8ef4d2196ca61d3098a3bdfa7837ff7be87" Dec 12 00:29:09 crc kubenswrapper[4606]: E1212 00:29:09.617472 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ffb804780da2159122527d7f19d3f8ef4d2196ca61d3098a3bdfa7837ff7be87\": container with ID starting with ffb804780da2159122527d7f19d3f8ef4d2196ca61d3098a3bdfa7837ff7be87 not found: ID does not exist" containerID="ffb804780da2159122527d7f19d3f8ef4d2196ca61d3098a3bdfa7837ff7be87" Dec 12 00:29:09 crc kubenswrapper[4606]: I1212 00:29:09.617488 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffb804780da2159122527d7f19d3f8ef4d2196ca61d3098a3bdfa7837ff7be87"} err="failed to get container status \"ffb804780da2159122527d7f19d3f8ef4d2196ca61d3098a3bdfa7837ff7be87\": rpc error: code = NotFound desc = could not find container \"ffb804780da2159122527d7f19d3f8ef4d2196ca61d3098a3bdfa7837ff7be87\": container with ID starting with ffb804780da2159122527d7f19d3f8ef4d2196ca61d3098a3bdfa7837ff7be87 not found: ID does not exist" Dec 12 00:29:09 crc kubenswrapper[4606]: I1212 00:29:09.636289 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-q5998"] Dec 12 00:29:09 crc kubenswrapper[4606]: I1212 00:29:09.644118 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-q5998"] Dec 12 00:29:09 crc kubenswrapper[4606]: I1212 00:29:09.667627 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-544fr"] Dec 12 00:29:09 crc kubenswrapper[4606]: I1212 00:29:09.671154 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-544fr"] Dec 12 00:29:09 crc kubenswrapper[4606]: I1212 00:29:09.705769 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16aefbaf-f9b9-452e-84fa-a710b9284349" path="/var/lib/kubelet/pods/16aefbaf-f9b9-452e-84fa-a710b9284349/volumes" Dec 12 00:29:09 crc kubenswrapper[4606]: I1212 00:29:09.706571 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34b189fb-cbc9-4ebe-bfa2-b8a4e1e0d4b4" path="/var/lib/kubelet/pods/34b189fb-cbc9-4ebe-bfa2-b8a4e1e0d4b4/volumes" Dec 12 00:29:09 crc kubenswrapper[4606]: I1212 00:29:09.707391 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5b97b3b-8994-4d7f-a165-c04d13546e89" path="/var/lib/kubelet/pods/d5b97b3b-8994-4d7f-a165-c04d13546e89/volumes" Dec 12 00:29:09 crc kubenswrapper[4606]: I1212 00:29:09.708614 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee26d00d-079d-41c7-b641-5f2373eef2ee" path="/var/lib/kubelet/pods/ee26d00d-079d-41c7-b641-5f2373eef2ee/volumes" Dec 12 00:29:10 crc kubenswrapper[4606]: I1212 00:29:10.326738 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fqkw9" event={"ID":"303940e6-1922-4197-ad2a-6524c192b1b5","Type":"ContainerDied","Data":"eb51762e86cc272087d3623bc290edd5f5e73da812a29ff16042047144b4c8c5"} Dec 12 00:29:10 crc kubenswrapper[4606]: I1212 00:29:10.326800 4606 scope.go:117] "RemoveContainer" containerID="78c3f9ad0cf1e9fa50796a473926316088ec8645f169bedad0718bfac093b0d5" Dec 12 00:29:10 crc kubenswrapper[4606]: I1212 00:29:10.326879 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fqkw9" Dec 12 00:29:10 crc kubenswrapper[4606]: I1212 00:29:10.345253 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-rq529" event={"ID":"d4738423-c294-4595-8656-3a2ebd437a75","Type":"ContainerStarted","Data":"5039c720cb348cfc5fe9304089c81828c6ddce45c3bb46bd39a8727206bff5a9"} Dec 12 00:29:10 crc kubenswrapper[4606]: I1212 00:29:10.345661 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-rq529" event={"ID":"d4738423-c294-4595-8656-3a2ebd437a75","Type":"ContainerStarted","Data":"2eacef2297de561f4ab9ba762aef3dd724123e7a7bb017e579f526dddb5bd7f4"} Dec 12 00:29:10 crc kubenswrapper[4606]: I1212 00:29:10.349257 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-rq529" Dec 12 00:29:10 crc kubenswrapper[4606]: I1212 00:29:10.353803 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-rq529" Dec 12 00:29:10 crc kubenswrapper[4606]: I1212 00:29:10.368750 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fqkw9"] Dec 12 00:29:10 crc kubenswrapper[4606]: I1212 00:29:10.371582 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-fqkw9"] Dec 12 00:29:10 crc kubenswrapper[4606]: I1212 00:29:10.381361 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-rq529" podStartSLOduration=2.381341442 podStartE2EDuration="2.381341442s" podCreationTimestamp="2025-12-12 00:29:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:29:10.380158861 +0000 UTC m=+340.925511727" watchObservedRunningTime="2025-12-12 00:29:10.381341442 +0000 UTC m=+340.926694308" Dec 12 00:29:10 crc kubenswrapper[4606]: I1212 00:29:10.383502 4606 scope.go:117] "RemoveContainer" containerID="5fa430cbc98eaa886b23f7cb54253b4f5b9e0765e97c2c3069e0d74f2de5ade5" Dec 12 00:29:10 crc kubenswrapper[4606]: I1212 00:29:10.407113 4606 scope.go:117] "RemoveContainer" containerID="8d142d962b8d94334be0e3cb53e7c755953d22ba5b3e6f7ec56c694763725e87" Dec 12 00:29:10 crc kubenswrapper[4606]: I1212 00:29:10.760685 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-k7skb"] Dec 12 00:29:10 crc kubenswrapper[4606]: E1212 00:29:10.760873 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16aefbaf-f9b9-452e-84fa-a710b9284349" containerName="extract-utilities" Dec 12 00:29:10 crc kubenswrapper[4606]: I1212 00:29:10.760884 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="16aefbaf-f9b9-452e-84fa-a710b9284349" containerName="extract-utilities" Dec 12 00:29:10 crc kubenswrapper[4606]: E1212 00:29:10.760896 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34b189fb-cbc9-4ebe-bfa2-b8a4e1e0d4b4" containerName="extract-content" Dec 12 00:29:10 crc kubenswrapper[4606]: I1212 00:29:10.760903 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="34b189fb-cbc9-4ebe-bfa2-b8a4e1e0d4b4" containerName="extract-content" Dec 12 00:29:10 crc kubenswrapper[4606]: E1212 00:29:10.760913 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee26d00d-079d-41c7-b641-5f2373eef2ee" containerName="extract-utilities" Dec 12 00:29:10 crc kubenswrapper[4606]: I1212 00:29:10.760918 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee26d00d-079d-41c7-b641-5f2373eef2ee" containerName="extract-utilities" Dec 12 00:29:10 crc kubenswrapper[4606]: E1212 00:29:10.760926 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34b189fb-cbc9-4ebe-bfa2-b8a4e1e0d4b4" containerName="extract-utilities" Dec 12 00:29:10 crc kubenswrapper[4606]: I1212 00:29:10.760932 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="34b189fb-cbc9-4ebe-bfa2-b8a4e1e0d4b4" containerName="extract-utilities" Dec 12 00:29:10 crc kubenswrapper[4606]: E1212 00:29:10.760940 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16aefbaf-f9b9-452e-84fa-a710b9284349" containerName="registry-server" Dec 12 00:29:10 crc kubenswrapper[4606]: I1212 00:29:10.760945 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="16aefbaf-f9b9-452e-84fa-a710b9284349" containerName="registry-server" Dec 12 00:29:10 crc kubenswrapper[4606]: E1212 00:29:10.760952 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="303940e6-1922-4197-ad2a-6524c192b1b5" containerName="registry-server" Dec 12 00:29:10 crc kubenswrapper[4606]: I1212 00:29:10.760958 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="303940e6-1922-4197-ad2a-6524c192b1b5" containerName="registry-server" Dec 12 00:29:10 crc kubenswrapper[4606]: E1212 00:29:10.760967 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5b97b3b-8994-4d7f-a165-c04d13546e89" containerName="marketplace-operator" Dec 12 00:29:10 crc kubenswrapper[4606]: I1212 00:29:10.760973 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5b97b3b-8994-4d7f-a165-c04d13546e89" containerName="marketplace-operator" Dec 12 00:29:10 crc kubenswrapper[4606]: E1212 00:29:10.760981 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="303940e6-1922-4197-ad2a-6524c192b1b5" containerName="extract-utilities" Dec 12 00:29:10 crc kubenswrapper[4606]: I1212 00:29:10.760987 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="303940e6-1922-4197-ad2a-6524c192b1b5" containerName="extract-utilities" Dec 12 00:29:10 crc kubenswrapper[4606]: E1212 00:29:10.760995 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee26d00d-079d-41c7-b641-5f2373eef2ee" containerName="extract-content" Dec 12 00:29:10 crc kubenswrapper[4606]: I1212 00:29:10.761000 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee26d00d-079d-41c7-b641-5f2373eef2ee" containerName="extract-content" Dec 12 00:29:10 crc kubenswrapper[4606]: E1212 00:29:10.761008 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34b189fb-cbc9-4ebe-bfa2-b8a4e1e0d4b4" containerName="registry-server" Dec 12 00:29:10 crc kubenswrapper[4606]: I1212 00:29:10.761014 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="34b189fb-cbc9-4ebe-bfa2-b8a4e1e0d4b4" containerName="registry-server" Dec 12 00:29:10 crc kubenswrapper[4606]: E1212 00:29:10.761022 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="303940e6-1922-4197-ad2a-6524c192b1b5" containerName="extract-content" Dec 12 00:29:10 crc kubenswrapper[4606]: I1212 00:29:10.761030 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="303940e6-1922-4197-ad2a-6524c192b1b5" containerName="extract-content" Dec 12 00:29:10 crc kubenswrapper[4606]: E1212 00:29:10.761038 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16aefbaf-f9b9-452e-84fa-a710b9284349" containerName="extract-content" Dec 12 00:29:10 crc kubenswrapper[4606]: I1212 00:29:10.761044 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="16aefbaf-f9b9-452e-84fa-a710b9284349" containerName="extract-content" Dec 12 00:29:10 crc kubenswrapper[4606]: E1212 00:29:10.761053 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee26d00d-079d-41c7-b641-5f2373eef2ee" containerName="registry-server" Dec 12 00:29:10 crc kubenswrapper[4606]: I1212 00:29:10.761060 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee26d00d-079d-41c7-b641-5f2373eef2ee" containerName="registry-server" Dec 12 00:29:10 crc kubenswrapper[4606]: I1212 00:29:10.761134 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5b97b3b-8994-4d7f-a165-c04d13546e89" containerName="marketplace-operator" Dec 12 00:29:10 crc kubenswrapper[4606]: I1212 00:29:10.761147 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5b97b3b-8994-4d7f-a165-c04d13546e89" containerName="marketplace-operator" Dec 12 00:29:10 crc kubenswrapper[4606]: I1212 00:29:10.761156 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="303940e6-1922-4197-ad2a-6524c192b1b5" containerName="registry-server" Dec 12 00:29:10 crc kubenswrapper[4606]: I1212 00:29:10.761162 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="16aefbaf-f9b9-452e-84fa-a710b9284349" containerName="registry-server" Dec 12 00:29:10 crc kubenswrapper[4606]: I1212 00:29:10.761190 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="34b189fb-cbc9-4ebe-bfa2-b8a4e1e0d4b4" containerName="registry-server" Dec 12 00:29:10 crc kubenswrapper[4606]: I1212 00:29:10.761212 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee26d00d-079d-41c7-b641-5f2373eef2ee" containerName="registry-server" Dec 12 00:29:10 crc kubenswrapper[4606]: E1212 00:29:10.761291 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5b97b3b-8994-4d7f-a165-c04d13546e89" containerName="marketplace-operator" Dec 12 00:29:10 crc kubenswrapper[4606]: I1212 00:29:10.761299 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5b97b3b-8994-4d7f-a165-c04d13546e89" containerName="marketplace-operator" Dec 12 00:29:10 crc kubenswrapper[4606]: I1212 00:29:10.761848 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k7skb" Dec 12 00:29:10 crc kubenswrapper[4606]: I1212 00:29:10.763589 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 12 00:29:10 crc kubenswrapper[4606]: I1212 00:29:10.778317 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-k7skb"] Dec 12 00:29:10 crc kubenswrapper[4606]: I1212 00:29:10.936747 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ssjsk\" (UniqueName: \"kubernetes.io/projected/2d17c9ef-183f-49d5-96ef-c21b165d4f2a-kube-api-access-ssjsk\") pod \"redhat-marketplace-k7skb\" (UID: \"2d17c9ef-183f-49d5-96ef-c21b165d4f2a\") " pod="openshift-marketplace/redhat-marketplace-k7skb" Dec 12 00:29:10 crc kubenswrapper[4606]: I1212 00:29:10.936868 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d17c9ef-183f-49d5-96ef-c21b165d4f2a-utilities\") pod \"redhat-marketplace-k7skb\" (UID: \"2d17c9ef-183f-49d5-96ef-c21b165d4f2a\") " pod="openshift-marketplace/redhat-marketplace-k7skb" Dec 12 00:29:10 crc kubenswrapper[4606]: I1212 00:29:10.936889 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d17c9ef-183f-49d5-96ef-c21b165d4f2a-catalog-content\") pod \"redhat-marketplace-k7skb\" (UID: \"2d17c9ef-183f-49d5-96ef-c21b165d4f2a\") " pod="openshift-marketplace/redhat-marketplace-k7skb" Dec 12 00:29:10 crc kubenswrapper[4606]: I1212 00:29:10.963796 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-w8wq4"] Dec 12 00:29:10 crc kubenswrapper[4606]: I1212 00:29:10.964706 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w8wq4" Dec 12 00:29:10 crc kubenswrapper[4606]: I1212 00:29:10.969597 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 12 00:29:10 crc kubenswrapper[4606]: I1212 00:29:10.980574 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-w8wq4"] Dec 12 00:29:11 crc kubenswrapper[4606]: I1212 00:29:11.038230 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d17c9ef-183f-49d5-96ef-c21b165d4f2a-catalog-content\") pod \"redhat-marketplace-k7skb\" (UID: \"2d17c9ef-183f-49d5-96ef-c21b165d4f2a\") " pod="openshift-marketplace/redhat-marketplace-k7skb" Dec 12 00:29:11 crc kubenswrapper[4606]: I1212 00:29:11.038278 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d17c9ef-183f-49d5-96ef-c21b165d4f2a-utilities\") pod \"redhat-marketplace-k7skb\" (UID: \"2d17c9ef-183f-49d5-96ef-c21b165d4f2a\") " pod="openshift-marketplace/redhat-marketplace-k7skb" Dec 12 00:29:11 crc kubenswrapper[4606]: I1212 00:29:11.038309 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ssjsk\" (UniqueName: \"kubernetes.io/projected/2d17c9ef-183f-49d5-96ef-c21b165d4f2a-kube-api-access-ssjsk\") pod \"redhat-marketplace-k7skb\" (UID: \"2d17c9ef-183f-49d5-96ef-c21b165d4f2a\") " pod="openshift-marketplace/redhat-marketplace-k7skb" Dec 12 00:29:11 crc kubenswrapper[4606]: I1212 00:29:11.038933 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d17c9ef-183f-49d5-96ef-c21b165d4f2a-catalog-content\") pod \"redhat-marketplace-k7skb\" (UID: \"2d17c9ef-183f-49d5-96ef-c21b165d4f2a\") " pod="openshift-marketplace/redhat-marketplace-k7skb" Dec 12 00:29:11 crc kubenswrapper[4606]: I1212 00:29:11.039144 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d17c9ef-183f-49d5-96ef-c21b165d4f2a-utilities\") pod \"redhat-marketplace-k7skb\" (UID: \"2d17c9ef-183f-49d5-96ef-c21b165d4f2a\") " pod="openshift-marketplace/redhat-marketplace-k7skb" Dec 12 00:29:11 crc kubenswrapper[4606]: I1212 00:29:11.064659 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ssjsk\" (UniqueName: \"kubernetes.io/projected/2d17c9ef-183f-49d5-96ef-c21b165d4f2a-kube-api-access-ssjsk\") pod \"redhat-marketplace-k7skb\" (UID: \"2d17c9ef-183f-49d5-96ef-c21b165d4f2a\") " pod="openshift-marketplace/redhat-marketplace-k7skb" Dec 12 00:29:11 crc kubenswrapper[4606]: I1212 00:29:11.075653 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k7skb" Dec 12 00:29:11 crc kubenswrapper[4606]: I1212 00:29:11.140361 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9627\" (UniqueName: \"kubernetes.io/projected/38061ed5-dc1b-4c0b-9b0d-02412c9ca54d-kube-api-access-w9627\") pod \"redhat-operators-w8wq4\" (UID: \"38061ed5-dc1b-4c0b-9b0d-02412c9ca54d\") " pod="openshift-marketplace/redhat-operators-w8wq4" Dec 12 00:29:11 crc kubenswrapper[4606]: I1212 00:29:11.140689 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38061ed5-dc1b-4c0b-9b0d-02412c9ca54d-catalog-content\") pod \"redhat-operators-w8wq4\" (UID: \"38061ed5-dc1b-4c0b-9b0d-02412c9ca54d\") " pod="openshift-marketplace/redhat-operators-w8wq4" Dec 12 00:29:11 crc kubenswrapper[4606]: I1212 00:29:11.140733 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38061ed5-dc1b-4c0b-9b0d-02412c9ca54d-utilities\") pod \"redhat-operators-w8wq4\" (UID: \"38061ed5-dc1b-4c0b-9b0d-02412c9ca54d\") " pod="openshift-marketplace/redhat-operators-w8wq4" Dec 12 00:29:11 crc kubenswrapper[4606]: I1212 00:29:11.242075 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9627\" (UniqueName: \"kubernetes.io/projected/38061ed5-dc1b-4c0b-9b0d-02412c9ca54d-kube-api-access-w9627\") pod \"redhat-operators-w8wq4\" (UID: \"38061ed5-dc1b-4c0b-9b0d-02412c9ca54d\") " pod="openshift-marketplace/redhat-operators-w8wq4" Dec 12 00:29:11 crc kubenswrapper[4606]: I1212 00:29:11.242133 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38061ed5-dc1b-4c0b-9b0d-02412c9ca54d-catalog-content\") pod \"redhat-operators-w8wq4\" (UID: \"38061ed5-dc1b-4c0b-9b0d-02412c9ca54d\") " pod="openshift-marketplace/redhat-operators-w8wq4" Dec 12 00:29:11 crc kubenswrapper[4606]: I1212 00:29:11.242191 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38061ed5-dc1b-4c0b-9b0d-02412c9ca54d-utilities\") pod \"redhat-operators-w8wq4\" (UID: \"38061ed5-dc1b-4c0b-9b0d-02412c9ca54d\") " pod="openshift-marketplace/redhat-operators-w8wq4" Dec 12 00:29:11 crc kubenswrapper[4606]: I1212 00:29:11.242620 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38061ed5-dc1b-4c0b-9b0d-02412c9ca54d-utilities\") pod \"redhat-operators-w8wq4\" (UID: \"38061ed5-dc1b-4c0b-9b0d-02412c9ca54d\") " pod="openshift-marketplace/redhat-operators-w8wq4" Dec 12 00:29:11 crc kubenswrapper[4606]: I1212 00:29:11.243160 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38061ed5-dc1b-4c0b-9b0d-02412c9ca54d-catalog-content\") pod \"redhat-operators-w8wq4\" (UID: \"38061ed5-dc1b-4c0b-9b0d-02412c9ca54d\") " pod="openshift-marketplace/redhat-operators-w8wq4" Dec 12 00:29:11 crc kubenswrapper[4606]: I1212 00:29:11.265394 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9627\" (UniqueName: \"kubernetes.io/projected/38061ed5-dc1b-4c0b-9b0d-02412c9ca54d-kube-api-access-w9627\") pod \"redhat-operators-w8wq4\" (UID: \"38061ed5-dc1b-4c0b-9b0d-02412c9ca54d\") " pod="openshift-marketplace/redhat-operators-w8wq4" Dec 12 00:29:11 crc kubenswrapper[4606]: I1212 00:29:11.277695 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w8wq4" Dec 12 00:29:11 crc kubenswrapper[4606]: I1212 00:29:11.467379 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-k7skb"] Dec 12 00:29:11 crc kubenswrapper[4606]: W1212 00:29:11.474361 4606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d17c9ef_183f_49d5_96ef_c21b165d4f2a.slice/crio-7d96b8088a3d59a9430deda5cedd21a54bf88db3c389ffe2fcd879b877734e75 WatchSource:0}: Error finding container 7d96b8088a3d59a9430deda5cedd21a54bf88db3c389ffe2fcd879b877734e75: Status 404 returned error can't find the container with id 7d96b8088a3d59a9430deda5cedd21a54bf88db3c389ffe2fcd879b877734e75 Dec 12 00:29:11 crc kubenswrapper[4606]: I1212 00:29:11.639399 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-w8wq4"] Dec 12 00:29:11 crc kubenswrapper[4606]: W1212 00:29:11.648528 4606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod38061ed5_dc1b_4c0b_9b0d_02412c9ca54d.slice/crio-1a02ef6e49940ba5034184f33549700a528def3799dc97beb2157cd4e353c3eb WatchSource:0}: Error finding container 1a02ef6e49940ba5034184f33549700a528def3799dc97beb2157cd4e353c3eb: Status 404 returned error can't find the container with id 1a02ef6e49940ba5034184f33549700a528def3799dc97beb2157cd4e353c3eb Dec 12 00:29:11 crc kubenswrapper[4606]: I1212 00:29:11.704950 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="303940e6-1922-4197-ad2a-6524c192b1b5" path="/var/lib/kubelet/pods/303940e6-1922-4197-ad2a-6524c192b1b5/volumes" Dec 12 00:29:12 crc kubenswrapper[4606]: I1212 00:29:12.360988 4606 generic.go:334] "Generic (PLEG): container finished" podID="38061ed5-dc1b-4c0b-9b0d-02412c9ca54d" containerID="e33a038124d44fcdfd761e6476d87f2d367f2c6abc0ced506a28f5a9086bb5ed" exitCode=0 Dec 12 00:29:12 crc kubenswrapper[4606]: I1212 00:29:12.361136 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w8wq4" event={"ID":"38061ed5-dc1b-4c0b-9b0d-02412c9ca54d","Type":"ContainerDied","Data":"e33a038124d44fcdfd761e6476d87f2d367f2c6abc0ced506a28f5a9086bb5ed"} Dec 12 00:29:12 crc kubenswrapper[4606]: I1212 00:29:12.361377 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w8wq4" event={"ID":"38061ed5-dc1b-4c0b-9b0d-02412c9ca54d","Type":"ContainerStarted","Data":"1a02ef6e49940ba5034184f33549700a528def3799dc97beb2157cd4e353c3eb"} Dec 12 00:29:12 crc kubenswrapper[4606]: I1212 00:29:12.364768 4606 generic.go:334] "Generic (PLEG): container finished" podID="2d17c9ef-183f-49d5-96ef-c21b165d4f2a" containerID="0e9c7da8894bb65f0c867cd32525f75d12dc49957c0f53e7b43c9f3b5069c89e" exitCode=0 Dec 12 00:29:12 crc kubenswrapper[4606]: I1212 00:29:12.364940 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k7skb" event={"ID":"2d17c9ef-183f-49d5-96ef-c21b165d4f2a","Type":"ContainerDied","Data":"0e9c7da8894bb65f0c867cd32525f75d12dc49957c0f53e7b43c9f3b5069c89e"} Dec 12 00:29:12 crc kubenswrapper[4606]: I1212 00:29:12.364970 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k7skb" event={"ID":"2d17c9ef-183f-49d5-96ef-c21b165d4f2a","Type":"ContainerStarted","Data":"7d96b8088a3d59a9430deda5cedd21a54bf88db3c389ffe2fcd879b877734e75"} Dec 12 00:29:13 crc kubenswrapper[4606]: I1212 00:29:13.358539 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-pw5mx"] Dec 12 00:29:13 crc kubenswrapper[4606]: I1212 00:29:13.360038 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pw5mx" Dec 12 00:29:13 crc kubenswrapper[4606]: I1212 00:29:13.362441 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 12 00:29:13 crc kubenswrapper[4606]: I1212 00:29:13.374283 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pw5mx"] Dec 12 00:29:13 crc kubenswrapper[4606]: I1212 00:29:13.377288 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w8wq4" event={"ID":"38061ed5-dc1b-4c0b-9b0d-02412c9ca54d","Type":"ContainerStarted","Data":"6c62db75a9ea96382949eb1e6967f9693114555936795db1dada693b1e4e804e"} Dec 12 00:29:13 crc kubenswrapper[4606]: I1212 00:29:13.465314 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c55zp\" (UniqueName: \"kubernetes.io/projected/70af385d-13b8-4ff2-8c35-fb9402388dd6-kube-api-access-c55zp\") pod \"certified-operators-pw5mx\" (UID: \"70af385d-13b8-4ff2-8c35-fb9402388dd6\") " pod="openshift-marketplace/certified-operators-pw5mx" Dec 12 00:29:13 crc kubenswrapper[4606]: I1212 00:29:13.465401 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/70af385d-13b8-4ff2-8c35-fb9402388dd6-utilities\") pod \"certified-operators-pw5mx\" (UID: \"70af385d-13b8-4ff2-8c35-fb9402388dd6\") " pod="openshift-marketplace/certified-operators-pw5mx" Dec 12 00:29:13 crc kubenswrapper[4606]: I1212 00:29:13.465426 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/70af385d-13b8-4ff2-8c35-fb9402388dd6-catalog-content\") pod \"certified-operators-pw5mx\" (UID: \"70af385d-13b8-4ff2-8c35-fb9402388dd6\") " pod="openshift-marketplace/certified-operators-pw5mx" Dec 12 00:29:13 crc kubenswrapper[4606]: I1212 00:29:13.560948 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-wq42n"] Dec 12 00:29:13 crc kubenswrapper[4606]: I1212 00:29:13.562060 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wq42n" Dec 12 00:29:13 crc kubenswrapper[4606]: I1212 00:29:13.564552 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 12 00:29:13 crc kubenswrapper[4606]: I1212 00:29:13.565993 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/70af385d-13b8-4ff2-8c35-fb9402388dd6-utilities\") pod \"certified-operators-pw5mx\" (UID: \"70af385d-13b8-4ff2-8c35-fb9402388dd6\") " pod="openshift-marketplace/certified-operators-pw5mx" Dec 12 00:29:13 crc kubenswrapper[4606]: I1212 00:29:13.566045 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/70af385d-13b8-4ff2-8c35-fb9402388dd6-catalog-content\") pod \"certified-operators-pw5mx\" (UID: \"70af385d-13b8-4ff2-8c35-fb9402388dd6\") " pod="openshift-marketplace/certified-operators-pw5mx" Dec 12 00:29:13 crc kubenswrapper[4606]: I1212 00:29:13.566105 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c55zp\" (UniqueName: \"kubernetes.io/projected/70af385d-13b8-4ff2-8c35-fb9402388dd6-kube-api-access-c55zp\") pod \"certified-operators-pw5mx\" (UID: \"70af385d-13b8-4ff2-8c35-fb9402388dd6\") " pod="openshift-marketplace/certified-operators-pw5mx" Dec 12 00:29:13 crc kubenswrapper[4606]: I1212 00:29:13.566582 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/70af385d-13b8-4ff2-8c35-fb9402388dd6-utilities\") pod \"certified-operators-pw5mx\" (UID: \"70af385d-13b8-4ff2-8c35-fb9402388dd6\") " pod="openshift-marketplace/certified-operators-pw5mx" Dec 12 00:29:13 crc kubenswrapper[4606]: I1212 00:29:13.566639 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/70af385d-13b8-4ff2-8c35-fb9402388dd6-catalog-content\") pod \"certified-operators-pw5mx\" (UID: \"70af385d-13b8-4ff2-8c35-fb9402388dd6\") " pod="openshift-marketplace/certified-operators-pw5mx" Dec 12 00:29:13 crc kubenswrapper[4606]: I1212 00:29:13.583426 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wq42n"] Dec 12 00:29:13 crc kubenswrapper[4606]: I1212 00:29:13.594191 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c55zp\" (UniqueName: \"kubernetes.io/projected/70af385d-13b8-4ff2-8c35-fb9402388dd6-kube-api-access-c55zp\") pod \"certified-operators-pw5mx\" (UID: \"70af385d-13b8-4ff2-8c35-fb9402388dd6\") " pod="openshift-marketplace/certified-operators-pw5mx" Dec 12 00:29:13 crc kubenswrapper[4606]: I1212 00:29:13.666893 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc97a6ce-dd87-4a67-a1ec-99dcce21178c-catalog-content\") pod \"community-operators-wq42n\" (UID: \"bc97a6ce-dd87-4a67-a1ec-99dcce21178c\") " pod="openshift-marketplace/community-operators-wq42n" Dec 12 00:29:13 crc kubenswrapper[4606]: I1212 00:29:13.666949 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc97a6ce-dd87-4a67-a1ec-99dcce21178c-utilities\") pod \"community-operators-wq42n\" (UID: \"bc97a6ce-dd87-4a67-a1ec-99dcce21178c\") " pod="openshift-marketplace/community-operators-wq42n" Dec 12 00:29:13 crc kubenswrapper[4606]: I1212 00:29:13.666996 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hw777\" (UniqueName: \"kubernetes.io/projected/bc97a6ce-dd87-4a67-a1ec-99dcce21178c-kube-api-access-hw777\") pod \"community-operators-wq42n\" (UID: \"bc97a6ce-dd87-4a67-a1ec-99dcce21178c\") " pod="openshift-marketplace/community-operators-wq42n" Dec 12 00:29:13 crc kubenswrapper[4606]: I1212 00:29:13.681882 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pw5mx" Dec 12 00:29:13 crc kubenswrapper[4606]: I1212 00:29:13.768744 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc97a6ce-dd87-4a67-a1ec-99dcce21178c-catalog-content\") pod \"community-operators-wq42n\" (UID: \"bc97a6ce-dd87-4a67-a1ec-99dcce21178c\") " pod="openshift-marketplace/community-operators-wq42n" Dec 12 00:29:13 crc kubenswrapper[4606]: I1212 00:29:13.768956 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc97a6ce-dd87-4a67-a1ec-99dcce21178c-utilities\") pod \"community-operators-wq42n\" (UID: \"bc97a6ce-dd87-4a67-a1ec-99dcce21178c\") " pod="openshift-marketplace/community-operators-wq42n" Dec 12 00:29:13 crc kubenswrapper[4606]: I1212 00:29:13.768996 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hw777\" (UniqueName: \"kubernetes.io/projected/bc97a6ce-dd87-4a67-a1ec-99dcce21178c-kube-api-access-hw777\") pod \"community-operators-wq42n\" (UID: \"bc97a6ce-dd87-4a67-a1ec-99dcce21178c\") " pod="openshift-marketplace/community-operators-wq42n" Dec 12 00:29:13 crc kubenswrapper[4606]: I1212 00:29:13.769509 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc97a6ce-dd87-4a67-a1ec-99dcce21178c-catalog-content\") pod \"community-operators-wq42n\" (UID: \"bc97a6ce-dd87-4a67-a1ec-99dcce21178c\") " pod="openshift-marketplace/community-operators-wq42n" Dec 12 00:29:13 crc kubenswrapper[4606]: I1212 00:29:13.769600 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc97a6ce-dd87-4a67-a1ec-99dcce21178c-utilities\") pod \"community-operators-wq42n\" (UID: \"bc97a6ce-dd87-4a67-a1ec-99dcce21178c\") " pod="openshift-marketplace/community-operators-wq42n" Dec 12 00:29:13 crc kubenswrapper[4606]: I1212 00:29:13.793684 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hw777\" (UniqueName: \"kubernetes.io/projected/bc97a6ce-dd87-4a67-a1ec-99dcce21178c-kube-api-access-hw777\") pod \"community-operators-wq42n\" (UID: \"bc97a6ce-dd87-4a67-a1ec-99dcce21178c\") " pod="openshift-marketplace/community-operators-wq42n" Dec 12 00:29:13 crc kubenswrapper[4606]: I1212 00:29:13.884480 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wq42n" Dec 12 00:29:14 crc kubenswrapper[4606]: I1212 00:29:14.079018 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pw5mx"] Dec 12 00:29:14 crc kubenswrapper[4606]: W1212 00:29:14.093474 4606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod70af385d_13b8_4ff2_8c35_fb9402388dd6.slice/crio-2a590979af5e0f826ee30e77dca5349365f4395cdc2845c5a68eba437bfc7dbf WatchSource:0}: Error finding container 2a590979af5e0f826ee30e77dca5349365f4395cdc2845c5a68eba437bfc7dbf: Status 404 returned error can't find the container with id 2a590979af5e0f826ee30e77dca5349365f4395cdc2845c5a68eba437bfc7dbf Dec 12 00:29:14 crc kubenswrapper[4606]: I1212 00:29:14.306581 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wq42n"] Dec 12 00:29:14 crc kubenswrapper[4606]: W1212 00:29:14.310733 4606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc97a6ce_dd87_4a67_a1ec_99dcce21178c.slice/crio-f7a7a919e005fb2c388486c89d39f5dac935a3f51b25d257f61fd32d5fc09f89 WatchSource:0}: Error finding container f7a7a919e005fb2c388486c89d39f5dac935a3f51b25d257f61fd32d5fc09f89: Status 404 returned error can't find the container with id f7a7a919e005fb2c388486c89d39f5dac935a3f51b25d257f61fd32d5fc09f89 Dec 12 00:29:14 crc kubenswrapper[4606]: I1212 00:29:14.383184 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wq42n" event={"ID":"bc97a6ce-dd87-4a67-a1ec-99dcce21178c","Type":"ContainerStarted","Data":"f7a7a919e005fb2c388486c89d39f5dac935a3f51b25d257f61fd32d5fc09f89"} Dec 12 00:29:14 crc kubenswrapper[4606]: I1212 00:29:14.384908 4606 generic.go:334] "Generic (PLEG): container finished" podID="38061ed5-dc1b-4c0b-9b0d-02412c9ca54d" containerID="6c62db75a9ea96382949eb1e6967f9693114555936795db1dada693b1e4e804e" exitCode=0 Dec 12 00:29:14 crc kubenswrapper[4606]: I1212 00:29:14.384941 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w8wq4" event={"ID":"38061ed5-dc1b-4c0b-9b0d-02412c9ca54d","Type":"ContainerDied","Data":"6c62db75a9ea96382949eb1e6967f9693114555936795db1dada693b1e4e804e"} Dec 12 00:29:14 crc kubenswrapper[4606]: I1212 00:29:14.390485 4606 generic.go:334] "Generic (PLEG): container finished" podID="70af385d-13b8-4ff2-8c35-fb9402388dd6" containerID="286fc20de13dc15f56799072586e0523519fafb40d77ce41e87e673ef7a03d40" exitCode=0 Dec 12 00:29:14 crc kubenswrapper[4606]: I1212 00:29:14.390562 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pw5mx" event={"ID":"70af385d-13b8-4ff2-8c35-fb9402388dd6","Type":"ContainerDied","Data":"286fc20de13dc15f56799072586e0523519fafb40d77ce41e87e673ef7a03d40"} Dec 12 00:29:14 crc kubenswrapper[4606]: I1212 00:29:14.390586 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pw5mx" event={"ID":"70af385d-13b8-4ff2-8c35-fb9402388dd6","Type":"ContainerStarted","Data":"2a590979af5e0f826ee30e77dca5349365f4395cdc2845c5a68eba437bfc7dbf"} Dec 12 00:29:14 crc kubenswrapper[4606]: I1212 00:29:14.398409 4606 generic.go:334] "Generic (PLEG): container finished" podID="2d17c9ef-183f-49d5-96ef-c21b165d4f2a" containerID="85d5c5bd02932ea8d1cea155755d33e11793785a59e9dc3d13e5921f20874ec3" exitCode=0 Dec 12 00:29:14 crc kubenswrapper[4606]: I1212 00:29:14.398443 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k7skb" event={"ID":"2d17c9ef-183f-49d5-96ef-c21b165d4f2a","Type":"ContainerDied","Data":"85d5c5bd02932ea8d1cea155755d33e11793785a59e9dc3d13e5921f20874ec3"} Dec 12 00:29:15 crc kubenswrapper[4606]: I1212 00:29:15.404815 4606 generic.go:334] "Generic (PLEG): container finished" podID="bc97a6ce-dd87-4a67-a1ec-99dcce21178c" containerID="7efe270a47d608787682fc344c031f32f0c46a60593754c623b6783f50bb3d9c" exitCode=0 Dec 12 00:29:15 crc kubenswrapper[4606]: I1212 00:29:15.404907 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wq42n" event={"ID":"bc97a6ce-dd87-4a67-a1ec-99dcce21178c","Type":"ContainerDied","Data":"7efe270a47d608787682fc344c031f32f0c46a60593754c623b6783f50bb3d9c"} Dec 12 00:29:15 crc kubenswrapper[4606]: I1212 00:29:15.407021 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w8wq4" event={"ID":"38061ed5-dc1b-4c0b-9b0d-02412c9ca54d","Type":"ContainerStarted","Data":"8a319995b7a8f0d536552df335865b18913ff2c30a133cdcf23bb997b1a78745"} Dec 12 00:29:15 crc kubenswrapper[4606]: I1212 00:29:15.410273 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pw5mx" event={"ID":"70af385d-13b8-4ff2-8c35-fb9402388dd6","Type":"ContainerStarted","Data":"062e0f6972301e6bc2c5566b56e43001c409ef93c3281ec87189afafc9c02df3"} Dec 12 00:29:15 crc kubenswrapper[4606]: I1212 00:29:15.413038 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k7skb" event={"ID":"2d17c9ef-183f-49d5-96ef-c21b165d4f2a","Type":"ContainerStarted","Data":"95455317867ba998f8aa931ff46ee5588cad115c1b0fbd219e99ebf9ee57ee9a"} Dec 12 00:29:15 crc kubenswrapper[4606]: I1212 00:29:15.441732 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-w8wq4" podStartSLOduration=2.818927162 podStartE2EDuration="5.441715553s" podCreationTimestamp="2025-12-12 00:29:10 +0000 UTC" firstStartedPulling="2025-12-12 00:29:12.363867321 +0000 UTC m=+342.909220187" lastFinishedPulling="2025-12-12 00:29:14.986655712 +0000 UTC m=+345.532008578" observedRunningTime="2025-12-12 00:29:15.439046432 +0000 UTC m=+345.984399298" watchObservedRunningTime="2025-12-12 00:29:15.441715553 +0000 UTC m=+345.987068419" Dec 12 00:29:15 crc kubenswrapper[4606]: I1212 00:29:15.475199 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-k7skb" podStartSLOduration=3.049860933 podStartE2EDuration="5.475158224s" podCreationTimestamp="2025-12-12 00:29:10 +0000 UTC" firstStartedPulling="2025-12-12 00:29:12.367933109 +0000 UTC m=+342.913285975" lastFinishedPulling="2025-12-12 00:29:14.7932304 +0000 UTC m=+345.338583266" observedRunningTime="2025-12-12 00:29:15.471469626 +0000 UTC m=+346.016822512" watchObservedRunningTime="2025-12-12 00:29:15.475158224 +0000 UTC m=+346.020511110" Dec 12 00:29:16 crc kubenswrapper[4606]: I1212 00:29:16.420755 4606 generic.go:334] "Generic (PLEG): container finished" podID="bc97a6ce-dd87-4a67-a1ec-99dcce21178c" containerID="da8179203795058d13db6c8d7bc94520d4831300e6349d2ee80d7450cc334358" exitCode=0 Dec 12 00:29:16 crc kubenswrapper[4606]: I1212 00:29:16.420860 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wq42n" event={"ID":"bc97a6ce-dd87-4a67-a1ec-99dcce21178c","Type":"ContainerDied","Data":"da8179203795058d13db6c8d7bc94520d4831300e6349d2ee80d7450cc334358"} Dec 12 00:29:16 crc kubenswrapper[4606]: I1212 00:29:16.423491 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pw5mx" event={"ID":"70af385d-13b8-4ff2-8c35-fb9402388dd6","Type":"ContainerDied","Data":"062e0f6972301e6bc2c5566b56e43001c409ef93c3281ec87189afafc9c02df3"} Dec 12 00:29:16 crc kubenswrapper[4606]: I1212 00:29:16.424052 4606 generic.go:334] "Generic (PLEG): container finished" podID="70af385d-13b8-4ff2-8c35-fb9402388dd6" containerID="062e0f6972301e6bc2c5566b56e43001c409ef93c3281ec87189afafc9c02df3" exitCode=0 Dec 12 00:29:17 crc kubenswrapper[4606]: I1212 00:29:17.433667 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pw5mx" event={"ID":"70af385d-13b8-4ff2-8c35-fb9402388dd6","Type":"ContainerStarted","Data":"9c838d0f57f826a673eda5f07bb7b9e852a12c247b1525732ac623e2f33d29a2"} Dec 12 00:29:17 crc kubenswrapper[4606]: I1212 00:29:17.435778 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wq42n" event={"ID":"bc97a6ce-dd87-4a67-a1ec-99dcce21178c","Type":"ContainerStarted","Data":"8f83ef2ee07d344164a5648542b1fada488caece01fdcd51890d8b38ac8a53fb"} Dec 12 00:29:17 crc kubenswrapper[4606]: I1212 00:29:17.455531 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-pw5mx" podStartSLOduration=1.987844184 podStartE2EDuration="4.455510724s" podCreationTimestamp="2025-12-12 00:29:13 +0000 UTC" firstStartedPulling="2025-12-12 00:29:14.39566547 +0000 UTC m=+344.941018336" lastFinishedPulling="2025-12-12 00:29:16.86333201 +0000 UTC m=+347.408684876" observedRunningTime="2025-12-12 00:29:17.452783811 +0000 UTC m=+347.998136687" watchObservedRunningTime="2025-12-12 00:29:17.455510724 +0000 UTC m=+348.000863590" Dec 12 00:29:17 crc kubenswrapper[4606]: I1212 00:29:17.468997 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-wq42n" podStartSLOduration=2.889532742 podStartE2EDuration="4.468979432s" podCreationTimestamp="2025-12-12 00:29:13 +0000 UTC" firstStartedPulling="2025-12-12 00:29:15.408798367 +0000 UTC m=+345.954151233" lastFinishedPulling="2025-12-12 00:29:16.988245057 +0000 UTC m=+347.533597923" observedRunningTime="2025-12-12 00:29:17.468290864 +0000 UTC m=+348.013643740" watchObservedRunningTime="2025-12-12 00:29:17.468979432 +0000 UTC m=+348.014332298" Dec 12 00:29:21 crc kubenswrapper[4606]: I1212 00:29:21.076285 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-k7skb" Dec 12 00:29:21 crc kubenswrapper[4606]: I1212 00:29:21.076882 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-k7skb" Dec 12 00:29:21 crc kubenswrapper[4606]: I1212 00:29:21.113972 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-k7skb" Dec 12 00:29:21 crc kubenswrapper[4606]: I1212 00:29:21.278132 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-w8wq4" Dec 12 00:29:21 crc kubenswrapper[4606]: I1212 00:29:21.278378 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-w8wq4" Dec 12 00:29:21 crc kubenswrapper[4606]: I1212 00:29:21.315302 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-w8wq4" Dec 12 00:29:21 crc kubenswrapper[4606]: I1212 00:29:21.492892 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-w8wq4" Dec 12 00:29:21 crc kubenswrapper[4606]: I1212 00:29:21.494531 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-k7skb" Dec 12 00:29:23 crc kubenswrapper[4606]: I1212 00:29:23.682810 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-pw5mx" Dec 12 00:29:23 crc kubenswrapper[4606]: I1212 00:29:23.683101 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-pw5mx" Dec 12 00:29:23 crc kubenswrapper[4606]: I1212 00:29:23.727768 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-pw5mx" Dec 12 00:29:23 crc kubenswrapper[4606]: I1212 00:29:23.884831 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-wq42n" Dec 12 00:29:23 crc kubenswrapper[4606]: I1212 00:29:23.884985 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-wq42n" Dec 12 00:29:23 crc kubenswrapper[4606]: I1212 00:29:23.937017 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-wq42n" Dec 12 00:29:24 crc kubenswrapper[4606]: I1212 00:29:24.516554 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-pw5mx" Dec 12 00:29:24 crc kubenswrapper[4606]: I1212 00:29:24.529918 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-wq42n" Dec 12 00:29:32 crc kubenswrapper[4606]: I1212 00:29:32.010721 4606 patch_prober.go:28] interesting pod/machine-config-daemon-cqmz5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 12 00:29:32 crc kubenswrapper[4606]: I1212 00:29:32.011399 4606 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 12 00:29:34 crc kubenswrapper[4606]: I1212 00:29:34.668314 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-wpvbs"] Dec 12 00:29:34 crc kubenswrapper[4606]: I1212 00:29:34.669332 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-wpvbs" Dec 12 00:29:34 crc kubenswrapper[4606]: I1212 00:29:34.690571 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-wpvbs"] Dec 12 00:29:34 crc kubenswrapper[4606]: I1212 00:29:34.859992 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cc46827e-9fa3-4b39-8d4f-11652662eaee-trusted-ca\") pod \"image-registry-66df7c8f76-wpvbs\" (UID: \"cc46827e-9fa3-4b39-8d4f-11652662eaee\") " pod="openshift-image-registry/image-registry-66df7c8f76-wpvbs" Dec 12 00:29:34 crc kubenswrapper[4606]: I1212 00:29:34.860086 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/cc46827e-9fa3-4b39-8d4f-11652662eaee-installation-pull-secrets\") pod \"image-registry-66df7c8f76-wpvbs\" (UID: \"cc46827e-9fa3-4b39-8d4f-11652662eaee\") " pod="openshift-image-registry/image-registry-66df7c8f76-wpvbs" Dec 12 00:29:34 crc kubenswrapper[4606]: I1212 00:29:34.860442 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltphd\" (UniqueName: \"kubernetes.io/projected/cc46827e-9fa3-4b39-8d4f-11652662eaee-kube-api-access-ltphd\") pod \"image-registry-66df7c8f76-wpvbs\" (UID: \"cc46827e-9fa3-4b39-8d4f-11652662eaee\") " pod="openshift-image-registry/image-registry-66df7c8f76-wpvbs" Dec 12 00:29:34 crc kubenswrapper[4606]: I1212 00:29:34.860591 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/cc46827e-9fa3-4b39-8d4f-11652662eaee-registry-certificates\") pod \"image-registry-66df7c8f76-wpvbs\" (UID: \"cc46827e-9fa3-4b39-8d4f-11652662eaee\") " pod="openshift-image-registry/image-registry-66df7c8f76-wpvbs" Dec 12 00:29:34 crc kubenswrapper[4606]: I1212 00:29:34.860653 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/cc46827e-9fa3-4b39-8d4f-11652662eaee-ca-trust-extracted\") pod \"image-registry-66df7c8f76-wpvbs\" (UID: \"cc46827e-9fa3-4b39-8d4f-11652662eaee\") " pod="openshift-image-registry/image-registry-66df7c8f76-wpvbs" Dec 12 00:29:34 crc kubenswrapper[4606]: I1212 00:29:34.860679 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cc46827e-9fa3-4b39-8d4f-11652662eaee-registry-tls\") pod \"image-registry-66df7c8f76-wpvbs\" (UID: \"cc46827e-9fa3-4b39-8d4f-11652662eaee\") " pod="openshift-image-registry/image-registry-66df7c8f76-wpvbs" Dec 12 00:29:34 crc kubenswrapper[4606]: I1212 00:29:34.860712 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-wpvbs\" (UID: \"cc46827e-9fa3-4b39-8d4f-11652662eaee\") " pod="openshift-image-registry/image-registry-66df7c8f76-wpvbs" Dec 12 00:29:34 crc kubenswrapper[4606]: I1212 00:29:34.860901 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cc46827e-9fa3-4b39-8d4f-11652662eaee-bound-sa-token\") pod \"image-registry-66df7c8f76-wpvbs\" (UID: \"cc46827e-9fa3-4b39-8d4f-11652662eaee\") " pod="openshift-image-registry/image-registry-66df7c8f76-wpvbs" Dec 12 00:29:34 crc kubenswrapper[4606]: I1212 00:29:34.890636 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-wpvbs\" (UID: \"cc46827e-9fa3-4b39-8d4f-11652662eaee\") " pod="openshift-image-registry/image-registry-66df7c8f76-wpvbs" Dec 12 00:29:34 crc kubenswrapper[4606]: I1212 00:29:34.975420 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cc46827e-9fa3-4b39-8d4f-11652662eaee-trusted-ca\") pod \"image-registry-66df7c8f76-wpvbs\" (UID: \"cc46827e-9fa3-4b39-8d4f-11652662eaee\") " pod="openshift-image-registry/image-registry-66df7c8f76-wpvbs" Dec 12 00:29:34 crc kubenswrapper[4606]: I1212 00:29:34.975602 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/cc46827e-9fa3-4b39-8d4f-11652662eaee-installation-pull-secrets\") pod \"image-registry-66df7c8f76-wpvbs\" (UID: \"cc46827e-9fa3-4b39-8d4f-11652662eaee\") " pod="openshift-image-registry/image-registry-66df7c8f76-wpvbs" Dec 12 00:29:34 crc kubenswrapper[4606]: I1212 00:29:34.975666 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ltphd\" (UniqueName: \"kubernetes.io/projected/cc46827e-9fa3-4b39-8d4f-11652662eaee-kube-api-access-ltphd\") pod \"image-registry-66df7c8f76-wpvbs\" (UID: \"cc46827e-9fa3-4b39-8d4f-11652662eaee\") " pod="openshift-image-registry/image-registry-66df7c8f76-wpvbs" Dec 12 00:29:34 crc kubenswrapper[4606]: I1212 00:29:34.975878 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/cc46827e-9fa3-4b39-8d4f-11652662eaee-registry-certificates\") pod \"image-registry-66df7c8f76-wpvbs\" (UID: \"cc46827e-9fa3-4b39-8d4f-11652662eaee\") " pod="openshift-image-registry/image-registry-66df7c8f76-wpvbs" Dec 12 00:29:34 crc kubenswrapper[4606]: I1212 00:29:34.975921 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cc46827e-9fa3-4b39-8d4f-11652662eaee-registry-tls\") pod \"image-registry-66df7c8f76-wpvbs\" (UID: \"cc46827e-9fa3-4b39-8d4f-11652662eaee\") " pod="openshift-image-registry/image-registry-66df7c8f76-wpvbs" Dec 12 00:29:34 crc kubenswrapper[4606]: I1212 00:29:34.975954 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/cc46827e-9fa3-4b39-8d4f-11652662eaee-ca-trust-extracted\") pod \"image-registry-66df7c8f76-wpvbs\" (UID: \"cc46827e-9fa3-4b39-8d4f-11652662eaee\") " pod="openshift-image-registry/image-registry-66df7c8f76-wpvbs" Dec 12 00:29:34 crc kubenswrapper[4606]: I1212 00:29:34.975994 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cc46827e-9fa3-4b39-8d4f-11652662eaee-bound-sa-token\") pod \"image-registry-66df7c8f76-wpvbs\" (UID: \"cc46827e-9fa3-4b39-8d4f-11652662eaee\") " pod="openshift-image-registry/image-registry-66df7c8f76-wpvbs" Dec 12 00:29:34 crc kubenswrapper[4606]: I1212 00:29:34.978087 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cc46827e-9fa3-4b39-8d4f-11652662eaee-trusted-ca\") pod \"image-registry-66df7c8f76-wpvbs\" (UID: \"cc46827e-9fa3-4b39-8d4f-11652662eaee\") " pod="openshift-image-registry/image-registry-66df7c8f76-wpvbs" Dec 12 00:29:34 crc kubenswrapper[4606]: I1212 00:29:34.979162 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/cc46827e-9fa3-4b39-8d4f-11652662eaee-registry-certificates\") pod \"image-registry-66df7c8f76-wpvbs\" (UID: \"cc46827e-9fa3-4b39-8d4f-11652662eaee\") " pod="openshift-image-registry/image-registry-66df7c8f76-wpvbs" Dec 12 00:29:34 crc kubenswrapper[4606]: I1212 00:29:34.981672 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/cc46827e-9fa3-4b39-8d4f-11652662eaee-ca-trust-extracted\") pod \"image-registry-66df7c8f76-wpvbs\" (UID: \"cc46827e-9fa3-4b39-8d4f-11652662eaee\") " pod="openshift-image-registry/image-registry-66df7c8f76-wpvbs" Dec 12 00:29:34 crc kubenswrapper[4606]: I1212 00:29:34.989893 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/cc46827e-9fa3-4b39-8d4f-11652662eaee-installation-pull-secrets\") pod \"image-registry-66df7c8f76-wpvbs\" (UID: \"cc46827e-9fa3-4b39-8d4f-11652662eaee\") " pod="openshift-image-registry/image-registry-66df7c8f76-wpvbs" Dec 12 00:29:35 crc kubenswrapper[4606]: I1212 00:29:35.009610 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cc46827e-9fa3-4b39-8d4f-11652662eaee-bound-sa-token\") pod \"image-registry-66df7c8f76-wpvbs\" (UID: \"cc46827e-9fa3-4b39-8d4f-11652662eaee\") " pod="openshift-image-registry/image-registry-66df7c8f76-wpvbs" Dec 12 00:29:35 crc kubenswrapper[4606]: I1212 00:29:35.011804 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cc46827e-9fa3-4b39-8d4f-11652662eaee-registry-tls\") pod \"image-registry-66df7c8f76-wpvbs\" (UID: \"cc46827e-9fa3-4b39-8d4f-11652662eaee\") " pod="openshift-image-registry/image-registry-66df7c8f76-wpvbs" Dec 12 00:29:35 crc kubenswrapper[4606]: I1212 00:29:35.022062 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltphd\" (UniqueName: \"kubernetes.io/projected/cc46827e-9fa3-4b39-8d4f-11652662eaee-kube-api-access-ltphd\") pod \"image-registry-66df7c8f76-wpvbs\" (UID: \"cc46827e-9fa3-4b39-8d4f-11652662eaee\") " pod="openshift-image-registry/image-registry-66df7c8f76-wpvbs" Dec 12 00:29:35 crc kubenswrapper[4606]: I1212 00:29:35.286732 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-wpvbs" Dec 12 00:29:35 crc kubenswrapper[4606]: I1212 00:29:35.714159 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-wpvbs"] Dec 12 00:29:36 crc kubenswrapper[4606]: I1212 00:29:36.539023 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-wpvbs" event={"ID":"cc46827e-9fa3-4b39-8d4f-11652662eaee","Type":"ContainerStarted","Data":"c679231a264e3efc8d37134f49b11b2e0b460136a42fc30b2fd25514307baea4"} Dec 12 00:29:37 crc kubenswrapper[4606]: I1212 00:29:37.546736 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-wpvbs" event={"ID":"cc46827e-9fa3-4b39-8d4f-11652662eaee","Type":"ContainerStarted","Data":"46d2a3809fa3e1da95372c1cb7fac1fb5157789af91ae643e22770d4d1bc2fd2"} Dec 12 00:29:37 crc kubenswrapper[4606]: I1212 00:29:37.546885 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-wpvbs" Dec 12 00:29:37 crc kubenswrapper[4606]: I1212 00:29:37.568879 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-wpvbs" podStartSLOduration=3.568854994 podStartE2EDuration="3.568854994s" podCreationTimestamp="2025-12-12 00:29:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:29:37.564997711 +0000 UTC m=+368.110350577" watchObservedRunningTime="2025-12-12 00:29:37.568854994 +0000 UTC m=+368.114207860" Dec 12 00:29:38 crc kubenswrapper[4606]: I1212 00:29:38.653536 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-b974966b8-2pkbn"] Dec 12 00:29:38 crc kubenswrapper[4606]: I1212 00:29:38.654090 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-b974966b8-2pkbn" podUID="d3742fdc-2cac-49fb-945b-11b645a54119" containerName="controller-manager" containerID="cri-o://7ec242e41e2f545349c2badfaff810881ef486f7b2eff38fbf1add05046c5299" gracePeriod=30 Dec 12 00:29:39 crc kubenswrapper[4606]: I1212 00:29:39.048056 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-b974966b8-2pkbn" Dec 12 00:29:39 crc kubenswrapper[4606]: I1212 00:29:39.131557 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8nb9w\" (UniqueName: \"kubernetes.io/projected/d3742fdc-2cac-49fb-945b-11b645a54119-kube-api-access-8nb9w\") pod \"d3742fdc-2cac-49fb-945b-11b645a54119\" (UID: \"d3742fdc-2cac-49fb-945b-11b645a54119\") " Dec 12 00:29:39 crc kubenswrapper[4606]: I1212 00:29:39.131627 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3742fdc-2cac-49fb-945b-11b645a54119-config\") pod \"d3742fdc-2cac-49fb-945b-11b645a54119\" (UID: \"d3742fdc-2cac-49fb-945b-11b645a54119\") " Dec 12 00:29:39 crc kubenswrapper[4606]: I1212 00:29:39.131673 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d3742fdc-2cac-49fb-945b-11b645a54119-serving-cert\") pod \"d3742fdc-2cac-49fb-945b-11b645a54119\" (UID: \"d3742fdc-2cac-49fb-945b-11b645a54119\") " Dec 12 00:29:39 crc kubenswrapper[4606]: I1212 00:29:39.131724 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d3742fdc-2cac-49fb-945b-11b645a54119-proxy-ca-bundles\") pod \"d3742fdc-2cac-49fb-945b-11b645a54119\" (UID: \"d3742fdc-2cac-49fb-945b-11b645a54119\") " Dec 12 00:29:39 crc kubenswrapper[4606]: I1212 00:29:39.131834 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d3742fdc-2cac-49fb-945b-11b645a54119-client-ca\") pod \"d3742fdc-2cac-49fb-945b-11b645a54119\" (UID: \"d3742fdc-2cac-49fb-945b-11b645a54119\") " Dec 12 00:29:39 crc kubenswrapper[4606]: I1212 00:29:39.132619 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3742fdc-2cac-49fb-945b-11b645a54119-client-ca" (OuterVolumeSpecName: "client-ca") pod "d3742fdc-2cac-49fb-945b-11b645a54119" (UID: "d3742fdc-2cac-49fb-945b-11b645a54119"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:29:39 crc kubenswrapper[4606]: I1212 00:29:39.132633 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3742fdc-2cac-49fb-945b-11b645a54119-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "d3742fdc-2cac-49fb-945b-11b645a54119" (UID: "d3742fdc-2cac-49fb-945b-11b645a54119"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:29:39 crc kubenswrapper[4606]: I1212 00:29:39.132731 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3742fdc-2cac-49fb-945b-11b645a54119-config" (OuterVolumeSpecName: "config") pod "d3742fdc-2cac-49fb-945b-11b645a54119" (UID: "d3742fdc-2cac-49fb-945b-11b645a54119"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:29:39 crc kubenswrapper[4606]: I1212 00:29:39.136861 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3742fdc-2cac-49fb-945b-11b645a54119-kube-api-access-8nb9w" (OuterVolumeSpecName: "kube-api-access-8nb9w") pod "d3742fdc-2cac-49fb-945b-11b645a54119" (UID: "d3742fdc-2cac-49fb-945b-11b645a54119"). InnerVolumeSpecName "kube-api-access-8nb9w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:29:39 crc kubenswrapper[4606]: I1212 00:29:39.140396 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3742fdc-2cac-49fb-945b-11b645a54119-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "d3742fdc-2cac-49fb-945b-11b645a54119" (UID: "d3742fdc-2cac-49fb-945b-11b645a54119"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:29:39 crc kubenswrapper[4606]: I1212 00:29:39.233071 4606 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d3742fdc-2cac-49fb-945b-11b645a54119-client-ca\") on node \"crc\" DevicePath \"\"" Dec 12 00:29:39 crc kubenswrapper[4606]: I1212 00:29:39.233106 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8nb9w\" (UniqueName: \"kubernetes.io/projected/d3742fdc-2cac-49fb-945b-11b645a54119-kube-api-access-8nb9w\") on node \"crc\" DevicePath \"\"" Dec 12 00:29:39 crc kubenswrapper[4606]: I1212 00:29:39.233122 4606 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3742fdc-2cac-49fb-945b-11b645a54119-config\") on node \"crc\" DevicePath \"\"" Dec 12 00:29:39 crc kubenswrapper[4606]: I1212 00:29:39.233132 4606 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d3742fdc-2cac-49fb-945b-11b645a54119-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 12 00:29:39 crc kubenswrapper[4606]: I1212 00:29:39.233143 4606 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d3742fdc-2cac-49fb-945b-11b645a54119-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 12 00:29:39 crc kubenswrapper[4606]: I1212 00:29:39.560096 4606 generic.go:334] "Generic (PLEG): container finished" podID="d3742fdc-2cac-49fb-945b-11b645a54119" containerID="7ec242e41e2f545349c2badfaff810881ef486f7b2eff38fbf1add05046c5299" exitCode=0 Dec 12 00:29:39 crc kubenswrapper[4606]: I1212 00:29:39.560145 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-b974966b8-2pkbn" Dec 12 00:29:39 crc kubenswrapper[4606]: I1212 00:29:39.560163 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-b974966b8-2pkbn" event={"ID":"d3742fdc-2cac-49fb-945b-11b645a54119","Type":"ContainerDied","Data":"7ec242e41e2f545349c2badfaff810881ef486f7b2eff38fbf1add05046c5299"} Dec 12 00:29:39 crc kubenswrapper[4606]: I1212 00:29:39.560690 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-b974966b8-2pkbn" event={"ID":"d3742fdc-2cac-49fb-945b-11b645a54119","Type":"ContainerDied","Data":"bde59295a1843a4c352d9165451a46a01ab2e5081bf628199a53c11adcf495cf"} Dec 12 00:29:39 crc kubenswrapper[4606]: I1212 00:29:39.560708 4606 scope.go:117] "RemoveContainer" containerID="7ec242e41e2f545349c2badfaff810881ef486f7b2eff38fbf1add05046c5299" Dec 12 00:29:39 crc kubenswrapper[4606]: I1212 00:29:39.578327 4606 scope.go:117] "RemoveContainer" containerID="7ec242e41e2f545349c2badfaff810881ef486f7b2eff38fbf1add05046c5299" Dec 12 00:29:39 crc kubenswrapper[4606]: E1212 00:29:39.578870 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ec242e41e2f545349c2badfaff810881ef486f7b2eff38fbf1add05046c5299\": container with ID starting with 7ec242e41e2f545349c2badfaff810881ef486f7b2eff38fbf1add05046c5299 not found: ID does not exist" containerID="7ec242e41e2f545349c2badfaff810881ef486f7b2eff38fbf1add05046c5299" Dec 12 00:29:39 crc kubenswrapper[4606]: I1212 00:29:39.578901 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ec242e41e2f545349c2badfaff810881ef486f7b2eff38fbf1add05046c5299"} err="failed to get container status \"7ec242e41e2f545349c2badfaff810881ef486f7b2eff38fbf1add05046c5299\": rpc error: code = NotFound desc = could not find container \"7ec242e41e2f545349c2badfaff810881ef486f7b2eff38fbf1add05046c5299\": container with ID starting with 7ec242e41e2f545349c2badfaff810881ef486f7b2eff38fbf1add05046c5299 not found: ID does not exist" Dec 12 00:29:39 crc kubenswrapper[4606]: I1212 00:29:39.593593 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-b974966b8-2pkbn"] Dec 12 00:29:39 crc kubenswrapper[4606]: I1212 00:29:39.596901 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-b974966b8-2pkbn"] Dec 12 00:29:39 crc kubenswrapper[4606]: I1212 00:29:39.708839 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3742fdc-2cac-49fb-945b-11b645a54119" path="/var/lib/kubelet/pods/d3742fdc-2cac-49fb-945b-11b645a54119/volumes" Dec 12 00:29:40 crc kubenswrapper[4606]: I1212 00:29:40.534595 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-85f878fd5-7wpzw"] Dec 12 00:29:40 crc kubenswrapper[4606]: E1212 00:29:40.534825 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3742fdc-2cac-49fb-945b-11b645a54119" containerName="controller-manager" Dec 12 00:29:40 crc kubenswrapper[4606]: I1212 00:29:40.534839 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3742fdc-2cac-49fb-945b-11b645a54119" containerName="controller-manager" Dec 12 00:29:40 crc kubenswrapper[4606]: I1212 00:29:40.534974 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3742fdc-2cac-49fb-945b-11b645a54119" containerName="controller-manager" Dec 12 00:29:40 crc kubenswrapper[4606]: I1212 00:29:40.535445 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-85f878fd5-7wpzw" Dec 12 00:29:40 crc kubenswrapper[4606]: I1212 00:29:40.537790 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 12 00:29:40 crc kubenswrapper[4606]: I1212 00:29:40.538149 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 12 00:29:40 crc kubenswrapper[4606]: I1212 00:29:40.538364 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 12 00:29:40 crc kubenswrapper[4606]: I1212 00:29:40.538484 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 12 00:29:40 crc kubenswrapper[4606]: I1212 00:29:40.539253 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 12 00:29:40 crc kubenswrapper[4606]: I1212 00:29:40.542618 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 12 00:29:40 crc kubenswrapper[4606]: I1212 00:29:40.551811 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 12 00:29:40 crc kubenswrapper[4606]: I1212 00:29:40.552036 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-85f878fd5-7wpzw"] Dec 12 00:29:40 crc kubenswrapper[4606]: I1212 00:29:40.651286 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fd3536e9-7771-4ba8-9ed4-ab3b1821f205-client-ca\") pod \"controller-manager-85f878fd5-7wpzw\" (UID: \"fd3536e9-7771-4ba8-9ed4-ab3b1821f205\") " pod="openshift-controller-manager/controller-manager-85f878fd5-7wpzw" Dec 12 00:29:40 crc kubenswrapper[4606]: I1212 00:29:40.651355 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fd3536e9-7771-4ba8-9ed4-ab3b1821f205-serving-cert\") pod \"controller-manager-85f878fd5-7wpzw\" (UID: \"fd3536e9-7771-4ba8-9ed4-ab3b1821f205\") " pod="openshift-controller-manager/controller-manager-85f878fd5-7wpzw" Dec 12 00:29:40 crc kubenswrapper[4606]: I1212 00:29:40.651380 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd3536e9-7771-4ba8-9ed4-ab3b1821f205-config\") pod \"controller-manager-85f878fd5-7wpzw\" (UID: \"fd3536e9-7771-4ba8-9ed4-ab3b1821f205\") " pod="openshift-controller-manager/controller-manager-85f878fd5-7wpzw" Dec 12 00:29:40 crc kubenswrapper[4606]: I1212 00:29:40.651397 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gtkc\" (UniqueName: \"kubernetes.io/projected/fd3536e9-7771-4ba8-9ed4-ab3b1821f205-kube-api-access-9gtkc\") pod \"controller-manager-85f878fd5-7wpzw\" (UID: \"fd3536e9-7771-4ba8-9ed4-ab3b1821f205\") " pod="openshift-controller-manager/controller-manager-85f878fd5-7wpzw" Dec 12 00:29:40 crc kubenswrapper[4606]: I1212 00:29:40.651429 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fd3536e9-7771-4ba8-9ed4-ab3b1821f205-proxy-ca-bundles\") pod \"controller-manager-85f878fd5-7wpzw\" (UID: \"fd3536e9-7771-4ba8-9ed4-ab3b1821f205\") " pod="openshift-controller-manager/controller-manager-85f878fd5-7wpzw" Dec 12 00:29:40 crc kubenswrapper[4606]: I1212 00:29:40.752976 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fd3536e9-7771-4ba8-9ed4-ab3b1821f205-serving-cert\") pod \"controller-manager-85f878fd5-7wpzw\" (UID: \"fd3536e9-7771-4ba8-9ed4-ab3b1821f205\") " pod="openshift-controller-manager/controller-manager-85f878fd5-7wpzw" Dec 12 00:29:40 crc kubenswrapper[4606]: I1212 00:29:40.753029 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd3536e9-7771-4ba8-9ed4-ab3b1821f205-config\") pod \"controller-manager-85f878fd5-7wpzw\" (UID: \"fd3536e9-7771-4ba8-9ed4-ab3b1821f205\") " pod="openshift-controller-manager/controller-manager-85f878fd5-7wpzw" Dec 12 00:29:40 crc kubenswrapper[4606]: I1212 00:29:40.753055 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9gtkc\" (UniqueName: \"kubernetes.io/projected/fd3536e9-7771-4ba8-9ed4-ab3b1821f205-kube-api-access-9gtkc\") pod \"controller-manager-85f878fd5-7wpzw\" (UID: \"fd3536e9-7771-4ba8-9ed4-ab3b1821f205\") " pod="openshift-controller-manager/controller-manager-85f878fd5-7wpzw" Dec 12 00:29:40 crc kubenswrapper[4606]: I1212 00:29:40.753097 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fd3536e9-7771-4ba8-9ed4-ab3b1821f205-proxy-ca-bundles\") pod \"controller-manager-85f878fd5-7wpzw\" (UID: \"fd3536e9-7771-4ba8-9ed4-ab3b1821f205\") " pod="openshift-controller-manager/controller-manager-85f878fd5-7wpzw" Dec 12 00:29:40 crc kubenswrapper[4606]: I1212 00:29:40.753153 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fd3536e9-7771-4ba8-9ed4-ab3b1821f205-client-ca\") pod \"controller-manager-85f878fd5-7wpzw\" (UID: \"fd3536e9-7771-4ba8-9ed4-ab3b1821f205\") " pod="openshift-controller-manager/controller-manager-85f878fd5-7wpzw" Dec 12 00:29:40 crc kubenswrapper[4606]: I1212 00:29:40.754096 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fd3536e9-7771-4ba8-9ed4-ab3b1821f205-client-ca\") pod \"controller-manager-85f878fd5-7wpzw\" (UID: \"fd3536e9-7771-4ba8-9ed4-ab3b1821f205\") " pod="openshift-controller-manager/controller-manager-85f878fd5-7wpzw" Dec 12 00:29:40 crc kubenswrapper[4606]: I1212 00:29:40.754751 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd3536e9-7771-4ba8-9ed4-ab3b1821f205-config\") pod \"controller-manager-85f878fd5-7wpzw\" (UID: \"fd3536e9-7771-4ba8-9ed4-ab3b1821f205\") " pod="openshift-controller-manager/controller-manager-85f878fd5-7wpzw" Dec 12 00:29:40 crc kubenswrapper[4606]: I1212 00:29:40.755601 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fd3536e9-7771-4ba8-9ed4-ab3b1821f205-proxy-ca-bundles\") pod \"controller-manager-85f878fd5-7wpzw\" (UID: \"fd3536e9-7771-4ba8-9ed4-ab3b1821f205\") " pod="openshift-controller-manager/controller-manager-85f878fd5-7wpzw" Dec 12 00:29:40 crc kubenswrapper[4606]: I1212 00:29:40.765990 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fd3536e9-7771-4ba8-9ed4-ab3b1821f205-serving-cert\") pod \"controller-manager-85f878fd5-7wpzw\" (UID: \"fd3536e9-7771-4ba8-9ed4-ab3b1821f205\") " pod="openshift-controller-manager/controller-manager-85f878fd5-7wpzw" Dec 12 00:29:40 crc kubenswrapper[4606]: I1212 00:29:40.780745 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gtkc\" (UniqueName: \"kubernetes.io/projected/fd3536e9-7771-4ba8-9ed4-ab3b1821f205-kube-api-access-9gtkc\") pod \"controller-manager-85f878fd5-7wpzw\" (UID: \"fd3536e9-7771-4ba8-9ed4-ab3b1821f205\") " pod="openshift-controller-manager/controller-manager-85f878fd5-7wpzw" Dec 12 00:29:40 crc kubenswrapper[4606]: I1212 00:29:40.861332 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-85f878fd5-7wpzw" Dec 12 00:29:41 crc kubenswrapper[4606]: I1212 00:29:41.295720 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-85f878fd5-7wpzw"] Dec 12 00:29:41 crc kubenswrapper[4606]: I1212 00:29:41.582581 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-85f878fd5-7wpzw" event={"ID":"fd3536e9-7771-4ba8-9ed4-ab3b1821f205","Type":"ContainerStarted","Data":"71c34bf5a19c7e3c388a8fe933a26450c4933c0b2efdbda78f7bf4949efc6241"} Dec 12 00:29:41 crc kubenswrapper[4606]: I1212 00:29:41.582626 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-85f878fd5-7wpzw" event={"ID":"fd3536e9-7771-4ba8-9ed4-ab3b1821f205","Type":"ContainerStarted","Data":"bd313fd4d53bc311fcaf012195779871801dcb2bcb8538886f4f8666b4c45151"} Dec 12 00:29:41 crc kubenswrapper[4606]: I1212 00:29:41.582974 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-85f878fd5-7wpzw" Dec 12 00:29:41 crc kubenswrapper[4606]: I1212 00:29:41.586587 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-85f878fd5-7wpzw" Dec 12 00:29:41 crc kubenswrapper[4606]: I1212 00:29:41.630221 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-85f878fd5-7wpzw" podStartSLOduration=3.630203833 podStartE2EDuration="3.630203833s" podCreationTimestamp="2025-12-12 00:29:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:29:41.601249232 +0000 UTC m=+372.146602118" watchObservedRunningTime="2025-12-12 00:29:41.630203833 +0000 UTC m=+372.175556699" Dec 12 00:29:55 crc kubenswrapper[4606]: I1212 00:29:55.293578 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-wpvbs" Dec 12 00:29:55 crc kubenswrapper[4606]: I1212 00:29:55.365069 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-t8nn7"] Dec 12 00:29:58 crc kubenswrapper[4606]: I1212 00:29:58.626837 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-585b55f498-rl7j8"] Dec 12 00:29:58 crc kubenswrapper[4606]: I1212 00:29:58.627509 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-585b55f498-rl7j8" podUID="c6499dc6-f01c-4d32-a671-8af65df96665" containerName="route-controller-manager" containerID="cri-o://901dbbb22e421905433b80ad695c7920448d19b658b2f6930ebe4caf67bf1d39" gracePeriod=30 Dec 12 00:29:59 crc kubenswrapper[4606]: I1212 00:29:59.685962 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-585b55f498-rl7j8" Dec 12 00:29:59 crc kubenswrapper[4606]: I1212 00:29:59.692160 4606 generic.go:334] "Generic (PLEG): container finished" podID="c6499dc6-f01c-4d32-a671-8af65df96665" containerID="901dbbb22e421905433b80ad695c7920448d19b658b2f6930ebe4caf67bf1d39" exitCode=0 Dec 12 00:29:59 crc kubenswrapper[4606]: I1212 00:29:59.692211 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-585b55f498-rl7j8" Dec 12 00:29:59 crc kubenswrapper[4606]: I1212 00:29:59.692230 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-585b55f498-rl7j8" event={"ID":"c6499dc6-f01c-4d32-a671-8af65df96665","Type":"ContainerDied","Data":"901dbbb22e421905433b80ad695c7920448d19b658b2f6930ebe4caf67bf1d39"} Dec 12 00:29:59 crc kubenswrapper[4606]: I1212 00:29:59.692256 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-585b55f498-rl7j8" event={"ID":"c6499dc6-f01c-4d32-a671-8af65df96665","Type":"ContainerDied","Data":"522ff1ff058852bf929bdbafd5a1bc57fe5b6b3255fc45a04f091ed660557148"} Dec 12 00:29:59 crc kubenswrapper[4606]: I1212 00:29:59.692289 4606 scope.go:117] "RemoveContainer" containerID="901dbbb22e421905433b80ad695c7920448d19b658b2f6930ebe4caf67bf1d39" Dec 12 00:29:59 crc kubenswrapper[4606]: I1212 00:29:59.723023 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6c574bd88c-jbwmf"] Dec 12 00:29:59 crc kubenswrapper[4606]: E1212 00:29:59.723435 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6499dc6-f01c-4d32-a671-8af65df96665" containerName="route-controller-manager" Dec 12 00:29:59 crc kubenswrapper[4606]: I1212 00:29:59.723450 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6499dc6-f01c-4d32-a671-8af65df96665" containerName="route-controller-manager" Dec 12 00:29:59 crc kubenswrapper[4606]: I1212 00:29:59.723588 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6499dc6-f01c-4d32-a671-8af65df96665" containerName="route-controller-manager" Dec 12 00:29:59 crc kubenswrapper[4606]: I1212 00:29:59.724036 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6c574bd88c-jbwmf" Dec 12 00:29:59 crc kubenswrapper[4606]: I1212 00:29:59.737374 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6c574bd88c-jbwmf"] Dec 12 00:29:59 crc kubenswrapper[4606]: I1212 00:29:59.739990 4606 scope.go:117] "RemoveContainer" containerID="901dbbb22e421905433b80ad695c7920448d19b658b2f6930ebe4caf67bf1d39" Dec 12 00:29:59 crc kubenswrapper[4606]: E1212 00:29:59.742629 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"901dbbb22e421905433b80ad695c7920448d19b658b2f6930ebe4caf67bf1d39\": container with ID starting with 901dbbb22e421905433b80ad695c7920448d19b658b2f6930ebe4caf67bf1d39 not found: ID does not exist" containerID="901dbbb22e421905433b80ad695c7920448d19b658b2f6930ebe4caf67bf1d39" Dec 12 00:29:59 crc kubenswrapper[4606]: I1212 00:29:59.742667 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"901dbbb22e421905433b80ad695c7920448d19b658b2f6930ebe4caf67bf1d39"} err="failed to get container status \"901dbbb22e421905433b80ad695c7920448d19b658b2f6930ebe4caf67bf1d39\": rpc error: code = NotFound desc = could not find container \"901dbbb22e421905433b80ad695c7920448d19b658b2f6930ebe4caf67bf1d39\": container with ID starting with 901dbbb22e421905433b80ad695c7920448d19b658b2f6930ebe4caf67bf1d39 not found: ID does not exist" Dec 12 00:29:59 crc kubenswrapper[4606]: I1212 00:29:59.817141 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w59jt\" (UniqueName: \"kubernetes.io/projected/c6499dc6-f01c-4d32-a671-8af65df96665-kube-api-access-w59jt\") pod \"c6499dc6-f01c-4d32-a671-8af65df96665\" (UID: \"c6499dc6-f01c-4d32-a671-8af65df96665\") " Dec 12 00:29:59 crc kubenswrapper[4606]: I1212 00:29:59.817287 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c6499dc6-f01c-4d32-a671-8af65df96665-client-ca\") pod \"c6499dc6-f01c-4d32-a671-8af65df96665\" (UID: \"c6499dc6-f01c-4d32-a671-8af65df96665\") " Dec 12 00:29:59 crc kubenswrapper[4606]: I1212 00:29:59.817344 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c6499dc6-f01c-4d32-a671-8af65df96665-config\") pod \"c6499dc6-f01c-4d32-a671-8af65df96665\" (UID: \"c6499dc6-f01c-4d32-a671-8af65df96665\") " Dec 12 00:29:59 crc kubenswrapper[4606]: I1212 00:29:59.817412 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c6499dc6-f01c-4d32-a671-8af65df96665-serving-cert\") pod \"c6499dc6-f01c-4d32-a671-8af65df96665\" (UID: \"c6499dc6-f01c-4d32-a671-8af65df96665\") " Dec 12 00:29:59 crc kubenswrapper[4606]: I1212 00:29:59.817649 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e62d6cfd-5d95-4c04-8d1d-ced285cf9c18-serving-cert\") pod \"route-controller-manager-6c574bd88c-jbwmf\" (UID: \"e62d6cfd-5d95-4c04-8d1d-ced285cf9c18\") " pod="openshift-route-controller-manager/route-controller-manager-6c574bd88c-jbwmf" Dec 12 00:29:59 crc kubenswrapper[4606]: I1212 00:29:59.817838 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e62d6cfd-5d95-4c04-8d1d-ced285cf9c18-config\") pod \"route-controller-manager-6c574bd88c-jbwmf\" (UID: \"e62d6cfd-5d95-4c04-8d1d-ced285cf9c18\") " pod="openshift-route-controller-manager/route-controller-manager-6c574bd88c-jbwmf" Dec 12 00:29:59 crc kubenswrapper[4606]: I1212 00:29:59.817870 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vfm7\" (UniqueName: \"kubernetes.io/projected/e62d6cfd-5d95-4c04-8d1d-ced285cf9c18-kube-api-access-5vfm7\") pod \"route-controller-manager-6c574bd88c-jbwmf\" (UID: \"e62d6cfd-5d95-4c04-8d1d-ced285cf9c18\") " pod="openshift-route-controller-manager/route-controller-manager-6c574bd88c-jbwmf" Dec 12 00:29:59 crc kubenswrapper[4606]: I1212 00:29:59.817961 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e62d6cfd-5d95-4c04-8d1d-ced285cf9c18-client-ca\") pod \"route-controller-manager-6c574bd88c-jbwmf\" (UID: \"e62d6cfd-5d95-4c04-8d1d-ced285cf9c18\") " pod="openshift-route-controller-manager/route-controller-manager-6c574bd88c-jbwmf" Dec 12 00:29:59 crc kubenswrapper[4606]: I1212 00:29:59.818381 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c6499dc6-f01c-4d32-a671-8af65df96665-client-ca" (OuterVolumeSpecName: "client-ca") pod "c6499dc6-f01c-4d32-a671-8af65df96665" (UID: "c6499dc6-f01c-4d32-a671-8af65df96665"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:29:59 crc kubenswrapper[4606]: I1212 00:29:59.818407 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c6499dc6-f01c-4d32-a671-8af65df96665-config" (OuterVolumeSpecName: "config") pod "c6499dc6-f01c-4d32-a671-8af65df96665" (UID: "c6499dc6-f01c-4d32-a671-8af65df96665"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:29:59 crc kubenswrapper[4606]: I1212 00:29:59.823591 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6499dc6-f01c-4d32-a671-8af65df96665-kube-api-access-w59jt" (OuterVolumeSpecName: "kube-api-access-w59jt") pod "c6499dc6-f01c-4d32-a671-8af65df96665" (UID: "c6499dc6-f01c-4d32-a671-8af65df96665"). InnerVolumeSpecName "kube-api-access-w59jt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:29:59 crc kubenswrapper[4606]: I1212 00:29:59.824131 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6499dc6-f01c-4d32-a671-8af65df96665-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "c6499dc6-f01c-4d32-a671-8af65df96665" (UID: "c6499dc6-f01c-4d32-a671-8af65df96665"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:29:59 crc kubenswrapper[4606]: I1212 00:29:59.919879 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e62d6cfd-5d95-4c04-8d1d-ced285cf9c18-config\") pod \"route-controller-manager-6c574bd88c-jbwmf\" (UID: \"e62d6cfd-5d95-4c04-8d1d-ced285cf9c18\") " pod="openshift-route-controller-manager/route-controller-manager-6c574bd88c-jbwmf" Dec 12 00:29:59 crc kubenswrapper[4606]: I1212 00:29:59.919979 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5vfm7\" (UniqueName: \"kubernetes.io/projected/e62d6cfd-5d95-4c04-8d1d-ced285cf9c18-kube-api-access-5vfm7\") pod \"route-controller-manager-6c574bd88c-jbwmf\" (UID: \"e62d6cfd-5d95-4c04-8d1d-ced285cf9c18\") " pod="openshift-route-controller-manager/route-controller-manager-6c574bd88c-jbwmf" Dec 12 00:29:59 crc kubenswrapper[4606]: I1212 00:29:59.920064 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e62d6cfd-5d95-4c04-8d1d-ced285cf9c18-client-ca\") pod \"route-controller-manager-6c574bd88c-jbwmf\" (UID: \"e62d6cfd-5d95-4c04-8d1d-ced285cf9c18\") " pod="openshift-route-controller-manager/route-controller-manager-6c574bd88c-jbwmf" Dec 12 00:29:59 crc kubenswrapper[4606]: I1212 00:29:59.920130 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e62d6cfd-5d95-4c04-8d1d-ced285cf9c18-serving-cert\") pod \"route-controller-manager-6c574bd88c-jbwmf\" (UID: \"e62d6cfd-5d95-4c04-8d1d-ced285cf9c18\") " pod="openshift-route-controller-manager/route-controller-manager-6c574bd88c-jbwmf" Dec 12 00:29:59 crc kubenswrapper[4606]: I1212 00:29:59.920197 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w59jt\" (UniqueName: \"kubernetes.io/projected/c6499dc6-f01c-4d32-a671-8af65df96665-kube-api-access-w59jt\") on node \"crc\" DevicePath \"\"" Dec 12 00:29:59 crc kubenswrapper[4606]: I1212 00:29:59.920227 4606 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c6499dc6-f01c-4d32-a671-8af65df96665-client-ca\") on node \"crc\" DevicePath \"\"" Dec 12 00:29:59 crc kubenswrapper[4606]: I1212 00:29:59.920243 4606 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c6499dc6-f01c-4d32-a671-8af65df96665-config\") on node \"crc\" DevicePath \"\"" Dec 12 00:29:59 crc kubenswrapper[4606]: I1212 00:29:59.920259 4606 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c6499dc6-f01c-4d32-a671-8af65df96665-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 12 00:29:59 crc kubenswrapper[4606]: I1212 00:29:59.921667 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e62d6cfd-5d95-4c04-8d1d-ced285cf9c18-client-ca\") pod \"route-controller-manager-6c574bd88c-jbwmf\" (UID: \"e62d6cfd-5d95-4c04-8d1d-ced285cf9c18\") " pod="openshift-route-controller-manager/route-controller-manager-6c574bd88c-jbwmf" Dec 12 00:29:59 crc kubenswrapper[4606]: I1212 00:29:59.922338 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e62d6cfd-5d95-4c04-8d1d-ced285cf9c18-config\") pod \"route-controller-manager-6c574bd88c-jbwmf\" (UID: \"e62d6cfd-5d95-4c04-8d1d-ced285cf9c18\") " pod="openshift-route-controller-manager/route-controller-manager-6c574bd88c-jbwmf" Dec 12 00:29:59 crc kubenswrapper[4606]: I1212 00:29:59.924343 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e62d6cfd-5d95-4c04-8d1d-ced285cf9c18-serving-cert\") pod \"route-controller-manager-6c574bd88c-jbwmf\" (UID: \"e62d6cfd-5d95-4c04-8d1d-ced285cf9c18\") " pod="openshift-route-controller-manager/route-controller-manager-6c574bd88c-jbwmf" Dec 12 00:29:59 crc kubenswrapper[4606]: I1212 00:29:59.935743 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vfm7\" (UniqueName: \"kubernetes.io/projected/e62d6cfd-5d95-4c04-8d1d-ced285cf9c18-kube-api-access-5vfm7\") pod \"route-controller-manager-6c574bd88c-jbwmf\" (UID: \"e62d6cfd-5d95-4c04-8d1d-ced285cf9c18\") " pod="openshift-route-controller-manager/route-controller-manager-6c574bd88c-jbwmf" Dec 12 00:30:00 crc kubenswrapper[4606]: I1212 00:30:00.021300 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-585b55f498-rl7j8"] Dec 12 00:30:00 crc kubenswrapper[4606]: I1212 00:30:00.025798 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-585b55f498-rl7j8"] Dec 12 00:30:00 crc kubenswrapper[4606]: I1212 00:30:00.046008 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6c574bd88c-jbwmf" Dec 12 00:30:00 crc kubenswrapper[4606]: I1212 00:30:00.170130 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29424990-7td8m"] Dec 12 00:30:00 crc kubenswrapper[4606]: I1212 00:30:00.170887 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29424990-7td8m" Dec 12 00:30:00 crc kubenswrapper[4606]: I1212 00:30:00.172793 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 12 00:30:00 crc kubenswrapper[4606]: I1212 00:30:00.172928 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 12 00:30:00 crc kubenswrapper[4606]: I1212 00:30:00.176844 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29424990-7td8m"] Dec 12 00:30:00 crc kubenswrapper[4606]: I1212 00:30:00.223683 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0ee5a4a4-3f2c-4e71-be6e-4b95896ebcf2-config-volume\") pod \"collect-profiles-29424990-7td8m\" (UID: \"0ee5a4a4-3f2c-4e71-be6e-4b95896ebcf2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29424990-7td8m" Dec 12 00:30:00 crc kubenswrapper[4606]: I1212 00:30:00.223743 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njbll\" (UniqueName: \"kubernetes.io/projected/0ee5a4a4-3f2c-4e71-be6e-4b95896ebcf2-kube-api-access-njbll\") pod \"collect-profiles-29424990-7td8m\" (UID: \"0ee5a4a4-3f2c-4e71-be6e-4b95896ebcf2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29424990-7td8m" Dec 12 00:30:00 crc kubenswrapper[4606]: I1212 00:30:00.223879 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0ee5a4a4-3f2c-4e71-be6e-4b95896ebcf2-secret-volume\") pod \"collect-profiles-29424990-7td8m\" (UID: \"0ee5a4a4-3f2c-4e71-be6e-4b95896ebcf2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29424990-7td8m" Dec 12 00:30:00 crc kubenswrapper[4606]: I1212 00:30:00.325022 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0ee5a4a4-3f2c-4e71-be6e-4b95896ebcf2-config-volume\") pod \"collect-profiles-29424990-7td8m\" (UID: \"0ee5a4a4-3f2c-4e71-be6e-4b95896ebcf2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29424990-7td8m" Dec 12 00:30:00 crc kubenswrapper[4606]: I1212 00:30:00.325104 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njbll\" (UniqueName: \"kubernetes.io/projected/0ee5a4a4-3f2c-4e71-be6e-4b95896ebcf2-kube-api-access-njbll\") pod \"collect-profiles-29424990-7td8m\" (UID: \"0ee5a4a4-3f2c-4e71-be6e-4b95896ebcf2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29424990-7td8m" Dec 12 00:30:00 crc kubenswrapper[4606]: I1212 00:30:00.325200 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0ee5a4a4-3f2c-4e71-be6e-4b95896ebcf2-secret-volume\") pod \"collect-profiles-29424990-7td8m\" (UID: \"0ee5a4a4-3f2c-4e71-be6e-4b95896ebcf2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29424990-7td8m" Dec 12 00:30:00 crc kubenswrapper[4606]: I1212 00:30:00.326453 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0ee5a4a4-3f2c-4e71-be6e-4b95896ebcf2-config-volume\") pod \"collect-profiles-29424990-7td8m\" (UID: \"0ee5a4a4-3f2c-4e71-be6e-4b95896ebcf2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29424990-7td8m" Dec 12 00:30:00 crc kubenswrapper[4606]: I1212 00:30:00.329693 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0ee5a4a4-3f2c-4e71-be6e-4b95896ebcf2-secret-volume\") pod \"collect-profiles-29424990-7td8m\" (UID: \"0ee5a4a4-3f2c-4e71-be6e-4b95896ebcf2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29424990-7td8m" Dec 12 00:30:00 crc kubenswrapper[4606]: I1212 00:30:00.340658 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njbll\" (UniqueName: \"kubernetes.io/projected/0ee5a4a4-3f2c-4e71-be6e-4b95896ebcf2-kube-api-access-njbll\") pod \"collect-profiles-29424990-7td8m\" (UID: \"0ee5a4a4-3f2c-4e71-be6e-4b95896ebcf2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29424990-7td8m" Dec 12 00:30:00 crc kubenswrapper[4606]: I1212 00:30:00.493417 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29424990-7td8m" Dec 12 00:30:00 crc kubenswrapper[4606]: I1212 00:30:00.516689 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6c574bd88c-jbwmf"] Dec 12 00:30:00 crc kubenswrapper[4606]: W1212 00:30:00.529659 4606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode62d6cfd_5d95_4c04_8d1d_ced285cf9c18.slice/crio-c32dbd23ea7675bcfb5e8814418c9215045883800f227262193121fcdc987a6c WatchSource:0}: Error finding container c32dbd23ea7675bcfb5e8814418c9215045883800f227262193121fcdc987a6c: Status 404 returned error can't find the container with id c32dbd23ea7675bcfb5e8814418c9215045883800f227262193121fcdc987a6c Dec 12 00:30:00 crc kubenswrapper[4606]: I1212 00:30:00.698250 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6c574bd88c-jbwmf" event={"ID":"e62d6cfd-5d95-4c04-8d1d-ced285cf9c18","Type":"ContainerStarted","Data":"0ac932521eedb9ab965602f15bd790f430a32c9e45f45012311fc6fffa3840a3"} Dec 12 00:30:00 crc kubenswrapper[4606]: I1212 00:30:00.698287 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6c574bd88c-jbwmf" event={"ID":"e62d6cfd-5d95-4c04-8d1d-ced285cf9c18","Type":"ContainerStarted","Data":"c32dbd23ea7675bcfb5e8814418c9215045883800f227262193121fcdc987a6c"} Dec 12 00:30:00 crc kubenswrapper[4606]: I1212 00:30:00.699116 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6c574bd88c-jbwmf" Dec 12 00:30:00 crc kubenswrapper[4606]: I1212 00:30:00.702203 4606 patch_prober.go:28] interesting pod/route-controller-manager-6c574bd88c-jbwmf container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.70:8443/healthz\": dial tcp 10.217.0.70:8443: connect: connection refused" start-of-body= Dec 12 00:30:00 crc kubenswrapper[4606]: I1212 00:30:00.702269 4606 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6c574bd88c-jbwmf" podUID="e62d6cfd-5d95-4c04-8d1d-ced285cf9c18" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.70:8443/healthz\": dial tcp 10.217.0.70:8443: connect: connection refused" Dec 12 00:30:00 crc kubenswrapper[4606]: I1212 00:30:00.719093 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6c574bd88c-jbwmf" podStartSLOduration=2.719069656 podStartE2EDuration="2.719069656s" podCreationTimestamp="2025-12-12 00:29:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:30:00.714193846 +0000 UTC m=+391.259546712" watchObservedRunningTime="2025-12-12 00:30:00.719069656 +0000 UTC m=+391.264422522" Dec 12 00:30:00 crc kubenswrapper[4606]: I1212 00:30:00.899675 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29424990-7td8m"] Dec 12 00:30:01 crc kubenswrapper[4606]: I1212 00:30:01.710146 4606 generic.go:334] "Generic (PLEG): container finished" podID="0ee5a4a4-3f2c-4e71-be6e-4b95896ebcf2" containerID="0758f1c087a8f7581b046dc559ae82dfe863767335aa9c8506f861dbc5a85de2" exitCode=0 Dec 12 00:30:01 crc kubenswrapper[4606]: I1212 00:30:01.713627 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6499dc6-f01c-4d32-a671-8af65df96665" path="/var/lib/kubelet/pods/c6499dc6-f01c-4d32-a671-8af65df96665/volumes" Dec 12 00:30:01 crc kubenswrapper[4606]: I1212 00:30:01.714429 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29424990-7td8m" event={"ID":"0ee5a4a4-3f2c-4e71-be6e-4b95896ebcf2","Type":"ContainerDied","Data":"0758f1c087a8f7581b046dc559ae82dfe863767335aa9c8506f861dbc5a85de2"} Dec 12 00:30:01 crc kubenswrapper[4606]: I1212 00:30:01.714470 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29424990-7td8m" event={"ID":"0ee5a4a4-3f2c-4e71-be6e-4b95896ebcf2","Type":"ContainerStarted","Data":"aad03399ce7b49fc785250e7549d0aa567a2b4e6ca63cf6ec0c94e5901b156f9"} Dec 12 00:30:01 crc kubenswrapper[4606]: I1212 00:30:01.718346 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6c574bd88c-jbwmf" Dec 12 00:30:02 crc kubenswrapper[4606]: I1212 00:30:02.010536 4606 patch_prober.go:28] interesting pod/machine-config-daemon-cqmz5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 12 00:30:02 crc kubenswrapper[4606]: I1212 00:30:02.010594 4606 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 12 00:30:02 crc kubenswrapper[4606]: I1212 00:30:02.010635 4606 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" Dec 12 00:30:02 crc kubenswrapper[4606]: I1212 00:30:02.011083 4606 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a395445cb02f1ca45c19ddbb727d5c765fe5792a9edbe62b6d34f236f52c7139"} pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 12 00:30:02 crc kubenswrapper[4606]: I1212 00:30:02.011146 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" containerName="machine-config-daemon" containerID="cri-o://a395445cb02f1ca45c19ddbb727d5c765fe5792a9edbe62b6d34f236f52c7139" gracePeriod=600 Dec 12 00:30:02 crc kubenswrapper[4606]: I1212 00:30:02.718428 4606 generic.go:334] "Generic (PLEG): container finished" podID="a543e227-be89-40cb-941d-b4707cc28921" containerID="a395445cb02f1ca45c19ddbb727d5c765fe5792a9edbe62b6d34f236f52c7139" exitCode=0 Dec 12 00:30:02 crc kubenswrapper[4606]: I1212 00:30:02.718512 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" event={"ID":"a543e227-be89-40cb-941d-b4707cc28921","Type":"ContainerDied","Data":"a395445cb02f1ca45c19ddbb727d5c765fe5792a9edbe62b6d34f236f52c7139"} Dec 12 00:30:02 crc kubenswrapper[4606]: I1212 00:30:02.719075 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" event={"ID":"a543e227-be89-40cb-941d-b4707cc28921","Type":"ContainerStarted","Data":"2880ad6f88a790c73e66326c829742ee2381f40cd5326754b1a99b3312add678"} Dec 12 00:30:02 crc kubenswrapper[4606]: I1212 00:30:02.719126 4606 scope.go:117] "RemoveContainer" containerID="5de882904f4a5718bcf44d07da05096363447836f325f35130914c17a30c3a55" Dec 12 00:30:03 crc kubenswrapper[4606]: I1212 00:30:03.100106 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29424990-7td8m" Dec 12 00:30:03 crc kubenswrapper[4606]: I1212 00:30:03.168347 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0ee5a4a4-3f2c-4e71-be6e-4b95896ebcf2-secret-volume\") pod \"0ee5a4a4-3f2c-4e71-be6e-4b95896ebcf2\" (UID: \"0ee5a4a4-3f2c-4e71-be6e-4b95896ebcf2\") " Dec 12 00:30:03 crc kubenswrapper[4606]: I1212 00:30:03.168434 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0ee5a4a4-3f2c-4e71-be6e-4b95896ebcf2-config-volume\") pod \"0ee5a4a4-3f2c-4e71-be6e-4b95896ebcf2\" (UID: \"0ee5a4a4-3f2c-4e71-be6e-4b95896ebcf2\") " Dec 12 00:30:03 crc kubenswrapper[4606]: I1212 00:30:03.168504 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-njbll\" (UniqueName: \"kubernetes.io/projected/0ee5a4a4-3f2c-4e71-be6e-4b95896ebcf2-kube-api-access-njbll\") pod \"0ee5a4a4-3f2c-4e71-be6e-4b95896ebcf2\" (UID: \"0ee5a4a4-3f2c-4e71-be6e-4b95896ebcf2\") " Dec 12 00:30:03 crc kubenswrapper[4606]: I1212 00:30:03.169296 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ee5a4a4-3f2c-4e71-be6e-4b95896ebcf2-config-volume" (OuterVolumeSpecName: "config-volume") pod "0ee5a4a4-3f2c-4e71-be6e-4b95896ebcf2" (UID: "0ee5a4a4-3f2c-4e71-be6e-4b95896ebcf2"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:30:03 crc kubenswrapper[4606]: I1212 00:30:03.172760 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ee5a4a4-3f2c-4e71-be6e-4b95896ebcf2-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "0ee5a4a4-3f2c-4e71-be6e-4b95896ebcf2" (UID: "0ee5a4a4-3f2c-4e71-be6e-4b95896ebcf2"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:30:03 crc kubenswrapper[4606]: I1212 00:30:03.173940 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ee5a4a4-3f2c-4e71-be6e-4b95896ebcf2-kube-api-access-njbll" (OuterVolumeSpecName: "kube-api-access-njbll") pod "0ee5a4a4-3f2c-4e71-be6e-4b95896ebcf2" (UID: "0ee5a4a4-3f2c-4e71-be6e-4b95896ebcf2"). InnerVolumeSpecName "kube-api-access-njbll". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:30:03 crc kubenswrapper[4606]: I1212 00:30:03.270054 4606 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0ee5a4a4-3f2c-4e71-be6e-4b95896ebcf2-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 12 00:30:03 crc kubenswrapper[4606]: I1212 00:30:03.270095 4606 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0ee5a4a4-3f2c-4e71-be6e-4b95896ebcf2-config-volume\") on node \"crc\" DevicePath \"\"" Dec 12 00:30:03 crc kubenswrapper[4606]: I1212 00:30:03.270111 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-njbll\" (UniqueName: \"kubernetes.io/projected/0ee5a4a4-3f2c-4e71-be6e-4b95896ebcf2-kube-api-access-njbll\") on node \"crc\" DevicePath \"\"" Dec 12 00:30:03 crc kubenswrapper[4606]: I1212 00:30:03.729231 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29424990-7td8m" event={"ID":"0ee5a4a4-3f2c-4e71-be6e-4b95896ebcf2","Type":"ContainerDied","Data":"aad03399ce7b49fc785250e7549d0aa567a2b4e6ca63cf6ec0c94e5901b156f9"} Dec 12 00:30:03 crc kubenswrapper[4606]: I1212 00:30:03.729647 4606 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aad03399ce7b49fc785250e7549d0aa567a2b4e6ca63cf6ec0c94e5901b156f9" Dec 12 00:30:03 crc kubenswrapper[4606]: I1212 00:30:03.729750 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29424990-7td8m" Dec 12 00:30:20 crc kubenswrapper[4606]: I1212 00:30:20.411783 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-t8nn7" podUID="40292f84-e865-4368-9e37-e385dfcb5880" containerName="registry" containerID="cri-o://d34b66ee13398dfdf6ff81baadba90cf00bfbe6b920bd2925d713fc34d18df31" gracePeriod=30 Dec 12 00:30:20 crc kubenswrapper[4606]: I1212 00:30:20.849719 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-t8nn7" Dec 12 00:30:20 crc kubenswrapper[4606]: I1212 00:30:20.863585 4606 generic.go:334] "Generic (PLEG): container finished" podID="40292f84-e865-4368-9e37-e385dfcb5880" containerID="d34b66ee13398dfdf6ff81baadba90cf00bfbe6b920bd2925d713fc34d18df31" exitCode=0 Dec 12 00:30:20 crc kubenswrapper[4606]: I1212 00:30:20.863641 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-t8nn7" event={"ID":"40292f84-e865-4368-9e37-e385dfcb5880","Type":"ContainerDied","Data":"d34b66ee13398dfdf6ff81baadba90cf00bfbe6b920bd2925d713fc34d18df31"} Dec 12 00:30:20 crc kubenswrapper[4606]: I1212 00:30:20.863673 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-t8nn7" event={"ID":"40292f84-e865-4368-9e37-e385dfcb5880","Type":"ContainerDied","Data":"c6a578130ba0136088789483edb93741101af32e0fd3f3096c2d4da613ec9421"} Dec 12 00:30:20 crc kubenswrapper[4606]: I1212 00:30:20.863675 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-t8nn7" Dec 12 00:30:20 crc kubenswrapper[4606]: I1212 00:30:20.863692 4606 scope.go:117] "RemoveContainer" containerID="d34b66ee13398dfdf6ff81baadba90cf00bfbe6b920bd2925d713fc34d18df31" Dec 12 00:30:20 crc kubenswrapper[4606]: I1212 00:30:20.885753 4606 scope.go:117] "RemoveContainer" containerID="d34b66ee13398dfdf6ff81baadba90cf00bfbe6b920bd2925d713fc34d18df31" Dec 12 00:30:20 crc kubenswrapper[4606]: E1212 00:30:20.886289 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d34b66ee13398dfdf6ff81baadba90cf00bfbe6b920bd2925d713fc34d18df31\": container with ID starting with d34b66ee13398dfdf6ff81baadba90cf00bfbe6b920bd2925d713fc34d18df31 not found: ID does not exist" containerID="d34b66ee13398dfdf6ff81baadba90cf00bfbe6b920bd2925d713fc34d18df31" Dec 12 00:30:20 crc kubenswrapper[4606]: I1212 00:30:20.886334 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d34b66ee13398dfdf6ff81baadba90cf00bfbe6b920bd2925d713fc34d18df31"} err="failed to get container status \"d34b66ee13398dfdf6ff81baadba90cf00bfbe6b920bd2925d713fc34d18df31\": rpc error: code = NotFound desc = could not find container \"d34b66ee13398dfdf6ff81baadba90cf00bfbe6b920bd2925d713fc34d18df31\": container with ID starting with d34b66ee13398dfdf6ff81baadba90cf00bfbe6b920bd2925d713fc34d18df31 not found: ID does not exist" Dec 12 00:30:20 crc kubenswrapper[4606]: I1212 00:30:20.955967 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f9p24\" (UniqueName: \"kubernetes.io/projected/40292f84-e865-4368-9e37-e385dfcb5880-kube-api-access-f9p24\") pod \"40292f84-e865-4368-9e37-e385dfcb5880\" (UID: \"40292f84-e865-4368-9e37-e385dfcb5880\") " Dec 12 00:30:20 crc kubenswrapper[4606]: I1212 00:30:20.956276 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"40292f84-e865-4368-9e37-e385dfcb5880\" (UID: \"40292f84-e865-4368-9e37-e385dfcb5880\") " Dec 12 00:30:20 crc kubenswrapper[4606]: I1212 00:30:20.956323 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/40292f84-e865-4368-9e37-e385dfcb5880-trusted-ca\") pod \"40292f84-e865-4368-9e37-e385dfcb5880\" (UID: \"40292f84-e865-4368-9e37-e385dfcb5880\") " Dec 12 00:30:20 crc kubenswrapper[4606]: I1212 00:30:20.956375 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/40292f84-e865-4368-9e37-e385dfcb5880-bound-sa-token\") pod \"40292f84-e865-4368-9e37-e385dfcb5880\" (UID: \"40292f84-e865-4368-9e37-e385dfcb5880\") " Dec 12 00:30:20 crc kubenswrapper[4606]: I1212 00:30:20.957145 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/40292f84-e865-4368-9e37-e385dfcb5880-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "40292f84-e865-4368-9e37-e385dfcb5880" (UID: "40292f84-e865-4368-9e37-e385dfcb5880"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:30:20 crc kubenswrapper[4606]: I1212 00:30:20.957277 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/40292f84-e865-4368-9e37-e385dfcb5880-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "40292f84-e865-4368-9e37-e385dfcb5880" (UID: "40292f84-e865-4368-9e37-e385dfcb5880"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:30:20 crc kubenswrapper[4606]: I1212 00:30:20.957307 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/40292f84-e865-4368-9e37-e385dfcb5880-registry-certificates\") pod \"40292f84-e865-4368-9e37-e385dfcb5880\" (UID: \"40292f84-e865-4368-9e37-e385dfcb5880\") " Dec 12 00:30:20 crc kubenswrapper[4606]: I1212 00:30:20.957467 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/40292f84-e865-4368-9e37-e385dfcb5880-installation-pull-secrets\") pod \"40292f84-e865-4368-9e37-e385dfcb5880\" (UID: \"40292f84-e865-4368-9e37-e385dfcb5880\") " Dec 12 00:30:20 crc kubenswrapper[4606]: I1212 00:30:20.957598 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/40292f84-e865-4368-9e37-e385dfcb5880-registry-tls\") pod \"40292f84-e865-4368-9e37-e385dfcb5880\" (UID: \"40292f84-e865-4368-9e37-e385dfcb5880\") " Dec 12 00:30:20 crc kubenswrapper[4606]: I1212 00:30:20.957709 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/40292f84-e865-4368-9e37-e385dfcb5880-ca-trust-extracted\") pod \"40292f84-e865-4368-9e37-e385dfcb5880\" (UID: \"40292f84-e865-4368-9e37-e385dfcb5880\") " Dec 12 00:30:20 crc kubenswrapper[4606]: I1212 00:30:20.958283 4606 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/40292f84-e865-4368-9e37-e385dfcb5880-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 12 00:30:20 crc kubenswrapper[4606]: I1212 00:30:20.958367 4606 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/40292f84-e865-4368-9e37-e385dfcb5880-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 12 00:30:20 crc kubenswrapper[4606]: I1212 00:30:20.962813 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40292f84-e865-4368-9e37-e385dfcb5880-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "40292f84-e865-4368-9e37-e385dfcb5880" (UID: "40292f84-e865-4368-9e37-e385dfcb5880"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:30:20 crc kubenswrapper[4606]: I1212 00:30:20.968403 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40292f84-e865-4368-9e37-e385dfcb5880-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "40292f84-e865-4368-9e37-e385dfcb5880" (UID: "40292f84-e865-4368-9e37-e385dfcb5880"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:30:20 crc kubenswrapper[4606]: I1212 00:30:20.968833 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "40292f84-e865-4368-9e37-e385dfcb5880" (UID: "40292f84-e865-4368-9e37-e385dfcb5880"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 12 00:30:20 crc kubenswrapper[4606]: I1212 00:30:20.968903 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40292f84-e865-4368-9e37-e385dfcb5880-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "40292f84-e865-4368-9e37-e385dfcb5880" (UID: "40292f84-e865-4368-9e37-e385dfcb5880"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:30:20 crc kubenswrapper[4606]: I1212 00:30:20.969058 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40292f84-e865-4368-9e37-e385dfcb5880-kube-api-access-f9p24" (OuterVolumeSpecName: "kube-api-access-f9p24") pod "40292f84-e865-4368-9e37-e385dfcb5880" (UID: "40292f84-e865-4368-9e37-e385dfcb5880"). InnerVolumeSpecName "kube-api-access-f9p24". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:30:20 crc kubenswrapper[4606]: I1212 00:30:20.975193 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40292f84-e865-4368-9e37-e385dfcb5880-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "40292f84-e865-4368-9e37-e385dfcb5880" (UID: "40292f84-e865-4368-9e37-e385dfcb5880"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 00:30:21 crc kubenswrapper[4606]: I1212 00:30:21.059105 4606 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/40292f84-e865-4368-9e37-e385dfcb5880-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 12 00:30:21 crc kubenswrapper[4606]: I1212 00:30:21.059141 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f9p24\" (UniqueName: \"kubernetes.io/projected/40292f84-e865-4368-9e37-e385dfcb5880-kube-api-access-f9p24\") on node \"crc\" DevicePath \"\"" Dec 12 00:30:21 crc kubenswrapper[4606]: I1212 00:30:21.059153 4606 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/40292f84-e865-4368-9e37-e385dfcb5880-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 12 00:30:21 crc kubenswrapper[4606]: I1212 00:30:21.059162 4606 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/40292f84-e865-4368-9e37-e385dfcb5880-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 12 00:30:21 crc kubenswrapper[4606]: I1212 00:30:21.059184 4606 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/40292f84-e865-4368-9e37-e385dfcb5880-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 12 00:30:21 crc kubenswrapper[4606]: I1212 00:30:21.199308 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-t8nn7"] Dec 12 00:30:21 crc kubenswrapper[4606]: I1212 00:30:21.205229 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-t8nn7"] Dec 12 00:30:21 crc kubenswrapper[4606]: I1212 00:30:21.716623 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40292f84-e865-4368-9e37-e385dfcb5880" path="/var/lib/kubelet/pods/40292f84-e865-4368-9e37-e385dfcb5880/volumes" Dec 12 00:32:02 crc kubenswrapper[4606]: I1212 00:32:02.010401 4606 patch_prober.go:28] interesting pod/machine-config-daemon-cqmz5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 12 00:32:02 crc kubenswrapper[4606]: I1212 00:32:02.011307 4606 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 12 00:32:32 crc kubenswrapper[4606]: I1212 00:32:32.010500 4606 patch_prober.go:28] interesting pod/machine-config-daemon-cqmz5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 12 00:32:32 crc kubenswrapper[4606]: I1212 00:32:32.011161 4606 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 12 00:33:02 crc kubenswrapper[4606]: I1212 00:33:02.010538 4606 patch_prober.go:28] interesting pod/machine-config-daemon-cqmz5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 12 00:33:02 crc kubenswrapper[4606]: I1212 00:33:02.011313 4606 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 12 00:33:02 crc kubenswrapper[4606]: I1212 00:33:02.011377 4606 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" Dec 12 00:33:02 crc kubenswrapper[4606]: I1212 00:33:02.011957 4606 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2880ad6f88a790c73e66326c829742ee2381f40cd5326754b1a99b3312add678"} pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 12 00:33:02 crc kubenswrapper[4606]: I1212 00:33:02.012013 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" containerName="machine-config-daemon" containerID="cri-o://2880ad6f88a790c73e66326c829742ee2381f40cd5326754b1a99b3312add678" gracePeriod=600 Dec 12 00:33:02 crc kubenswrapper[4606]: I1212 00:33:02.922506 4606 generic.go:334] "Generic (PLEG): container finished" podID="a543e227-be89-40cb-941d-b4707cc28921" containerID="2880ad6f88a790c73e66326c829742ee2381f40cd5326754b1a99b3312add678" exitCode=0 Dec 12 00:33:02 crc kubenswrapper[4606]: I1212 00:33:02.922607 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" event={"ID":"a543e227-be89-40cb-941d-b4707cc28921","Type":"ContainerDied","Data":"2880ad6f88a790c73e66326c829742ee2381f40cd5326754b1a99b3312add678"} Dec 12 00:33:02 crc kubenswrapper[4606]: I1212 00:33:02.922941 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" event={"ID":"a543e227-be89-40cb-941d-b4707cc28921","Type":"ContainerStarted","Data":"1873e8515b38b39e992466285ce6933f345b21f7fe695ca304e250f3437cff70"} Dec 12 00:33:02 crc kubenswrapper[4606]: I1212 00:33:02.922980 4606 scope.go:117] "RemoveContainer" containerID="a395445cb02f1ca45c19ddbb727d5c765fe5792a9edbe62b6d34f236f52c7139" Dec 12 00:35:02 crc kubenswrapper[4606]: I1212 00:35:02.010500 4606 patch_prober.go:28] interesting pod/machine-config-daemon-cqmz5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 12 00:35:02 crc kubenswrapper[4606]: I1212 00:35:02.012421 4606 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 12 00:35:32 crc kubenswrapper[4606]: I1212 00:35:32.010286 4606 patch_prober.go:28] interesting pod/machine-config-daemon-cqmz5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 12 00:35:32 crc kubenswrapper[4606]: I1212 00:35:32.011863 4606 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 12 00:36:00 crc kubenswrapper[4606]: I1212 00:36:00.419922 4606 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 12 00:36:02 crc kubenswrapper[4606]: I1212 00:36:02.010115 4606 patch_prober.go:28] interesting pod/machine-config-daemon-cqmz5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 12 00:36:02 crc kubenswrapper[4606]: I1212 00:36:02.010472 4606 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 12 00:36:02 crc kubenswrapper[4606]: I1212 00:36:02.010559 4606 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" Dec 12 00:36:02 crc kubenswrapper[4606]: I1212 00:36:02.011118 4606 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1873e8515b38b39e992466285ce6933f345b21f7fe695ca304e250f3437cff70"} pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 12 00:36:02 crc kubenswrapper[4606]: I1212 00:36:02.011201 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" containerName="machine-config-daemon" containerID="cri-o://1873e8515b38b39e992466285ce6933f345b21f7fe695ca304e250f3437cff70" gracePeriod=600 Dec 12 00:36:03 crc kubenswrapper[4606]: I1212 00:36:03.094147 4606 generic.go:334] "Generic (PLEG): container finished" podID="a543e227-be89-40cb-941d-b4707cc28921" containerID="1873e8515b38b39e992466285ce6933f345b21f7fe695ca304e250f3437cff70" exitCode=0 Dec 12 00:36:03 crc kubenswrapper[4606]: I1212 00:36:03.094244 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" event={"ID":"a543e227-be89-40cb-941d-b4707cc28921","Type":"ContainerDied","Data":"1873e8515b38b39e992466285ce6933f345b21f7fe695ca304e250f3437cff70"} Dec 12 00:36:03 crc kubenswrapper[4606]: I1212 00:36:03.094694 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" event={"ID":"a543e227-be89-40cb-941d-b4707cc28921","Type":"ContainerStarted","Data":"d38d273cf66284763f6d7e3888567975579d39fc9a7372cc2cae90d2dfe8ce04"} Dec 12 00:36:03 crc kubenswrapper[4606]: I1212 00:36:03.094735 4606 scope.go:117] "RemoveContainer" containerID="2880ad6f88a790c73e66326c829742ee2381f40cd5326754b1a99b3312add678" Dec 12 00:37:27 crc kubenswrapper[4606]: I1212 00:37:27.969263 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-vgqkv"] Dec 12 00:37:27 crc kubenswrapper[4606]: E1212 00:37:27.969905 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40292f84-e865-4368-9e37-e385dfcb5880" containerName="registry" Dec 12 00:37:27 crc kubenswrapper[4606]: I1212 00:37:27.969917 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="40292f84-e865-4368-9e37-e385dfcb5880" containerName="registry" Dec 12 00:37:27 crc kubenswrapper[4606]: E1212 00:37:27.969937 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ee5a4a4-3f2c-4e71-be6e-4b95896ebcf2" containerName="collect-profiles" Dec 12 00:37:27 crc kubenswrapper[4606]: I1212 00:37:27.969943 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ee5a4a4-3f2c-4e71-be6e-4b95896ebcf2" containerName="collect-profiles" Dec 12 00:37:27 crc kubenswrapper[4606]: I1212 00:37:27.970033 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="40292f84-e865-4368-9e37-e385dfcb5880" containerName="registry" Dec 12 00:37:27 crc kubenswrapper[4606]: I1212 00:37:27.970045 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ee5a4a4-3f2c-4e71-be6e-4b95896ebcf2" containerName="collect-profiles" Dec 12 00:37:27 crc kubenswrapper[4606]: I1212 00:37:27.970413 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-vgqkv" Dec 12 00:37:27 crc kubenswrapper[4606]: I1212 00:37:27.972277 4606 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-tsrsr" Dec 12 00:37:27 crc kubenswrapper[4606]: I1212 00:37:27.972429 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Dec 12 00:37:27 crc kubenswrapper[4606]: I1212 00:37:27.973814 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Dec 12 00:37:27 crc kubenswrapper[4606]: I1212 00:37:27.984788 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-vgqkv"] Dec 12 00:37:27 crc kubenswrapper[4606]: I1212 00:37:27.997524 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-5b446d88c5-8wvkl"] Dec 12 00:37:27 crc kubenswrapper[4606]: I1212 00:37:27.998263 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-8wvkl" Dec 12 00:37:28 crc kubenswrapper[4606]: I1212 00:37:28.001909 4606 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-cfw68" Dec 12 00:37:28 crc kubenswrapper[4606]: I1212 00:37:28.018819 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-8wvkl"] Dec 12 00:37:28 crc kubenswrapper[4606]: I1212 00:37:28.022349 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-2vtxw"] Dec 12 00:37:28 crc kubenswrapper[4606]: I1212 00:37:28.022964 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-2vtxw" Dec 12 00:37:28 crc kubenswrapper[4606]: I1212 00:37:28.029978 4606 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-qkvsc" Dec 12 00:37:28 crc kubenswrapper[4606]: I1212 00:37:28.039163 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-2vtxw"] Dec 12 00:37:28 crc kubenswrapper[4606]: I1212 00:37:28.046503 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dn62\" (UniqueName: \"kubernetes.io/projected/95dcdca5-05de-43d2-a86c-757b112d1cd5-kube-api-access-4dn62\") pod \"cert-manager-cainjector-7f985d654d-vgqkv\" (UID: \"95dcdca5-05de-43d2-a86c-757b112d1cd5\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-vgqkv" Dec 12 00:37:28 crc kubenswrapper[4606]: I1212 00:37:28.046597 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdshq\" (UniqueName: \"kubernetes.io/projected/4def20cb-2590-41e7-9c98-6fd10a84d049-kube-api-access-qdshq\") pod \"cert-manager-5b446d88c5-8wvkl\" (UID: \"4def20cb-2590-41e7-9c98-6fd10a84d049\") " pod="cert-manager/cert-manager-5b446d88c5-8wvkl" Dec 12 00:37:28 crc kubenswrapper[4606]: I1212 00:37:28.147320 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdshq\" (UniqueName: \"kubernetes.io/projected/4def20cb-2590-41e7-9c98-6fd10a84d049-kube-api-access-qdshq\") pod \"cert-manager-5b446d88c5-8wvkl\" (UID: \"4def20cb-2590-41e7-9c98-6fd10a84d049\") " pod="cert-manager/cert-manager-5b446d88c5-8wvkl" Dec 12 00:37:28 crc kubenswrapper[4606]: I1212 00:37:28.147363 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8knw\" (UniqueName: \"kubernetes.io/projected/886ac20b-ef3b-459a-8539-6a7040bcd6fb-kube-api-access-t8knw\") pod \"cert-manager-webhook-5655c58dd6-2vtxw\" (UID: \"886ac20b-ef3b-459a-8539-6a7040bcd6fb\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-2vtxw" Dec 12 00:37:28 crc kubenswrapper[4606]: I1212 00:37:28.147413 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4dn62\" (UniqueName: \"kubernetes.io/projected/95dcdca5-05de-43d2-a86c-757b112d1cd5-kube-api-access-4dn62\") pod \"cert-manager-cainjector-7f985d654d-vgqkv\" (UID: \"95dcdca5-05de-43d2-a86c-757b112d1cd5\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-vgqkv" Dec 12 00:37:28 crc kubenswrapper[4606]: I1212 00:37:28.164237 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdshq\" (UniqueName: \"kubernetes.io/projected/4def20cb-2590-41e7-9c98-6fd10a84d049-kube-api-access-qdshq\") pod \"cert-manager-5b446d88c5-8wvkl\" (UID: \"4def20cb-2590-41e7-9c98-6fd10a84d049\") " pod="cert-manager/cert-manager-5b446d88c5-8wvkl" Dec 12 00:37:28 crc kubenswrapper[4606]: I1212 00:37:28.170522 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dn62\" (UniqueName: \"kubernetes.io/projected/95dcdca5-05de-43d2-a86c-757b112d1cd5-kube-api-access-4dn62\") pod \"cert-manager-cainjector-7f985d654d-vgqkv\" (UID: \"95dcdca5-05de-43d2-a86c-757b112d1cd5\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-vgqkv" Dec 12 00:37:28 crc kubenswrapper[4606]: I1212 00:37:28.247992 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8knw\" (UniqueName: \"kubernetes.io/projected/886ac20b-ef3b-459a-8539-6a7040bcd6fb-kube-api-access-t8knw\") pod \"cert-manager-webhook-5655c58dd6-2vtxw\" (UID: \"886ac20b-ef3b-459a-8539-6a7040bcd6fb\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-2vtxw" Dec 12 00:37:28 crc kubenswrapper[4606]: I1212 00:37:28.266848 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8knw\" (UniqueName: \"kubernetes.io/projected/886ac20b-ef3b-459a-8539-6a7040bcd6fb-kube-api-access-t8knw\") pod \"cert-manager-webhook-5655c58dd6-2vtxw\" (UID: \"886ac20b-ef3b-459a-8539-6a7040bcd6fb\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-2vtxw" Dec 12 00:37:28 crc kubenswrapper[4606]: I1212 00:37:28.284512 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-vgqkv" Dec 12 00:37:28 crc kubenswrapper[4606]: I1212 00:37:28.318450 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-8wvkl" Dec 12 00:37:28 crc kubenswrapper[4606]: I1212 00:37:28.336132 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-2vtxw" Dec 12 00:37:28 crc kubenswrapper[4606]: I1212 00:37:28.569029 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-vgqkv"] Dec 12 00:37:28 crc kubenswrapper[4606]: I1212 00:37:28.583073 4606 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 12 00:37:28 crc kubenswrapper[4606]: I1212 00:37:28.622830 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-vgqkv" event={"ID":"95dcdca5-05de-43d2-a86c-757b112d1cd5","Type":"ContainerStarted","Data":"31c73a0cd310a0f126b2333fad8f4568f3f96043ff41f67a79352b68e5694286"} Dec 12 00:37:28 crc kubenswrapper[4606]: I1212 00:37:28.654942 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-2vtxw"] Dec 12 00:37:28 crc kubenswrapper[4606]: W1212 00:37:28.661207 4606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod886ac20b_ef3b_459a_8539_6a7040bcd6fb.slice/crio-a919e0d08b0c7d89c971096aa8aeb3baf2eb0e66512da45b9e8f876faca7ba22 WatchSource:0}: Error finding container a919e0d08b0c7d89c971096aa8aeb3baf2eb0e66512da45b9e8f876faca7ba22: Status 404 returned error can't find the container with id a919e0d08b0c7d89c971096aa8aeb3baf2eb0e66512da45b9e8f876faca7ba22 Dec 12 00:37:28 crc kubenswrapper[4606]: I1212 00:37:28.821645 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-8wvkl"] Dec 12 00:37:28 crc kubenswrapper[4606]: W1212 00:37:28.827392 4606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4def20cb_2590_41e7_9c98_6fd10a84d049.slice/crio-e3ca03ba275cbe5a07f21195c4493ddb4d518692bd90b6f8f08f7c71e0dcc2c1 WatchSource:0}: Error finding container e3ca03ba275cbe5a07f21195c4493ddb4d518692bd90b6f8f08f7c71e0dcc2c1: Status 404 returned error can't find the container with id e3ca03ba275cbe5a07f21195c4493ddb4d518692bd90b6f8f08f7c71e0dcc2c1 Dec 12 00:37:29 crc kubenswrapper[4606]: I1212 00:37:29.631101 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-8wvkl" event={"ID":"4def20cb-2590-41e7-9c98-6fd10a84d049","Type":"ContainerStarted","Data":"e3ca03ba275cbe5a07f21195c4493ddb4d518692bd90b6f8f08f7c71e0dcc2c1"} Dec 12 00:37:29 crc kubenswrapper[4606]: I1212 00:37:29.632151 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-2vtxw" event={"ID":"886ac20b-ef3b-459a-8539-6a7040bcd6fb","Type":"ContainerStarted","Data":"a919e0d08b0c7d89c971096aa8aeb3baf2eb0e66512da45b9e8f876faca7ba22"} Dec 12 00:37:31 crc kubenswrapper[4606]: I1212 00:37:31.644237 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-vgqkv" event={"ID":"95dcdca5-05de-43d2-a86c-757b112d1cd5","Type":"ContainerStarted","Data":"2116ee753ba221402393905939401c63d3cf499609345b249c02e08fb3a9c37d"} Dec 12 00:37:32 crc kubenswrapper[4606]: I1212 00:37:32.652819 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-8wvkl" event={"ID":"4def20cb-2590-41e7-9c98-6fd10a84d049","Type":"ContainerStarted","Data":"f7f6fddabec5445ac1a75258ae5c79c43972bf060df2183770433c42d7b95e61"} Dec 12 00:37:32 crc kubenswrapper[4606]: I1212 00:37:32.654836 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-2vtxw" event={"ID":"886ac20b-ef3b-459a-8539-6a7040bcd6fb","Type":"ContainerStarted","Data":"79f76347236900f547889d4063b043e2847e883cdfdabb9f174c0c4992470d1e"} Dec 12 00:37:32 crc kubenswrapper[4606]: I1212 00:37:32.678161 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7f985d654d-vgqkv" podStartSLOduration=3.569622809 podStartE2EDuration="5.678142542s" podCreationTimestamp="2025-12-12 00:37:27 +0000 UTC" firstStartedPulling="2025-12-12 00:37:28.582830566 +0000 UTC m=+839.128183432" lastFinishedPulling="2025-12-12 00:37:30.691350299 +0000 UTC m=+841.236703165" observedRunningTime="2025-12-12 00:37:31.662163756 +0000 UTC m=+842.207516632" watchObservedRunningTime="2025-12-12 00:37:32.678142542 +0000 UTC m=+843.223495418" Dec 12 00:37:32 crc kubenswrapper[4606]: I1212 00:37:32.680628 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-5b446d88c5-8wvkl" podStartSLOduration=2.628120206 podStartE2EDuration="5.680610399s" podCreationTimestamp="2025-12-12 00:37:27 +0000 UTC" firstStartedPulling="2025-12-12 00:37:28.829458484 +0000 UTC m=+839.374811360" lastFinishedPulling="2025-12-12 00:37:31.881948687 +0000 UTC m=+842.427301553" observedRunningTime="2025-12-12 00:37:32.676219741 +0000 UTC m=+843.221572627" watchObservedRunningTime="2025-12-12 00:37:32.680610399 +0000 UTC m=+843.225963285" Dec 12 00:37:33 crc kubenswrapper[4606]: I1212 00:37:33.336384 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-5655c58dd6-2vtxw" Dec 12 00:37:37 crc kubenswrapper[4606]: I1212 00:37:37.790352 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-5655c58dd6-2vtxw" podStartSLOduration=7.577470959 podStartE2EDuration="10.790326618s" podCreationTimestamp="2025-12-12 00:37:27 +0000 UTC" firstStartedPulling="2025-12-12 00:37:28.663010364 +0000 UTC m=+839.208363230" lastFinishedPulling="2025-12-12 00:37:31.875866023 +0000 UTC m=+842.421218889" observedRunningTime="2025-12-12 00:37:32.716845019 +0000 UTC m=+843.262197915" watchObservedRunningTime="2025-12-12 00:37:37.790326618 +0000 UTC m=+848.335679514" Dec 12 00:37:37 crc kubenswrapper[4606]: I1212 00:37:37.796735 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-hpw5w"] Dec 12 00:37:37 crc kubenswrapper[4606]: I1212 00:37:37.797428 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" podUID="da25b0ba-5398-4185-a4c1-aeba44ae5633" containerName="ovn-controller" containerID="cri-o://0867a89de2edf12036267a937b7e3b1a801e9710ff0a1250c20e020c80ac4d5d" gracePeriod=30 Dec 12 00:37:37 crc kubenswrapper[4606]: I1212 00:37:37.797550 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" podUID="da25b0ba-5398-4185-a4c1-aeba44ae5633" containerName="northd" containerID="cri-o://366ddc63e1156e702b24e0c3d3c02785c10324669fa851e46851f9a2b8a46a5f" gracePeriod=30 Dec 12 00:37:37 crc kubenswrapper[4606]: I1212 00:37:37.797643 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" podUID="da25b0ba-5398-4185-a4c1-aeba44ae5633" containerName="sbdb" containerID="cri-o://8e3c08f155fa3cef9692edf10c4df9719f3647b5625a9ac68a7c158afb80dd86" gracePeriod=30 Dec 12 00:37:37 crc kubenswrapper[4606]: I1212 00:37:37.797631 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" podUID="da25b0ba-5398-4185-a4c1-aeba44ae5633" containerName="ovn-acl-logging" containerID="cri-o://6183a4685fda61493c792e9ae693cd7f76e4aad9753f622b48347da3fd2b3b00" gracePeriod=30 Dec 12 00:37:37 crc kubenswrapper[4606]: I1212 00:37:37.797698 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" podUID="da25b0ba-5398-4185-a4c1-aeba44ae5633" containerName="nbdb" containerID="cri-o://dbb742680d55b47d51cd1964e187b6816e50a88befbce58d3a8746af74cf7071" gracePeriod=30 Dec 12 00:37:37 crc kubenswrapper[4606]: I1212 00:37:37.797597 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" podUID="da25b0ba-5398-4185-a4c1-aeba44ae5633" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://653bdfa6f15b75149c89a4aff0706e132368f8bc75520cde9f1e1c55c8866537" gracePeriod=30 Dec 12 00:37:37 crc kubenswrapper[4606]: I1212 00:37:37.799729 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" podUID="da25b0ba-5398-4185-a4c1-aeba44ae5633" containerName="kube-rbac-proxy-node" containerID="cri-o://fa827acce161127216bcfcf3553efeb627cbe52e4d23a8cb011e025f67dde670" gracePeriod=30 Dec 12 00:37:37 crc kubenswrapper[4606]: I1212 00:37:37.842408 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" podUID="da25b0ba-5398-4185-a4c1-aeba44ae5633" containerName="ovnkube-controller" containerID="cri-o://6bfebed57887da9d277d1b3dc30ba0ef65a624b36ede841840c041551ed0b201" gracePeriod=30 Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.117011 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hpw5w_da25b0ba-5398-4185-a4c1-aeba44ae5633/ovnkube-controller/3.log" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.119535 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hpw5w_da25b0ba-5398-4185-a4c1-aeba44ae5633/ovn-acl-logging/0.log" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.120743 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hpw5w_da25b0ba-5398-4185-a4c1-aeba44ae5633/ovn-controller/0.log" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.121194 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.171918 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-nklsf"] Dec 12 00:37:38 crc kubenswrapper[4606]: E1212 00:37:38.172350 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da25b0ba-5398-4185-a4c1-aeba44ae5633" containerName="kube-rbac-proxy-ovn-metrics" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.172434 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="da25b0ba-5398-4185-a4c1-aeba44ae5633" containerName="kube-rbac-proxy-ovn-metrics" Dec 12 00:37:38 crc kubenswrapper[4606]: E1212 00:37:38.172503 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da25b0ba-5398-4185-a4c1-aeba44ae5633" containerName="ovnkube-controller" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.172563 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="da25b0ba-5398-4185-a4c1-aeba44ae5633" containerName="ovnkube-controller" Dec 12 00:37:38 crc kubenswrapper[4606]: E1212 00:37:38.172618 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da25b0ba-5398-4185-a4c1-aeba44ae5633" containerName="kubecfg-setup" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.172699 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="da25b0ba-5398-4185-a4c1-aeba44ae5633" containerName="kubecfg-setup" Dec 12 00:37:38 crc kubenswrapper[4606]: E1212 00:37:38.172769 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da25b0ba-5398-4185-a4c1-aeba44ae5633" containerName="nbdb" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.172829 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="da25b0ba-5398-4185-a4c1-aeba44ae5633" containerName="nbdb" Dec 12 00:37:38 crc kubenswrapper[4606]: E1212 00:37:38.172895 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da25b0ba-5398-4185-a4c1-aeba44ae5633" containerName="sbdb" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.172957 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="da25b0ba-5398-4185-a4c1-aeba44ae5633" containerName="sbdb" Dec 12 00:37:38 crc kubenswrapper[4606]: E1212 00:37:38.173023 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da25b0ba-5398-4185-a4c1-aeba44ae5633" containerName="ovn-acl-logging" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.173083 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="da25b0ba-5398-4185-a4c1-aeba44ae5633" containerName="ovn-acl-logging" Dec 12 00:37:38 crc kubenswrapper[4606]: E1212 00:37:38.173154 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da25b0ba-5398-4185-a4c1-aeba44ae5633" containerName="ovnkube-controller" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.173238 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="da25b0ba-5398-4185-a4c1-aeba44ae5633" containerName="ovnkube-controller" Dec 12 00:37:38 crc kubenswrapper[4606]: E1212 00:37:38.173306 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da25b0ba-5398-4185-a4c1-aeba44ae5633" containerName="ovn-controller" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.173361 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="da25b0ba-5398-4185-a4c1-aeba44ae5633" containerName="ovn-controller" Dec 12 00:37:38 crc kubenswrapper[4606]: E1212 00:37:38.173421 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da25b0ba-5398-4185-a4c1-aeba44ae5633" containerName="ovnkube-controller" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.173480 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="da25b0ba-5398-4185-a4c1-aeba44ae5633" containerName="ovnkube-controller" Dec 12 00:37:38 crc kubenswrapper[4606]: E1212 00:37:38.173553 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da25b0ba-5398-4185-a4c1-aeba44ae5633" containerName="kube-rbac-proxy-node" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.173615 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="da25b0ba-5398-4185-a4c1-aeba44ae5633" containerName="kube-rbac-proxy-node" Dec 12 00:37:38 crc kubenswrapper[4606]: E1212 00:37:38.173680 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da25b0ba-5398-4185-a4c1-aeba44ae5633" containerName="northd" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.173740 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="da25b0ba-5398-4185-a4c1-aeba44ae5633" containerName="northd" Dec 12 00:37:38 crc kubenswrapper[4606]: E1212 00:37:38.173801 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da25b0ba-5398-4185-a4c1-aeba44ae5633" containerName="ovnkube-controller" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.173861 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="da25b0ba-5398-4185-a4c1-aeba44ae5633" containerName="ovnkube-controller" Dec 12 00:37:38 crc kubenswrapper[4606]: E1212 00:37:38.173938 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da25b0ba-5398-4185-a4c1-aeba44ae5633" containerName="ovnkube-controller" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.174005 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="da25b0ba-5398-4185-a4c1-aeba44ae5633" containerName="ovnkube-controller" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.174166 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="da25b0ba-5398-4185-a4c1-aeba44ae5633" containerName="ovn-controller" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.174272 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="da25b0ba-5398-4185-a4c1-aeba44ae5633" containerName="ovnkube-controller" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.174355 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="da25b0ba-5398-4185-a4c1-aeba44ae5633" containerName="ovnkube-controller" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.174457 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="da25b0ba-5398-4185-a4c1-aeba44ae5633" containerName="sbdb" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.174511 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="da25b0ba-5398-4185-a4c1-aeba44ae5633" containerName="ovnkube-controller" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.174558 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="da25b0ba-5398-4185-a4c1-aeba44ae5633" containerName="nbdb" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.174606 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="da25b0ba-5398-4185-a4c1-aeba44ae5633" containerName="northd" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.174661 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="da25b0ba-5398-4185-a4c1-aeba44ae5633" containerName="ovnkube-controller" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.174710 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="da25b0ba-5398-4185-a4c1-aeba44ae5633" containerName="ovn-acl-logging" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.174756 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="da25b0ba-5398-4185-a4c1-aeba44ae5633" containerName="kube-rbac-proxy-node" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.174803 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="da25b0ba-5398-4185-a4c1-aeba44ae5633" containerName="kube-rbac-proxy-ovn-metrics" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.175007 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="da25b0ba-5398-4185-a4c1-aeba44ae5633" containerName="ovnkube-controller" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.176579 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-nklsf" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.183352 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/da25b0ba-5398-4185-a4c1-aeba44ae5633-node-log\") pod \"da25b0ba-5398-4185-a4c1-aeba44ae5633\" (UID: \"da25b0ba-5398-4185-a4c1-aeba44ae5633\") " Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.183399 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/da25b0ba-5398-4185-a4c1-aeba44ae5633-etc-openvswitch\") pod \"da25b0ba-5398-4185-a4c1-aeba44ae5633\" (UID: \"da25b0ba-5398-4185-a4c1-aeba44ae5633\") " Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.183433 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/da25b0ba-5398-4185-a4c1-aeba44ae5633-var-lib-openvswitch\") pod \"da25b0ba-5398-4185-a4c1-aeba44ae5633\" (UID: \"da25b0ba-5398-4185-a4c1-aeba44ae5633\") " Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.183449 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/da25b0ba-5398-4185-a4c1-aeba44ae5633-log-socket\") pod \"da25b0ba-5398-4185-a4c1-aeba44ae5633\" (UID: \"da25b0ba-5398-4185-a4c1-aeba44ae5633\") " Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.183456 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/da25b0ba-5398-4185-a4c1-aeba44ae5633-node-log" (OuterVolumeSpecName: "node-log") pod "da25b0ba-5398-4185-a4c1-aeba44ae5633" (UID: "da25b0ba-5398-4185-a4c1-aeba44ae5633"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.183464 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/da25b0ba-5398-4185-a4c1-aeba44ae5633-systemd-units\") pod \"da25b0ba-5398-4185-a4c1-aeba44ae5633\" (UID: \"da25b0ba-5398-4185-a4c1-aeba44ae5633\") " Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.183485 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/da25b0ba-5398-4185-a4c1-aeba44ae5633-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "da25b0ba-5398-4185-a4c1-aeba44ae5633" (UID: "da25b0ba-5398-4185-a4c1-aeba44ae5633"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.183506 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/da25b0ba-5398-4185-a4c1-aeba44ae5633-log-socket" (OuterVolumeSpecName: "log-socket") pod "da25b0ba-5398-4185-a4c1-aeba44ae5633" (UID: "da25b0ba-5398-4185-a4c1-aeba44ae5633"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.183520 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/da25b0ba-5398-4185-a4c1-aeba44ae5633-ovnkube-config\") pod \"da25b0ba-5398-4185-a4c1-aeba44ae5633\" (UID: \"da25b0ba-5398-4185-a4c1-aeba44ae5633\") " Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.183525 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/da25b0ba-5398-4185-a4c1-aeba44ae5633-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "da25b0ba-5398-4185-a4c1-aeba44ae5633" (UID: "da25b0ba-5398-4185-a4c1-aeba44ae5633"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.183526 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/da25b0ba-5398-4185-a4c1-aeba44ae5633-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "da25b0ba-5398-4185-a4c1-aeba44ae5633" (UID: "da25b0ba-5398-4185-a4c1-aeba44ae5633"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.183556 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/da25b0ba-5398-4185-a4c1-aeba44ae5633-host-run-ovn-kubernetes\") pod \"da25b0ba-5398-4185-a4c1-aeba44ae5633\" (UID: \"da25b0ba-5398-4185-a4c1-aeba44ae5633\") " Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.183581 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/da25b0ba-5398-4185-a4c1-aeba44ae5633-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "da25b0ba-5398-4185-a4c1-aeba44ae5633" (UID: "da25b0ba-5398-4185-a4c1-aeba44ae5633"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.183624 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/da25b0ba-5398-4185-a4c1-aeba44ae5633-host-run-netns\") pod \"da25b0ba-5398-4185-a4c1-aeba44ae5633\" (UID: \"da25b0ba-5398-4185-a4c1-aeba44ae5633\") " Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.183659 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/da25b0ba-5398-4185-a4c1-aeba44ae5633-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "da25b0ba-5398-4185-a4c1-aeba44ae5633" (UID: "da25b0ba-5398-4185-a4c1-aeba44ae5633"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.183681 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/da25b0ba-5398-4185-a4c1-aeba44ae5633-run-ovn\") pod \"da25b0ba-5398-4185-a4c1-aeba44ae5633\" (UID: \"da25b0ba-5398-4185-a4c1-aeba44ae5633\") " Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.183710 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/da25b0ba-5398-4185-a4c1-aeba44ae5633-host-slash\") pod \"da25b0ba-5398-4185-a4c1-aeba44ae5633\" (UID: \"da25b0ba-5398-4185-a4c1-aeba44ae5633\") " Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.183750 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/da25b0ba-5398-4185-a4c1-aeba44ae5633-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "da25b0ba-5398-4185-a4c1-aeba44ae5633" (UID: "da25b0ba-5398-4185-a4c1-aeba44ae5633"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.183750 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/da25b0ba-5398-4185-a4c1-aeba44ae5633-host-kubelet\") pod \"da25b0ba-5398-4185-a4c1-aeba44ae5633\" (UID: \"da25b0ba-5398-4185-a4c1-aeba44ae5633\") " Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.183783 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/da25b0ba-5398-4185-a4c1-aeba44ae5633-run-openvswitch\") pod \"da25b0ba-5398-4185-a4c1-aeba44ae5633\" (UID: \"da25b0ba-5398-4185-a4c1-aeba44ae5633\") " Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.183783 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/da25b0ba-5398-4185-a4c1-aeba44ae5633-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "da25b0ba-5398-4185-a4c1-aeba44ae5633" (UID: "da25b0ba-5398-4185-a4c1-aeba44ae5633"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.183802 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/da25b0ba-5398-4185-a4c1-aeba44ae5633-host-cni-bin\") pod \"da25b0ba-5398-4185-a4c1-aeba44ae5633\" (UID: \"da25b0ba-5398-4185-a4c1-aeba44ae5633\") " Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.183824 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/da25b0ba-5398-4185-a4c1-aeba44ae5633-host-slash" (OuterVolumeSpecName: "host-slash") pod "da25b0ba-5398-4185-a4c1-aeba44ae5633" (UID: "da25b0ba-5398-4185-a4c1-aeba44ae5633"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.183831 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/da25b0ba-5398-4185-a4c1-aeba44ae5633-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "da25b0ba-5398-4185-a4c1-aeba44ae5633" (UID: "da25b0ba-5398-4185-a4c1-aeba44ae5633"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.183839 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/da25b0ba-5398-4185-a4c1-aeba44ae5633-ovn-node-metrics-cert\") pod \"da25b0ba-5398-4185-a4c1-aeba44ae5633\" (UID: \"da25b0ba-5398-4185-a4c1-aeba44ae5633\") " Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.183887 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/da25b0ba-5398-4185-a4c1-aeba44ae5633-env-overrides\") pod \"da25b0ba-5398-4185-a4c1-aeba44ae5633\" (UID: \"da25b0ba-5398-4185-a4c1-aeba44ae5633\") " Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.183918 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/da25b0ba-5398-4185-a4c1-aeba44ae5633-host-var-lib-cni-networks-ovn-kubernetes\") pod \"da25b0ba-5398-4185-a4c1-aeba44ae5633\" (UID: \"da25b0ba-5398-4185-a4c1-aeba44ae5633\") " Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.183940 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/da25b0ba-5398-4185-a4c1-aeba44ae5633-host-cni-netd\") pod \"da25b0ba-5398-4185-a4c1-aeba44ae5633\" (UID: \"da25b0ba-5398-4185-a4c1-aeba44ae5633\") " Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.183888 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/da25b0ba-5398-4185-a4c1-aeba44ae5633-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "da25b0ba-5398-4185-a4c1-aeba44ae5633" (UID: "da25b0ba-5398-4185-a4c1-aeba44ae5633"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.183962 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/da25b0ba-5398-4185-a4c1-aeba44ae5633-run-systemd\") pod \"da25b0ba-5398-4185-a4c1-aeba44ae5633\" (UID: \"da25b0ba-5398-4185-a4c1-aeba44ae5633\") " Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.184030 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/da25b0ba-5398-4185-a4c1-aeba44ae5633-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "da25b0ba-5398-4185-a4c1-aeba44ae5633" (UID: "da25b0ba-5398-4185-a4c1-aeba44ae5633"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.184048 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hbcl8\" (UniqueName: \"kubernetes.io/projected/da25b0ba-5398-4185-a4c1-aeba44ae5633-kube-api-access-hbcl8\") pod \"da25b0ba-5398-4185-a4c1-aeba44ae5633\" (UID: \"da25b0ba-5398-4185-a4c1-aeba44ae5633\") " Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.184073 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/da25b0ba-5398-4185-a4c1-aeba44ae5633-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "da25b0ba-5398-4185-a4c1-aeba44ae5633" (UID: "da25b0ba-5398-4185-a4c1-aeba44ae5633"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.184088 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/da25b0ba-5398-4185-a4c1-aeba44ae5633-ovnkube-script-lib\") pod \"da25b0ba-5398-4185-a4c1-aeba44ae5633\" (UID: \"da25b0ba-5398-4185-a4c1-aeba44ae5633\") " Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.184103 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da25b0ba-5398-4185-a4c1-aeba44ae5633-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "da25b0ba-5398-4185-a4c1-aeba44ae5633" (UID: "da25b0ba-5398-4185-a4c1-aeba44ae5633"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.184221 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da25b0ba-5398-4185-a4c1-aeba44ae5633-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "da25b0ba-5398-4185-a4c1-aeba44ae5633" (UID: "da25b0ba-5398-4185-a4c1-aeba44ae5633"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.184363 4606 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/da25b0ba-5398-4185-a4c1-aeba44ae5633-node-log\") on node \"crc\" DevicePath \"\"" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.184379 4606 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/da25b0ba-5398-4185-a4c1-aeba44ae5633-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.184393 4606 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/da25b0ba-5398-4185-a4c1-aeba44ae5633-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.184404 4606 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/da25b0ba-5398-4185-a4c1-aeba44ae5633-log-socket\") on node \"crc\" DevicePath \"\"" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.184414 4606 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/da25b0ba-5398-4185-a4c1-aeba44ae5633-systemd-units\") on node \"crc\" DevicePath \"\"" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.184425 4606 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/da25b0ba-5398-4185-a4c1-aeba44ae5633-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.184435 4606 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/da25b0ba-5398-4185-a4c1-aeba44ae5633-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.184446 4606 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/da25b0ba-5398-4185-a4c1-aeba44ae5633-host-run-netns\") on node \"crc\" DevicePath \"\"" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.184457 4606 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/da25b0ba-5398-4185-a4c1-aeba44ae5633-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.184466 4606 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/da25b0ba-5398-4185-a4c1-aeba44ae5633-host-slash\") on node \"crc\" DevicePath \"\"" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.184476 4606 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/da25b0ba-5398-4185-a4c1-aeba44ae5633-host-kubelet\") on node \"crc\" DevicePath \"\"" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.184486 4606 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/da25b0ba-5398-4185-a4c1-aeba44ae5633-run-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.184496 4606 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/da25b0ba-5398-4185-a4c1-aeba44ae5633-host-cni-bin\") on node \"crc\" DevicePath \"\"" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.184505 4606 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/da25b0ba-5398-4185-a4c1-aeba44ae5633-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.184515 4606 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/da25b0ba-5398-4185-a4c1-aeba44ae5633-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.184526 4606 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/da25b0ba-5398-4185-a4c1-aeba44ae5633-host-cni-netd\") on node \"crc\" DevicePath \"\"" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.184550 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da25b0ba-5398-4185-a4c1-aeba44ae5633-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "da25b0ba-5398-4185-a4c1-aeba44ae5633" (UID: "da25b0ba-5398-4185-a4c1-aeba44ae5633"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.189166 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da25b0ba-5398-4185-a4c1-aeba44ae5633-kube-api-access-hbcl8" (OuterVolumeSpecName: "kube-api-access-hbcl8") pod "da25b0ba-5398-4185-a4c1-aeba44ae5633" (UID: "da25b0ba-5398-4185-a4c1-aeba44ae5633"). InnerVolumeSpecName "kube-api-access-hbcl8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.190237 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da25b0ba-5398-4185-a4c1-aeba44ae5633-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "da25b0ba-5398-4185-a4c1-aeba44ae5633" (UID: "da25b0ba-5398-4185-a4c1-aeba44ae5633"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.198858 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/da25b0ba-5398-4185-a4c1-aeba44ae5633-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "da25b0ba-5398-4185-a4c1-aeba44ae5633" (UID: "da25b0ba-5398-4185-a4c1-aeba44ae5633"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.285751 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c3161e84-a5cb-41b9-adad-c9b4ab79b746-node-log\") pod \"ovnkube-node-nklsf\" (UID: \"c3161e84-a5cb-41b9-adad-c9b4ab79b746\") " pod="openshift-ovn-kubernetes/ovnkube-node-nklsf" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.285820 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c3161e84-a5cb-41b9-adad-c9b4ab79b746-ovnkube-config\") pod \"ovnkube-node-nklsf\" (UID: \"c3161e84-a5cb-41b9-adad-c9b4ab79b746\") " pod="openshift-ovn-kubernetes/ovnkube-node-nklsf" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.285852 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c3161e84-a5cb-41b9-adad-c9b4ab79b746-run-openvswitch\") pod \"ovnkube-node-nklsf\" (UID: \"c3161e84-a5cb-41b9-adad-c9b4ab79b746\") " pod="openshift-ovn-kubernetes/ovnkube-node-nklsf" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.285871 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c3161e84-a5cb-41b9-adad-c9b4ab79b746-host-cni-netd\") pod \"ovnkube-node-nklsf\" (UID: \"c3161e84-a5cb-41b9-adad-c9b4ab79b746\") " pod="openshift-ovn-kubernetes/ovnkube-node-nklsf" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.285908 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c3161e84-a5cb-41b9-adad-c9b4ab79b746-run-ovn\") pod \"ovnkube-node-nklsf\" (UID: \"c3161e84-a5cb-41b9-adad-c9b4ab79b746\") " pod="openshift-ovn-kubernetes/ovnkube-node-nklsf" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.285928 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c3161e84-a5cb-41b9-adad-c9b4ab79b746-env-overrides\") pod \"ovnkube-node-nklsf\" (UID: \"c3161e84-a5cb-41b9-adad-c9b4ab79b746\") " pod="openshift-ovn-kubernetes/ovnkube-node-nklsf" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.285946 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c3161e84-a5cb-41b9-adad-c9b4ab79b746-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-nklsf\" (UID: \"c3161e84-a5cb-41b9-adad-c9b4ab79b746\") " pod="openshift-ovn-kubernetes/ovnkube-node-nklsf" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.285962 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gpnk\" (UniqueName: \"kubernetes.io/projected/c3161e84-a5cb-41b9-adad-c9b4ab79b746-kube-api-access-7gpnk\") pod \"ovnkube-node-nklsf\" (UID: \"c3161e84-a5cb-41b9-adad-c9b4ab79b746\") " pod="openshift-ovn-kubernetes/ovnkube-node-nklsf" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.285979 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c3161e84-a5cb-41b9-adad-c9b4ab79b746-run-systemd\") pod \"ovnkube-node-nklsf\" (UID: \"c3161e84-a5cb-41b9-adad-c9b4ab79b746\") " pod="openshift-ovn-kubernetes/ovnkube-node-nklsf" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.285993 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c3161e84-a5cb-41b9-adad-c9b4ab79b746-etc-openvswitch\") pod \"ovnkube-node-nklsf\" (UID: \"c3161e84-a5cb-41b9-adad-c9b4ab79b746\") " pod="openshift-ovn-kubernetes/ovnkube-node-nklsf" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.286012 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c3161e84-a5cb-41b9-adad-c9b4ab79b746-host-kubelet\") pod \"ovnkube-node-nklsf\" (UID: \"c3161e84-a5cb-41b9-adad-c9b4ab79b746\") " pod="openshift-ovn-kubernetes/ovnkube-node-nklsf" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.286026 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c3161e84-a5cb-41b9-adad-c9b4ab79b746-var-lib-openvswitch\") pod \"ovnkube-node-nklsf\" (UID: \"c3161e84-a5cb-41b9-adad-c9b4ab79b746\") " pod="openshift-ovn-kubernetes/ovnkube-node-nklsf" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.286039 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c3161e84-a5cb-41b9-adad-c9b4ab79b746-host-cni-bin\") pod \"ovnkube-node-nklsf\" (UID: \"c3161e84-a5cb-41b9-adad-c9b4ab79b746\") " pod="openshift-ovn-kubernetes/ovnkube-node-nklsf" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.286055 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c3161e84-a5cb-41b9-adad-c9b4ab79b746-ovnkube-script-lib\") pod \"ovnkube-node-nklsf\" (UID: \"c3161e84-a5cb-41b9-adad-c9b4ab79b746\") " pod="openshift-ovn-kubernetes/ovnkube-node-nklsf" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.286071 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c3161e84-a5cb-41b9-adad-c9b4ab79b746-ovn-node-metrics-cert\") pod \"ovnkube-node-nklsf\" (UID: \"c3161e84-a5cb-41b9-adad-c9b4ab79b746\") " pod="openshift-ovn-kubernetes/ovnkube-node-nklsf" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.286086 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c3161e84-a5cb-41b9-adad-c9b4ab79b746-systemd-units\") pod \"ovnkube-node-nklsf\" (UID: \"c3161e84-a5cb-41b9-adad-c9b4ab79b746\") " pod="openshift-ovn-kubernetes/ovnkube-node-nklsf" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.286111 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c3161e84-a5cb-41b9-adad-c9b4ab79b746-log-socket\") pod \"ovnkube-node-nklsf\" (UID: \"c3161e84-a5cb-41b9-adad-c9b4ab79b746\") " pod="openshift-ovn-kubernetes/ovnkube-node-nklsf" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.286128 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c3161e84-a5cb-41b9-adad-c9b4ab79b746-host-run-ovn-kubernetes\") pod \"ovnkube-node-nklsf\" (UID: \"c3161e84-a5cb-41b9-adad-c9b4ab79b746\") " pod="openshift-ovn-kubernetes/ovnkube-node-nklsf" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.286142 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c3161e84-a5cb-41b9-adad-c9b4ab79b746-host-run-netns\") pod \"ovnkube-node-nklsf\" (UID: \"c3161e84-a5cb-41b9-adad-c9b4ab79b746\") " pod="openshift-ovn-kubernetes/ovnkube-node-nklsf" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.286160 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c3161e84-a5cb-41b9-adad-c9b4ab79b746-host-slash\") pod \"ovnkube-node-nklsf\" (UID: \"c3161e84-a5cb-41b9-adad-c9b4ab79b746\") " pod="openshift-ovn-kubernetes/ovnkube-node-nklsf" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.286223 4606 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/da25b0ba-5398-4185-a4c1-aeba44ae5633-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.286234 4606 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/da25b0ba-5398-4185-a4c1-aeba44ae5633-run-systemd\") on node \"crc\" DevicePath \"\"" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.286243 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hbcl8\" (UniqueName: \"kubernetes.io/projected/da25b0ba-5398-4185-a4c1-aeba44ae5633-kube-api-access-hbcl8\") on node \"crc\" DevicePath \"\"" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.286251 4606 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/da25b0ba-5398-4185-a4c1-aeba44ae5633-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.338718 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-5655c58dd6-2vtxw" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.387124 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c3161e84-a5cb-41b9-adad-c9b4ab79b746-run-openvswitch\") pod \"ovnkube-node-nklsf\" (UID: \"c3161e84-a5cb-41b9-adad-c9b4ab79b746\") " pod="openshift-ovn-kubernetes/ovnkube-node-nklsf" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.387164 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c3161e84-a5cb-41b9-adad-c9b4ab79b746-host-cni-netd\") pod \"ovnkube-node-nklsf\" (UID: \"c3161e84-a5cb-41b9-adad-c9b4ab79b746\") " pod="openshift-ovn-kubernetes/ovnkube-node-nklsf" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.387225 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c3161e84-a5cb-41b9-adad-c9b4ab79b746-run-ovn\") pod \"ovnkube-node-nklsf\" (UID: \"c3161e84-a5cb-41b9-adad-c9b4ab79b746\") " pod="openshift-ovn-kubernetes/ovnkube-node-nklsf" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.387245 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c3161e84-a5cb-41b9-adad-c9b4ab79b746-env-overrides\") pod \"ovnkube-node-nklsf\" (UID: \"c3161e84-a5cb-41b9-adad-c9b4ab79b746\") " pod="openshift-ovn-kubernetes/ovnkube-node-nklsf" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.387264 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c3161e84-a5cb-41b9-adad-c9b4ab79b746-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-nklsf\" (UID: \"c3161e84-a5cb-41b9-adad-c9b4ab79b746\") " pod="openshift-ovn-kubernetes/ovnkube-node-nklsf" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.387278 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gpnk\" (UniqueName: \"kubernetes.io/projected/c3161e84-a5cb-41b9-adad-c9b4ab79b746-kube-api-access-7gpnk\") pod \"ovnkube-node-nklsf\" (UID: \"c3161e84-a5cb-41b9-adad-c9b4ab79b746\") " pod="openshift-ovn-kubernetes/ovnkube-node-nklsf" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.387297 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c3161e84-a5cb-41b9-adad-c9b4ab79b746-run-systemd\") pod \"ovnkube-node-nklsf\" (UID: \"c3161e84-a5cb-41b9-adad-c9b4ab79b746\") " pod="openshift-ovn-kubernetes/ovnkube-node-nklsf" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.387310 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c3161e84-a5cb-41b9-adad-c9b4ab79b746-etc-openvswitch\") pod \"ovnkube-node-nklsf\" (UID: \"c3161e84-a5cb-41b9-adad-c9b4ab79b746\") " pod="openshift-ovn-kubernetes/ovnkube-node-nklsf" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.387338 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c3161e84-a5cb-41b9-adad-c9b4ab79b746-host-kubelet\") pod \"ovnkube-node-nklsf\" (UID: \"c3161e84-a5cb-41b9-adad-c9b4ab79b746\") " pod="openshift-ovn-kubernetes/ovnkube-node-nklsf" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.387351 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c3161e84-a5cb-41b9-adad-c9b4ab79b746-host-cni-bin\") pod \"ovnkube-node-nklsf\" (UID: \"c3161e84-a5cb-41b9-adad-c9b4ab79b746\") " pod="openshift-ovn-kubernetes/ovnkube-node-nklsf" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.387367 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c3161e84-a5cb-41b9-adad-c9b4ab79b746-var-lib-openvswitch\") pod \"ovnkube-node-nklsf\" (UID: \"c3161e84-a5cb-41b9-adad-c9b4ab79b746\") " pod="openshift-ovn-kubernetes/ovnkube-node-nklsf" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.387383 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c3161e84-a5cb-41b9-adad-c9b4ab79b746-ovnkube-script-lib\") pod \"ovnkube-node-nklsf\" (UID: \"c3161e84-a5cb-41b9-adad-c9b4ab79b746\") " pod="openshift-ovn-kubernetes/ovnkube-node-nklsf" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.387398 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c3161e84-a5cb-41b9-adad-c9b4ab79b746-ovn-node-metrics-cert\") pod \"ovnkube-node-nklsf\" (UID: \"c3161e84-a5cb-41b9-adad-c9b4ab79b746\") " pod="openshift-ovn-kubernetes/ovnkube-node-nklsf" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.387412 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c3161e84-a5cb-41b9-adad-c9b4ab79b746-systemd-units\") pod \"ovnkube-node-nklsf\" (UID: \"c3161e84-a5cb-41b9-adad-c9b4ab79b746\") " pod="openshift-ovn-kubernetes/ovnkube-node-nklsf" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.387437 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c3161e84-a5cb-41b9-adad-c9b4ab79b746-log-socket\") pod \"ovnkube-node-nklsf\" (UID: \"c3161e84-a5cb-41b9-adad-c9b4ab79b746\") " pod="openshift-ovn-kubernetes/ovnkube-node-nklsf" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.387454 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c3161e84-a5cb-41b9-adad-c9b4ab79b746-host-run-ovn-kubernetes\") pod \"ovnkube-node-nklsf\" (UID: \"c3161e84-a5cb-41b9-adad-c9b4ab79b746\") " pod="openshift-ovn-kubernetes/ovnkube-node-nklsf" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.387469 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c3161e84-a5cb-41b9-adad-c9b4ab79b746-host-run-netns\") pod \"ovnkube-node-nklsf\" (UID: \"c3161e84-a5cb-41b9-adad-c9b4ab79b746\") " pod="openshift-ovn-kubernetes/ovnkube-node-nklsf" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.387488 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c3161e84-a5cb-41b9-adad-c9b4ab79b746-host-slash\") pod \"ovnkube-node-nklsf\" (UID: \"c3161e84-a5cb-41b9-adad-c9b4ab79b746\") " pod="openshift-ovn-kubernetes/ovnkube-node-nklsf" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.387503 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c3161e84-a5cb-41b9-adad-c9b4ab79b746-node-log\") pod \"ovnkube-node-nklsf\" (UID: \"c3161e84-a5cb-41b9-adad-c9b4ab79b746\") " pod="openshift-ovn-kubernetes/ovnkube-node-nklsf" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.387522 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c3161e84-a5cb-41b9-adad-c9b4ab79b746-ovnkube-config\") pod \"ovnkube-node-nklsf\" (UID: \"c3161e84-a5cb-41b9-adad-c9b4ab79b746\") " pod="openshift-ovn-kubernetes/ovnkube-node-nklsf" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.388140 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c3161e84-a5cb-41b9-adad-c9b4ab79b746-ovnkube-config\") pod \"ovnkube-node-nklsf\" (UID: \"c3161e84-a5cb-41b9-adad-c9b4ab79b746\") " pod="openshift-ovn-kubernetes/ovnkube-node-nklsf" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.388216 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c3161e84-a5cb-41b9-adad-c9b4ab79b746-run-openvswitch\") pod \"ovnkube-node-nklsf\" (UID: \"c3161e84-a5cb-41b9-adad-c9b4ab79b746\") " pod="openshift-ovn-kubernetes/ovnkube-node-nklsf" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.388250 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c3161e84-a5cb-41b9-adad-c9b4ab79b746-host-cni-netd\") pod \"ovnkube-node-nklsf\" (UID: \"c3161e84-a5cb-41b9-adad-c9b4ab79b746\") " pod="openshift-ovn-kubernetes/ovnkube-node-nklsf" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.388280 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c3161e84-a5cb-41b9-adad-c9b4ab79b746-var-lib-openvswitch\") pod \"ovnkube-node-nklsf\" (UID: \"c3161e84-a5cb-41b9-adad-c9b4ab79b746\") " pod="openshift-ovn-kubernetes/ovnkube-node-nklsf" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.388292 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c3161e84-a5cb-41b9-adad-c9b4ab79b746-host-cni-bin\") pod \"ovnkube-node-nklsf\" (UID: \"c3161e84-a5cb-41b9-adad-c9b4ab79b746\") " pod="openshift-ovn-kubernetes/ovnkube-node-nklsf" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.388318 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c3161e84-a5cb-41b9-adad-c9b4ab79b746-host-run-netns\") pod \"ovnkube-node-nklsf\" (UID: \"c3161e84-a5cb-41b9-adad-c9b4ab79b746\") " pod="openshift-ovn-kubernetes/ovnkube-node-nklsf" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.388330 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c3161e84-a5cb-41b9-adad-c9b4ab79b746-log-socket\") pod \"ovnkube-node-nklsf\" (UID: \"c3161e84-a5cb-41b9-adad-c9b4ab79b746\") " pod="openshift-ovn-kubernetes/ovnkube-node-nklsf" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.388340 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c3161e84-a5cb-41b9-adad-c9b4ab79b746-host-slash\") pod \"ovnkube-node-nklsf\" (UID: \"c3161e84-a5cb-41b9-adad-c9b4ab79b746\") " pod="openshift-ovn-kubernetes/ovnkube-node-nklsf" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.388360 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c3161e84-a5cb-41b9-adad-c9b4ab79b746-node-log\") pod \"ovnkube-node-nklsf\" (UID: \"c3161e84-a5cb-41b9-adad-c9b4ab79b746\") " pod="openshift-ovn-kubernetes/ovnkube-node-nklsf" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.388386 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c3161e84-a5cb-41b9-adad-c9b4ab79b746-systemd-units\") pod \"ovnkube-node-nklsf\" (UID: \"c3161e84-a5cb-41b9-adad-c9b4ab79b746\") " pod="openshift-ovn-kubernetes/ovnkube-node-nklsf" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.388624 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c3161e84-a5cb-41b9-adad-c9b4ab79b746-run-ovn\") pod \"ovnkube-node-nklsf\" (UID: \"c3161e84-a5cb-41b9-adad-c9b4ab79b746\") " pod="openshift-ovn-kubernetes/ovnkube-node-nklsf" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.388985 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c3161e84-a5cb-41b9-adad-c9b4ab79b746-env-overrides\") pod \"ovnkube-node-nklsf\" (UID: \"c3161e84-a5cb-41b9-adad-c9b4ab79b746\") " pod="openshift-ovn-kubernetes/ovnkube-node-nklsf" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.389013 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c3161e84-a5cb-41b9-adad-c9b4ab79b746-run-systemd\") pod \"ovnkube-node-nklsf\" (UID: \"c3161e84-a5cb-41b9-adad-c9b4ab79b746\") " pod="openshift-ovn-kubernetes/ovnkube-node-nklsf" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.389019 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c3161e84-a5cb-41b9-adad-c9b4ab79b746-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-nklsf\" (UID: \"c3161e84-a5cb-41b9-adad-c9b4ab79b746\") " pod="openshift-ovn-kubernetes/ovnkube-node-nklsf" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.389042 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c3161e84-a5cb-41b9-adad-c9b4ab79b746-host-kubelet\") pod \"ovnkube-node-nklsf\" (UID: \"c3161e84-a5cb-41b9-adad-c9b4ab79b746\") " pod="openshift-ovn-kubernetes/ovnkube-node-nklsf" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.388304 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c3161e84-a5cb-41b9-adad-c9b4ab79b746-host-run-ovn-kubernetes\") pod \"ovnkube-node-nklsf\" (UID: \"c3161e84-a5cb-41b9-adad-c9b4ab79b746\") " pod="openshift-ovn-kubernetes/ovnkube-node-nklsf" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.389047 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c3161e84-a5cb-41b9-adad-c9b4ab79b746-etc-openvswitch\") pod \"ovnkube-node-nklsf\" (UID: \"c3161e84-a5cb-41b9-adad-c9b4ab79b746\") " pod="openshift-ovn-kubernetes/ovnkube-node-nklsf" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.389293 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c3161e84-a5cb-41b9-adad-c9b4ab79b746-ovnkube-script-lib\") pod \"ovnkube-node-nklsf\" (UID: \"c3161e84-a5cb-41b9-adad-c9b4ab79b746\") " pod="openshift-ovn-kubernetes/ovnkube-node-nklsf" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.392101 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c3161e84-a5cb-41b9-adad-c9b4ab79b746-ovn-node-metrics-cert\") pod \"ovnkube-node-nklsf\" (UID: \"c3161e84-a5cb-41b9-adad-c9b4ab79b746\") " pod="openshift-ovn-kubernetes/ovnkube-node-nklsf" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.412117 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gpnk\" (UniqueName: \"kubernetes.io/projected/c3161e84-a5cb-41b9-adad-c9b4ab79b746-kube-api-access-7gpnk\") pod \"ovnkube-node-nklsf\" (UID: \"c3161e84-a5cb-41b9-adad-c9b4ab79b746\") " pod="openshift-ovn-kubernetes/ovnkube-node-nklsf" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.491495 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-nklsf" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.709928 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hpw5w_da25b0ba-5398-4185-a4c1-aeba44ae5633/ovnkube-controller/3.log" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.713154 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hpw5w_da25b0ba-5398-4185-a4c1-aeba44ae5633/ovn-acl-logging/0.log" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.713914 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hpw5w_da25b0ba-5398-4185-a4c1-aeba44ae5633/ovn-controller/0.log" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.714521 4606 generic.go:334] "Generic (PLEG): container finished" podID="da25b0ba-5398-4185-a4c1-aeba44ae5633" containerID="6bfebed57887da9d277d1b3dc30ba0ef65a624b36ede841840c041551ed0b201" exitCode=0 Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.714550 4606 generic.go:334] "Generic (PLEG): container finished" podID="da25b0ba-5398-4185-a4c1-aeba44ae5633" containerID="8e3c08f155fa3cef9692edf10c4df9719f3647b5625a9ac68a7c158afb80dd86" exitCode=0 Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.714564 4606 generic.go:334] "Generic (PLEG): container finished" podID="da25b0ba-5398-4185-a4c1-aeba44ae5633" containerID="dbb742680d55b47d51cd1964e187b6816e50a88befbce58d3a8746af74cf7071" exitCode=0 Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.714573 4606 generic.go:334] "Generic (PLEG): container finished" podID="da25b0ba-5398-4185-a4c1-aeba44ae5633" containerID="366ddc63e1156e702b24e0c3d3c02785c10324669fa851e46851f9a2b8a46a5f" exitCode=0 Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.714584 4606 generic.go:334] "Generic (PLEG): container finished" podID="da25b0ba-5398-4185-a4c1-aeba44ae5633" containerID="653bdfa6f15b75149c89a4aff0706e132368f8bc75520cde9f1e1c55c8866537" exitCode=0 Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.714596 4606 generic.go:334] "Generic (PLEG): container finished" podID="da25b0ba-5398-4185-a4c1-aeba44ae5633" containerID="fa827acce161127216bcfcf3553efeb627cbe52e4d23a8cb011e025f67dde670" exitCode=0 Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.714604 4606 generic.go:334] "Generic (PLEG): container finished" podID="da25b0ba-5398-4185-a4c1-aeba44ae5633" containerID="6183a4685fda61493c792e9ae693cd7f76e4aad9753f622b48347da3fd2b3b00" exitCode=143 Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.714613 4606 generic.go:334] "Generic (PLEG): container finished" podID="da25b0ba-5398-4185-a4c1-aeba44ae5633" containerID="0867a89de2edf12036267a937b7e3b1a801e9710ff0a1250c20e020c80ac4d5d" exitCode=143 Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.714612 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" event={"ID":"da25b0ba-5398-4185-a4c1-aeba44ae5633","Type":"ContainerDied","Data":"6bfebed57887da9d277d1b3dc30ba0ef65a624b36ede841840c041551ed0b201"} Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.714684 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" event={"ID":"da25b0ba-5398-4185-a4c1-aeba44ae5633","Type":"ContainerDied","Data":"8e3c08f155fa3cef9692edf10c4df9719f3647b5625a9ac68a7c158afb80dd86"} Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.714724 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" event={"ID":"da25b0ba-5398-4185-a4c1-aeba44ae5633","Type":"ContainerDied","Data":"dbb742680d55b47d51cd1964e187b6816e50a88befbce58d3a8746af74cf7071"} Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.714770 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" event={"ID":"da25b0ba-5398-4185-a4c1-aeba44ae5633","Type":"ContainerDied","Data":"366ddc63e1156e702b24e0c3d3c02785c10324669fa851e46851f9a2b8a46a5f"} Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.714787 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.714798 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" event={"ID":"da25b0ba-5398-4185-a4c1-aeba44ae5633","Type":"ContainerDied","Data":"653bdfa6f15b75149c89a4aff0706e132368f8bc75520cde9f1e1c55c8866537"} Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.714993 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" event={"ID":"da25b0ba-5398-4185-a4c1-aeba44ae5633","Type":"ContainerDied","Data":"fa827acce161127216bcfcf3553efeb627cbe52e4d23a8cb011e025f67dde670"} Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.715027 4606 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"04af88555385f41f61f31bcc6dcfd9ea677fd98ba9ef0b1a25003fff11e8be1b"} Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.715045 4606 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8e3c08f155fa3cef9692edf10c4df9719f3647b5625a9ac68a7c158afb80dd86"} Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.715063 4606 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"dbb742680d55b47d51cd1964e187b6816e50a88befbce58d3a8746af74cf7071"} Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.714745 4606 scope.go:117] "RemoveContainer" containerID="6bfebed57887da9d277d1b3dc30ba0ef65a624b36ede841840c041551ed0b201" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.715077 4606 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"366ddc63e1156e702b24e0c3d3c02785c10324669fa851e46851f9a2b8a46a5f"} Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.715188 4606 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"653bdfa6f15b75149c89a4aff0706e132368f8bc75520cde9f1e1c55c8866537"} Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.715206 4606 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fa827acce161127216bcfcf3553efeb627cbe52e4d23a8cb011e025f67dde670"} Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.715213 4606 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6183a4685fda61493c792e9ae693cd7f76e4aad9753f622b48347da3fd2b3b00"} Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.715220 4606 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0867a89de2edf12036267a937b7e3b1a801e9710ff0a1250c20e020c80ac4d5d"} Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.715227 4606 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f94a9f641815d76464c17c6145b9ac4803eedab3bea29a52d65a793d448cebe2"} Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.715246 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" event={"ID":"da25b0ba-5398-4185-a4c1-aeba44ae5633","Type":"ContainerDied","Data":"6183a4685fda61493c792e9ae693cd7f76e4aad9753f622b48347da3fd2b3b00"} Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.715269 4606 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6bfebed57887da9d277d1b3dc30ba0ef65a624b36ede841840c041551ed0b201"} Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.715278 4606 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"04af88555385f41f61f31bcc6dcfd9ea677fd98ba9ef0b1a25003fff11e8be1b"} Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.715285 4606 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8e3c08f155fa3cef9692edf10c4df9719f3647b5625a9ac68a7c158afb80dd86"} Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.715291 4606 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"dbb742680d55b47d51cd1964e187b6816e50a88befbce58d3a8746af74cf7071"} Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.715298 4606 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"366ddc63e1156e702b24e0c3d3c02785c10324669fa851e46851f9a2b8a46a5f"} Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.715305 4606 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"653bdfa6f15b75149c89a4aff0706e132368f8bc75520cde9f1e1c55c8866537"} Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.715311 4606 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fa827acce161127216bcfcf3553efeb627cbe52e4d23a8cb011e025f67dde670"} Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.715319 4606 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6183a4685fda61493c792e9ae693cd7f76e4aad9753f622b48347da3fd2b3b00"} Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.715354 4606 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0867a89de2edf12036267a937b7e3b1a801e9710ff0a1250c20e020c80ac4d5d"} Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.715368 4606 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f94a9f641815d76464c17c6145b9ac4803eedab3bea29a52d65a793d448cebe2"} Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.715379 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" event={"ID":"da25b0ba-5398-4185-a4c1-aeba44ae5633","Type":"ContainerDied","Data":"0867a89de2edf12036267a937b7e3b1a801e9710ff0a1250c20e020c80ac4d5d"} Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.715391 4606 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6bfebed57887da9d277d1b3dc30ba0ef65a624b36ede841840c041551ed0b201"} Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.715400 4606 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"04af88555385f41f61f31bcc6dcfd9ea677fd98ba9ef0b1a25003fff11e8be1b"} Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.715406 4606 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8e3c08f155fa3cef9692edf10c4df9719f3647b5625a9ac68a7c158afb80dd86"} Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.715413 4606 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"dbb742680d55b47d51cd1964e187b6816e50a88befbce58d3a8746af74cf7071"} Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.715420 4606 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"366ddc63e1156e702b24e0c3d3c02785c10324669fa851e46851f9a2b8a46a5f"} Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.715427 4606 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"653bdfa6f15b75149c89a4aff0706e132368f8bc75520cde9f1e1c55c8866537"} Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.715434 4606 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fa827acce161127216bcfcf3553efeb627cbe52e4d23a8cb011e025f67dde670"} Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.715442 4606 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6183a4685fda61493c792e9ae693cd7f76e4aad9753f622b48347da3fd2b3b00"} Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.715449 4606 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0867a89de2edf12036267a937b7e3b1a801e9710ff0a1250c20e020c80ac4d5d"} Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.715456 4606 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f94a9f641815d76464c17c6145b9ac4803eedab3bea29a52d65a793d448cebe2"} Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.715465 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hpw5w" event={"ID":"da25b0ba-5398-4185-a4c1-aeba44ae5633","Type":"ContainerDied","Data":"5c2a63eacc2ccc10dc4ae350eb8ea92415a431f080396e6b22a1b790b6667255"} Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.715475 4606 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6bfebed57887da9d277d1b3dc30ba0ef65a624b36ede841840c041551ed0b201"} Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.715483 4606 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"04af88555385f41f61f31bcc6dcfd9ea677fd98ba9ef0b1a25003fff11e8be1b"} Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.715490 4606 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8e3c08f155fa3cef9692edf10c4df9719f3647b5625a9ac68a7c158afb80dd86"} Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.715497 4606 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"dbb742680d55b47d51cd1964e187b6816e50a88befbce58d3a8746af74cf7071"} Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.715504 4606 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"366ddc63e1156e702b24e0c3d3c02785c10324669fa851e46851f9a2b8a46a5f"} Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.715510 4606 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"653bdfa6f15b75149c89a4aff0706e132368f8bc75520cde9f1e1c55c8866537"} Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.715517 4606 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fa827acce161127216bcfcf3553efeb627cbe52e4d23a8cb011e025f67dde670"} Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.715524 4606 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6183a4685fda61493c792e9ae693cd7f76e4aad9753f622b48347da3fd2b3b00"} Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.715530 4606 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0867a89de2edf12036267a937b7e3b1a801e9710ff0a1250c20e020c80ac4d5d"} Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.715538 4606 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f94a9f641815d76464c17c6145b9ac4803eedab3bea29a52d65a793d448cebe2"} Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.720914 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xzcfk_b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0/kube-multus/2.log" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.722389 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xzcfk_b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0/kube-multus/1.log" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.722537 4606 generic.go:334] "Generic (PLEG): container finished" podID="b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0" containerID="834943ef77c01eccd37c1ec6b4bf249f411ca5b5a275a69b5e5939d8f08242e8" exitCode=2 Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.722597 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xzcfk" event={"ID":"b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0","Type":"ContainerDied","Data":"834943ef77c01eccd37c1ec6b4bf249f411ca5b5a275a69b5e5939d8f08242e8"} Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.722858 4606 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d3601dd0b8c0a574cc4689906f3c4ed491f3432cae8b328c55f07e5d17b6e976"} Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.723547 4606 scope.go:117] "RemoveContainer" containerID="834943ef77c01eccd37c1ec6b4bf249f411ca5b5a275a69b5e5939d8f08242e8" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.727793 4606 generic.go:334] "Generic (PLEG): container finished" podID="c3161e84-a5cb-41b9-adad-c9b4ab79b746" containerID="f2bcd2c70199c221c51140cf1f89e809537dff3342b8acb8c7d6ac211c1d586d" exitCode=0 Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.727868 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nklsf" event={"ID":"c3161e84-a5cb-41b9-adad-c9b4ab79b746","Type":"ContainerDied","Data":"f2bcd2c70199c221c51140cf1f89e809537dff3342b8acb8c7d6ac211c1d586d"} Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.727972 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nklsf" event={"ID":"c3161e84-a5cb-41b9-adad-c9b4ab79b746","Type":"ContainerStarted","Data":"88ba0d3662df496e2d6ccadf241478b76e191ad7a063bb99842c1df5b9180015"} Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.749382 4606 scope.go:117] "RemoveContainer" containerID="04af88555385f41f61f31bcc6dcfd9ea677fd98ba9ef0b1a25003fff11e8be1b" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.812650 4606 scope.go:117] "RemoveContainer" containerID="8e3c08f155fa3cef9692edf10c4df9719f3647b5625a9ac68a7c158afb80dd86" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.845237 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-hpw5w"] Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.850925 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-hpw5w"] Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.851406 4606 scope.go:117] "RemoveContainer" containerID="dbb742680d55b47d51cd1964e187b6816e50a88befbce58d3a8746af74cf7071" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.867840 4606 scope.go:117] "RemoveContainer" containerID="366ddc63e1156e702b24e0c3d3c02785c10324669fa851e46851f9a2b8a46a5f" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.882350 4606 scope.go:117] "RemoveContainer" containerID="653bdfa6f15b75149c89a4aff0706e132368f8bc75520cde9f1e1c55c8866537" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.893740 4606 scope.go:117] "RemoveContainer" containerID="fa827acce161127216bcfcf3553efeb627cbe52e4d23a8cb011e025f67dde670" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.947448 4606 scope.go:117] "RemoveContainer" containerID="6183a4685fda61493c792e9ae693cd7f76e4aad9753f622b48347da3fd2b3b00" Dec 12 00:37:38 crc kubenswrapper[4606]: I1212 00:37:38.968518 4606 scope.go:117] "RemoveContainer" containerID="0867a89de2edf12036267a937b7e3b1a801e9710ff0a1250c20e020c80ac4d5d" Dec 12 00:37:39 crc kubenswrapper[4606]: I1212 00:37:39.001300 4606 scope.go:117] "RemoveContainer" containerID="f94a9f641815d76464c17c6145b9ac4803eedab3bea29a52d65a793d448cebe2" Dec 12 00:37:39 crc kubenswrapper[4606]: I1212 00:37:39.025213 4606 scope.go:117] "RemoveContainer" containerID="6bfebed57887da9d277d1b3dc30ba0ef65a624b36ede841840c041551ed0b201" Dec 12 00:37:39 crc kubenswrapper[4606]: E1212 00:37:39.025559 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6bfebed57887da9d277d1b3dc30ba0ef65a624b36ede841840c041551ed0b201\": container with ID starting with 6bfebed57887da9d277d1b3dc30ba0ef65a624b36ede841840c041551ed0b201 not found: ID does not exist" containerID="6bfebed57887da9d277d1b3dc30ba0ef65a624b36ede841840c041551ed0b201" Dec 12 00:37:39 crc kubenswrapper[4606]: I1212 00:37:39.025587 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6bfebed57887da9d277d1b3dc30ba0ef65a624b36ede841840c041551ed0b201"} err="failed to get container status \"6bfebed57887da9d277d1b3dc30ba0ef65a624b36ede841840c041551ed0b201\": rpc error: code = NotFound desc = could not find container \"6bfebed57887da9d277d1b3dc30ba0ef65a624b36ede841840c041551ed0b201\": container with ID starting with 6bfebed57887da9d277d1b3dc30ba0ef65a624b36ede841840c041551ed0b201 not found: ID does not exist" Dec 12 00:37:39 crc kubenswrapper[4606]: I1212 00:37:39.025611 4606 scope.go:117] "RemoveContainer" containerID="04af88555385f41f61f31bcc6dcfd9ea677fd98ba9ef0b1a25003fff11e8be1b" Dec 12 00:37:39 crc kubenswrapper[4606]: E1212 00:37:39.025875 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"04af88555385f41f61f31bcc6dcfd9ea677fd98ba9ef0b1a25003fff11e8be1b\": container with ID starting with 04af88555385f41f61f31bcc6dcfd9ea677fd98ba9ef0b1a25003fff11e8be1b not found: ID does not exist" containerID="04af88555385f41f61f31bcc6dcfd9ea677fd98ba9ef0b1a25003fff11e8be1b" Dec 12 00:37:39 crc kubenswrapper[4606]: I1212 00:37:39.025911 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04af88555385f41f61f31bcc6dcfd9ea677fd98ba9ef0b1a25003fff11e8be1b"} err="failed to get container status \"04af88555385f41f61f31bcc6dcfd9ea677fd98ba9ef0b1a25003fff11e8be1b\": rpc error: code = NotFound desc = could not find container \"04af88555385f41f61f31bcc6dcfd9ea677fd98ba9ef0b1a25003fff11e8be1b\": container with ID starting with 04af88555385f41f61f31bcc6dcfd9ea677fd98ba9ef0b1a25003fff11e8be1b not found: ID does not exist" Dec 12 00:37:39 crc kubenswrapper[4606]: I1212 00:37:39.025937 4606 scope.go:117] "RemoveContainer" containerID="8e3c08f155fa3cef9692edf10c4df9719f3647b5625a9ac68a7c158afb80dd86" Dec 12 00:37:39 crc kubenswrapper[4606]: E1212 00:37:39.026152 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e3c08f155fa3cef9692edf10c4df9719f3647b5625a9ac68a7c158afb80dd86\": container with ID starting with 8e3c08f155fa3cef9692edf10c4df9719f3647b5625a9ac68a7c158afb80dd86 not found: ID does not exist" containerID="8e3c08f155fa3cef9692edf10c4df9719f3647b5625a9ac68a7c158afb80dd86" Dec 12 00:37:39 crc kubenswrapper[4606]: I1212 00:37:39.026201 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e3c08f155fa3cef9692edf10c4df9719f3647b5625a9ac68a7c158afb80dd86"} err="failed to get container status \"8e3c08f155fa3cef9692edf10c4df9719f3647b5625a9ac68a7c158afb80dd86\": rpc error: code = NotFound desc = could not find container \"8e3c08f155fa3cef9692edf10c4df9719f3647b5625a9ac68a7c158afb80dd86\": container with ID starting with 8e3c08f155fa3cef9692edf10c4df9719f3647b5625a9ac68a7c158afb80dd86 not found: ID does not exist" Dec 12 00:37:39 crc kubenswrapper[4606]: I1212 00:37:39.026219 4606 scope.go:117] "RemoveContainer" containerID="dbb742680d55b47d51cd1964e187b6816e50a88befbce58d3a8746af74cf7071" Dec 12 00:37:39 crc kubenswrapper[4606]: E1212 00:37:39.026448 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dbb742680d55b47d51cd1964e187b6816e50a88befbce58d3a8746af74cf7071\": container with ID starting with dbb742680d55b47d51cd1964e187b6816e50a88befbce58d3a8746af74cf7071 not found: ID does not exist" containerID="dbb742680d55b47d51cd1964e187b6816e50a88befbce58d3a8746af74cf7071" Dec 12 00:37:39 crc kubenswrapper[4606]: I1212 00:37:39.026479 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dbb742680d55b47d51cd1964e187b6816e50a88befbce58d3a8746af74cf7071"} err="failed to get container status \"dbb742680d55b47d51cd1964e187b6816e50a88befbce58d3a8746af74cf7071\": rpc error: code = NotFound desc = could not find container \"dbb742680d55b47d51cd1964e187b6816e50a88befbce58d3a8746af74cf7071\": container with ID starting with dbb742680d55b47d51cd1964e187b6816e50a88befbce58d3a8746af74cf7071 not found: ID does not exist" Dec 12 00:37:39 crc kubenswrapper[4606]: I1212 00:37:39.026501 4606 scope.go:117] "RemoveContainer" containerID="366ddc63e1156e702b24e0c3d3c02785c10324669fa851e46851f9a2b8a46a5f" Dec 12 00:37:39 crc kubenswrapper[4606]: E1212 00:37:39.026721 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"366ddc63e1156e702b24e0c3d3c02785c10324669fa851e46851f9a2b8a46a5f\": container with ID starting with 366ddc63e1156e702b24e0c3d3c02785c10324669fa851e46851f9a2b8a46a5f not found: ID does not exist" containerID="366ddc63e1156e702b24e0c3d3c02785c10324669fa851e46851f9a2b8a46a5f" Dec 12 00:37:39 crc kubenswrapper[4606]: I1212 00:37:39.026748 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"366ddc63e1156e702b24e0c3d3c02785c10324669fa851e46851f9a2b8a46a5f"} err="failed to get container status \"366ddc63e1156e702b24e0c3d3c02785c10324669fa851e46851f9a2b8a46a5f\": rpc error: code = NotFound desc = could not find container \"366ddc63e1156e702b24e0c3d3c02785c10324669fa851e46851f9a2b8a46a5f\": container with ID starting with 366ddc63e1156e702b24e0c3d3c02785c10324669fa851e46851f9a2b8a46a5f not found: ID does not exist" Dec 12 00:37:39 crc kubenswrapper[4606]: I1212 00:37:39.026765 4606 scope.go:117] "RemoveContainer" containerID="653bdfa6f15b75149c89a4aff0706e132368f8bc75520cde9f1e1c55c8866537" Dec 12 00:37:39 crc kubenswrapper[4606]: E1212 00:37:39.027013 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"653bdfa6f15b75149c89a4aff0706e132368f8bc75520cde9f1e1c55c8866537\": container with ID starting with 653bdfa6f15b75149c89a4aff0706e132368f8bc75520cde9f1e1c55c8866537 not found: ID does not exist" containerID="653bdfa6f15b75149c89a4aff0706e132368f8bc75520cde9f1e1c55c8866537" Dec 12 00:37:39 crc kubenswrapper[4606]: I1212 00:37:39.027039 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"653bdfa6f15b75149c89a4aff0706e132368f8bc75520cde9f1e1c55c8866537"} err="failed to get container status \"653bdfa6f15b75149c89a4aff0706e132368f8bc75520cde9f1e1c55c8866537\": rpc error: code = NotFound desc = could not find container \"653bdfa6f15b75149c89a4aff0706e132368f8bc75520cde9f1e1c55c8866537\": container with ID starting with 653bdfa6f15b75149c89a4aff0706e132368f8bc75520cde9f1e1c55c8866537 not found: ID does not exist" Dec 12 00:37:39 crc kubenswrapper[4606]: I1212 00:37:39.027055 4606 scope.go:117] "RemoveContainer" containerID="fa827acce161127216bcfcf3553efeb627cbe52e4d23a8cb011e025f67dde670" Dec 12 00:37:39 crc kubenswrapper[4606]: E1212 00:37:39.027335 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa827acce161127216bcfcf3553efeb627cbe52e4d23a8cb011e025f67dde670\": container with ID starting with fa827acce161127216bcfcf3553efeb627cbe52e4d23a8cb011e025f67dde670 not found: ID does not exist" containerID="fa827acce161127216bcfcf3553efeb627cbe52e4d23a8cb011e025f67dde670" Dec 12 00:37:39 crc kubenswrapper[4606]: I1212 00:37:39.027361 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa827acce161127216bcfcf3553efeb627cbe52e4d23a8cb011e025f67dde670"} err="failed to get container status \"fa827acce161127216bcfcf3553efeb627cbe52e4d23a8cb011e025f67dde670\": rpc error: code = NotFound desc = could not find container \"fa827acce161127216bcfcf3553efeb627cbe52e4d23a8cb011e025f67dde670\": container with ID starting with fa827acce161127216bcfcf3553efeb627cbe52e4d23a8cb011e025f67dde670 not found: ID does not exist" Dec 12 00:37:39 crc kubenswrapper[4606]: I1212 00:37:39.027378 4606 scope.go:117] "RemoveContainer" containerID="6183a4685fda61493c792e9ae693cd7f76e4aad9753f622b48347da3fd2b3b00" Dec 12 00:37:39 crc kubenswrapper[4606]: E1212 00:37:39.027624 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6183a4685fda61493c792e9ae693cd7f76e4aad9753f622b48347da3fd2b3b00\": container with ID starting with 6183a4685fda61493c792e9ae693cd7f76e4aad9753f622b48347da3fd2b3b00 not found: ID does not exist" containerID="6183a4685fda61493c792e9ae693cd7f76e4aad9753f622b48347da3fd2b3b00" Dec 12 00:37:39 crc kubenswrapper[4606]: I1212 00:37:39.027659 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6183a4685fda61493c792e9ae693cd7f76e4aad9753f622b48347da3fd2b3b00"} err="failed to get container status \"6183a4685fda61493c792e9ae693cd7f76e4aad9753f622b48347da3fd2b3b00\": rpc error: code = NotFound desc = could not find container \"6183a4685fda61493c792e9ae693cd7f76e4aad9753f622b48347da3fd2b3b00\": container with ID starting with 6183a4685fda61493c792e9ae693cd7f76e4aad9753f622b48347da3fd2b3b00 not found: ID does not exist" Dec 12 00:37:39 crc kubenswrapper[4606]: I1212 00:37:39.027677 4606 scope.go:117] "RemoveContainer" containerID="0867a89de2edf12036267a937b7e3b1a801e9710ff0a1250c20e020c80ac4d5d" Dec 12 00:37:39 crc kubenswrapper[4606]: E1212 00:37:39.027877 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0867a89de2edf12036267a937b7e3b1a801e9710ff0a1250c20e020c80ac4d5d\": container with ID starting with 0867a89de2edf12036267a937b7e3b1a801e9710ff0a1250c20e020c80ac4d5d not found: ID does not exist" containerID="0867a89de2edf12036267a937b7e3b1a801e9710ff0a1250c20e020c80ac4d5d" Dec 12 00:37:39 crc kubenswrapper[4606]: I1212 00:37:39.027902 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0867a89de2edf12036267a937b7e3b1a801e9710ff0a1250c20e020c80ac4d5d"} err="failed to get container status \"0867a89de2edf12036267a937b7e3b1a801e9710ff0a1250c20e020c80ac4d5d\": rpc error: code = NotFound desc = could not find container \"0867a89de2edf12036267a937b7e3b1a801e9710ff0a1250c20e020c80ac4d5d\": container with ID starting with 0867a89de2edf12036267a937b7e3b1a801e9710ff0a1250c20e020c80ac4d5d not found: ID does not exist" Dec 12 00:37:39 crc kubenswrapper[4606]: I1212 00:37:39.028002 4606 scope.go:117] "RemoveContainer" containerID="f94a9f641815d76464c17c6145b9ac4803eedab3bea29a52d65a793d448cebe2" Dec 12 00:37:39 crc kubenswrapper[4606]: E1212 00:37:39.028426 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f94a9f641815d76464c17c6145b9ac4803eedab3bea29a52d65a793d448cebe2\": container with ID starting with f94a9f641815d76464c17c6145b9ac4803eedab3bea29a52d65a793d448cebe2 not found: ID does not exist" containerID="f94a9f641815d76464c17c6145b9ac4803eedab3bea29a52d65a793d448cebe2" Dec 12 00:37:39 crc kubenswrapper[4606]: I1212 00:37:39.028445 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f94a9f641815d76464c17c6145b9ac4803eedab3bea29a52d65a793d448cebe2"} err="failed to get container status \"f94a9f641815d76464c17c6145b9ac4803eedab3bea29a52d65a793d448cebe2\": rpc error: code = NotFound desc = could not find container \"f94a9f641815d76464c17c6145b9ac4803eedab3bea29a52d65a793d448cebe2\": container with ID starting with f94a9f641815d76464c17c6145b9ac4803eedab3bea29a52d65a793d448cebe2 not found: ID does not exist" Dec 12 00:37:39 crc kubenswrapper[4606]: I1212 00:37:39.028460 4606 scope.go:117] "RemoveContainer" containerID="6bfebed57887da9d277d1b3dc30ba0ef65a624b36ede841840c041551ed0b201" Dec 12 00:37:39 crc kubenswrapper[4606]: I1212 00:37:39.028713 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6bfebed57887da9d277d1b3dc30ba0ef65a624b36ede841840c041551ed0b201"} err="failed to get container status \"6bfebed57887da9d277d1b3dc30ba0ef65a624b36ede841840c041551ed0b201\": rpc error: code = NotFound desc = could not find container \"6bfebed57887da9d277d1b3dc30ba0ef65a624b36ede841840c041551ed0b201\": container with ID starting with 6bfebed57887da9d277d1b3dc30ba0ef65a624b36ede841840c041551ed0b201 not found: ID does not exist" Dec 12 00:37:39 crc kubenswrapper[4606]: I1212 00:37:39.028735 4606 scope.go:117] "RemoveContainer" containerID="04af88555385f41f61f31bcc6dcfd9ea677fd98ba9ef0b1a25003fff11e8be1b" Dec 12 00:37:39 crc kubenswrapper[4606]: I1212 00:37:39.028978 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04af88555385f41f61f31bcc6dcfd9ea677fd98ba9ef0b1a25003fff11e8be1b"} err="failed to get container status \"04af88555385f41f61f31bcc6dcfd9ea677fd98ba9ef0b1a25003fff11e8be1b\": rpc error: code = NotFound desc = could not find container \"04af88555385f41f61f31bcc6dcfd9ea677fd98ba9ef0b1a25003fff11e8be1b\": container with ID starting with 04af88555385f41f61f31bcc6dcfd9ea677fd98ba9ef0b1a25003fff11e8be1b not found: ID does not exist" Dec 12 00:37:39 crc kubenswrapper[4606]: I1212 00:37:39.029000 4606 scope.go:117] "RemoveContainer" containerID="8e3c08f155fa3cef9692edf10c4df9719f3647b5625a9ac68a7c158afb80dd86" Dec 12 00:37:39 crc kubenswrapper[4606]: I1212 00:37:39.029558 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e3c08f155fa3cef9692edf10c4df9719f3647b5625a9ac68a7c158afb80dd86"} err="failed to get container status \"8e3c08f155fa3cef9692edf10c4df9719f3647b5625a9ac68a7c158afb80dd86\": rpc error: code = NotFound desc = could not find container \"8e3c08f155fa3cef9692edf10c4df9719f3647b5625a9ac68a7c158afb80dd86\": container with ID starting with 8e3c08f155fa3cef9692edf10c4df9719f3647b5625a9ac68a7c158afb80dd86 not found: ID does not exist" Dec 12 00:37:39 crc kubenswrapper[4606]: I1212 00:37:39.029578 4606 scope.go:117] "RemoveContainer" containerID="dbb742680d55b47d51cd1964e187b6816e50a88befbce58d3a8746af74cf7071" Dec 12 00:37:39 crc kubenswrapper[4606]: I1212 00:37:39.029825 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dbb742680d55b47d51cd1964e187b6816e50a88befbce58d3a8746af74cf7071"} err="failed to get container status \"dbb742680d55b47d51cd1964e187b6816e50a88befbce58d3a8746af74cf7071\": rpc error: code = NotFound desc = could not find container \"dbb742680d55b47d51cd1964e187b6816e50a88befbce58d3a8746af74cf7071\": container with ID starting with dbb742680d55b47d51cd1964e187b6816e50a88befbce58d3a8746af74cf7071 not found: ID does not exist" Dec 12 00:37:39 crc kubenswrapper[4606]: I1212 00:37:39.029850 4606 scope.go:117] "RemoveContainer" containerID="366ddc63e1156e702b24e0c3d3c02785c10324669fa851e46851f9a2b8a46a5f" Dec 12 00:37:39 crc kubenswrapper[4606]: I1212 00:37:39.032217 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"366ddc63e1156e702b24e0c3d3c02785c10324669fa851e46851f9a2b8a46a5f"} err="failed to get container status \"366ddc63e1156e702b24e0c3d3c02785c10324669fa851e46851f9a2b8a46a5f\": rpc error: code = NotFound desc = could not find container \"366ddc63e1156e702b24e0c3d3c02785c10324669fa851e46851f9a2b8a46a5f\": container with ID starting with 366ddc63e1156e702b24e0c3d3c02785c10324669fa851e46851f9a2b8a46a5f not found: ID does not exist" Dec 12 00:37:39 crc kubenswrapper[4606]: I1212 00:37:39.032265 4606 scope.go:117] "RemoveContainer" containerID="653bdfa6f15b75149c89a4aff0706e132368f8bc75520cde9f1e1c55c8866537" Dec 12 00:37:39 crc kubenswrapper[4606]: I1212 00:37:39.034064 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"653bdfa6f15b75149c89a4aff0706e132368f8bc75520cde9f1e1c55c8866537"} err="failed to get container status \"653bdfa6f15b75149c89a4aff0706e132368f8bc75520cde9f1e1c55c8866537\": rpc error: code = NotFound desc = could not find container \"653bdfa6f15b75149c89a4aff0706e132368f8bc75520cde9f1e1c55c8866537\": container with ID starting with 653bdfa6f15b75149c89a4aff0706e132368f8bc75520cde9f1e1c55c8866537 not found: ID does not exist" Dec 12 00:37:39 crc kubenswrapper[4606]: I1212 00:37:39.034087 4606 scope.go:117] "RemoveContainer" containerID="fa827acce161127216bcfcf3553efeb627cbe52e4d23a8cb011e025f67dde670" Dec 12 00:37:39 crc kubenswrapper[4606]: I1212 00:37:39.034434 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa827acce161127216bcfcf3553efeb627cbe52e4d23a8cb011e025f67dde670"} err="failed to get container status \"fa827acce161127216bcfcf3553efeb627cbe52e4d23a8cb011e025f67dde670\": rpc error: code = NotFound desc = could not find container \"fa827acce161127216bcfcf3553efeb627cbe52e4d23a8cb011e025f67dde670\": container with ID starting with fa827acce161127216bcfcf3553efeb627cbe52e4d23a8cb011e025f67dde670 not found: ID does not exist" Dec 12 00:37:39 crc kubenswrapper[4606]: I1212 00:37:39.034454 4606 scope.go:117] "RemoveContainer" containerID="6183a4685fda61493c792e9ae693cd7f76e4aad9753f622b48347da3fd2b3b00" Dec 12 00:37:39 crc kubenswrapper[4606]: I1212 00:37:39.034648 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6183a4685fda61493c792e9ae693cd7f76e4aad9753f622b48347da3fd2b3b00"} err="failed to get container status \"6183a4685fda61493c792e9ae693cd7f76e4aad9753f622b48347da3fd2b3b00\": rpc error: code = NotFound desc = could not find container \"6183a4685fda61493c792e9ae693cd7f76e4aad9753f622b48347da3fd2b3b00\": container with ID starting with 6183a4685fda61493c792e9ae693cd7f76e4aad9753f622b48347da3fd2b3b00 not found: ID does not exist" Dec 12 00:37:39 crc kubenswrapper[4606]: I1212 00:37:39.034663 4606 scope.go:117] "RemoveContainer" containerID="0867a89de2edf12036267a937b7e3b1a801e9710ff0a1250c20e020c80ac4d5d" Dec 12 00:37:39 crc kubenswrapper[4606]: I1212 00:37:39.034860 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0867a89de2edf12036267a937b7e3b1a801e9710ff0a1250c20e020c80ac4d5d"} err="failed to get container status \"0867a89de2edf12036267a937b7e3b1a801e9710ff0a1250c20e020c80ac4d5d\": rpc error: code = NotFound desc = could not find container \"0867a89de2edf12036267a937b7e3b1a801e9710ff0a1250c20e020c80ac4d5d\": container with ID starting with 0867a89de2edf12036267a937b7e3b1a801e9710ff0a1250c20e020c80ac4d5d not found: ID does not exist" Dec 12 00:37:39 crc kubenswrapper[4606]: I1212 00:37:39.034883 4606 scope.go:117] "RemoveContainer" containerID="f94a9f641815d76464c17c6145b9ac4803eedab3bea29a52d65a793d448cebe2" Dec 12 00:37:39 crc kubenswrapper[4606]: I1212 00:37:39.035136 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f94a9f641815d76464c17c6145b9ac4803eedab3bea29a52d65a793d448cebe2"} err="failed to get container status \"f94a9f641815d76464c17c6145b9ac4803eedab3bea29a52d65a793d448cebe2\": rpc error: code = NotFound desc = could not find container \"f94a9f641815d76464c17c6145b9ac4803eedab3bea29a52d65a793d448cebe2\": container with ID starting with f94a9f641815d76464c17c6145b9ac4803eedab3bea29a52d65a793d448cebe2 not found: ID does not exist" Dec 12 00:37:39 crc kubenswrapper[4606]: I1212 00:37:39.035157 4606 scope.go:117] "RemoveContainer" containerID="6bfebed57887da9d277d1b3dc30ba0ef65a624b36ede841840c041551ed0b201" Dec 12 00:37:39 crc kubenswrapper[4606]: I1212 00:37:39.035422 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6bfebed57887da9d277d1b3dc30ba0ef65a624b36ede841840c041551ed0b201"} err="failed to get container status \"6bfebed57887da9d277d1b3dc30ba0ef65a624b36ede841840c041551ed0b201\": rpc error: code = NotFound desc = could not find container \"6bfebed57887da9d277d1b3dc30ba0ef65a624b36ede841840c041551ed0b201\": container with ID starting with 6bfebed57887da9d277d1b3dc30ba0ef65a624b36ede841840c041551ed0b201 not found: ID does not exist" Dec 12 00:37:39 crc kubenswrapper[4606]: I1212 00:37:39.035442 4606 scope.go:117] "RemoveContainer" containerID="04af88555385f41f61f31bcc6dcfd9ea677fd98ba9ef0b1a25003fff11e8be1b" Dec 12 00:37:39 crc kubenswrapper[4606]: I1212 00:37:39.035651 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04af88555385f41f61f31bcc6dcfd9ea677fd98ba9ef0b1a25003fff11e8be1b"} err="failed to get container status \"04af88555385f41f61f31bcc6dcfd9ea677fd98ba9ef0b1a25003fff11e8be1b\": rpc error: code = NotFound desc = could not find container \"04af88555385f41f61f31bcc6dcfd9ea677fd98ba9ef0b1a25003fff11e8be1b\": container with ID starting with 04af88555385f41f61f31bcc6dcfd9ea677fd98ba9ef0b1a25003fff11e8be1b not found: ID does not exist" Dec 12 00:37:39 crc kubenswrapper[4606]: I1212 00:37:39.035669 4606 scope.go:117] "RemoveContainer" containerID="8e3c08f155fa3cef9692edf10c4df9719f3647b5625a9ac68a7c158afb80dd86" Dec 12 00:37:39 crc kubenswrapper[4606]: I1212 00:37:39.035886 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e3c08f155fa3cef9692edf10c4df9719f3647b5625a9ac68a7c158afb80dd86"} err="failed to get container status \"8e3c08f155fa3cef9692edf10c4df9719f3647b5625a9ac68a7c158afb80dd86\": rpc error: code = NotFound desc = could not find container \"8e3c08f155fa3cef9692edf10c4df9719f3647b5625a9ac68a7c158afb80dd86\": container with ID starting with 8e3c08f155fa3cef9692edf10c4df9719f3647b5625a9ac68a7c158afb80dd86 not found: ID does not exist" Dec 12 00:37:39 crc kubenswrapper[4606]: I1212 00:37:39.035907 4606 scope.go:117] "RemoveContainer" containerID="dbb742680d55b47d51cd1964e187b6816e50a88befbce58d3a8746af74cf7071" Dec 12 00:37:39 crc kubenswrapper[4606]: I1212 00:37:39.036092 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dbb742680d55b47d51cd1964e187b6816e50a88befbce58d3a8746af74cf7071"} err="failed to get container status \"dbb742680d55b47d51cd1964e187b6816e50a88befbce58d3a8746af74cf7071\": rpc error: code = NotFound desc = could not find container \"dbb742680d55b47d51cd1964e187b6816e50a88befbce58d3a8746af74cf7071\": container with ID starting with dbb742680d55b47d51cd1964e187b6816e50a88befbce58d3a8746af74cf7071 not found: ID does not exist" Dec 12 00:37:39 crc kubenswrapper[4606]: I1212 00:37:39.036110 4606 scope.go:117] "RemoveContainer" containerID="366ddc63e1156e702b24e0c3d3c02785c10324669fa851e46851f9a2b8a46a5f" Dec 12 00:37:39 crc kubenswrapper[4606]: I1212 00:37:39.036338 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"366ddc63e1156e702b24e0c3d3c02785c10324669fa851e46851f9a2b8a46a5f"} err="failed to get container status \"366ddc63e1156e702b24e0c3d3c02785c10324669fa851e46851f9a2b8a46a5f\": rpc error: code = NotFound desc = could not find container \"366ddc63e1156e702b24e0c3d3c02785c10324669fa851e46851f9a2b8a46a5f\": container with ID starting with 366ddc63e1156e702b24e0c3d3c02785c10324669fa851e46851f9a2b8a46a5f not found: ID does not exist" Dec 12 00:37:39 crc kubenswrapper[4606]: I1212 00:37:39.036358 4606 scope.go:117] "RemoveContainer" containerID="653bdfa6f15b75149c89a4aff0706e132368f8bc75520cde9f1e1c55c8866537" Dec 12 00:37:39 crc kubenswrapper[4606]: I1212 00:37:39.036553 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"653bdfa6f15b75149c89a4aff0706e132368f8bc75520cde9f1e1c55c8866537"} err="failed to get container status \"653bdfa6f15b75149c89a4aff0706e132368f8bc75520cde9f1e1c55c8866537\": rpc error: code = NotFound desc = could not find container \"653bdfa6f15b75149c89a4aff0706e132368f8bc75520cde9f1e1c55c8866537\": container with ID starting with 653bdfa6f15b75149c89a4aff0706e132368f8bc75520cde9f1e1c55c8866537 not found: ID does not exist" Dec 12 00:37:39 crc kubenswrapper[4606]: I1212 00:37:39.036573 4606 scope.go:117] "RemoveContainer" containerID="fa827acce161127216bcfcf3553efeb627cbe52e4d23a8cb011e025f67dde670" Dec 12 00:37:39 crc kubenswrapper[4606]: I1212 00:37:39.036755 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa827acce161127216bcfcf3553efeb627cbe52e4d23a8cb011e025f67dde670"} err="failed to get container status \"fa827acce161127216bcfcf3553efeb627cbe52e4d23a8cb011e025f67dde670\": rpc error: code = NotFound desc = could not find container \"fa827acce161127216bcfcf3553efeb627cbe52e4d23a8cb011e025f67dde670\": container with ID starting with fa827acce161127216bcfcf3553efeb627cbe52e4d23a8cb011e025f67dde670 not found: ID does not exist" Dec 12 00:37:39 crc kubenswrapper[4606]: I1212 00:37:39.036776 4606 scope.go:117] "RemoveContainer" containerID="6183a4685fda61493c792e9ae693cd7f76e4aad9753f622b48347da3fd2b3b00" Dec 12 00:37:39 crc kubenswrapper[4606]: I1212 00:37:39.036970 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6183a4685fda61493c792e9ae693cd7f76e4aad9753f622b48347da3fd2b3b00"} err="failed to get container status \"6183a4685fda61493c792e9ae693cd7f76e4aad9753f622b48347da3fd2b3b00\": rpc error: code = NotFound desc = could not find container \"6183a4685fda61493c792e9ae693cd7f76e4aad9753f622b48347da3fd2b3b00\": container with ID starting with 6183a4685fda61493c792e9ae693cd7f76e4aad9753f622b48347da3fd2b3b00 not found: ID does not exist" Dec 12 00:37:39 crc kubenswrapper[4606]: I1212 00:37:39.036990 4606 scope.go:117] "RemoveContainer" containerID="0867a89de2edf12036267a937b7e3b1a801e9710ff0a1250c20e020c80ac4d5d" Dec 12 00:37:39 crc kubenswrapper[4606]: I1212 00:37:39.037183 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0867a89de2edf12036267a937b7e3b1a801e9710ff0a1250c20e020c80ac4d5d"} err="failed to get container status \"0867a89de2edf12036267a937b7e3b1a801e9710ff0a1250c20e020c80ac4d5d\": rpc error: code = NotFound desc = could not find container \"0867a89de2edf12036267a937b7e3b1a801e9710ff0a1250c20e020c80ac4d5d\": container with ID starting with 0867a89de2edf12036267a937b7e3b1a801e9710ff0a1250c20e020c80ac4d5d not found: ID does not exist" Dec 12 00:37:39 crc kubenswrapper[4606]: I1212 00:37:39.037203 4606 scope.go:117] "RemoveContainer" containerID="f94a9f641815d76464c17c6145b9ac4803eedab3bea29a52d65a793d448cebe2" Dec 12 00:37:39 crc kubenswrapper[4606]: I1212 00:37:39.037404 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f94a9f641815d76464c17c6145b9ac4803eedab3bea29a52d65a793d448cebe2"} err="failed to get container status \"f94a9f641815d76464c17c6145b9ac4803eedab3bea29a52d65a793d448cebe2\": rpc error: code = NotFound desc = could not find container \"f94a9f641815d76464c17c6145b9ac4803eedab3bea29a52d65a793d448cebe2\": container with ID starting with f94a9f641815d76464c17c6145b9ac4803eedab3bea29a52d65a793d448cebe2 not found: ID does not exist" Dec 12 00:37:39 crc kubenswrapper[4606]: I1212 00:37:39.037424 4606 scope.go:117] "RemoveContainer" containerID="6bfebed57887da9d277d1b3dc30ba0ef65a624b36ede841840c041551ed0b201" Dec 12 00:37:39 crc kubenswrapper[4606]: I1212 00:37:39.037609 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6bfebed57887da9d277d1b3dc30ba0ef65a624b36ede841840c041551ed0b201"} err="failed to get container status \"6bfebed57887da9d277d1b3dc30ba0ef65a624b36ede841840c041551ed0b201\": rpc error: code = NotFound desc = could not find container \"6bfebed57887da9d277d1b3dc30ba0ef65a624b36ede841840c041551ed0b201\": container with ID starting with 6bfebed57887da9d277d1b3dc30ba0ef65a624b36ede841840c041551ed0b201 not found: ID does not exist" Dec 12 00:37:39 crc kubenswrapper[4606]: I1212 00:37:39.037629 4606 scope.go:117] "RemoveContainer" containerID="04af88555385f41f61f31bcc6dcfd9ea677fd98ba9ef0b1a25003fff11e8be1b" Dec 12 00:37:39 crc kubenswrapper[4606]: I1212 00:37:39.037816 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04af88555385f41f61f31bcc6dcfd9ea677fd98ba9ef0b1a25003fff11e8be1b"} err="failed to get container status \"04af88555385f41f61f31bcc6dcfd9ea677fd98ba9ef0b1a25003fff11e8be1b\": rpc error: code = NotFound desc = could not find container \"04af88555385f41f61f31bcc6dcfd9ea677fd98ba9ef0b1a25003fff11e8be1b\": container with ID starting with 04af88555385f41f61f31bcc6dcfd9ea677fd98ba9ef0b1a25003fff11e8be1b not found: ID does not exist" Dec 12 00:37:39 crc kubenswrapper[4606]: I1212 00:37:39.037837 4606 scope.go:117] "RemoveContainer" containerID="8e3c08f155fa3cef9692edf10c4df9719f3647b5625a9ac68a7c158afb80dd86" Dec 12 00:37:39 crc kubenswrapper[4606]: I1212 00:37:39.038103 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e3c08f155fa3cef9692edf10c4df9719f3647b5625a9ac68a7c158afb80dd86"} err="failed to get container status \"8e3c08f155fa3cef9692edf10c4df9719f3647b5625a9ac68a7c158afb80dd86\": rpc error: code = NotFound desc = could not find container \"8e3c08f155fa3cef9692edf10c4df9719f3647b5625a9ac68a7c158afb80dd86\": container with ID starting with 8e3c08f155fa3cef9692edf10c4df9719f3647b5625a9ac68a7c158afb80dd86 not found: ID does not exist" Dec 12 00:37:39 crc kubenswrapper[4606]: I1212 00:37:39.038123 4606 scope.go:117] "RemoveContainer" containerID="dbb742680d55b47d51cd1964e187b6816e50a88befbce58d3a8746af74cf7071" Dec 12 00:37:39 crc kubenswrapper[4606]: I1212 00:37:39.038383 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dbb742680d55b47d51cd1964e187b6816e50a88befbce58d3a8746af74cf7071"} err="failed to get container status \"dbb742680d55b47d51cd1964e187b6816e50a88befbce58d3a8746af74cf7071\": rpc error: code = NotFound desc = could not find container \"dbb742680d55b47d51cd1964e187b6816e50a88befbce58d3a8746af74cf7071\": container with ID starting with dbb742680d55b47d51cd1964e187b6816e50a88befbce58d3a8746af74cf7071 not found: ID does not exist" Dec 12 00:37:39 crc kubenswrapper[4606]: I1212 00:37:39.038407 4606 scope.go:117] "RemoveContainer" containerID="366ddc63e1156e702b24e0c3d3c02785c10324669fa851e46851f9a2b8a46a5f" Dec 12 00:37:39 crc kubenswrapper[4606]: I1212 00:37:39.038601 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"366ddc63e1156e702b24e0c3d3c02785c10324669fa851e46851f9a2b8a46a5f"} err="failed to get container status \"366ddc63e1156e702b24e0c3d3c02785c10324669fa851e46851f9a2b8a46a5f\": rpc error: code = NotFound desc = could not find container \"366ddc63e1156e702b24e0c3d3c02785c10324669fa851e46851f9a2b8a46a5f\": container with ID starting with 366ddc63e1156e702b24e0c3d3c02785c10324669fa851e46851f9a2b8a46a5f not found: ID does not exist" Dec 12 00:37:39 crc kubenswrapper[4606]: I1212 00:37:39.038622 4606 scope.go:117] "RemoveContainer" containerID="653bdfa6f15b75149c89a4aff0706e132368f8bc75520cde9f1e1c55c8866537" Dec 12 00:37:39 crc kubenswrapper[4606]: I1212 00:37:39.038814 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"653bdfa6f15b75149c89a4aff0706e132368f8bc75520cde9f1e1c55c8866537"} err="failed to get container status \"653bdfa6f15b75149c89a4aff0706e132368f8bc75520cde9f1e1c55c8866537\": rpc error: code = NotFound desc = could not find container \"653bdfa6f15b75149c89a4aff0706e132368f8bc75520cde9f1e1c55c8866537\": container with ID starting with 653bdfa6f15b75149c89a4aff0706e132368f8bc75520cde9f1e1c55c8866537 not found: ID does not exist" Dec 12 00:37:39 crc kubenswrapper[4606]: I1212 00:37:39.038858 4606 scope.go:117] "RemoveContainer" containerID="fa827acce161127216bcfcf3553efeb627cbe52e4d23a8cb011e025f67dde670" Dec 12 00:37:39 crc kubenswrapper[4606]: I1212 00:37:39.039100 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa827acce161127216bcfcf3553efeb627cbe52e4d23a8cb011e025f67dde670"} err="failed to get container status \"fa827acce161127216bcfcf3553efeb627cbe52e4d23a8cb011e025f67dde670\": rpc error: code = NotFound desc = could not find container \"fa827acce161127216bcfcf3553efeb627cbe52e4d23a8cb011e025f67dde670\": container with ID starting with fa827acce161127216bcfcf3553efeb627cbe52e4d23a8cb011e025f67dde670 not found: ID does not exist" Dec 12 00:37:39 crc kubenswrapper[4606]: I1212 00:37:39.039127 4606 scope.go:117] "RemoveContainer" containerID="6183a4685fda61493c792e9ae693cd7f76e4aad9753f622b48347da3fd2b3b00" Dec 12 00:37:39 crc kubenswrapper[4606]: I1212 00:37:39.039392 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6183a4685fda61493c792e9ae693cd7f76e4aad9753f622b48347da3fd2b3b00"} err="failed to get container status \"6183a4685fda61493c792e9ae693cd7f76e4aad9753f622b48347da3fd2b3b00\": rpc error: code = NotFound desc = could not find container \"6183a4685fda61493c792e9ae693cd7f76e4aad9753f622b48347da3fd2b3b00\": container with ID starting with 6183a4685fda61493c792e9ae693cd7f76e4aad9753f622b48347da3fd2b3b00 not found: ID does not exist" Dec 12 00:37:39 crc kubenswrapper[4606]: I1212 00:37:39.039413 4606 scope.go:117] "RemoveContainer" containerID="0867a89de2edf12036267a937b7e3b1a801e9710ff0a1250c20e020c80ac4d5d" Dec 12 00:37:39 crc kubenswrapper[4606]: I1212 00:37:39.039600 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0867a89de2edf12036267a937b7e3b1a801e9710ff0a1250c20e020c80ac4d5d"} err="failed to get container status \"0867a89de2edf12036267a937b7e3b1a801e9710ff0a1250c20e020c80ac4d5d\": rpc error: code = NotFound desc = could not find container \"0867a89de2edf12036267a937b7e3b1a801e9710ff0a1250c20e020c80ac4d5d\": container with ID starting with 0867a89de2edf12036267a937b7e3b1a801e9710ff0a1250c20e020c80ac4d5d not found: ID does not exist" Dec 12 00:37:39 crc kubenswrapper[4606]: I1212 00:37:39.039621 4606 scope.go:117] "RemoveContainer" containerID="f94a9f641815d76464c17c6145b9ac4803eedab3bea29a52d65a793d448cebe2" Dec 12 00:37:39 crc kubenswrapper[4606]: I1212 00:37:39.039811 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f94a9f641815d76464c17c6145b9ac4803eedab3bea29a52d65a793d448cebe2"} err="failed to get container status \"f94a9f641815d76464c17c6145b9ac4803eedab3bea29a52d65a793d448cebe2\": rpc error: code = NotFound desc = could not find container \"f94a9f641815d76464c17c6145b9ac4803eedab3bea29a52d65a793d448cebe2\": container with ID starting with f94a9f641815d76464c17c6145b9ac4803eedab3bea29a52d65a793d448cebe2 not found: ID does not exist" Dec 12 00:37:39 crc kubenswrapper[4606]: I1212 00:37:39.039832 4606 scope.go:117] "RemoveContainer" containerID="6bfebed57887da9d277d1b3dc30ba0ef65a624b36ede841840c041551ed0b201" Dec 12 00:37:39 crc kubenswrapper[4606]: I1212 00:37:39.040016 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6bfebed57887da9d277d1b3dc30ba0ef65a624b36ede841840c041551ed0b201"} err="failed to get container status \"6bfebed57887da9d277d1b3dc30ba0ef65a624b36ede841840c041551ed0b201\": rpc error: code = NotFound desc = could not find container \"6bfebed57887da9d277d1b3dc30ba0ef65a624b36ede841840c041551ed0b201\": container with ID starting with 6bfebed57887da9d277d1b3dc30ba0ef65a624b36ede841840c041551ed0b201 not found: ID does not exist" Dec 12 00:37:39 crc kubenswrapper[4606]: I1212 00:37:39.710101 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da25b0ba-5398-4185-a4c1-aeba44ae5633" path="/var/lib/kubelet/pods/da25b0ba-5398-4185-a4c1-aeba44ae5633/volumes" Dec 12 00:37:39 crc kubenswrapper[4606]: I1212 00:37:39.743507 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xzcfk_b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0/kube-multus/2.log" Dec 12 00:37:39 crc kubenswrapper[4606]: I1212 00:37:39.744370 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xzcfk_b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0/kube-multus/1.log" Dec 12 00:37:39 crc kubenswrapper[4606]: I1212 00:37:39.744469 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xzcfk" event={"ID":"b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0","Type":"ContainerStarted","Data":"a7913c49635cfe2008f40abdcbf8644021f335bca2a6ed9881afcfb687727206"} Dec 12 00:37:39 crc kubenswrapper[4606]: I1212 00:37:39.753941 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nklsf" event={"ID":"c3161e84-a5cb-41b9-adad-c9b4ab79b746","Type":"ContainerStarted","Data":"6680291e6affcfbb61d90774e405ff564107d86d401802a8a0aa070cc48935a4"} Dec 12 00:37:39 crc kubenswrapper[4606]: I1212 00:37:39.753991 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nklsf" event={"ID":"c3161e84-a5cb-41b9-adad-c9b4ab79b746","Type":"ContainerStarted","Data":"beab3c08dae34fe7bbb858e9ad00a4f235e45b8b3b5c9102f8feca94876f8ebb"} Dec 12 00:37:39 crc kubenswrapper[4606]: I1212 00:37:39.754003 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nklsf" event={"ID":"c3161e84-a5cb-41b9-adad-c9b4ab79b746","Type":"ContainerStarted","Data":"619f64ae7d263c49c56165dcc8260070f54ab249c8f4a8d692f1a6018a296e91"} Dec 12 00:37:39 crc kubenswrapper[4606]: I1212 00:37:39.754015 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nklsf" event={"ID":"c3161e84-a5cb-41b9-adad-c9b4ab79b746","Type":"ContainerStarted","Data":"9a3157e3a36ba84eb0153647f18d989a5a412969dba4481bf51d4fd5f21299cc"} Dec 12 00:37:39 crc kubenswrapper[4606]: I1212 00:37:39.754031 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nklsf" event={"ID":"c3161e84-a5cb-41b9-adad-c9b4ab79b746","Type":"ContainerStarted","Data":"86076dc5e03422b2f128210f9aec25f80f10ee930bc7f4440e8f8df7e45d8991"} Dec 12 00:37:39 crc kubenswrapper[4606]: I1212 00:37:39.754041 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nklsf" event={"ID":"c3161e84-a5cb-41b9-adad-c9b4ab79b746","Type":"ContainerStarted","Data":"d816b2b8ae4d5b3c0a7a33bbb73a6c15f7dd3d7724d6a4d0fdd5bee1aeed61d2"} Dec 12 00:37:41 crc kubenswrapper[4606]: I1212 00:37:41.775346 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nklsf" event={"ID":"c3161e84-a5cb-41b9-adad-c9b4ab79b746","Type":"ContainerStarted","Data":"90e51df3cd17c02aa4662c2875706b5fbb14151e6982564d224ed4d14c1d0d61"} Dec 12 00:37:44 crc kubenswrapper[4606]: I1212 00:37:44.801625 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nklsf" event={"ID":"c3161e84-a5cb-41b9-adad-c9b4ab79b746","Type":"ContainerStarted","Data":"50b48df381750693ca31561926789f87ba7ae236aa7a8889e047c12a50536096"} Dec 12 00:37:44 crc kubenswrapper[4606]: I1212 00:37:44.802239 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-nklsf" Dec 12 00:37:44 crc kubenswrapper[4606]: I1212 00:37:44.802254 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-nklsf" Dec 12 00:37:44 crc kubenswrapper[4606]: I1212 00:37:44.832615 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-nklsf" Dec 12 00:37:44 crc kubenswrapper[4606]: I1212 00:37:44.842602 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-nklsf" podStartSLOduration=6.842586255 podStartE2EDuration="6.842586255s" podCreationTimestamp="2025-12-12 00:37:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:37:44.841493665 +0000 UTC m=+855.386846551" watchObservedRunningTime="2025-12-12 00:37:44.842586255 +0000 UTC m=+855.387939131" Dec 12 00:37:45 crc kubenswrapper[4606]: I1212 00:37:45.808406 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-nklsf" Dec 12 00:37:45 crc kubenswrapper[4606]: I1212 00:37:45.881107 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-nklsf" Dec 12 00:38:02 crc kubenswrapper[4606]: I1212 00:38:02.010922 4606 patch_prober.go:28] interesting pod/machine-config-daemon-cqmz5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 12 00:38:02 crc kubenswrapper[4606]: I1212 00:38:02.011660 4606 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 12 00:38:08 crc kubenswrapper[4606]: I1212 00:38:08.530092 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-nklsf" Dec 12 00:38:16 crc kubenswrapper[4606]: I1212 00:38:16.587435 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-5rvsk"] Dec 12 00:38:16 crc kubenswrapper[4606]: I1212 00:38:16.589682 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5rvsk" Dec 12 00:38:16 crc kubenswrapper[4606]: I1212 00:38:16.607446 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5rvsk"] Dec 12 00:38:16 crc kubenswrapper[4606]: I1212 00:38:16.768813 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7d278b3-153c-4a0f-b333-a568a2be0b9a-utilities\") pod \"redhat-operators-5rvsk\" (UID: \"c7d278b3-153c-4a0f-b333-a568a2be0b9a\") " pod="openshift-marketplace/redhat-operators-5rvsk" Dec 12 00:38:16 crc kubenswrapper[4606]: I1212 00:38:16.768874 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7d278b3-153c-4a0f-b333-a568a2be0b9a-catalog-content\") pod \"redhat-operators-5rvsk\" (UID: \"c7d278b3-153c-4a0f-b333-a568a2be0b9a\") " pod="openshift-marketplace/redhat-operators-5rvsk" Dec 12 00:38:16 crc kubenswrapper[4606]: I1212 00:38:16.769014 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8srt\" (UniqueName: \"kubernetes.io/projected/c7d278b3-153c-4a0f-b333-a568a2be0b9a-kube-api-access-h8srt\") pod \"redhat-operators-5rvsk\" (UID: \"c7d278b3-153c-4a0f-b333-a568a2be0b9a\") " pod="openshift-marketplace/redhat-operators-5rvsk" Dec 12 00:38:16 crc kubenswrapper[4606]: I1212 00:38:16.870271 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h8srt\" (UniqueName: \"kubernetes.io/projected/c7d278b3-153c-4a0f-b333-a568a2be0b9a-kube-api-access-h8srt\") pod \"redhat-operators-5rvsk\" (UID: \"c7d278b3-153c-4a0f-b333-a568a2be0b9a\") " pod="openshift-marketplace/redhat-operators-5rvsk" Dec 12 00:38:16 crc kubenswrapper[4606]: I1212 00:38:16.870424 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7d278b3-153c-4a0f-b333-a568a2be0b9a-utilities\") pod \"redhat-operators-5rvsk\" (UID: \"c7d278b3-153c-4a0f-b333-a568a2be0b9a\") " pod="openshift-marketplace/redhat-operators-5rvsk" Dec 12 00:38:16 crc kubenswrapper[4606]: I1212 00:38:16.870458 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7d278b3-153c-4a0f-b333-a568a2be0b9a-catalog-content\") pod \"redhat-operators-5rvsk\" (UID: \"c7d278b3-153c-4a0f-b333-a568a2be0b9a\") " pod="openshift-marketplace/redhat-operators-5rvsk" Dec 12 00:38:16 crc kubenswrapper[4606]: I1212 00:38:16.871100 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7d278b3-153c-4a0f-b333-a568a2be0b9a-utilities\") pod \"redhat-operators-5rvsk\" (UID: \"c7d278b3-153c-4a0f-b333-a568a2be0b9a\") " pod="openshift-marketplace/redhat-operators-5rvsk" Dec 12 00:38:16 crc kubenswrapper[4606]: I1212 00:38:16.871781 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7d278b3-153c-4a0f-b333-a568a2be0b9a-catalog-content\") pod \"redhat-operators-5rvsk\" (UID: \"c7d278b3-153c-4a0f-b333-a568a2be0b9a\") " pod="openshift-marketplace/redhat-operators-5rvsk" Dec 12 00:38:16 crc kubenswrapper[4606]: I1212 00:38:16.892954 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8srt\" (UniqueName: \"kubernetes.io/projected/c7d278b3-153c-4a0f-b333-a568a2be0b9a-kube-api-access-h8srt\") pod \"redhat-operators-5rvsk\" (UID: \"c7d278b3-153c-4a0f-b333-a568a2be0b9a\") " pod="openshift-marketplace/redhat-operators-5rvsk" Dec 12 00:38:16 crc kubenswrapper[4606]: I1212 00:38:16.935274 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5rvsk" Dec 12 00:38:17 crc kubenswrapper[4606]: I1212 00:38:17.177964 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5rvsk"] Dec 12 00:38:18 crc kubenswrapper[4606]: I1212 00:38:18.015457 4606 generic.go:334] "Generic (PLEG): container finished" podID="c7d278b3-153c-4a0f-b333-a568a2be0b9a" containerID="b7d78f2d77e0cb6042611f97eb4c0985bb9ec7a85e07b9c59462c1f999d23b7d" exitCode=0 Dec 12 00:38:18 crc kubenswrapper[4606]: I1212 00:38:18.016278 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5rvsk" event={"ID":"c7d278b3-153c-4a0f-b333-a568a2be0b9a","Type":"ContainerDied","Data":"b7d78f2d77e0cb6042611f97eb4c0985bb9ec7a85e07b9c59462c1f999d23b7d"} Dec 12 00:38:18 crc kubenswrapper[4606]: I1212 00:38:18.016321 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5rvsk" event={"ID":"c7d278b3-153c-4a0f-b333-a568a2be0b9a","Type":"ContainerStarted","Data":"500d30612b4ce2c26a39decf7201647829d53b1b0f794a068abd0e77324ba413"} Dec 12 00:38:20 crc kubenswrapper[4606]: I1212 00:38:20.032862 4606 generic.go:334] "Generic (PLEG): container finished" podID="c7d278b3-153c-4a0f-b333-a568a2be0b9a" containerID="b43217fc1bd1023aa97eb2c46a3e9b267aa5fee9495084dceaa66fde13b6e19f" exitCode=0 Dec 12 00:38:20 crc kubenswrapper[4606]: I1212 00:38:20.032930 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5rvsk" event={"ID":"c7d278b3-153c-4a0f-b333-a568a2be0b9a","Type":"ContainerDied","Data":"b43217fc1bd1023aa97eb2c46a3e9b267aa5fee9495084dceaa66fde13b6e19f"} Dec 12 00:38:20 crc kubenswrapper[4606]: I1212 00:38:20.513715 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-pw5mx"] Dec 12 00:38:20 crc kubenswrapper[4606]: I1212 00:38:20.514085 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-pw5mx" podUID="70af385d-13b8-4ff2-8c35-fb9402388dd6" containerName="registry-server" containerID="cri-o://9c838d0f57f826a673eda5f07bb7b9e852a12c247b1525732ac623e2f33d29a2" gracePeriod=30 Dec 12 00:38:20 crc kubenswrapper[4606]: I1212 00:38:20.539747 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wq42n"] Dec 12 00:38:20 crc kubenswrapper[4606]: I1212 00:38:20.540261 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-wq42n" podUID="bc97a6ce-dd87-4a67-a1ec-99dcce21178c" containerName="registry-server" containerID="cri-o://8f83ef2ee07d344164a5648542b1fada488caece01fdcd51890d8b38ac8a53fb" gracePeriod=30 Dec 12 00:38:20 crc kubenswrapper[4606]: I1212 00:38:20.562415 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-rq529"] Dec 12 00:38:20 crc kubenswrapper[4606]: I1212 00:38:20.562585 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-rq529" podUID="d4738423-c294-4595-8656-3a2ebd437a75" containerName="marketplace-operator" containerID="cri-o://5039c720cb348cfc5fe9304089c81828c6ddce45c3bb46bd39a8727206bff5a9" gracePeriod=30 Dec 12 00:38:20 crc kubenswrapper[4606]: I1212 00:38:20.570881 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-k7skb"] Dec 12 00:38:20 crc kubenswrapper[4606]: I1212 00:38:20.571223 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-k7skb" podUID="2d17c9ef-183f-49d5-96ef-c21b165d4f2a" containerName="registry-server" containerID="cri-o://95455317867ba998f8aa931ff46ee5588cad115c1b0fbd219e99ebf9ee57ee9a" gracePeriod=30 Dec 12 00:38:20 crc kubenswrapper[4606]: I1212 00:38:20.577616 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5rvsk"] Dec 12 00:38:20 crc kubenswrapper[4606]: I1212 00:38:20.593075 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-w8wq4"] Dec 12 00:38:20 crc kubenswrapper[4606]: I1212 00:38:20.593402 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-w8wq4" podUID="38061ed5-dc1b-4c0b-9b0d-02412c9ca54d" containerName="registry-server" containerID="cri-o://8a319995b7a8f0d536552df335865b18913ff2c30a133cdcf23bb997b1a78745" gracePeriod=30 Dec 12 00:38:20 crc kubenswrapper[4606]: I1212 00:38:20.598054 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-xqq5z"] Dec 12 00:38:20 crc kubenswrapper[4606]: I1212 00:38:20.598770 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-xqq5z" Dec 12 00:38:20 crc kubenswrapper[4606]: I1212 00:38:20.606847 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-xqq5z"] Dec 12 00:38:20 crc kubenswrapper[4606]: I1212 00:38:20.622884 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxl2n\" (UniqueName: \"kubernetes.io/projected/bf10904b-21cd-4987-bedb-118b0992002a-kube-api-access-lxl2n\") pod \"marketplace-operator-79b997595-xqq5z\" (UID: \"bf10904b-21cd-4987-bedb-118b0992002a\") " pod="openshift-marketplace/marketplace-operator-79b997595-xqq5z" Dec 12 00:38:20 crc kubenswrapper[4606]: I1212 00:38:20.622971 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/bf10904b-21cd-4987-bedb-118b0992002a-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-xqq5z\" (UID: \"bf10904b-21cd-4987-bedb-118b0992002a\") " pod="openshift-marketplace/marketplace-operator-79b997595-xqq5z" Dec 12 00:38:20 crc kubenswrapper[4606]: I1212 00:38:20.623085 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf10904b-21cd-4987-bedb-118b0992002a-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-xqq5z\" (UID: \"bf10904b-21cd-4987-bedb-118b0992002a\") " pod="openshift-marketplace/marketplace-operator-79b997595-xqq5z" Dec 12 00:38:20 crc kubenswrapper[4606]: E1212 00:38:20.660870 4606 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod70af385d_13b8_4ff2_8c35_fb9402388dd6.slice/crio-9c838d0f57f826a673eda5f07bb7b9e852a12c247b1525732ac623e2f33d29a2.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd4738423_c294_4595_8656_3a2ebd437a75.slice/crio-5039c720cb348cfc5fe9304089c81828c6ddce45c3bb46bd39a8727206bff5a9.scope\": RecentStats: unable to find data in memory cache]" Dec 12 00:38:20 crc kubenswrapper[4606]: I1212 00:38:20.728083 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf10904b-21cd-4987-bedb-118b0992002a-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-xqq5z\" (UID: \"bf10904b-21cd-4987-bedb-118b0992002a\") " pod="openshift-marketplace/marketplace-operator-79b997595-xqq5z" Dec 12 00:38:20 crc kubenswrapper[4606]: I1212 00:38:20.728165 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxl2n\" (UniqueName: \"kubernetes.io/projected/bf10904b-21cd-4987-bedb-118b0992002a-kube-api-access-lxl2n\") pod \"marketplace-operator-79b997595-xqq5z\" (UID: \"bf10904b-21cd-4987-bedb-118b0992002a\") " pod="openshift-marketplace/marketplace-operator-79b997595-xqq5z" Dec 12 00:38:20 crc kubenswrapper[4606]: I1212 00:38:20.728305 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/bf10904b-21cd-4987-bedb-118b0992002a-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-xqq5z\" (UID: \"bf10904b-21cd-4987-bedb-118b0992002a\") " pod="openshift-marketplace/marketplace-operator-79b997595-xqq5z" Dec 12 00:38:20 crc kubenswrapper[4606]: I1212 00:38:20.730348 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf10904b-21cd-4987-bedb-118b0992002a-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-xqq5z\" (UID: \"bf10904b-21cd-4987-bedb-118b0992002a\") " pod="openshift-marketplace/marketplace-operator-79b997595-xqq5z" Dec 12 00:38:20 crc kubenswrapper[4606]: I1212 00:38:20.740543 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/bf10904b-21cd-4987-bedb-118b0992002a-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-xqq5z\" (UID: \"bf10904b-21cd-4987-bedb-118b0992002a\") " pod="openshift-marketplace/marketplace-operator-79b997595-xqq5z" Dec 12 00:38:20 crc kubenswrapper[4606]: I1212 00:38:20.752648 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxl2n\" (UniqueName: \"kubernetes.io/projected/bf10904b-21cd-4987-bedb-118b0992002a-kube-api-access-lxl2n\") pod \"marketplace-operator-79b997595-xqq5z\" (UID: \"bf10904b-21cd-4987-bedb-118b0992002a\") " pod="openshift-marketplace/marketplace-operator-79b997595-xqq5z" Dec 12 00:38:20 crc kubenswrapper[4606]: I1212 00:38:20.887703 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-xqq5z" Dec 12 00:38:20 crc kubenswrapper[4606]: I1212 00:38:20.971781 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pw5mx" Dec 12 00:38:21 crc kubenswrapper[4606]: I1212 00:38:21.032207 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/70af385d-13b8-4ff2-8c35-fb9402388dd6-catalog-content\") pod \"70af385d-13b8-4ff2-8c35-fb9402388dd6\" (UID: \"70af385d-13b8-4ff2-8c35-fb9402388dd6\") " Dec 12 00:38:21 crc kubenswrapper[4606]: I1212 00:38:21.032287 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c55zp\" (UniqueName: \"kubernetes.io/projected/70af385d-13b8-4ff2-8c35-fb9402388dd6-kube-api-access-c55zp\") pod \"70af385d-13b8-4ff2-8c35-fb9402388dd6\" (UID: \"70af385d-13b8-4ff2-8c35-fb9402388dd6\") " Dec 12 00:38:21 crc kubenswrapper[4606]: I1212 00:38:21.032379 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/70af385d-13b8-4ff2-8c35-fb9402388dd6-utilities\") pod \"70af385d-13b8-4ff2-8c35-fb9402388dd6\" (UID: \"70af385d-13b8-4ff2-8c35-fb9402388dd6\") " Dec 12 00:38:21 crc kubenswrapper[4606]: I1212 00:38:21.033614 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70af385d-13b8-4ff2-8c35-fb9402388dd6-utilities" (OuterVolumeSpecName: "utilities") pod "70af385d-13b8-4ff2-8c35-fb9402388dd6" (UID: "70af385d-13b8-4ff2-8c35-fb9402388dd6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 00:38:21 crc kubenswrapper[4606]: I1212 00:38:21.039618 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wq42n" Dec 12 00:38:21 crc kubenswrapper[4606]: I1212 00:38:21.054208 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70af385d-13b8-4ff2-8c35-fb9402388dd6-kube-api-access-c55zp" (OuterVolumeSpecName: "kube-api-access-c55zp") pod "70af385d-13b8-4ff2-8c35-fb9402388dd6" (UID: "70af385d-13b8-4ff2-8c35-fb9402388dd6"). InnerVolumeSpecName "kube-api-access-c55zp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:38:21 crc kubenswrapper[4606]: I1212 00:38:21.075346 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w8wq4" Dec 12 00:38:21 crc kubenswrapper[4606]: I1212 00:38:21.088544 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k7skb" Dec 12 00:38:21 crc kubenswrapper[4606]: I1212 00:38:21.088664 4606 generic.go:334] "Generic (PLEG): container finished" podID="bc97a6ce-dd87-4a67-a1ec-99dcce21178c" containerID="8f83ef2ee07d344164a5648542b1fada488caece01fdcd51890d8b38ac8a53fb" exitCode=0 Dec 12 00:38:21 crc kubenswrapper[4606]: I1212 00:38:21.088713 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wq42n" event={"ID":"bc97a6ce-dd87-4a67-a1ec-99dcce21178c","Type":"ContainerDied","Data":"8f83ef2ee07d344164a5648542b1fada488caece01fdcd51890d8b38ac8a53fb"} Dec 12 00:38:21 crc kubenswrapper[4606]: I1212 00:38:21.088736 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wq42n" event={"ID":"bc97a6ce-dd87-4a67-a1ec-99dcce21178c","Type":"ContainerDied","Data":"f7a7a919e005fb2c388486c89d39f5dac935a3f51b25d257f61fd32d5fc09f89"} Dec 12 00:38:21 crc kubenswrapper[4606]: I1212 00:38:21.088746 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wq42n" Dec 12 00:38:21 crc kubenswrapper[4606]: I1212 00:38:21.088752 4606 scope.go:117] "RemoveContainer" containerID="8f83ef2ee07d344164a5648542b1fada488caece01fdcd51890d8b38ac8a53fb" Dec 12 00:38:21 crc kubenswrapper[4606]: I1212 00:38:21.092750 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-rq529" Dec 12 00:38:21 crc kubenswrapper[4606]: I1212 00:38:21.106597 4606 generic.go:334] "Generic (PLEG): container finished" podID="38061ed5-dc1b-4c0b-9b0d-02412c9ca54d" containerID="8a319995b7a8f0d536552df335865b18913ff2c30a133cdcf23bb997b1a78745" exitCode=0 Dec 12 00:38:21 crc kubenswrapper[4606]: I1212 00:38:21.106660 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w8wq4" event={"ID":"38061ed5-dc1b-4c0b-9b0d-02412c9ca54d","Type":"ContainerDied","Data":"8a319995b7a8f0d536552df335865b18913ff2c30a133cdcf23bb997b1a78745"} Dec 12 00:38:21 crc kubenswrapper[4606]: I1212 00:38:21.106680 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w8wq4" event={"ID":"38061ed5-dc1b-4c0b-9b0d-02412c9ca54d","Type":"ContainerDied","Data":"1a02ef6e49940ba5034184f33549700a528def3799dc97beb2157cd4e353c3eb"} Dec 12 00:38:21 crc kubenswrapper[4606]: I1212 00:38:21.106733 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w8wq4" Dec 12 00:38:21 crc kubenswrapper[4606]: I1212 00:38:21.126391 4606 generic.go:334] "Generic (PLEG): container finished" podID="70af385d-13b8-4ff2-8c35-fb9402388dd6" containerID="9c838d0f57f826a673eda5f07bb7b9e852a12c247b1525732ac623e2f33d29a2" exitCode=0 Dec 12 00:38:21 crc kubenswrapper[4606]: I1212 00:38:21.126466 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pw5mx" event={"ID":"70af385d-13b8-4ff2-8c35-fb9402388dd6","Type":"ContainerDied","Data":"9c838d0f57f826a673eda5f07bb7b9e852a12c247b1525732ac623e2f33d29a2"} Dec 12 00:38:21 crc kubenswrapper[4606]: I1212 00:38:21.126488 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pw5mx" event={"ID":"70af385d-13b8-4ff2-8c35-fb9402388dd6","Type":"ContainerDied","Data":"2a590979af5e0f826ee30e77dca5349365f4395cdc2845c5a68eba437bfc7dbf"} Dec 12 00:38:21 crc kubenswrapper[4606]: I1212 00:38:21.126613 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pw5mx" Dec 12 00:38:21 crc kubenswrapper[4606]: I1212 00:38:21.128935 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5rvsk" event={"ID":"c7d278b3-153c-4a0f-b333-a568a2be0b9a","Type":"ContainerStarted","Data":"5e5939c44ee86ad573ea0add0cde22fcbe2e06f5e46cb0831383aad657cd5757"} Dec 12 00:38:21 crc kubenswrapper[4606]: I1212 00:38:21.129076 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-5rvsk" podUID="c7d278b3-153c-4a0f-b333-a568a2be0b9a" containerName="registry-server" containerID="cri-o://5e5939c44ee86ad573ea0add0cde22fcbe2e06f5e46cb0831383aad657cd5757" gracePeriod=30 Dec 12 00:38:21 crc kubenswrapper[4606]: I1212 00:38:21.130294 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70af385d-13b8-4ff2-8c35-fb9402388dd6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "70af385d-13b8-4ff2-8c35-fb9402388dd6" (UID: "70af385d-13b8-4ff2-8c35-fb9402388dd6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 00:38:21 crc kubenswrapper[4606]: I1212 00:38:21.133145 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hw777\" (UniqueName: \"kubernetes.io/projected/bc97a6ce-dd87-4a67-a1ec-99dcce21178c-kube-api-access-hw777\") pod \"bc97a6ce-dd87-4a67-a1ec-99dcce21178c\" (UID: \"bc97a6ce-dd87-4a67-a1ec-99dcce21178c\") " Dec 12 00:38:21 crc kubenswrapper[4606]: I1212 00:38:21.133197 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fbxd2\" (UniqueName: \"kubernetes.io/projected/d4738423-c294-4595-8656-3a2ebd437a75-kube-api-access-fbxd2\") pod \"d4738423-c294-4595-8656-3a2ebd437a75\" (UID: \"d4738423-c294-4595-8656-3a2ebd437a75\") " Dec 12 00:38:21 crc kubenswrapper[4606]: I1212 00:38:21.133301 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d17c9ef-183f-49d5-96ef-c21b165d4f2a-catalog-content\") pod \"2d17c9ef-183f-49d5-96ef-c21b165d4f2a\" (UID: \"2d17c9ef-183f-49d5-96ef-c21b165d4f2a\") " Dec 12 00:38:21 crc kubenswrapper[4606]: I1212 00:38:21.133362 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38061ed5-dc1b-4c0b-9b0d-02412c9ca54d-catalog-content\") pod \"38061ed5-dc1b-4c0b-9b0d-02412c9ca54d\" (UID: \"38061ed5-dc1b-4c0b-9b0d-02412c9ca54d\") " Dec 12 00:38:21 crc kubenswrapper[4606]: I1212 00:38:21.133395 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d17c9ef-183f-49d5-96ef-c21b165d4f2a-utilities\") pod \"2d17c9ef-183f-49d5-96ef-c21b165d4f2a\" (UID: \"2d17c9ef-183f-49d5-96ef-c21b165d4f2a\") " Dec 12 00:38:21 crc kubenswrapper[4606]: I1212 00:38:21.133410 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38061ed5-dc1b-4c0b-9b0d-02412c9ca54d-utilities\") pod \"38061ed5-dc1b-4c0b-9b0d-02412c9ca54d\" (UID: \"38061ed5-dc1b-4c0b-9b0d-02412c9ca54d\") " Dec 12 00:38:21 crc kubenswrapper[4606]: I1212 00:38:21.133431 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d4738423-c294-4595-8656-3a2ebd437a75-marketplace-trusted-ca\") pod \"d4738423-c294-4595-8656-3a2ebd437a75\" (UID: \"d4738423-c294-4595-8656-3a2ebd437a75\") " Dec 12 00:38:21 crc kubenswrapper[4606]: I1212 00:38:21.133445 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc97a6ce-dd87-4a67-a1ec-99dcce21178c-utilities\") pod \"bc97a6ce-dd87-4a67-a1ec-99dcce21178c\" (UID: \"bc97a6ce-dd87-4a67-a1ec-99dcce21178c\") " Dec 12 00:38:21 crc kubenswrapper[4606]: I1212 00:38:21.133496 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/d4738423-c294-4595-8656-3a2ebd437a75-marketplace-operator-metrics\") pod \"d4738423-c294-4595-8656-3a2ebd437a75\" (UID: \"d4738423-c294-4595-8656-3a2ebd437a75\") " Dec 12 00:38:21 crc kubenswrapper[4606]: I1212 00:38:21.133515 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc97a6ce-dd87-4a67-a1ec-99dcce21178c-catalog-content\") pod \"bc97a6ce-dd87-4a67-a1ec-99dcce21178c\" (UID: \"bc97a6ce-dd87-4a67-a1ec-99dcce21178c\") " Dec 12 00:38:21 crc kubenswrapper[4606]: I1212 00:38:21.133533 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ssjsk\" (UniqueName: \"kubernetes.io/projected/2d17c9ef-183f-49d5-96ef-c21b165d4f2a-kube-api-access-ssjsk\") pod \"2d17c9ef-183f-49d5-96ef-c21b165d4f2a\" (UID: \"2d17c9ef-183f-49d5-96ef-c21b165d4f2a\") " Dec 12 00:38:21 crc kubenswrapper[4606]: I1212 00:38:21.133551 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9627\" (UniqueName: \"kubernetes.io/projected/38061ed5-dc1b-4c0b-9b0d-02412c9ca54d-kube-api-access-w9627\") pod \"38061ed5-dc1b-4c0b-9b0d-02412c9ca54d\" (UID: \"38061ed5-dc1b-4c0b-9b0d-02412c9ca54d\") " Dec 12 00:38:21 crc kubenswrapper[4606]: I1212 00:38:21.133729 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c55zp\" (UniqueName: \"kubernetes.io/projected/70af385d-13b8-4ff2-8c35-fb9402388dd6-kube-api-access-c55zp\") on node \"crc\" DevicePath \"\"" Dec 12 00:38:21 crc kubenswrapper[4606]: I1212 00:38:21.133740 4606 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/70af385d-13b8-4ff2-8c35-fb9402388dd6-utilities\") on node \"crc\" DevicePath \"\"" Dec 12 00:38:21 crc kubenswrapper[4606]: I1212 00:38:21.133749 4606 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/70af385d-13b8-4ff2-8c35-fb9402388dd6-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 12 00:38:21 crc kubenswrapper[4606]: I1212 00:38:21.138831 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4738423-c294-4595-8656-3a2ebd437a75-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "d4738423-c294-4595-8656-3a2ebd437a75" (UID: "d4738423-c294-4595-8656-3a2ebd437a75"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:38:21 crc kubenswrapper[4606]: I1212 00:38:21.139535 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d4738423-c294-4595-8656-3a2ebd437a75-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "d4738423-c294-4595-8656-3a2ebd437a75" (UID: "d4738423-c294-4595-8656-3a2ebd437a75"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:38:21 crc kubenswrapper[4606]: I1212 00:38:21.140806 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc97a6ce-dd87-4a67-a1ec-99dcce21178c-utilities" (OuterVolumeSpecName: "utilities") pod "bc97a6ce-dd87-4a67-a1ec-99dcce21178c" (UID: "bc97a6ce-dd87-4a67-a1ec-99dcce21178c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 00:38:21 crc kubenswrapper[4606]: I1212 00:38:21.140977 4606 generic.go:334] "Generic (PLEG): container finished" podID="2d17c9ef-183f-49d5-96ef-c21b165d4f2a" containerID="95455317867ba998f8aa931ff46ee5588cad115c1b0fbd219e99ebf9ee57ee9a" exitCode=0 Dec 12 00:38:21 crc kubenswrapper[4606]: I1212 00:38:21.141053 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k7skb" event={"ID":"2d17c9ef-183f-49d5-96ef-c21b165d4f2a","Type":"ContainerDied","Data":"95455317867ba998f8aa931ff46ee5588cad115c1b0fbd219e99ebf9ee57ee9a"} Dec 12 00:38:21 crc kubenswrapper[4606]: I1212 00:38:21.141079 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k7skb" event={"ID":"2d17c9ef-183f-49d5-96ef-c21b165d4f2a","Type":"ContainerDied","Data":"7d96b8088a3d59a9430deda5cedd21a54bf88db3c389ffe2fcd879b877734e75"} Dec 12 00:38:21 crc kubenswrapper[4606]: I1212 00:38:21.141147 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k7skb" Dec 12 00:38:21 crc kubenswrapper[4606]: I1212 00:38:21.144047 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38061ed5-dc1b-4c0b-9b0d-02412c9ca54d-utilities" (OuterVolumeSpecName: "utilities") pod "38061ed5-dc1b-4c0b-9b0d-02412c9ca54d" (UID: "38061ed5-dc1b-4c0b-9b0d-02412c9ca54d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 00:38:21 crc kubenswrapper[4606]: I1212 00:38:21.146295 4606 scope.go:117] "RemoveContainer" containerID="da8179203795058d13db6c8d7bc94520d4831300e6349d2ee80d7450cc334358" Dec 12 00:38:21 crc kubenswrapper[4606]: I1212 00:38:21.148038 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d17c9ef-183f-49d5-96ef-c21b165d4f2a-utilities" (OuterVolumeSpecName: "utilities") pod "2d17c9ef-183f-49d5-96ef-c21b165d4f2a" (UID: "2d17c9ef-183f-49d5-96ef-c21b165d4f2a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 00:38:21 crc kubenswrapper[4606]: I1212 00:38:21.150192 4606 generic.go:334] "Generic (PLEG): container finished" podID="d4738423-c294-4595-8656-3a2ebd437a75" containerID="5039c720cb348cfc5fe9304089c81828c6ddce45c3bb46bd39a8727206bff5a9" exitCode=0 Dec 12 00:38:21 crc kubenswrapper[4606]: I1212 00:38:21.150230 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-rq529" event={"ID":"d4738423-c294-4595-8656-3a2ebd437a75","Type":"ContainerDied","Data":"5039c720cb348cfc5fe9304089c81828c6ddce45c3bb46bd39a8727206bff5a9"} Dec 12 00:38:21 crc kubenswrapper[4606]: I1212 00:38:21.150255 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-rq529" event={"ID":"d4738423-c294-4595-8656-3a2ebd437a75","Type":"ContainerDied","Data":"2eacef2297de561f4ab9ba762aef3dd724123e7a7bb017e579f526dddb5bd7f4"} Dec 12 00:38:21 crc kubenswrapper[4606]: I1212 00:38:21.150326 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-rq529" Dec 12 00:38:21 crc kubenswrapper[4606]: I1212 00:38:21.153669 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4738423-c294-4595-8656-3a2ebd437a75-kube-api-access-fbxd2" (OuterVolumeSpecName: "kube-api-access-fbxd2") pod "d4738423-c294-4595-8656-3a2ebd437a75" (UID: "d4738423-c294-4595-8656-3a2ebd437a75"). InnerVolumeSpecName "kube-api-access-fbxd2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:38:21 crc kubenswrapper[4606]: I1212 00:38:21.158721 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38061ed5-dc1b-4c0b-9b0d-02412c9ca54d-kube-api-access-w9627" (OuterVolumeSpecName: "kube-api-access-w9627") pod "38061ed5-dc1b-4c0b-9b0d-02412c9ca54d" (UID: "38061ed5-dc1b-4c0b-9b0d-02412c9ca54d"). InnerVolumeSpecName "kube-api-access-w9627". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:38:21 crc kubenswrapper[4606]: I1212 00:38:21.163120 4606 scope.go:117] "RemoveContainer" containerID="7efe270a47d608787682fc344c031f32f0c46a60593754c623b6783f50bb3d9c" Dec 12 00:38:21 crc kubenswrapper[4606]: I1212 00:38:21.164639 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc97a6ce-dd87-4a67-a1ec-99dcce21178c-kube-api-access-hw777" (OuterVolumeSpecName: "kube-api-access-hw777") pod "bc97a6ce-dd87-4a67-a1ec-99dcce21178c" (UID: "bc97a6ce-dd87-4a67-a1ec-99dcce21178c"). InnerVolumeSpecName "kube-api-access-hw777". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:38:21 crc kubenswrapper[4606]: I1212 00:38:21.170331 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d17c9ef-183f-49d5-96ef-c21b165d4f2a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2d17c9ef-183f-49d5-96ef-c21b165d4f2a" (UID: "2d17c9ef-183f-49d5-96ef-c21b165d4f2a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 00:38:21 crc kubenswrapper[4606]: I1212 00:38:21.170995 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d17c9ef-183f-49d5-96ef-c21b165d4f2a-kube-api-access-ssjsk" (OuterVolumeSpecName: "kube-api-access-ssjsk") pod "2d17c9ef-183f-49d5-96ef-c21b165d4f2a" (UID: "2d17c9ef-183f-49d5-96ef-c21b165d4f2a"). InnerVolumeSpecName "kube-api-access-ssjsk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:38:21 crc kubenswrapper[4606]: I1212 00:38:21.181040 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-5rvsk" podStartSLOduration=2.742938643 podStartE2EDuration="5.181021796s" podCreationTimestamp="2025-12-12 00:38:16 +0000 UTC" firstStartedPulling="2025-12-12 00:38:18.017959243 +0000 UTC m=+888.563312109" lastFinishedPulling="2025-12-12 00:38:20.456042406 +0000 UTC m=+891.001395262" observedRunningTime="2025-12-12 00:38:21.180919753 +0000 UTC m=+891.726272619" watchObservedRunningTime="2025-12-12 00:38:21.181021796 +0000 UTC m=+891.726374662" Dec 12 00:38:21 crc kubenswrapper[4606]: I1212 00:38:21.183005 4606 scope.go:117] "RemoveContainer" containerID="8f83ef2ee07d344164a5648542b1fada488caece01fdcd51890d8b38ac8a53fb" Dec 12 00:38:21 crc kubenswrapper[4606]: E1212 00:38:21.184245 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f83ef2ee07d344164a5648542b1fada488caece01fdcd51890d8b38ac8a53fb\": container with ID starting with 8f83ef2ee07d344164a5648542b1fada488caece01fdcd51890d8b38ac8a53fb not found: ID does not exist" containerID="8f83ef2ee07d344164a5648542b1fada488caece01fdcd51890d8b38ac8a53fb" Dec 12 00:38:21 crc kubenswrapper[4606]: I1212 00:38:21.184300 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f83ef2ee07d344164a5648542b1fada488caece01fdcd51890d8b38ac8a53fb"} err="failed to get container status \"8f83ef2ee07d344164a5648542b1fada488caece01fdcd51890d8b38ac8a53fb\": rpc error: code = NotFound desc = could not find container \"8f83ef2ee07d344164a5648542b1fada488caece01fdcd51890d8b38ac8a53fb\": container with ID starting with 8f83ef2ee07d344164a5648542b1fada488caece01fdcd51890d8b38ac8a53fb not found: ID does not exist" Dec 12 00:38:21 crc kubenswrapper[4606]: I1212 00:38:21.184323 4606 scope.go:117] "RemoveContainer" containerID="da8179203795058d13db6c8d7bc94520d4831300e6349d2ee80d7450cc334358" Dec 12 00:38:21 crc kubenswrapper[4606]: E1212 00:38:21.184653 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da8179203795058d13db6c8d7bc94520d4831300e6349d2ee80d7450cc334358\": container with ID starting with da8179203795058d13db6c8d7bc94520d4831300e6349d2ee80d7450cc334358 not found: ID does not exist" containerID="da8179203795058d13db6c8d7bc94520d4831300e6349d2ee80d7450cc334358" Dec 12 00:38:21 crc kubenswrapper[4606]: I1212 00:38:21.184694 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da8179203795058d13db6c8d7bc94520d4831300e6349d2ee80d7450cc334358"} err="failed to get container status \"da8179203795058d13db6c8d7bc94520d4831300e6349d2ee80d7450cc334358\": rpc error: code = NotFound desc = could not find container \"da8179203795058d13db6c8d7bc94520d4831300e6349d2ee80d7450cc334358\": container with ID starting with da8179203795058d13db6c8d7bc94520d4831300e6349d2ee80d7450cc334358 not found: ID does not exist" Dec 12 00:38:21 crc kubenswrapper[4606]: I1212 00:38:21.184708 4606 scope.go:117] "RemoveContainer" containerID="7efe270a47d608787682fc344c031f32f0c46a60593754c623b6783f50bb3d9c" Dec 12 00:38:21 crc kubenswrapper[4606]: E1212 00:38:21.185025 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7efe270a47d608787682fc344c031f32f0c46a60593754c623b6783f50bb3d9c\": container with ID starting with 7efe270a47d608787682fc344c031f32f0c46a60593754c623b6783f50bb3d9c not found: ID does not exist" containerID="7efe270a47d608787682fc344c031f32f0c46a60593754c623b6783f50bb3d9c" Dec 12 00:38:21 crc kubenswrapper[4606]: I1212 00:38:21.185041 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7efe270a47d608787682fc344c031f32f0c46a60593754c623b6783f50bb3d9c"} err="failed to get container status \"7efe270a47d608787682fc344c031f32f0c46a60593754c623b6783f50bb3d9c\": rpc error: code = NotFound desc = could not find container \"7efe270a47d608787682fc344c031f32f0c46a60593754c623b6783f50bb3d9c\": container with ID starting with 7efe270a47d608787682fc344c031f32f0c46a60593754c623b6783f50bb3d9c not found: ID does not exist" Dec 12 00:38:21 crc kubenswrapper[4606]: I1212 00:38:21.185053 4606 scope.go:117] "RemoveContainer" containerID="8a319995b7a8f0d536552df335865b18913ff2c30a133cdcf23bb997b1a78745" Dec 12 00:38:21 crc kubenswrapper[4606]: I1212 00:38:21.213602 4606 scope.go:117] "RemoveContainer" containerID="6c62db75a9ea96382949eb1e6967f9693114555936795db1dada693b1e4e804e" Dec 12 00:38:21 crc kubenswrapper[4606]: I1212 00:38:21.226486 4606 scope.go:117] "RemoveContainer" containerID="e33a038124d44fcdfd761e6476d87f2d367f2c6abc0ced506a28f5a9086bb5ed" Dec 12 00:38:21 crc kubenswrapper[4606]: I1212 00:38:21.232071 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc97a6ce-dd87-4a67-a1ec-99dcce21178c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bc97a6ce-dd87-4a67-a1ec-99dcce21178c" (UID: "bc97a6ce-dd87-4a67-a1ec-99dcce21178c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 00:38:21 crc kubenswrapper[4606]: I1212 00:38:21.235348 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hw777\" (UniqueName: \"kubernetes.io/projected/bc97a6ce-dd87-4a67-a1ec-99dcce21178c-kube-api-access-hw777\") on node \"crc\" DevicePath \"\"" Dec 12 00:38:21 crc kubenswrapper[4606]: I1212 00:38:21.235372 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fbxd2\" (UniqueName: \"kubernetes.io/projected/d4738423-c294-4595-8656-3a2ebd437a75-kube-api-access-fbxd2\") on node \"crc\" DevicePath \"\"" Dec 12 00:38:21 crc kubenswrapper[4606]: I1212 00:38:21.235382 4606 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d17c9ef-183f-49d5-96ef-c21b165d4f2a-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 12 00:38:21 crc kubenswrapper[4606]: I1212 00:38:21.235391 4606 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d17c9ef-183f-49d5-96ef-c21b165d4f2a-utilities\") on node \"crc\" DevicePath \"\"" Dec 12 00:38:21 crc kubenswrapper[4606]: I1212 00:38:21.235400 4606 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38061ed5-dc1b-4c0b-9b0d-02412c9ca54d-utilities\") on node \"crc\" DevicePath \"\"" Dec 12 00:38:21 crc kubenswrapper[4606]: I1212 00:38:21.235408 4606 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d4738423-c294-4595-8656-3a2ebd437a75-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 12 00:38:21 crc kubenswrapper[4606]: I1212 00:38:21.235417 4606 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc97a6ce-dd87-4a67-a1ec-99dcce21178c-utilities\") on node \"crc\" DevicePath \"\"" Dec 12 00:38:21 crc kubenswrapper[4606]: I1212 00:38:21.235425 4606 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/d4738423-c294-4595-8656-3a2ebd437a75-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 12 00:38:21 crc kubenswrapper[4606]: I1212 00:38:21.235433 4606 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc97a6ce-dd87-4a67-a1ec-99dcce21178c-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 12 00:38:21 crc kubenswrapper[4606]: I1212 00:38:21.235442 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ssjsk\" (UniqueName: \"kubernetes.io/projected/2d17c9ef-183f-49d5-96ef-c21b165d4f2a-kube-api-access-ssjsk\") on node \"crc\" DevicePath \"\"" Dec 12 00:38:21 crc kubenswrapper[4606]: I1212 00:38:21.235450 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9627\" (UniqueName: \"kubernetes.io/projected/38061ed5-dc1b-4c0b-9b0d-02412c9ca54d-kube-api-access-w9627\") on node \"crc\" DevicePath \"\"" Dec 12 00:38:21 crc kubenswrapper[4606]: I1212 00:38:21.238878 4606 scope.go:117] "RemoveContainer" containerID="8a319995b7a8f0d536552df335865b18913ff2c30a133cdcf23bb997b1a78745" Dec 12 00:38:21 crc kubenswrapper[4606]: E1212 00:38:21.239286 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a319995b7a8f0d536552df335865b18913ff2c30a133cdcf23bb997b1a78745\": container with ID starting with 8a319995b7a8f0d536552df335865b18913ff2c30a133cdcf23bb997b1a78745 not found: ID does not exist" containerID="8a319995b7a8f0d536552df335865b18913ff2c30a133cdcf23bb997b1a78745" Dec 12 00:38:21 crc kubenswrapper[4606]: I1212 00:38:21.239330 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a319995b7a8f0d536552df335865b18913ff2c30a133cdcf23bb997b1a78745"} err="failed to get container status \"8a319995b7a8f0d536552df335865b18913ff2c30a133cdcf23bb997b1a78745\": rpc error: code = NotFound desc = could not find container \"8a319995b7a8f0d536552df335865b18913ff2c30a133cdcf23bb997b1a78745\": container with ID starting with 8a319995b7a8f0d536552df335865b18913ff2c30a133cdcf23bb997b1a78745 not found: ID does not exist" Dec 12 00:38:21 crc kubenswrapper[4606]: I1212 00:38:21.239355 4606 scope.go:117] "RemoveContainer" containerID="6c62db75a9ea96382949eb1e6967f9693114555936795db1dada693b1e4e804e" Dec 12 00:38:21 crc kubenswrapper[4606]: E1212 00:38:21.239641 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c62db75a9ea96382949eb1e6967f9693114555936795db1dada693b1e4e804e\": container with ID starting with 6c62db75a9ea96382949eb1e6967f9693114555936795db1dada693b1e4e804e not found: ID does not exist" containerID="6c62db75a9ea96382949eb1e6967f9693114555936795db1dada693b1e4e804e" Dec 12 00:38:21 crc kubenswrapper[4606]: I1212 00:38:21.239664 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c62db75a9ea96382949eb1e6967f9693114555936795db1dada693b1e4e804e"} err="failed to get container status \"6c62db75a9ea96382949eb1e6967f9693114555936795db1dada693b1e4e804e\": rpc error: code = NotFound desc = could not find container \"6c62db75a9ea96382949eb1e6967f9693114555936795db1dada693b1e4e804e\": container with ID starting with 6c62db75a9ea96382949eb1e6967f9693114555936795db1dada693b1e4e804e not found: ID does not exist" Dec 12 00:38:21 crc kubenswrapper[4606]: I1212 00:38:21.239684 4606 scope.go:117] "RemoveContainer" containerID="e33a038124d44fcdfd761e6476d87f2d367f2c6abc0ced506a28f5a9086bb5ed" Dec 12 00:38:21 crc kubenswrapper[4606]: E1212 00:38:21.239865 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e33a038124d44fcdfd761e6476d87f2d367f2c6abc0ced506a28f5a9086bb5ed\": container with ID starting with e33a038124d44fcdfd761e6476d87f2d367f2c6abc0ced506a28f5a9086bb5ed not found: ID does not exist" containerID="e33a038124d44fcdfd761e6476d87f2d367f2c6abc0ced506a28f5a9086bb5ed" Dec 12 00:38:21 crc kubenswrapper[4606]: I1212 00:38:21.239879 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e33a038124d44fcdfd761e6476d87f2d367f2c6abc0ced506a28f5a9086bb5ed"} err="failed to get container status \"e33a038124d44fcdfd761e6476d87f2d367f2c6abc0ced506a28f5a9086bb5ed\": rpc error: code = NotFound desc = could not find container \"e33a038124d44fcdfd761e6476d87f2d367f2c6abc0ced506a28f5a9086bb5ed\": container with ID starting with e33a038124d44fcdfd761e6476d87f2d367f2c6abc0ced506a28f5a9086bb5ed not found: ID does not exist" Dec 12 00:38:21 crc kubenswrapper[4606]: I1212 00:38:21.239891 4606 scope.go:117] "RemoveContainer" containerID="9c838d0f57f826a673eda5f07bb7b9e852a12c247b1525732ac623e2f33d29a2" Dec 12 00:38:21 crc kubenswrapper[4606]: I1212 00:38:21.254333 4606 scope.go:117] "RemoveContainer" containerID="062e0f6972301e6bc2c5566b56e43001c409ef93c3281ec87189afafc9c02df3" Dec 12 00:38:21 crc kubenswrapper[4606]: I1212 00:38:21.277326 4606 scope.go:117] "RemoveContainer" containerID="286fc20de13dc15f56799072586e0523519fafb40d77ce41e87e673ef7a03d40" Dec 12 00:38:21 crc kubenswrapper[4606]: I1212 00:38:21.293848 4606 scope.go:117] "RemoveContainer" containerID="9c838d0f57f826a673eda5f07bb7b9e852a12c247b1525732ac623e2f33d29a2" Dec 12 00:38:21 crc kubenswrapper[4606]: E1212 00:38:21.294294 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c838d0f57f826a673eda5f07bb7b9e852a12c247b1525732ac623e2f33d29a2\": container with ID starting with 9c838d0f57f826a673eda5f07bb7b9e852a12c247b1525732ac623e2f33d29a2 not found: ID does not exist" containerID="9c838d0f57f826a673eda5f07bb7b9e852a12c247b1525732ac623e2f33d29a2" Dec 12 00:38:21 crc kubenswrapper[4606]: I1212 00:38:21.294341 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c838d0f57f826a673eda5f07bb7b9e852a12c247b1525732ac623e2f33d29a2"} err="failed to get container status \"9c838d0f57f826a673eda5f07bb7b9e852a12c247b1525732ac623e2f33d29a2\": rpc error: code = NotFound desc = could not find container \"9c838d0f57f826a673eda5f07bb7b9e852a12c247b1525732ac623e2f33d29a2\": container with ID starting with 9c838d0f57f826a673eda5f07bb7b9e852a12c247b1525732ac623e2f33d29a2 not found: ID does not exist" Dec 12 00:38:21 crc kubenswrapper[4606]: I1212 00:38:21.294380 4606 scope.go:117] "RemoveContainer" containerID="062e0f6972301e6bc2c5566b56e43001c409ef93c3281ec87189afafc9c02df3" Dec 12 00:38:21 crc kubenswrapper[4606]: E1212 00:38:21.294743 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"062e0f6972301e6bc2c5566b56e43001c409ef93c3281ec87189afafc9c02df3\": container with ID starting with 062e0f6972301e6bc2c5566b56e43001c409ef93c3281ec87189afafc9c02df3 not found: ID does not exist" containerID="062e0f6972301e6bc2c5566b56e43001c409ef93c3281ec87189afafc9c02df3" Dec 12 00:38:21 crc kubenswrapper[4606]: I1212 00:38:21.294772 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"062e0f6972301e6bc2c5566b56e43001c409ef93c3281ec87189afafc9c02df3"} err="failed to get container status \"062e0f6972301e6bc2c5566b56e43001c409ef93c3281ec87189afafc9c02df3\": rpc error: code = NotFound desc = could not find container \"062e0f6972301e6bc2c5566b56e43001c409ef93c3281ec87189afafc9c02df3\": container with ID starting with 062e0f6972301e6bc2c5566b56e43001c409ef93c3281ec87189afafc9c02df3 not found: ID does not exist" Dec 12 00:38:21 crc kubenswrapper[4606]: I1212 00:38:21.294798 4606 scope.go:117] "RemoveContainer" containerID="286fc20de13dc15f56799072586e0523519fafb40d77ce41e87e673ef7a03d40" Dec 12 00:38:21 crc kubenswrapper[4606]: E1212 00:38:21.295196 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"286fc20de13dc15f56799072586e0523519fafb40d77ce41e87e673ef7a03d40\": container with ID starting with 286fc20de13dc15f56799072586e0523519fafb40d77ce41e87e673ef7a03d40 not found: ID does not exist" containerID="286fc20de13dc15f56799072586e0523519fafb40d77ce41e87e673ef7a03d40" Dec 12 00:38:21 crc kubenswrapper[4606]: I1212 00:38:21.295229 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"286fc20de13dc15f56799072586e0523519fafb40d77ce41e87e673ef7a03d40"} err="failed to get container status \"286fc20de13dc15f56799072586e0523519fafb40d77ce41e87e673ef7a03d40\": rpc error: code = NotFound desc = could not find container \"286fc20de13dc15f56799072586e0523519fafb40d77ce41e87e673ef7a03d40\": container with ID starting with 286fc20de13dc15f56799072586e0523519fafb40d77ce41e87e673ef7a03d40 not found: ID does not exist" Dec 12 00:38:21 crc kubenswrapper[4606]: I1212 00:38:21.295250 4606 scope.go:117] "RemoveContainer" containerID="95455317867ba998f8aa931ff46ee5588cad115c1b0fbd219e99ebf9ee57ee9a" Dec 12 00:38:21 crc kubenswrapper[4606]: I1212 00:38:21.307062 4606 scope.go:117] "RemoveContainer" containerID="85d5c5bd02932ea8d1cea155755d33e11793785a59e9dc3d13e5921f20874ec3" Dec 12 00:38:21 crc kubenswrapper[4606]: I1212 00:38:21.308076 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38061ed5-dc1b-4c0b-9b0d-02412c9ca54d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "38061ed5-dc1b-4c0b-9b0d-02412c9ca54d" (UID: "38061ed5-dc1b-4c0b-9b0d-02412c9ca54d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 00:38:21 crc kubenswrapper[4606]: I1212 00:38:21.318024 4606 scope.go:117] "RemoveContainer" containerID="0e9c7da8894bb65f0c867cd32525f75d12dc49957c0f53e7b43c9f3b5069c89e" Dec 12 00:38:21 crc kubenswrapper[4606]: I1212 00:38:21.336931 4606 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38061ed5-dc1b-4c0b-9b0d-02412c9ca54d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 12 00:38:21 crc kubenswrapper[4606]: I1212 00:38:21.341476 4606 scope.go:117] "RemoveContainer" containerID="95455317867ba998f8aa931ff46ee5588cad115c1b0fbd219e99ebf9ee57ee9a" Dec 12 00:38:21 crc kubenswrapper[4606]: E1212 00:38:21.341923 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95455317867ba998f8aa931ff46ee5588cad115c1b0fbd219e99ebf9ee57ee9a\": container with ID starting with 95455317867ba998f8aa931ff46ee5588cad115c1b0fbd219e99ebf9ee57ee9a not found: ID does not exist" containerID="95455317867ba998f8aa931ff46ee5588cad115c1b0fbd219e99ebf9ee57ee9a" Dec 12 00:38:21 crc kubenswrapper[4606]: I1212 00:38:21.341962 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95455317867ba998f8aa931ff46ee5588cad115c1b0fbd219e99ebf9ee57ee9a"} err="failed to get container status \"95455317867ba998f8aa931ff46ee5588cad115c1b0fbd219e99ebf9ee57ee9a\": rpc error: code = NotFound desc = could not find container \"95455317867ba998f8aa931ff46ee5588cad115c1b0fbd219e99ebf9ee57ee9a\": container with ID starting with 95455317867ba998f8aa931ff46ee5588cad115c1b0fbd219e99ebf9ee57ee9a not found: ID does not exist" Dec 12 00:38:21 crc kubenswrapper[4606]: I1212 00:38:21.341984 4606 scope.go:117] "RemoveContainer" containerID="85d5c5bd02932ea8d1cea155755d33e11793785a59e9dc3d13e5921f20874ec3" Dec 12 00:38:21 crc kubenswrapper[4606]: E1212 00:38:21.342375 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85d5c5bd02932ea8d1cea155755d33e11793785a59e9dc3d13e5921f20874ec3\": container with ID starting with 85d5c5bd02932ea8d1cea155755d33e11793785a59e9dc3d13e5921f20874ec3 not found: ID does not exist" containerID="85d5c5bd02932ea8d1cea155755d33e11793785a59e9dc3d13e5921f20874ec3" Dec 12 00:38:21 crc kubenswrapper[4606]: I1212 00:38:21.342438 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85d5c5bd02932ea8d1cea155755d33e11793785a59e9dc3d13e5921f20874ec3"} err="failed to get container status \"85d5c5bd02932ea8d1cea155755d33e11793785a59e9dc3d13e5921f20874ec3\": rpc error: code = NotFound desc = could not find container \"85d5c5bd02932ea8d1cea155755d33e11793785a59e9dc3d13e5921f20874ec3\": container with ID starting with 85d5c5bd02932ea8d1cea155755d33e11793785a59e9dc3d13e5921f20874ec3 not found: ID does not exist" Dec 12 00:38:21 crc kubenswrapper[4606]: I1212 00:38:21.342481 4606 scope.go:117] "RemoveContainer" containerID="0e9c7da8894bb65f0c867cd32525f75d12dc49957c0f53e7b43c9f3b5069c89e" Dec 12 00:38:21 crc kubenswrapper[4606]: E1212 00:38:21.342858 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e9c7da8894bb65f0c867cd32525f75d12dc49957c0f53e7b43c9f3b5069c89e\": container with ID starting with 0e9c7da8894bb65f0c867cd32525f75d12dc49957c0f53e7b43c9f3b5069c89e not found: ID does not exist" containerID="0e9c7da8894bb65f0c867cd32525f75d12dc49957c0f53e7b43c9f3b5069c89e" Dec 12 00:38:21 crc kubenswrapper[4606]: I1212 00:38:21.342886 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e9c7da8894bb65f0c867cd32525f75d12dc49957c0f53e7b43c9f3b5069c89e"} err="failed to get container status \"0e9c7da8894bb65f0c867cd32525f75d12dc49957c0f53e7b43c9f3b5069c89e\": rpc error: code = NotFound desc = could not find container \"0e9c7da8894bb65f0c867cd32525f75d12dc49957c0f53e7b43c9f3b5069c89e\": container with ID starting with 0e9c7da8894bb65f0c867cd32525f75d12dc49957c0f53e7b43c9f3b5069c89e not found: ID does not exist" Dec 12 00:38:21 crc kubenswrapper[4606]: I1212 00:38:21.342902 4606 scope.go:117] "RemoveContainer" containerID="5039c720cb348cfc5fe9304089c81828c6ddce45c3bb46bd39a8727206bff5a9" Dec 12 00:38:21 crc kubenswrapper[4606]: I1212 00:38:21.361077 4606 scope.go:117] "RemoveContainer" containerID="5039c720cb348cfc5fe9304089c81828c6ddce45c3bb46bd39a8727206bff5a9" Dec 12 00:38:21 crc kubenswrapper[4606]: E1212 00:38:21.361614 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5039c720cb348cfc5fe9304089c81828c6ddce45c3bb46bd39a8727206bff5a9\": container with ID starting with 5039c720cb348cfc5fe9304089c81828c6ddce45c3bb46bd39a8727206bff5a9 not found: ID does not exist" containerID="5039c720cb348cfc5fe9304089c81828c6ddce45c3bb46bd39a8727206bff5a9" Dec 12 00:38:21 crc kubenswrapper[4606]: I1212 00:38:21.361653 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5039c720cb348cfc5fe9304089c81828c6ddce45c3bb46bd39a8727206bff5a9"} err="failed to get container status \"5039c720cb348cfc5fe9304089c81828c6ddce45c3bb46bd39a8727206bff5a9\": rpc error: code = NotFound desc = could not find container \"5039c720cb348cfc5fe9304089c81828c6ddce45c3bb46bd39a8727206bff5a9\": container with ID starting with 5039c720cb348cfc5fe9304089c81828c6ddce45c3bb46bd39a8727206bff5a9 not found: ID does not exist" Dec 12 00:38:21 crc kubenswrapper[4606]: I1212 00:38:21.401363 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-xqq5z"] Dec 12 00:38:21 crc kubenswrapper[4606]: W1212 00:38:21.411941 4606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf10904b_21cd_4987_bedb_118b0992002a.slice/crio-690d0ca798691ec36292b485680146f9144fad946dc2fc7889b514c2a7164517 WatchSource:0}: Error finding container 690d0ca798691ec36292b485680146f9144fad946dc2fc7889b514c2a7164517: Status 404 returned error can't find the container with id 690d0ca798691ec36292b485680146f9144fad946dc2fc7889b514c2a7164517 Dec 12 00:38:21 crc kubenswrapper[4606]: I1212 00:38:21.429233 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wq42n"] Dec 12 00:38:21 crc kubenswrapper[4606]: I1212 00:38:21.438610 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-wq42n"] Dec 12 00:38:21 crc kubenswrapper[4606]: I1212 00:38:21.449752 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-w8wq4"] Dec 12 00:38:21 crc kubenswrapper[4606]: I1212 00:38:21.453538 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-w8wq4"] Dec 12 00:38:21 crc kubenswrapper[4606]: I1212 00:38:21.470681 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-pw5mx"] Dec 12 00:38:21 crc kubenswrapper[4606]: I1212 00:38:21.473293 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-pw5mx"] Dec 12 00:38:21 crc kubenswrapper[4606]: I1212 00:38:21.488890 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-k7skb"] Dec 12 00:38:21 crc kubenswrapper[4606]: I1212 00:38:21.497742 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-k7skb"] Dec 12 00:38:21 crc kubenswrapper[4606]: I1212 00:38:21.502656 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-rq529"] Dec 12 00:38:21 crc kubenswrapper[4606]: I1212 00:38:21.505918 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-rq529"] Dec 12 00:38:21 crc kubenswrapper[4606]: I1212 00:38:21.710489 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d17c9ef-183f-49d5-96ef-c21b165d4f2a" path="/var/lib/kubelet/pods/2d17c9ef-183f-49d5-96ef-c21b165d4f2a/volumes" Dec 12 00:38:21 crc kubenswrapper[4606]: I1212 00:38:21.711392 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38061ed5-dc1b-4c0b-9b0d-02412c9ca54d" path="/var/lib/kubelet/pods/38061ed5-dc1b-4c0b-9b0d-02412c9ca54d/volumes" Dec 12 00:38:21 crc kubenswrapper[4606]: I1212 00:38:21.712289 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70af385d-13b8-4ff2-8c35-fb9402388dd6" path="/var/lib/kubelet/pods/70af385d-13b8-4ff2-8c35-fb9402388dd6/volumes" Dec 12 00:38:21 crc kubenswrapper[4606]: I1212 00:38:21.713774 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc97a6ce-dd87-4a67-a1ec-99dcce21178c" path="/var/lib/kubelet/pods/bc97a6ce-dd87-4a67-a1ec-99dcce21178c/volumes" Dec 12 00:38:21 crc kubenswrapper[4606]: I1212 00:38:21.714827 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4738423-c294-4595-8656-3a2ebd437a75" path="/var/lib/kubelet/pods/d4738423-c294-4595-8656-3a2ebd437a75/volumes" Dec 12 00:38:22 crc kubenswrapper[4606]: I1212 00:38:22.112244 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-5rvsk_c7d278b3-153c-4a0f-b333-a568a2be0b9a/registry-server/0.log" Dec 12 00:38:22 crc kubenswrapper[4606]: I1212 00:38:22.112928 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5rvsk" Dec 12 00:38:22 crc kubenswrapper[4606]: I1212 00:38:22.162591 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-xqq5z" event={"ID":"bf10904b-21cd-4987-bedb-118b0992002a","Type":"ContainerStarted","Data":"12fe8de01642377afe7ded90dfe75e59a3d85ee0bf9d2a57af94d47e9dfefe09"} Dec 12 00:38:22 crc kubenswrapper[4606]: I1212 00:38:22.162638 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-xqq5z" Dec 12 00:38:22 crc kubenswrapper[4606]: I1212 00:38:22.162653 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-xqq5z" event={"ID":"bf10904b-21cd-4987-bedb-118b0992002a","Type":"ContainerStarted","Data":"690d0ca798691ec36292b485680146f9144fad946dc2fc7889b514c2a7164517"} Dec 12 00:38:22 crc kubenswrapper[4606]: I1212 00:38:22.164455 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-5rvsk_c7d278b3-153c-4a0f-b333-a568a2be0b9a/registry-server/0.log" Dec 12 00:38:22 crc kubenswrapper[4606]: I1212 00:38:22.165511 4606 generic.go:334] "Generic (PLEG): container finished" podID="c7d278b3-153c-4a0f-b333-a568a2be0b9a" containerID="5e5939c44ee86ad573ea0add0cde22fcbe2e06f5e46cb0831383aad657cd5757" exitCode=1 Dec 12 00:38:22 crc kubenswrapper[4606]: I1212 00:38:22.165541 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5rvsk" event={"ID":"c7d278b3-153c-4a0f-b333-a568a2be0b9a","Type":"ContainerDied","Data":"5e5939c44ee86ad573ea0add0cde22fcbe2e06f5e46cb0831383aad657cd5757"} Dec 12 00:38:22 crc kubenswrapper[4606]: I1212 00:38:22.165598 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5rvsk" event={"ID":"c7d278b3-153c-4a0f-b333-a568a2be0b9a","Type":"ContainerDied","Data":"500d30612b4ce2c26a39decf7201647829d53b1b0f794a068abd0e77324ba413"} Dec 12 00:38:22 crc kubenswrapper[4606]: I1212 00:38:22.165617 4606 scope.go:117] "RemoveContainer" containerID="5e5939c44ee86ad573ea0add0cde22fcbe2e06f5e46cb0831383aad657cd5757" Dec 12 00:38:22 crc kubenswrapper[4606]: I1212 00:38:22.165630 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5rvsk" Dec 12 00:38:22 crc kubenswrapper[4606]: I1212 00:38:22.166811 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-xqq5z" Dec 12 00:38:22 crc kubenswrapper[4606]: I1212 00:38:22.182635 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-xqq5z" podStartSLOduration=2.182624434 podStartE2EDuration="2.182624434s" podCreationTimestamp="2025-12-12 00:38:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:38:22.181559656 +0000 UTC m=+892.726912532" watchObservedRunningTime="2025-12-12 00:38:22.182624434 +0000 UTC m=+892.727977300" Dec 12 00:38:22 crc kubenswrapper[4606]: I1212 00:38:22.191145 4606 scope.go:117] "RemoveContainer" containerID="b43217fc1bd1023aa97eb2c46a3e9b267aa5fee9495084dceaa66fde13b6e19f" Dec 12 00:38:22 crc kubenswrapper[4606]: I1212 00:38:22.228232 4606 scope.go:117] "RemoveContainer" containerID="b7d78f2d77e0cb6042611f97eb4c0985bb9ec7a85e07b9c59462c1f999d23b7d" Dec 12 00:38:22 crc kubenswrapper[4606]: I1212 00:38:22.256873 4606 scope.go:117] "RemoveContainer" containerID="5e5939c44ee86ad573ea0add0cde22fcbe2e06f5e46cb0831383aad657cd5757" Dec 12 00:38:22 crc kubenswrapper[4606]: E1212 00:38:22.257526 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e5939c44ee86ad573ea0add0cde22fcbe2e06f5e46cb0831383aad657cd5757\": container with ID starting with 5e5939c44ee86ad573ea0add0cde22fcbe2e06f5e46cb0831383aad657cd5757 not found: ID does not exist" containerID="5e5939c44ee86ad573ea0add0cde22fcbe2e06f5e46cb0831383aad657cd5757" Dec 12 00:38:22 crc kubenswrapper[4606]: I1212 00:38:22.257565 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e5939c44ee86ad573ea0add0cde22fcbe2e06f5e46cb0831383aad657cd5757"} err="failed to get container status \"5e5939c44ee86ad573ea0add0cde22fcbe2e06f5e46cb0831383aad657cd5757\": rpc error: code = NotFound desc = could not find container \"5e5939c44ee86ad573ea0add0cde22fcbe2e06f5e46cb0831383aad657cd5757\": container with ID starting with 5e5939c44ee86ad573ea0add0cde22fcbe2e06f5e46cb0831383aad657cd5757 not found: ID does not exist" Dec 12 00:38:22 crc kubenswrapper[4606]: I1212 00:38:22.257595 4606 scope.go:117] "RemoveContainer" containerID="b43217fc1bd1023aa97eb2c46a3e9b267aa5fee9495084dceaa66fde13b6e19f" Dec 12 00:38:22 crc kubenswrapper[4606]: E1212 00:38:22.258118 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b43217fc1bd1023aa97eb2c46a3e9b267aa5fee9495084dceaa66fde13b6e19f\": container with ID starting with b43217fc1bd1023aa97eb2c46a3e9b267aa5fee9495084dceaa66fde13b6e19f not found: ID does not exist" containerID="b43217fc1bd1023aa97eb2c46a3e9b267aa5fee9495084dceaa66fde13b6e19f" Dec 12 00:38:22 crc kubenswrapper[4606]: I1212 00:38:22.258149 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b43217fc1bd1023aa97eb2c46a3e9b267aa5fee9495084dceaa66fde13b6e19f"} err="failed to get container status \"b43217fc1bd1023aa97eb2c46a3e9b267aa5fee9495084dceaa66fde13b6e19f\": rpc error: code = NotFound desc = could not find container \"b43217fc1bd1023aa97eb2c46a3e9b267aa5fee9495084dceaa66fde13b6e19f\": container with ID starting with b43217fc1bd1023aa97eb2c46a3e9b267aa5fee9495084dceaa66fde13b6e19f not found: ID does not exist" Dec 12 00:38:22 crc kubenswrapper[4606]: I1212 00:38:22.258205 4606 scope.go:117] "RemoveContainer" containerID="b7d78f2d77e0cb6042611f97eb4c0985bb9ec7a85e07b9c59462c1f999d23b7d" Dec 12 00:38:22 crc kubenswrapper[4606]: E1212 00:38:22.260207 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7d78f2d77e0cb6042611f97eb4c0985bb9ec7a85e07b9c59462c1f999d23b7d\": container with ID starting with b7d78f2d77e0cb6042611f97eb4c0985bb9ec7a85e07b9c59462c1f999d23b7d not found: ID does not exist" containerID="b7d78f2d77e0cb6042611f97eb4c0985bb9ec7a85e07b9c59462c1f999d23b7d" Dec 12 00:38:22 crc kubenswrapper[4606]: I1212 00:38:22.260249 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7d78f2d77e0cb6042611f97eb4c0985bb9ec7a85e07b9c59462c1f999d23b7d"} err="failed to get container status \"b7d78f2d77e0cb6042611f97eb4c0985bb9ec7a85e07b9c59462c1f999d23b7d\": rpc error: code = NotFound desc = could not find container \"b7d78f2d77e0cb6042611f97eb4c0985bb9ec7a85e07b9c59462c1f999d23b7d\": container with ID starting with b7d78f2d77e0cb6042611f97eb4c0985bb9ec7a85e07b9c59462c1f999d23b7d not found: ID does not exist" Dec 12 00:38:22 crc kubenswrapper[4606]: I1212 00:38:22.261890 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7d278b3-153c-4a0f-b333-a568a2be0b9a-utilities\") pod \"c7d278b3-153c-4a0f-b333-a568a2be0b9a\" (UID: \"c7d278b3-153c-4a0f-b333-a568a2be0b9a\") " Dec 12 00:38:22 crc kubenswrapper[4606]: I1212 00:38:22.261965 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h8srt\" (UniqueName: \"kubernetes.io/projected/c7d278b3-153c-4a0f-b333-a568a2be0b9a-kube-api-access-h8srt\") pod \"c7d278b3-153c-4a0f-b333-a568a2be0b9a\" (UID: \"c7d278b3-153c-4a0f-b333-a568a2be0b9a\") " Dec 12 00:38:22 crc kubenswrapper[4606]: I1212 00:38:22.262006 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7d278b3-153c-4a0f-b333-a568a2be0b9a-catalog-content\") pod \"c7d278b3-153c-4a0f-b333-a568a2be0b9a\" (UID: \"c7d278b3-153c-4a0f-b333-a568a2be0b9a\") " Dec 12 00:38:22 crc kubenswrapper[4606]: I1212 00:38:22.263536 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7d278b3-153c-4a0f-b333-a568a2be0b9a-utilities" (OuterVolumeSpecName: "utilities") pod "c7d278b3-153c-4a0f-b333-a568a2be0b9a" (UID: "c7d278b3-153c-4a0f-b333-a568a2be0b9a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 00:38:22 crc kubenswrapper[4606]: I1212 00:38:22.267725 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7d278b3-153c-4a0f-b333-a568a2be0b9a-kube-api-access-h8srt" (OuterVolumeSpecName: "kube-api-access-h8srt") pod "c7d278b3-153c-4a0f-b333-a568a2be0b9a" (UID: "c7d278b3-153c-4a0f-b333-a568a2be0b9a"). InnerVolumeSpecName "kube-api-access-h8srt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:38:22 crc kubenswrapper[4606]: I1212 00:38:22.363093 4606 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7d278b3-153c-4a0f-b333-a568a2be0b9a-utilities\") on node \"crc\" DevicePath \"\"" Dec 12 00:38:22 crc kubenswrapper[4606]: I1212 00:38:22.363127 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h8srt\" (UniqueName: \"kubernetes.io/projected/c7d278b3-153c-4a0f-b333-a568a2be0b9a-kube-api-access-h8srt\") on node \"crc\" DevicePath \"\"" Dec 12 00:38:22 crc kubenswrapper[4606]: I1212 00:38:22.390166 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7d278b3-153c-4a0f-b333-a568a2be0b9a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c7d278b3-153c-4a0f-b333-a568a2be0b9a" (UID: "c7d278b3-153c-4a0f-b333-a568a2be0b9a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 00:38:22 crc kubenswrapper[4606]: I1212 00:38:22.464023 4606 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7d278b3-153c-4a0f-b333-a568a2be0b9a-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 12 00:38:22 crc kubenswrapper[4606]: I1212 00:38:22.508722 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5rvsk"] Dec 12 00:38:22 crc kubenswrapper[4606]: I1212 00:38:22.513248 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-5rvsk"] Dec 12 00:38:22 crc kubenswrapper[4606]: I1212 00:38:22.752114 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-c6vsb"] Dec 12 00:38:22 crc kubenswrapper[4606]: E1212 00:38:22.752430 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38061ed5-dc1b-4c0b-9b0d-02412c9ca54d" containerName="extract-content" Dec 12 00:38:22 crc kubenswrapper[4606]: I1212 00:38:22.752460 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="38061ed5-dc1b-4c0b-9b0d-02412c9ca54d" containerName="extract-content" Dec 12 00:38:22 crc kubenswrapper[4606]: E1212 00:38:22.752480 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7d278b3-153c-4a0f-b333-a568a2be0b9a" containerName="extract-utilities" Dec 12 00:38:22 crc kubenswrapper[4606]: I1212 00:38:22.752494 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7d278b3-153c-4a0f-b333-a568a2be0b9a" containerName="extract-utilities" Dec 12 00:38:22 crc kubenswrapper[4606]: E1212 00:38:22.752516 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70af385d-13b8-4ff2-8c35-fb9402388dd6" containerName="registry-server" Dec 12 00:38:22 crc kubenswrapper[4606]: I1212 00:38:22.752528 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="70af385d-13b8-4ff2-8c35-fb9402388dd6" containerName="registry-server" Dec 12 00:38:22 crc kubenswrapper[4606]: E1212 00:38:22.752541 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7d278b3-153c-4a0f-b333-a568a2be0b9a" containerName="registry-server" Dec 12 00:38:22 crc kubenswrapper[4606]: I1212 00:38:22.752554 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7d278b3-153c-4a0f-b333-a568a2be0b9a" containerName="registry-server" Dec 12 00:38:22 crc kubenswrapper[4606]: E1212 00:38:22.752572 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7d278b3-153c-4a0f-b333-a568a2be0b9a" containerName="extract-content" Dec 12 00:38:22 crc kubenswrapper[4606]: I1212 00:38:22.752583 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7d278b3-153c-4a0f-b333-a568a2be0b9a" containerName="extract-content" Dec 12 00:38:22 crc kubenswrapper[4606]: E1212 00:38:22.752602 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d17c9ef-183f-49d5-96ef-c21b165d4f2a" containerName="extract-utilities" Dec 12 00:38:22 crc kubenswrapper[4606]: I1212 00:38:22.752614 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d17c9ef-183f-49d5-96ef-c21b165d4f2a" containerName="extract-utilities" Dec 12 00:38:22 crc kubenswrapper[4606]: E1212 00:38:22.752628 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d17c9ef-183f-49d5-96ef-c21b165d4f2a" containerName="registry-server" Dec 12 00:38:22 crc kubenswrapper[4606]: I1212 00:38:22.752642 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d17c9ef-183f-49d5-96ef-c21b165d4f2a" containerName="registry-server" Dec 12 00:38:22 crc kubenswrapper[4606]: E1212 00:38:22.752658 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc97a6ce-dd87-4a67-a1ec-99dcce21178c" containerName="registry-server" Dec 12 00:38:22 crc kubenswrapper[4606]: I1212 00:38:22.752670 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc97a6ce-dd87-4a67-a1ec-99dcce21178c" containerName="registry-server" Dec 12 00:38:22 crc kubenswrapper[4606]: E1212 00:38:22.752694 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4738423-c294-4595-8656-3a2ebd437a75" containerName="marketplace-operator" Dec 12 00:38:22 crc kubenswrapper[4606]: I1212 00:38:22.752706 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4738423-c294-4595-8656-3a2ebd437a75" containerName="marketplace-operator" Dec 12 00:38:22 crc kubenswrapper[4606]: E1212 00:38:22.752721 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38061ed5-dc1b-4c0b-9b0d-02412c9ca54d" containerName="extract-utilities" Dec 12 00:38:22 crc kubenswrapper[4606]: I1212 00:38:22.752732 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="38061ed5-dc1b-4c0b-9b0d-02412c9ca54d" containerName="extract-utilities" Dec 12 00:38:22 crc kubenswrapper[4606]: E1212 00:38:22.752747 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70af385d-13b8-4ff2-8c35-fb9402388dd6" containerName="extract-utilities" Dec 12 00:38:22 crc kubenswrapper[4606]: I1212 00:38:22.752761 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="70af385d-13b8-4ff2-8c35-fb9402388dd6" containerName="extract-utilities" Dec 12 00:38:22 crc kubenswrapper[4606]: E1212 00:38:22.752779 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38061ed5-dc1b-4c0b-9b0d-02412c9ca54d" containerName="registry-server" Dec 12 00:38:22 crc kubenswrapper[4606]: I1212 00:38:22.752791 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="38061ed5-dc1b-4c0b-9b0d-02412c9ca54d" containerName="registry-server" Dec 12 00:38:22 crc kubenswrapper[4606]: E1212 00:38:22.752810 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d17c9ef-183f-49d5-96ef-c21b165d4f2a" containerName="extract-content" Dec 12 00:38:22 crc kubenswrapper[4606]: I1212 00:38:22.752823 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d17c9ef-183f-49d5-96ef-c21b165d4f2a" containerName="extract-content" Dec 12 00:38:22 crc kubenswrapper[4606]: E1212 00:38:22.752838 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc97a6ce-dd87-4a67-a1ec-99dcce21178c" containerName="extract-content" Dec 12 00:38:22 crc kubenswrapper[4606]: I1212 00:38:22.752849 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc97a6ce-dd87-4a67-a1ec-99dcce21178c" containerName="extract-content" Dec 12 00:38:22 crc kubenswrapper[4606]: E1212 00:38:22.752868 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc97a6ce-dd87-4a67-a1ec-99dcce21178c" containerName="extract-utilities" Dec 12 00:38:22 crc kubenswrapper[4606]: I1212 00:38:22.752879 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc97a6ce-dd87-4a67-a1ec-99dcce21178c" containerName="extract-utilities" Dec 12 00:38:22 crc kubenswrapper[4606]: E1212 00:38:22.752893 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70af385d-13b8-4ff2-8c35-fb9402388dd6" containerName="extract-content" Dec 12 00:38:22 crc kubenswrapper[4606]: I1212 00:38:22.752906 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="70af385d-13b8-4ff2-8c35-fb9402388dd6" containerName="extract-content" Dec 12 00:38:22 crc kubenswrapper[4606]: I1212 00:38:22.753055 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc97a6ce-dd87-4a67-a1ec-99dcce21178c" containerName="registry-server" Dec 12 00:38:22 crc kubenswrapper[4606]: I1212 00:38:22.753077 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d17c9ef-183f-49d5-96ef-c21b165d4f2a" containerName="registry-server" Dec 12 00:38:22 crc kubenswrapper[4606]: I1212 00:38:22.753093 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7d278b3-153c-4a0f-b333-a568a2be0b9a" containerName="registry-server" Dec 12 00:38:22 crc kubenswrapper[4606]: I1212 00:38:22.753118 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="70af385d-13b8-4ff2-8c35-fb9402388dd6" containerName="registry-server" Dec 12 00:38:22 crc kubenswrapper[4606]: I1212 00:38:22.753132 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4738423-c294-4595-8656-3a2ebd437a75" containerName="marketplace-operator" Dec 12 00:38:22 crc kubenswrapper[4606]: I1212 00:38:22.753150 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="38061ed5-dc1b-4c0b-9b0d-02412c9ca54d" containerName="registry-server" Dec 12 00:38:22 crc kubenswrapper[4606]: I1212 00:38:22.754351 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c6vsb" Dec 12 00:38:22 crc kubenswrapper[4606]: I1212 00:38:22.758332 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 12 00:38:22 crc kubenswrapper[4606]: I1212 00:38:22.766035 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-c6vsb"] Dec 12 00:38:22 crc kubenswrapper[4606]: I1212 00:38:22.870309 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c717d9dc-08d4-4863-8788-0f151d9f6c21-utilities\") pod \"certified-operators-c6vsb\" (UID: \"c717d9dc-08d4-4863-8788-0f151d9f6c21\") " pod="openshift-marketplace/certified-operators-c6vsb" Dec 12 00:38:22 crc kubenswrapper[4606]: I1212 00:38:22.870372 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c717d9dc-08d4-4863-8788-0f151d9f6c21-catalog-content\") pod \"certified-operators-c6vsb\" (UID: \"c717d9dc-08d4-4863-8788-0f151d9f6c21\") " pod="openshift-marketplace/certified-operators-c6vsb" Dec 12 00:38:22 crc kubenswrapper[4606]: I1212 00:38:22.870546 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmgk2\" (UniqueName: \"kubernetes.io/projected/c717d9dc-08d4-4863-8788-0f151d9f6c21-kube-api-access-gmgk2\") pod \"certified-operators-c6vsb\" (UID: \"c717d9dc-08d4-4863-8788-0f151d9f6c21\") " pod="openshift-marketplace/certified-operators-c6vsb" Dec 12 00:38:22 crc kubenswrapper[4606]: I1212 00:38:22.971367 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c717d9dc-08d4-4863-8788-0f151d9f6c21-utilities\") pod \"certified-operators-c6vsb\" (UID: \"c717d9dc-08d4-4863-8788-0f151d9f6c21\") " pod="openshift-marketplace/certified-operators-c6vsb" Dec 12 00:38:22 crc kubenswrapper[4606]: I1212 00:38:22.971435 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c717d9dc-08d4-4863-8788-0f151d9f6c21-catalog-content\") pod \"certified-operators-c6vsb\" (UID: \"c717d9dc-08d4-4863-8788-0f151d9f6c21\") " pod="openshift-marketplace/certified-operators-c6vsb" Dec 12 00:38:22 crc kubenswrapper[4606]: I1212 00:38:22.971456 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gmgk2\" (UniqueName: \"kubernetes.io/projected/c717d9dc-08d4-4863-8788-0f151d9f6c21-kube-api-access-gmgk2\") pod \"certified-operators-c6vsb\" (UID: \"c717d9dc-08d4-4863-8788-0f151d9f6c21\") " pod="openshift-marketplace/certified-operators-c6vsb" Dec 12 00:38:22 crc kubenswrapper[4606]: I1212 00:38:22.972102 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c717d9dc-08d4-4863-8788-0f151d9f6c21-catalog-content\") pod \"certified-operators-c6vsb\" (UID: \"c717d9dc-08d4-4863-8788-0f151d9f6c21\") " pod="openshift-marketplace/certified-operators-c6vsb" Dec 12 00:38:22 crc kubenswrapper[4606]: I1212 00:38:22.972282 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c717d9dc-08d4-4863-8788-0f151d9f6c21-utilities\") pod \"certified-operators-c6vsb\" (UID: \"c717d9dc-08d4-4863-8788-0f151d9f6c21\") " pod="openshift-marketplace/certified-operators-c6vsb" Dec 12 00:38:22 crc kubenswrapper[4606]: I1212 00:38:22.990652 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmgk2\" (UniqueName: \"kubernetes.io/projected/c717d9dc-08d4-4863-8788-0f151d9f6c21-kube-api-access-gmgk2\") pod \"certified-operators-c6vsb\" (UID: \"c717d9dc-08d4-4863-8788-0f151d9f6c21\") " pod="openshift-marketplace/certified-operators-c6vsb" Dec 12 00:38:23 crc kubenswrapper[4606]: I1212 00:38:23.087039 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c6vsb" Dec 12 00:38:23 crc kubenswrapper[4606]: I1212 00:38:23.151624 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-6bmzj"] Dec 12 00:38:23 crc kubenswrapper[4606]: I1212 00:38:23.152754 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6bmzj" Dec 12 00:38:23 crc kubenswrapper[4606]: I1212 00:38:23.170246 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6bmzj"] Dec 12 00:38:23 crc kubenswrapper[4606]: I1212 00:38:23.175151 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f8f2888b-ed28-4fb8-b268-b7190b535644-catalog-content\") pod \"certified-operators-6bmzj\" (UID: \"f8f2888b-ed28-4fb8-b268-b7190b535644\") " pod="openshift-marketplace/certified-operators-6bmzj" Dec 12 00:38:23 crc kubenswrapper[4606]: I1212 00:38:23.175252 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhr6p\" (UniqueName: \"kubernetes.io/projected/f8f2888b-ed28-4fb8-b268-b7190b535644-kube-api-access-hhr6p\") pod \"certified-operators-6bmzj\" (UID: \"f8f2888b-ed28-4fb8-b268-b7190b535644\") " pod="openshift-marketplace/certified-operators-6bmzj" Dec 12 00:38:23 crc kubenswrapper[4606]: I1212 00:38:23.175303 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f8f2888b-ed28-4fb8-b268-b7190b535644-utilities\") pod \"certified-operators-6bmzj\" (UID: \"f8f2888b-ed28-4fb8-b268-b7190b535644\") " pod="openshift-marketplace/certified-operators-6bmzj" Dec 12 00:38:23 crc kubenswrapper[4606]: I1212 00:38:23.276325 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f8f2888b-ed28-4fb8-b268-b7190b535644-catalog-content\") pod \"certified-operators-6bmzj\" (UID: \"f8f2888b-ed28-4fb8-b268-b7190b535644\") " pod="openshift-marketplace/certified-operators-6bmzj" Dec 12 00:38:23 crc kubenswrapper[4606]: I1212 00:38:23.276747 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hhr6p\" (UniqueName: \"kubernetes.io/projected/f8f2888b-ed28-4fb8-b268-b7190b535644-kube-api-access-hhr6p\") pod \"certified-operators-6bmzj\" (UID: \"f8f2888b-ed28-4fb8-b268-b7190b535644\") " pod="openshift-marketplace/certified-operators-6bmzj" Dec 12 00:38:23 crc kubenswrapper[4606]: I1212 00:38:23.276772 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f8f2888b-ed28-4fb8-b268-b7190b535644-utilities\") pod \"certified-operators-6bmzj\" (UID: \"f8f2888b-ed28-4fb8-b268-b7190b535644\") " pod="openshift-marketplace/certified-operators-6bmzj" Dec 12 00:38:23 crc kubenswrapper[4606]: I1212 00:38:23.276895 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f8f2888b-ed28-4fb8-b268-b7190b535644-catalog-content\") pod \"certified-operators-6bmzj\" (UID: \"f8f2888b-ed28-4fb8-b268-b7190b535644\") " pod="openshift-marketplace/certified-operators-6bmzj" Dec 12 00:38:23 crc kubenswrapper[4606]: I1212 00:38:23.277248 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f8f2888b-ed28-4fb8-b268-b7190b535644-utilities\") pod \"certified-operators-6bmzj\" (UID: \"f8f2888b-ed28-4fb8-b268-b7190b535644\") " pod="openshift-marketplace/certified-operators-6bmzj" Dec 12 00:38:23 crc kubenswrapper[4606]: I1212 00:38:23.297495 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhr6p\" (UniqueName: \"kubernetes.io/projected/f8f2888b-ed28-4fb8-b268-b7190b535644-kube-api-access-hhr6p\") pod \"certified-operators-6bmzj\" (UID: \"f8f2888b-ed28-4fb8-b268-b7190b535644\") " pod="openshift-marketplace/certified-operators-6bmzj" Dec 12 00:38:23 crc kubenswrapper[4606]: I1212 00:38:23.314821 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-c6vsb"] Dec 12 00:38:23 crc kubenswrapper[4606]: W1212 00:38:23.323439 4606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc717d9dc_08d4_4863_8788_0f151d9f6c21.slice/crio-852f71ff7586852577f844b4c5db496b86c5420dfa5c50a7213dfd6ad338cd60 WatchSource:0}: Error finding container 852f71ff7586852577f844b4c5db496b86c5420dfa5c50a7213dfd6ad338cd60: Status 404 returned error can't find the container with id 852f71ff7586852577f844b4c5db496b86c5420dfa5c50a7213dfd6ad338cd60 Dec 12 00:38:23 crc kubenswrapper[4606]: I1212 00:38:23.348443 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-x6rsl"] Dec 12 00:38:23 crc kubenswrapper[4606]: I1212 00:38:23.349766 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-x6rsl" Dec 12 00:38:23 crc kubenswrapper[4606]: I1212 00:38:23.352249 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 12 00:38:23 crc kubenswrapper[4606]: I1212 00:38:23.357538 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-x6rsl"] Dec 12 00:38:23 crc kubenswrapper[4606]: I1212 00:38:23.377229 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rcpm\" (UniqueName: \"kubernetes.io/projected/bf73008f-0c71-4676-ae7a-8a3256c3df05-kube-api-access-7rcpm\") pod \"community-operators-x6rsl\" (UID: \"bf73008f-0c71-4676-ae7a-8a3256c3df05\") " pod="openshift-marketplace/community-operators-x6rsl" Dec 12 00:38:23 crc kubenswrapper[4606]: I1212 00:38:23.377304 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf73008f-0c71-4676-ae7a-8a3256c3df05-catalog-content\") pod \"community-operators-x6rsl\" (UID: \"bf73008f-0c71-4676-ae7a-8a3256c3df05\") " pod="openshift-marketplace/community-operators-x6rsl" Dec 12 00:38:23 crc kubenswrapper[4606]: I1212 00:38:23.377344 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf73008f-0c71-4676-ae7a-8a3256c3df05-utilities\") pod \"community-operators-x6rsl\" (UID: \"bf73008f-0c71-4676-ae7a-8a3256c3df05\") " pod="openshift-marketplace/community-operators-x6rsl" Dec 12 00:38:23 crc kubenswrapper[4606]: I1212 00:38:23.478074 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf73008f-0c71-4676-ae7a-8a3256c3df05-utilities\") pod \"community-operators-x6rsl\" (UID: \"bf73008f-0c71-4676-ae7a-8a3256c3df05\") " pod="openshift-marketplace/community-operators-x6rsl" Dec 12 00:38:23 crc kubenswrapper[4606]: I1212 00:38:23.478361 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rcpm\" (UniqueName: \"kubernetes.io/projected/bf73008f-0c71-4676-ae7a-8a3256c3df05-kube-api-access-7rcpm\") pod \"community-operators-x6rsl\" (UID: \"bf73008f-0c71-4676-ae7a-8a3256c3df05\") " pod="openshift-marketplace/community-operators-x6rsl" Dec 12 00:38:23 crc kubenswrapper[4606]: I1212 00:38:23.478478 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf73008f-0c71-4676-ae7a-8a3256c3df05-catalog-content\") pod \"community-operators-x6rsl\" (UID: \"bf73008f-0c71-4676-ae7a-8a3256c3df05\") " pod="openshift-marketplace/community-operators-x6rsl" Dec 12 00:38:23 crc kubenswrapper[4606]: I1212 00:38:23.478587 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf73008f-0c71-4676-ae7a-8a3256c3df05-utilities\") pod \"community-operators-x6rsl\" (UID: \"bf73008f-0c71-4676-ae7a-8a3256c3df05\") " pod="openshift-marketplace/community-operators-x6rsl" Dec 12 00:38:23 crc kubenswrapper[4606]: I1212 00:38:23.479006 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf73008f-0c71-4676-ae7a-8a3256c3df05-catalog-content\") pod \"community-operators-x6rsl\" (UID: \"bf73008f-0c71-4676-ae7a-8a3256c3df05\") " pod="openshift-marketplace/community-operators-x6rsl" Dec 12 00:38:23 crc kubenswrapper[4606]: I1212 00:38:23.494612 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rcpm\" (UniqueName: \"kubernetes.io/projected/bf73008f-0c71-4676-ae7a-8a3256c3df05-kube-api-access-7rcpm\") pod \"community-operators-x6rsl\" (UID: \"bf73008f-0c71-4676-ae7a-8a3256c3df05\") " pod="openshift-marketplace/community-operators-x6rsl" Dec 12 00:38:23 crc kubenswrapper[4606]: I1212 00:38:23.496541 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6bmzj" Dec 12 00:38:23 crc kubenswrapper[4606]: I1212 00:38:23.655439 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6bmzj"] Dec 12 00:38:23 crc kubenswrapper[4606]: W1212 00:38:23.664155 4606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf8f2888b_ed28_4fb8_b268_b7190b535644.slice/crio-e005765e60e24df6c3391b9b94d7d0782018b5babfcbd7ca790b2e8e13148179 WatchSource:0}: Error finding container e005765e60e24df6c3391b9b94d7d0782018b5babfcbd7ca790b2e8e13148179: Status 404 returned error can't find the container with id e005765e60e24df6c3391b9b94d7d0782018b5babfcbd7ca790b2e8e13148179 Dec 12 00:38:23 crc kubenswrapper[4606]: I1212 00:38:23.668745 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-x6rsl" Dec 12 00:38:23 crc kubenswrapper[4606]: I1212 00:38:23.706287 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7d278b3-153c-4a0f-b333-a568a2be0b9a" path="/var/lib/kubelet/pods/c7d278b3-153c-4a0f-b333-a568a2be0b9a/volumes" Dec 12 00:38:23 crc kubenswrapper[4606]: I1212 00:38:23.742032 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-6dmf2"] Dec 12 00:38:23 crc kubenswrapper[4606]: I1212 00:38:23.742904 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6dmf2" Dec 12 00:38:23 crc kubenswrapper[4606]: I1212 00:38:23.760051 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6dmf2"] Dec 12 00:38:23 crc kubenswrapper[4606]: I1212 00:38:23.782907 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0986169d-1add-4351-8132-d483cfcf36ef-catalog-content\") pod \"community-operators-6dmf2\" (UID: \"0986169d-1add-4351-8132-d483cfcf36ef\") " pod="openshift-marketplace/community-operators-6dmf2" Dec 12 00:38:23 crc kubenswrapper[4606]: I1212 00:38:23.782979 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0986169d-1add-4351-8132-d483cfcf36ef-utilities\") pod \"community-operators-6dmf2\" (UID: \"0986169d-1add-4351-8132-d483cfcf36ef\") " pod="openshift-marketplace/community-operators-6dmf2" Dec 12 00:38:23 crc kubenswrapper[4606]: I1212 00:38:23.783006 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7cfr\" (UniqueName: \"kubernetes.io/projected/0986169d-1add-4351-8132-d483cfcf36ef-kube-api-access-p7cfr\") pod \"community-operators-6dmf2\" (UID: \"0986169d-1add-4351-8132-d483cfcf36ef\") " pod="openshift-marketplace/community-operators-6dmf2" Dec 12 00:38:23 crc kubenswrapper[4606]: I1212 00:38:23.862657 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-x6rsl"] Dec 12 00:38:23 crc kubenswrapper[4606]: W1212 00:38:23.869955 4606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf73008f_0c71_4676_ae7a_8a3256c3df05.slice/crio-fb9b1725d852b7fdf799e97af8aaebbd2b135907ba5f1d2d1ed76fe9b710fdd9 WatchSource:0}: Error finding container fb9b1725d852b7fdf799e97af8aaebbd2b135907ba5f1d2d1ed76fe9b710fdd9: Status 404 returned error can't find the container with id fb9b1725d852b7fdf799e97af8aaebbd2b135907ba5f1d2d1ed76fe9b710fdd9 Dec 12 00:38:23 crc kubenswrapper[4606]: I1212 00:38:23.884082 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0986169d-1add-4351-8132-d483cfcf36ef-catalog-content\") pod \"community-operators-6dmf2\" (UID: \"0986169d-1add-4351-8132-d483cfcf36ef\") " pod="openshift-marketplace/community-operators-6dmf2" Dec 12 00:38:23 crc kubenswrapper[4606]: I1212 00:38:23.884125 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0986169d-1add-4351-8132-d483cfcf36ef-utilities\") pod \"community-operators-6dmf2\" (UID: \"0986169d-1add-4351-8132-d483cfcf36ef\") " pod="openshift-marketplace/community-operators-6dmf2" Dec 12 00:38:23 crc kubenswrapper[4606]: I1212 00:38:23.884147 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p7cfr\" (UniqueName: \"kubernetes.io/projected/0986169d-1add-4351-8132-d483cfcf36ef-kube-api-access-p7cfr\") pod \"community-operators-6dmf2\" (UID: \"0986169d-1add-4351-8132-d483cfcf36ef\") " pod="openshift-marketplace/community-operators-6dmf2" Dec 12 00:38:23 crc kubenswrapper[4606]: I1212 00:38:23.884518 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0986169d-1add-4351-8132-d483cfcf36ef-utilities\") pod \"community-operators-6dmf2\" (UID: \"0986169d-1add-4351-8132-d483cfcf36ef\") " pod="openshift-marketplace/community-operators-6dmf2" Dec 12 00:38:23 crc kubenswrapper[4606]: I1212 00:38:23.884806 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0986169d-1add-4351-8132-d483cfcf36ef-catalog-content\") pod \"community-operators-6dmf2\" (UID: \"0986169d-1add-4351-8132-d483cfcf36ef\") " pod="openshift-marketplace/community-operators-6dmf2" Dec 12 00:38:23 crc kubenswrapper[4606]: I1212 00:38:23.903226 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7cfr\" (UniqueName: \"kubernetes.io/projected/0986169d-1add-4351-8132-d483cfcf36ef-kube-api-access-p7cfr\") pod \"community-operators-6dmf2\" (UID: \"0986169d-1add-4351-8132-d483cfcf36ef\") " pod="openshift-marketplace/community-operators-6dmf2" Dec 12 00:38:24 crc kubenswrapper[4606]: I1212 00:38:24.083553 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6dmf2" Dec 12 00:38:24 crc kubenswrapper[4606]: I1212 00:38:24.181286 4606 generic.go:334] "Generic (PLEG): container finished" podID="c717d9dc-08d4-4863-8788-0f151d9f6c21" containerID="4945ef5077d166c13dafecde7c8166708e9d9ba74ab55d4375dcc4154d23bbfb" exitCode=0 Dec 12 00:38:24 crc kubenswrapper[4606]: I1212 00:38:24.181408 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c6vsb" event={"ID":"c717d9dc-08d4-4863-8788-0f151d9f6c21","Type":"ContainerDied","Data":"4945ef5077d166c13dafecde7c8166708e9d9ba74ab55d4375dcc4154d23bbfb"} Dec 12 00:38:24 crc kubenswrapper[4606]: I1212 00:38:24.181449 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c6vsb" event={"ID":"c717d9dc-08d4-4863-8788-0f151d9f6c21","Type":"ContainerStarted","Data":"852f71ff7586852577f844b4c5db496b86c5420dfa5c50a7213dfd6ad338cd60"} Dec 12 00:38:24 crc kubenswrapper[4606]: I1212 00:38:24.195868 4606 generic.go:334] "Generic (PLEG): container finished" podID="bf73008f-0c71-4676-ae7a-8a3256c3df05" containerID="82df32511e048a9cf61c8e0ea6d1e57620190b623b88f3c56280ed550aa1d0c1" exitCode=0 Dec 12 00:38:24 crc kubenswrapper[4606]: I1212 00:38:24.195986 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x6rsl" event={"ID":"bf73008f-0c71-4676-ae7a-8a3256c3df05","Type":"ContainerDied","Data":"82df32511e048a9cf61c8e0ea6d1e57620190b623b88f3c56280ed550aa1d0c1"} Dec 12 00:38:24 crc kubenswrapper[4606]: I1212 00:38:24.196038 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x6rsl" event={"ID":"bf73008f-0c71-4676-ae7a-8a3256c3df05","Type":"ContainerStarted","Data":"fb9b1725d852b7fdf799e97af8aaebbd2b135907ba5f1d2d1ed76fe9b710fdd9"} Dec 12 00:38:24 crc kubenswrapper[4606]: I1212 00:38:24.212453 4606 generic.go:334] "Generic (PLEG): container finished" podID="f8f2888b-ed28-4fb8-b268-b7190b535644" containerID="223ce7498fbb3267571aa6c5d9e6e27811879a720016d5ee3e9172652cb85016" exitCode=0 Dec 12 00:38:24 crc kubenswrapper[4606]: I1212 00:38:24.213052 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6bmzj" event={"ID":"f8f2888b-ed28-4fb8-b268-b7190b535644","Type":"ContainerDied","Data":"223ce7498fbb3267571aa6c5d9e6e27811879a720016d5ee3e9172652cb85016"} Dec 12 00:38:24 crc kubenswrapper[4606]: I1212 00:38:24.213075 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6bmzj" event={"ID":"f8f2888b-ed28-4fb8-b268-b7190b535644","Type":"ContainerStarted","Data":"e005765e60e24df6c3391b9b94d7d0782018b5babfcbd7ca790b2e8e13148179"} Dec 12 00:38:24 crc kubenswrapper[4606]: I1212 00:38:24.590212 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6dmf2"] Dec 12 00:38:24 crc kubenswrapper[4606]: W1212 00:38:24.597693 4606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0986169d_1add_4351_8132_d483cfcf36ef.slice/crio-98298e33d240d4d89285e34f8f0340017806b7ca661afd94d9f00370b253823e WatchSource:0}: Error finding container 98298e33d240d4d89285e34f8f0340017806b7ca661afd94d9f00370b253823e: Status 404 returned error can't find the container with id 98298e33d240d4d89285e34f8f0340017806b7ca661afd94d9f00370b253823e Dec 12 00:38:25 crc kubenswrapper[4606]: I1212 00:38:25.220193 4606 generic.go:334] "Generic (PLEG): container finished" podID="0986169d-1add-4351-8132-d483cfcf36ef" containerID="086cb42c1dcc27a71dae13fb063221a2ccd981e6aff622701d7d5729a731c485" exitCode=0 Dec 12 00:38:25 crc kubenswrapper[4606]: I1212 00:38:25.220272 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6dmf2" event={"ID":"0986169d-1add-4351-8132-d483cfcf36ef","Type":"ContainerDied","Data":"086cb42c1dcc27a71dae13fb063221a2ccd981e6aff622701d7d5729a731c485"} Dec 12 00:38:25 crc kubenswrapper[4606]: I1212 00:38:25.220304 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6dmf2" event={"ID":"0986169d-1add-4351-8132-d483cfcf36ef","Type":"ContainerStarted","Data":"98298e33d240d4d89285e34f8f0340017806b7ca661afd94d9f00370b253823e"} Dec 12 00:38:25 crc kubenswrapper[4606]: I1212 00:38:25.224364 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x6rsl" event={"ID":"bf73008f-0c71-4676-ae7a-8a3256c3df05","Type":"ContainerStarted","Data":"cf14ad7ab31b14c007f5cbdd6e19bd3b7e90074f8dff217b5669a4f41beca0f0"} Dec 12 00:38:25 crc kubenswrapper[4606]: I1212 00:38:25.550224 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-v92fv"] Dec 12 00:38:25 crc kubenswrapper[4606]: I1212 00:38:25.555442 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v92fv" Dec 12 00:38:25 crc kubenswrapper[4606]: I1212 00:38:25.557788 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 12 00:38:25 crc kubenswrapper[4606]: I1212 00:38:25.567674 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-v92fv"] Dec 12 00:38:25 crc kubenswrapper[4606]: I1212 00:38:25.707728 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/74f91f8e-c973-4ffb-89d2-8b0683578a84-utilities\") pod \"redhat-marketplace-v92fv\" (UID: \"74f91f8e-c973-4ffb-89d2-8b0683578a84\") " pod="openshift-marketplace/redhat-marketplace-v92fv" Dec 12 00:38:25 crc kubenswrapper[4606]: I1212 00:38:25.707788 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rjp5\" (UniqueName: \"kubernetes.io/projected/74f91f8e-c973-4ffb-89d2-8b0683578a84-kube-api-access-8rjp5\") pod \"redhat-marketplace-v92fv\" (UID: \"74f91f8e-c973-4ffb-89d2-8b0683578a84\") " pod="openshift-marketplace/redhat-marketplace-v92fv" Dec 12 00:38:25 crc kubenswrapper[4606]: I1212 00:38:25.707830 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/74f91f8e-c973-4ffb-89d2-8b0683578a84-catalog-content\") pod \"redhat-marketplace-v92fv\" (UID: \"74f91f8e-c973-4ffb-89d2-8b0683578a84\") " pod="openshift-marketplace/redhat-marketplace-v92fv" Dec 12 00:38:25 crc kubenswrapper[4606]: I1212 00:38:25.808752 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/74f91f8e-c973-4ffb-89d2-8b0683578a84-utilities\") pod \"redhat-marketplace-v92fv\" (UID: \"74f91f8e-c973-4ffb-89d2-8b0683578a84\") " pod="openshift-marketplace/redhat-marketplace-v92fv" Dec 12 00:38:25 crc kubenswrapper[4606]: I1212 00:38:25.808804 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rjp5\" (UniqueName: \"kubernetes.io/projected/74f91f8e-c973-4ffb-89d2-8b0683578a84-kube-api-access-8rjp5\") pod \"redhat-marketplace-v92fv\" (UID: \"74f91f8e-c973-4ffb-89d2-8b0683578a84\") " pod="openshift-marketplace/redhat-marketplace-v92fv" Dec 12 00:38:25 crc kubenswrapper[4606]: I1212 00:38:25.808830 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/74f91f8e-c973-4ffb-89d2-8b0683578a84-catalog-content\") pod \"redhat-marketplace-v92fv\" (UID: \"74f91f8e-c973-4ffb-89d2-8b0683578a84\") " pod="openshift-marketplace/redhat-marketplace-v92fv" Dec 12 00:38:25 crc kubenswrapper[4606]: I1212 00:38:25.809270 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/74f91f8e-c973-4ffb-89d2-8b0683578a84-catalog-content\") pod \"redhat-marketplace-v92fv\" (UID: \"74f91f8e-c973-4ffb-89d2-8b0683578a84\") " pod="openshift-marketplace/redhat-marketplace-v92fv" Dec 12 00:38:25 crc kubenswrapper[4606]: I1212 00:38:25.809271 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/74f91f8e-c973-4ffb-89d2-8b0683578a84-utilities\") pod \"redhat-marketplace-v92fv\" (UID: \"74f91f8e-c973-4ffb-89d2-8b0683578a84\") " pod="openshift-marketplace/redhat-marketplace-v92fv" Dec 12 00:38:25 crc kubenswrapper[4606]: I1212 00:38:25.838914 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rjp5\" (UniqueName: \"kubernetes.io/projected/74f91f8e-c973-4ffb-89d2-8b0683578a84-kube-api-access-8rjp5\") pod \"redhat-marketplace-v92fv\" (UID: \"74f91f8e-c973-4ffb-89d2-8b0683578a84\") " pod="openshift-marketplace/redhat-marketplace-v92fv" Dec 12 00:38:25 crc kubenswrapper[4606]: I1212 00:38:25.872625 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v92fv" Dec 12 00:38:25 crc kubenswrapper[4606]: I1212 00:38:25.953853 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-xvcv5"] Dec 12 00:38:25 crc kubenswrapper[4606]: I1212 00:38:25.954989 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xvcv5" Dec 12 00:38:25 crc kubenswrapper[4606]: I1212 00:38:25.965602 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xvcv5"] Dec 12 00:38:26 crc kubenswrapper[4606]: I1212 00:38:26.021053 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ft4p5\" (UniqueName: \"kubernetes.io/projected/c0375875-a842-430b-8e0f-996d03beb60f-kube-api-access-ft4p5\") pod \"redhat-marketplace-xvcv5\" (UID: \"c0375875-a842-430b-8e0f-996d03beb60f\") " pod="openshift-marketplace/redhat-marketplace-xvcv5" Dec 12 00:38:26 crc kubenswrapper[4606]: I1212 00:38:26.021125 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0375875-a842-430b-8e0f-996d03beb60f-catalog-content\") pod \"redhat-marketplace-xvcv5\" (UID: \"c0375875-a842-430b-8e0f-996d03beb60f\") " pod="openshift-marketplace/redhat-marketplace-xvcv5" Dec 12 00:38:26 crc kubenswrapper[4606]: I1212 00:38:26.021161 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0375875-a842-430b-8e0f-996d03beb60f-utilities\") pod \"redhat-marketplace-xvcv5\" (UID: \"c0375875-a842-430b-8e0f-996d03beb60f\") " pod="openshift-marketplace/redhat-marketplace-xvcv5" Dec 12 00:38:26 crc kubenswrapper[4606]: I1212 00:38:26.077793 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-v92fv"] Dec 12 00:38:26 crc kubenswrapper[4606]: W1212 00:38:26.085015 4606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod74f91f8e_c973_4ffb_89d2_8b0683578a84.slice/crio-515ae54dd123191bd17aedb9c91857fa3cfdbdc2dc4f34907d026cbe14e533aa WatchSource:0}: Error finding container 515ae54dd123191bd17aedb9c91857fa3cfdbdc2dc4f34907d026cbe14e533aa: Status 404 returned error can't find the container with id 515ae54dd123191bd17aedb9c91857fa3cfdbdc2dc4f34907d026cbe14e533aa Dec 12 00:38:26 crc kubenswrapper[4606]: I1212 00:38:26.122649 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ft4p5\" (UniqueName: \"kubernetes.io/projected/c0375875-a842-430b-8e0f-996d03beb60f-kube-api-access-ft4p5\") pod \"redhat-marketplace-xvcv5\" (UID: \"c0375875-a842-430b-8e0f-996d03beb60f\") " pod="openshift-marketplace/redhat-marketplace-xvcv5" Dec 12 00:38:26 crc kubenswrapper[4606]: I1212 00:38:26.122722 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0375875-a842-430b-8e0f-996d03beb60f-catalog-content\") pod \"redhat-marketplace-xvcv5\" (UID: \"c0375875-a842-430b-8e0f-996d03beb60f\") " pod="openshift-marketplace/redhat-marketplace-xvcv5" Dec 12 00:38:26 crc kubenswrapper[4606]: I1212 00:38:26.122750 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0375875-a842-430b-8e0f-996d03beb60f-utilities\") pod \"redhat-marketplace-xvcv5\" (UID: \"c0375875-a842-430b-8e0f-996d03beb60f\") " pod="openshift-marketplace/redhat-marketplace-xvcv5" Dec 12 00:38:26 crc kubenswrapper[4606]: I1212 00:38:26.123562 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0375875-a842-430b-8e0f-996d03beb60f-catalog-content\") pod \"redhat-marketplace-xvcv5\" (UID: \"c0375875-a842-430b-8e0f-996d03beb60f\") " pod="openshift-marketplace/redhat-marketplace-xvcv5" Dec 12 00:38:26 crc kubenswrapper[4606]: I1212 00:38:26.123617 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0375875-a842-430b-8e0f-996d03beb60f-utilities\") pod \"redhat-marketplace-xvcv5\" (UID: \"c0375875-a842-430b-8e0f-996d03beb60f\") " pod="openshift-marketplace/redhat-marketplace-xvcv5" Dec 12 00:38:26 crc kubenswrapper[4606]: I1212 00:38:26.138453 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ft4p5\" (UniqueName: \"kubernetes.io/projected/c0375875-a842-430b-8e0f-996d03beb60f-kube-api-access-ft4p5\") pod \"redhat-marketplace-xvcv5\" (UID: \"c0375875-a842-430b-8e0f-996d03beb60f\") " pod="openshift-marketplace/redhat-marketplace-xvcv5" Dec 12 00:38:26 crc kubenswrapper[4606]: I1212 00:38:26.159770 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-l7nww"] Dec 12 00:38:26 crc kubenswrapper[4606]: I1212 00:38:26.161212 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l7nww" Dec 12 00:38:26 crc kubenswrapper[4606]: I1212 00:38:26.163441 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 12 00:38:26 crc kubenswrapper[4606]: I1212 00:38:26.167161 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-l7nww"] Dec 12 00:38:26 crc kubenswrapper[4606]: I1212 00:38:26.223910 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dlcq\" (UniqueName: \"kubernetes.io/projected/9df9dd2c-220d-43e3-a680-21d6ea0622f5-kube-api-access-5dlcq\") pod \"redhat-operators-l7nww\" (UID: \"9df9dd2c-220d-43e3-a680-21d6ea0622f5\") " pod="openshift-marketplace/redhat-operators-l7nww" Dec 12 00:38:26 crc kubenswrapper[4606]: I1212 00:38:26.223951 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9df9dd2c-220d-43e3-a680-21d6ea0622f5-catalog-content\") pod \"redhat-operators-l7nww\" (UID: \"9df9dd2c-220d-43e3-a680-21d6ea0622f5\") " pod="openshift-marketplace/redhat-operators-l7nww" Dec 12 00:38:26 crc kubenswrapper[4606]: I1212 00:38:26.223975 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9df9dd2c-220d-43e3-a680-21d6ea0622f5-utilities\") pod \"redhat-operators-l7nww\" (UID: \"9df9dd2c-220d-43e3-a680-21d6ea0622f5\") " pod="openshift-marketplace/redhat-operators-l7nww" Dec 12 00:38:26 crc kubenswrapper[4606]: I1212 00:38:26.231772 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v92fv" event={"ID":"74f91f8e-c973-4ffb-89d2-8b0683578a84","Type":"ContainerStarted","Data":"515ae54dd123191bd17aedb9c91857fa3cfdbdc2dc4f34907d026cbe14e533aa"} Dec 12 00:38:26 crc kubenswrapper[4606]: I1212 00:38:26.234044 4606 generic.go:334] "Generic (PLEG): container finished" podID="bf73008f-0c71-4676-ae7a-8a3256c3df05" containerID="cf14ad7ab31b14c007f5cbdd6e19bd3b7e90074f8dff217b5669a4f41beca0f0" exitCode=0 Dec 12 00:38:26 crc kubenswrapper[4606]: I1212 00:38:26.234087 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x6rsl" event={"ID":"bf73008f-0c71-4676-ae7a-8a3256c3df05","Type":"ContainerDied","Data":"cf14ad7ab31b14c007f5cbdd6e19bd3b7e90074f8dff217b5669a4f41beca0f0"} Dec 12 00:38:26 crc kubenswrapper[4606]: I1212 00:38:26.280141 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xvcv5" Dec 12 00:38:26 crc kubenswrapper[4606]: I1212 00:38:26.324923 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9df9dd2c-220d-43e3-a680-21d6ea0622f5-catalog-content\") pod \"redhat-operators-l7nww\" (UID: \"9df9dd2c-220d-43e3-a680-21d6ea0622f5\") " pod="openshift-marketplace/redhat-operators-l7nww" Dec 12 00:38:26 crc kubenswrapper[4606]: I1212 00:38:26.324982 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9df9dd2c-220d-43e3-a680-21d6ea0622f5-utilities\") pod \"redhat-operators-l7nww\" (UID: \"9df9dd2c-220d-43e3-a680-21d6ea0622f5\") " pod="openshift-marketplace/redhat-operators-l7nww" Dec 12 00:38:26 crc kubenswrapper[4606]: I1212 00:38:26.325089 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5dlcq\" (UniqueName: \"kubernetes.io/projected/9df9dd2c-220d-43e3-a680-21d6ea0622f5-kube-api-access-5dlcq\") pod \"redhat-operators-l7nww\" (UID: \"9df9dd2c-220d-43e3-a680-21d6ea0622f5\") " pod="openshift-marketplace/redhat-operators-l7nww" Dec 12 00:38:26 crc kubenswrapper[4606]: I1212 00:38:26.325824 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9df9dd2c-220d-43e3-a680-21d6ea0622f5-catalog-content\") pod \"redhat-operators-l7nww\" (UID: \"9df9dd2c-220d-43e3-a680-21d6ea0622f5\") " pod="openshift-marketplace/redhat-operators-l7nww" Dec 12 00:38:26 crc kubenswrapper[4606]: I1212 00:38:26.326100 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9df9dd2c-220d-43e3-a680-21d6ea0622f5-utilities\") pod \"redhat-operators-l7nww\" (UID: \"9df9dd2c-220d-43e3-a680-21d6ea0622f5\") " pod="openshift-marketplace/redhat-operators-l7nww" Dec 12 00:38:26 crc kubenswrapper[4606]: I1212 00:38:26.340804 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dlcq\" (UniqueName: \"kubernetes.io/projected/9df9dd2c-220d-43e3-a680-21d6ea0622f5-kube-api-access-5dlcq\") pod \"redhat-operators-l7nww\" (UID: \"9df9dd2c-220d-43e3-a680-21d6ea0622f5\") " pod="openshift-marketplace/redhat-operators-l7nww" Dec 12 00:38:26 crc kubenswrapper[4606]: I1212 00:38:26.461777 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xvcv5"] Dec 12 00:38:26 crc kubenswrapper[4606]: I1212 00:38:26.502210 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l7nww" Dec 12 00:38:26 crc kubenswrapper[4606]: W1212 00:38:26.518652 4606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc0375875_a842_430b_8e0f_996d03beb60f.slice/crio-b50f935deefb907c48bf79973e35048907e8409d2c0e4a6aaf355c12df82a108 WatchSource:0}: Error finding container b50f935deefb907c48bf79973e35048907e8409d2c0e4a6aaf355c12df82a108: Status 404 returned error can't find the container with id b50f935deefb907c48bf79973e35048907e8409d2c0e4a6aaf355c12df82a108 Dec 12 00:38:26 crc kubenswrapper[4606]: I1212 00:38:26.911661 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-l7nww"] Dec 12 00:38:26 crc kubenswrapper[4606]: W1212 00:38:26.934541 4606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9df9dd2c_220d_43e3_a680_21d6ea0622f5.slice/crio-d123b28c61aa20b82ec30938185d60eca045d9aaf68efa36bab971be9e6fe687 WatchSource:0}: Error finding container d123b28c61aa20b82ec30938185d60eca045d9aaf68efa36bab971be9e6fe687: Status 404 returned error can't find the container with id d123b28c61aa20b82ec30938185d60eca045d9aaf68efa36bab971be9e6fe687 Dec 12 00:38:27 crc kubenswrapper[4606]: I1212 00:38:27.242451 4606 generic.go:334] "Generic (PLEG): container finished" podID="0986169d-1add-4351-8132-d483cfcf36ef" containerID="1d90d26226ce187e322fd66c82cc6de3499a5f2004bec15f3fed2dbb0864cff5" exitCode=0 Dec 12 00:38:27 crc kubenswrapper[4606]: I1212 00:38:27.242530 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6dmf2" event={"ID":"0986169d-1add-4351-8132-d483cfcf36ef","Type":"ContainerDied","Data":"1d90d26226ce187e322fd66c82cc6de3499a5f2004bec15f3fed2dbb0864cff5"} Dec 12 00:38:27 crc kubenswrapper[4606]: I1212 00:38:27.248274 4606 generic.go:334] "Generic (PLEG): container finished" podID="74f91f8e-c973-4ffb-89d2-8b0683578a84" containerID="73294c2d9009a4c0bf5594ba3bec2d7d0e25d2b81534924437d0e531ba653219" exitCode=0 Dec 12 00:38:27 crc kubenswrapper[4606]: I1212 00:38:27.248382 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v92fv" event={"ID":"74f91f8e-c973-4ffb-89d2-8b0683578a84","Type":"ContainerDied","Data":"73294c2d9009a4c0bf5594ba3bec2d7d0e25d2b81534924437d0e531ba653219"} Dec 12 00:38:27 crc kubenswrapper[4606]: I1212 00:38:27.256472 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x6rsl" event={"ID":"bf73008f-0c71-4676-ae7a-8a3256c3df05","Type":"ContainerStarted","Data":"961229365c0316141103fbdffd14faad4842e8a53a8118f1b1c6b3c5e7e6b921"} Dec 12 00:38:27 crc kubenswrapper[4606]: I1212 00:38:27.263061 4606 generic.go:334] "Generic (PLEG): container finished" podID="c0375875-a842-430b-8e0f-996d03beb60f" containerID="dc76dd226260278a8a33f3fb9327434257fc81adff3f0abca9a3539d86b57709" exitCode=0 Dec 12 00:38:27 crc kubenswrapper[4606]: I1212 00:38:27.263128 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xvcv5" event={"ID":"c0375875-a842-430b-8e0f-996d03beb60f","Type":"ContainerDied","Data":"dc76dd226260278a8a33f3fb9327434257fc81adff3f0abca9a3539d86b57709"} Dec 12 00:38:27 crc kubenswrapper[4606]: I1212 00:38:27.263154 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xvcv5" event={"ID":"c0375875-a842-430b-8e0f-996d03beb60f","Type":"ContainerStarted","Data":"b50f935deefb907c48bf79973e35048907e8409d2c0e4a6aaf355c12df82a108"} Dec 12 00:38:27 crc kubenswrapper[4606]: I1212 00:38:27.265318 4606 generic.go:334] "Generic (PLEG): container finished" podID="9df9dd2c-220d-43e3-a680-21d6ea0622f5" containerID="2bbd03411a81b51047ee545e4c28b2dc3078fe5f50a1f2f0de6e8a24d1141953" exitCode=0 Dec 12 00:38:27 crc kubenswrapper[4606]: I1212 00:38:27.265347 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l7nww" event={"ID":"9df9dd2c-220d-43e3-a680-21d6ea0622f5","Type":"ContainerDied","Data":"2bbd03411a81b51047ee545e4c28b2dc3078fe5f50a1f2f0de6e8a24d1141953"} Dec 12 00:38:27 crc kubenswrapper[4606]: I1212 00:38:27.265364 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l7nww" event={"ID":"9df9dd2c-220d-43e3-a680-21d6ea0622f5","Type":"ContainerStarted","Data":"d123b28c61aa20b82ec30938185d60eca045d9aaf68efa36bab971be9e6fe687"} Dec 12 00:38:27 crc kubenswrapper[4606]: I1212 00:38:27.286554 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-x6rsl" podStartSLOduration=1.7579929380000001 podStartE2EDuration="4.286538809s" podCreationTimestamp="2025-12-12 00:38:23 +0000 UTC" firstStartedPulling="2025-12-12 00:38:24.208612866 +0000 UTC m=+894.753965732" lastFinishedPulling="2025-12-12 00:38:26.737158737 +0000 UTC m=+897.282511603" observedRunningTime="2025-12-12 00:38:27.278302176 +0000 UTC m=+897.823655052" watchObservedRunningTime="2025-12-12 00:38:27.286538809 +0000 UTC m=+897.831891675" Dec 12 00:38:28 crc kubenswrapper[4606]: I1212 00:38:28.275059 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6dmf2" event={"ID":"0986169d-1add-4351-8132-d483cfcf36ef","Type":"ContainerStarted","Data":"e84581583a7d920e9b773b1c2c6719d46486f3150c777e8f83de3bb9e01e93bb"} Dec 12 00:38:28 crc kubenswrapper[4606]: I1212 00:38:28.281206 4606 generic.go:334] "Generic (PLEG): container finished" podID="c0375875-a842-430b-8e0f-996d03beb60f" containerID="a10103f04d5cfd0c97e343b1d6a4398c7f7824da160102f47c2b2ade794eb689" exitCode=0 Dec 12 00:38:28 crc kubenswrapper[4606]: I1212 00:38:28.281249 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xvcv5" event={"ID":"c0375875-a842-430b-8e0f-996d03beb60f","Type":"ContainerDied","Data":"a10103f04d5cfd0c97e343b1d6a4398c7f7824da160102f47c2b2ade794eb689"} Dec 12 00:38:28 crc kubenswrapper[4606]: I1212 00:38:28.283666 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l7nww" event={"ID":"9df9dd2c-220d-43e3-a680-21d6ea0622f5","Type":"ContainerStarted","Data":"8ad4a502008e1a5347672b41510651b81a542219b2c140d9042033868a21b0b6"} Dec 12 00:38:28 crc kubenswrapper[4606]: I1212 00:38:28.320031 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-6dmf2" podStartSLOduration=2.808187661 podStartE2EDuration="5.320010948s" podCreationTimestamp="2025-12-12 00:38:23 +0000 UTC" firstStartedPulling="2025-12-12 00:38:25.223673189 +0000 UTC m=+895.769026055" lastFinishedPulling="2025-12-12 00:38:27.735496456 +0000 UTC m=+898.280849342" observedRunningTime="2025-12-12 00:38:28.292462783 +0000 UTC m=+898.837815649" watchObservedRunningTime="2025-12-12 00:38:28.320010948 +0000 UTC m=+898.865363814" Dec 12 00:38:29 crc kubenswrapper[4606]: I1212 00:38:29.291271 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xvcv5" event={"ID":"c0375875-a842-430b-8e0f-996d03beb60f","Type":"ContainerStarted","Data":"98d121b4828258356c5cd1be509b25d6b4065cb37869bd736a8755fa0fcbdd43"} Dec 12 00:38:29 crc kubenswrapper[4606]: I1212 00:38:29.295525 4606 generic.go:334] "Generic (PLEG): container finished" podID="9df9dd2c-220d-43e3-a680-21d6ea0622f5" containerID="8ad4a502008e1a5347672b41510651b81a542219b2c140d9042033868a21b0b6" exitCode=0 Dec 12 00:38:29 crc kubenswrapper[4606]: I1212 00:38:29.295717 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l7nww" event={"ID":"9df9dd2c-220d-43e3-a680-21d6ea0622f5","Type":"ContainerDied","Data":"8ad4a502008e1a5347672b41510651b81a542219b2c140d9042033868a21b0b6"} Dec 12 00:38:29 crc kubenswrapper[4606]: I1212 00:38:29.300967 4606 generic.go:334] "Generic (PLEG): container finished" podID="74f91f8e-c973-4ffb-89d2-8b0683578a84" containerID="76239c335605aeb83e706f914ac6ab96f3ea1394b6b6d40122d2a35de3fd0998" exitCode=0 Dec 12 00:38:29 crc kubenswrapper[4606]: I1212 00:38:29.300997 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v92fv" event={"ID":"74f91f8e-c973-4ffb-89d2-8b0683578a84","Type":"ContainerDied","Data":"76239c335605aeb83e706f914ac6ab96f3ea1394b6b6d40122d2a35de3fd0998"} Dec 12 00:38:29 crc kubenswrapper[4606]: I1212 00:38:29.316230 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-xvcv5" podStartSLOduration=2.782011264 podStartE2EDuration="4.316143449s" podCreationTimestamp="2025-12-12 00:38:25 +0000 UTC" firstStartedPulling="2025-12-12 00:38:27.264784331 +0000 UTC m=+897.810137197" lastFinishedPulling="2025-12-12 00:38:28.798916516 +0000 UTC m=+899.344269382" observedRunningTime="2025-12-12 00:38:29.311286888 +0000 UTC m=+899.856639764" watchObservedRunningTime="2025-12-12 00:38:29.316143449 +0000 UTC m=+899.861496305" Dec 12 00:38:30 crc kubenswrapper[4606]: I1212 00:38:30.063292 4606 scope.go:117] "RemoveContainer" containerID="d3601dd0b8c0a574cc4689906f3c4ed491f3432cae8b328c55f07e5d17b6e976" Dec 12 00:38:32 crc kubenswrapper[4606]: I1212 00:38:32.010008 4606 patch_prober.go:28] interesting pod/machine-config-daemon-cqmz5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 12 00:38:32 crc kubenswrapper[4606]: I1212 00:38:32.010415 4606 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 12 00:38:32 crc kubenswrapper[4606]: I1212 00:38:32.318688 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v92fv" event={"ID":"74f91f8e-c973-4ffb-89d2-8b0683578a84","Type":"ContainerStarted","Data":"fa9dec15cf75ab0dfd29e1e3e801de362f5cf4f436bb019d0803acf547c27fc5"} Dec 12 00:38:32 crc kubenswrapper[4606]: I1212 00:38:32.321102 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xzcfk_b1b4ec5b-c88b-4808-8bab-8b5f9f4f88b0/kube-multus/2.log" Dec 12 00:38:33 crc kubenswrapper[4606]: I1212 00:38:33.328204 4606 generic.go:334] "Generic (PLEG): container finished" podID="f8f2888b-ed28-4fb8-b268-b7190b535644" containerID="c8fb19854e34b035d9791ac8e297afbea4566855f415686d33705993b7198f79" exitCode=0 Dec 12 00:38:33 crc kubenswrapper[4606]: I1212 00:38:33.328255 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6bmzj" event={"ID":"f8f2888b-ed28-4fb8-b268-b7190b535644","Type":"ContainerDied","Data":"c8fb19854e34b035d9791ac8e297afbea4566855f415686d33705993b7198f79"} Dec 12 00:38:33 crc kubenswrapper[4606]: I1212 00:38:33.330842 4606 generic.go:334] "Generic (PLEG): container finished" podID="c717d9dc-08d4-4863-8788-0f151d9f6c21" containerID="c34e4f04b61f7aa7fb53338beda1a43a80ba95b691060095ffbdf6a5ab83dcc3" exitCode=0 Dec 12 00:38:33 crc kubenswrapper[4606]: I1212 00:38:33.330896 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c6vsb" event={"ID":"c717d9dc-08d4-4863-8788-0f151d9f6c21","Type":"ContainerDied","Data":"c34e4f04b61f7aa7fb53338beda1a43a80ba95b691060095ffbdf6a5ab83dcc3"} Dec 12 00:38:33 crc kubenswrapper[4606]: I1212 00:38:33.333050 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l7nww" event={"ID":"9df9dd2c-220d-43e3-a680-21d6ea0622f5","Type":"ContainerStarted","Data":"539312d1a8084692cea75168d401825c852b0de6ef1f83e3b9f8e40100f154b6"} Dec 12 00:38:33 crc kubenswrapper[4606]: I1212 00:38:33.369230 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-v92fv" podStartSLOduration=3.885870917 podStartE2EDuration="8.369206293s" podCreationTimestamp="2025-12-12 00:38:25 +0000 UTC" firstStartedPulling="2025-12-12 00:38:27.251324467 +0000 UTC m=+897.796677333" lastFinishedPulling="2025-12-12 00:38:31.734659843 +0000 UTC m=+902.280012709" observedRunningTime="2025-12-12 00:38:33.365326569 +0000 UTC m=+903.910679435" watchObservedRunningTime="2025-12-12 00:38:33.369206293 +0000 UTC m=+903.914559169" Dec 12 00:38:33 crc kubenswrapper[4606]: I1212 00:38:33.409529 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-l7nww" podStartSLOduration=2.980219538 podStartE2EDuration="7.409508853s" podCreationTimestamp="2025-12-12 00:38:26 +0000 UTC" firstStartedPulling="2025-12-12 00:38:27.273554188 +0000 UTC m=+897.818907054" lastFinishedPulling="2025-12-12 00:38:31.702843493 +0000 UTC m=+902.248196369" observedRunningTime="2025-12-12 00:38:33.406456341 +0000 UTC m=+903.951809207" watchObservedRunningTime="2025-12-12 00:38:33.409508853 +0000 UTC m=+903.954861729" Dec 12 00:38:33 crc kubenswrapper[4606]: I1212 00:38:33.669592 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-x6rsl" Dec 12 00:38:33 crc kubenswrapper[4606]: I1212 00:38:33.669656 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-x6rsl" Dec 12 00:38:33 crc kubenswrapper[4606]: I1212 00:38:33.733601 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-x6rsl" Dec 12 00:38:34 crc kubenswrapper[4606]: I1212 00:38:34.084018 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-6dmf2" Dec 12 00:38:34 crc kubenswrapper[4606]: I1212 00:38:34.084136 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-6dmf2" Dec 12 00:38:34 crc kubenswrapper[4606]: I1212 00:38:34.132128 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-6dmf2" Dec 12 00:38:34 crc kubenswrapper[4606]: I1212 00:38:34.394979 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-x6rsl" Dec 12 00:38:34 crc kubenswrapper[4606]: I1212 00:38:34.403556 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-6dmf2" Dec 12 00:38:35 crc kubenswrapper[4606]: I1212 00:38:35.347772 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6bmzj" event={"ID":"f8f2888b-ed28-4fb8-b268-b7190b535644","Type":"ContainerStarted","Data":"9958aaba90c7611348b606c9b5b0c631e979e262ea56e9b5853f75bcd7461874"} Dec 12 00:38:35 crc kubenswrapper[4606]: I1212 00:38:35.351742 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c6vsb" event={"ID":"c717d9dc-08d4-4863-8788-0f151d9f6c21","Type":"ContainerStarted","Data":"fa748d9f7908e02d3b6a0458928dd8cabff04f84f7e707a44c2414506e3a0cf3"} Dec 12 00:38:35 crc kubenswrapper[4606]: I1212 00:38:35.391092 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-6bmzj" podStartSLOduration=2.298977833 podStartE2EDuration="12.391071633s" podCreationTimestamp="2025-12-12 00:38:23 +0000 UTC" firstStartedPulling="2025-12-12 00:38:24.22058521 +0000 UTC m=+894.765938086" lastFinishedPulling="2025-12-12 00:38:34.31267902 +0000 UTC m=+904.858031886" observedRunningTime="2025-12-12 00:38:35.371739791 +0000 UTC m=+905.917092667" watchObservedRunningTime="2025-12-12 00:38:35.391071633 +0000 UTC m=+905.936424499" Dec 12 00:38:35 crc kubenswrapper[4606]: I1212 00:38:35.391283 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-c6vsb" podStartSLOduration=3.106939262 podStartE2EDuration="13.391277549s" podCreationTimestamp="2025-12-12 00:38:22 +0000 UTC" firstStartedPulling="2025-12-12 00:38:24.184268458 +0000 UTC m=+894.729621364" lastFinishedPulling="2025-12-12 00:38:34.468606785 +0000 UTC m=+905.013959651" observedRunningTime="2025-12-12 00:38:35.389963553 +0000 UTC m=+905.935316419" watchObservedRunningTime="2025-12-12 00:38:35.391277549 +0000 UTC m=+905.936630415" Dec 12 00:38:35 crc kubenswrapper[4606]: I1212 00:38:35.873751 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-v92fv" Dec 12 00:38:35 crc kubenswrapper[4606]: I1212 00:38:35.874841 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-v92fv" Dec 12 00:38:35 crc kubenswrapper[4606]: I1212 00:38:35.924561 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-v92fv" Dec 12 00:38:36 crc kubenswrapper[4606]: I1212 00:38:36.281111 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-xvcv5" Dec 12 00:38:36 crc kubenswrapper[4606]: I1212 00:38:36.281251 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-xvcv5" Dec 12 00:38:36 crc kubenswrapper[4606]: I1212 00:38:36.322646 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-xvcv5" Dec 12 00:38:36 crc kubenswrapper[4606]: I1212 00:38:36.394749 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-xvcv5" Dec 12 00:38:36 crc kubenswrapper[4606]: I1212 00:38:36.503092 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-l7nww" Dec 12 00:38:36 crc kubenswrapper[4606]: I1212 00:38:36.503321 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-l7nww" Dec 12 00:38:37 crc kubenswrapper[4606]: I1212 00:38:37.539630 4606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-l7nww" podUID="9df9dd2c-220d-43e3-a680-21d6ea0622f5" containerName="registry-server" probeResult="failure" output=< Dec 12 00:38:37 crc kubenswrapper[4606]: timeout: failed to connect service ":50051" within 1s Dec 12 00:38:37 crc kubenswrapper[4606]: > Dec 12 00:38:37 crc kubenswrapper[4606]: I1212 00:38:37.742447 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6dmf2"] Dec 12 00:38:37 crc kubenswrapper[4606]: I1212 00:38:37.742804 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-6dmf2" podUID="0986169d-1add-4351-8132-d483cfcf36ef" containerName="registry-server" containerID="cri-o://e84581583a7d920e9b773b1c2c6719d46486f3150c777e8f83de3bb9e01e93bb" gracePeriod=2 Dec 12 00:38:40 crc kubenswrapper[4606]: I1212 00:38:40.137457 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xvcv5"] Dec 12 00:38:40 crc kubenswrapper[4606]: I1212 00:38:40.137748 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-xvcv5" podUID="c0375875-a842-430b-8e0f-996d03beb60f" containerName="registry-server" containerID="cri-o://98d121b4828258356c5cd1be509b25d6b4065cb37869bd736a8755fa0fcbdd43" gracePeriod=2 Dec 12 00:38:40 crc kubenswrapper[4606]: I1212 00:38:40.378492 4606 generic.go:334] "Generic (PLEG): container finished" podID="0986169d-1add-4351-8132-d483cfcf36ef" containerID="e84581583a7d920e9b773b1c2c6719d46486f3150c777e8f83de3bb9e01e93bb" exitCode=0 Dec 12 00:38:40 crc kubenswrapper[4606]: I1212 00:38:40.378718 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6dmf2" event={"ID":"0986169d-1add-4351-8132-d483cfcf36ef","Type":"ContainerDied","Data":"e84581583a7d920e9b773b1c2c6719d46486f3150c777e8f83de3bb9e01e93bb"} Dec 12 00:38:42 crc kubenswrapper[4606]: I1212 00:38:42.305975 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6dmf2" Dec 12 00:38:42 crc kubenswrapper[4606]: I1212 00:38:42.395183 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6dmf2" event={"ID":"0986169d-1add-4351-8132-d483cfcf36ef","Type":"ContainerDied","Data":"98298e33d240d4d89285e34f8f0340017806b7ca661afd94d9f00370b253823e"} Dec 12 00:38:42 crc kubenswrapper[4606]: I1212 00:38:42.395240 4606 scope.go:117] "RemoveContainer" containerID="e84581583a7d920e9b773b1c2c6719d46486f3150c777e8f83de3bb9e01e93bb" Dec 12 00:38:42 crc kubenswrapper[4606]: I1212 00:38:42.395209 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6dmf2" Dec 12 00:38:42 crc kubenswrapper[4606]: I1212 00:38:42.399438 4606 generic.go:334] "Generic (PLEG): container finished" podID="c0375875-a842-430b-8e0f-996d03beb60f" containerID="98d121b4828258356c5cd1be509b25d6b4065cb37869bd736a8755fa0fcbdd43" exitCode=0 Dec 12 00:38:42 crc kubenswrapper[4606]: I1212 00:38:42.399462 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xvcv5" event={"ID":"c0375875-a842-430b-8e0f-996d03beb60f","Type":"ContainerDied","Data":"98d121b4828258356c5cd1be509b25d6b4065cb37869bd736a8755fa0fcbdd43"} Dec 12 00:38:42 crc kubenswrapper[4606]: I1212 00:38:42.418674 4606 scope.go:117] "RemoveContainer" containerID="1d90d26226ce187e322fd66c82cc6de3499a5f2004bec15f3fed2dbb0864cff5" Dec 12 00:38:42 crc kubenswrapper[4606]: I1212 00:38:42.438231 4606 scope.go:117] "RemoveContainer" containerID="086cb42c1dcc27a71dae13fb063221a2ccd981e6aff622701d7d5729a731c485" Dec 12 00:38:42 crc kubenswrapper[4606]: I1212 00:38:42.446836 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0986169d-1add-4351-8132-d483cfcf36ef-catalog-content\") pod \"0986169d-1add-4351-8132-d483cfcf36ef\" (UID: \"0986169d-1add-4351-8132-d483cfcf36ef\") " Dec 12 00:38:42 crc kubenswrapper[4606]: I1212 00:38:42.446918 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0986169d-1add-4351-8132-d483cfcf36ef-utilities\") pod \"0986169d-1add-4351-8132-d483cfcf36ef\" (UID: \"0986169d-1add-4351-8132-d483cfcf36ef\") " Dec 12 00:38:42 crc kubenswrapper[4606]: I1212 00:38:42.446966 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p7cfr\" (UniqueName: \"kubernetes.io/projected/0986169d-1add-4351-8132-d483cfcf36ef-kube-api-access-p7cfr\") pod \"0986169d-1add-4351-8132-d483cfcf36ef\" (UID: \"0986169d-1add-4351-8132-d483cfcf36ef\") " Dec 12 00:38:42 crc kubenswrapper[4606]: I1212 00:38:42.448068 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0986169d-1add-4351-8132-d483cfcf36ef-utilities" (OuterVolumeSpecName: "utilities") pod "0986169d-1add-4351-8132-d483cfcf36ef" (UID: "0986169d-1add-4351-8132-d483cfcf36ef"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 00:38:42 crc kubenswrapper[4606]: I1212 00:38:42.457013 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0986169d-1add-4351-8132-d483cfcf36ef-kube-api-access-p7cfr" (OuterVolumeSpecName: "kube-api-access-p7cfr") pod "0986169d-1add-4351-8132-d483cfcf36ef" (UID: "0986169d-1add-4351-8132-d483cfcf36ef"). InnerVolumeSpecName "kube-api-access-p7cfr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:38:42 crc kubenswrapper[4606]: I1212 00:38:42.497618 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0986169d-1add-4351-8132-d483cfcf36ef-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0986169d-1add-4351-8132-d483cfcf36ef" (UID: "0986169d-1add-4351-8132-d483cfcf36ef"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 00:38:42 crc kubenswrapper[4606]: I1212 00:38:42.548029 4606 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0986169d-1add-4351-8132-d483cfcf36ef-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 12 00:38:42 crc kubenswrapper[4606]: I1212 00:38:42.548081 4606 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0986169d-1add-4351-8132-d483cfcf36ef-utilities\") on node \"crc\" DevicePath \"\"" Dec 12 00:38:42 crc kubenswrapper[4606]: I1212 00:38:42.548103 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p7cfr\" (UniqueName: \"kubernetes.io/projected/0986169d-1add-4351-8132-d483cfcf36ef-kube-api-access-p7cfr\") on node \"crc\" DevicePath \"\"" Dec 12 00:38:42 crc kubenswrapper[4606]: I1212 00:38:42.737686 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6dmf2"] Dec 12 00:38:42 crc kubenswrapper[4606]: I1212 00:38:42.745247 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-6dmf2"] Dec 12 00:38:43 crc kubenswrapper[4606]: I1212 00:38:43.087474 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-c6vsb" Dec 12 00:38:43 crc kubenswrapper[4606]: I1212 00:38:43.087527 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-c6vsb" Dec 12 00:38:43 crc kubenswrapper[4606]: I1212 00:38:43.163589 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-c6vsb" Dec 12 00:38:43 crc kubenswrapper[4606]: I1212 00:38:43.208133 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xvcv5" Dec 12 00:38:43 crc kubenswrapper[4606]: I1212 00:38:43.356771 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ft4p5\" (UniqueName: \"kubernetes.io/projected/c0375875-a842-430b-8e0f-996d03beb60f-kube-api-access-ft4p5\") pod \"c0375875-a842-430b-8e0f-996d03beb60f\" (UID: \"c0375875-a842-430b-8e0f-996d03beb60f\") " Dec 12 00:38:43 crc kubenswrapper[4606]: I1212 00:38:43.356871 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0375875-a842-430b-8e0f-996d03beb60f-utilities\") pod \"c0375875-a842-430b-8e0f-996d03beb60f\" (UID: \"c0375875-a842-430b-8e0f-996d03beb60f\") " Dec 12 00:38:43 crc kubenswrapper[4606]: I1212 00:38:43.356952 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0375875-a842-430b-8e0f-996d03beb60f-catalog-content\") pod \"c0375875-a842-430b-8e0f-996d03beb60f\" (UID: \"c0375875-a842-430b-8e0f-996d03beb60f\") " Dec 12 00:38:43 crc kubenswrapper[4606]: I1212 00:38:43.357707 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c0375875-a842-430b-8e0f-996d03beb60f-utilities" (OuterVolumeSpecName: "utilities") pod "c0375875-a842-430b-8e0f-996d03beb60f" (UID: "c0375875-a842-430b-8e0f-996d03beb60f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 00:38:43 crc kubenswrapper[4606]: I1212 00:38:43.359969 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0375875-a842-430b-8e0f-996d03beb60f-kube-api-access-ft4p5" (OuterVolumeSpecName: "kube-api-access-ft4p5") pod "c0375875-a842-430b-8e0f-996d03beb60f" (UID: "c0375875-a842-430b-8e0f-996d03beb60f"). InnerVolumeSpecName "kube-api-access-ft4p5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:38:43 crc kubenswrapper[4606]: I1212 00:38:43.379653 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c0375875-a842-430b-8e0f-996d03beb60f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c0375875-a842-430b-8e0f-996d03beb60f" (UID: "c0375875-a842-430b-8e0f-996d03beb60f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 00:38:43 crc kubenswrapper[4606]: I1212 00:38:43.405874 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xvcv5" event={"ID":"c0375875-a842-430b-8e0f-996d03beb60f","Type":"ContainerDied","Data":"b50f935deefb907c48bf79973e35048907e8409d2c0e4a6aaf355c12df82a108"} Dec 12 00:38:43 crc kubenswrapper[4606]: I1212 00:38:43.405891 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xvcv5" Dec 12 00:38:43 crc kubenswrapper[4606]: I1212 00:38:43.405918 4606 scope.go:117] "RemoveContainer" containerID="98d121b4828258356c5cd1be509b25d6b4065cb37869bd736a8755fa0fcbdd43" Dec 12 00:38:43 crc kubenswrapper[4606]: I1212 00:38:43.422370 4606 scope.go:117] "RemoveContainer" containerID="a10103f04d5cfd0c97e343b1d6a4398c7f7824da160102f47c2b2ade794eb689" Dec 12 00:38:43 crc kubenswrapper[4606]: I1212 00:38:43.435782 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xvcv5"] Dec 12 00:38:43 crc kubenswrapper[4606]: I1212 00:38:43.438794 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-xvcv5"] Dec 12 00:38:43 crc kubenswrapper[4606]: I1212 00:38:43.446467 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-c6vsb" Dec 12 00:38:43 crc kubenswrapper[4606]: I1212 00:38:43.458316 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ft4p5\" (UniqueName: \"kubernetes.io/projected/c0375875-a842-430b-8e0f-996d03beb60f-kube-api-access-ft4p5\") on node \"crc\" DevicePath \"\"" Dec 12 00:38:43 crc kubenswrapper[4606]: I1212 00:38:43.458360 4606 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0375875-a842-430b-8e0f-996d03beb60f-utilities\") on node \"crc\" DevicePath \"\"" Dec 12 00:38:43 crc kubenswrapper[4606]: I1212 00:38:43.458373 4606 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0375875-a842-430b-8e0f-996d03beb60f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 12 00:38:43 crc kubenswrapper[4606]: I1212 00:38:43.461793 4606 scope.go:117] "RemoveContainer" containerID="dc76dd226260278a8a33f3fb9327434257fc81adff3f0abca9a3539d86b57709" Dec 12 00:38:43 crc kubenswrapper[4606]: I1212 00:38:43.497582 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-6bmzj" Dec 12 00:38:43 crc kubenswrapper[4606]: I1212 00:38:43.497736 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-6bmzj" Dec 12 00:38:43 crc kubenswrapper[4606]: I1212 00:38:43.532847 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-6bmzj" Dec 12 00:38:43 crc kubenswrapper[4606]: I1212 00:38:43.708995 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0986169d-1add-4351-8132-d483cfcf36ef" path="/var/lib/kubelet/pods/0986169d-1add-4351-8132-d483cfcf36ef/volumes" Dec 12 00:38:43 crc kubenswrapper[4606]: I1212 00:38:43.710319 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0375875-a842-430b-8e0f-996d03beb60f" path="/var/lib/kubelet/pods/c0375875-a842-430b-8e0f-996d03beb60f/volumes" Dec 12 00:38:44 crc kubenswrapper[4606]: I1212 00:38:44.475700 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-6bmzj" Dec 12 00:38:45 crc kubenswrapper[4606]: I1212 00:38:45.536698 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6bmzj"] Dec 12 00:38:45 crc kubenswrapper[4606]: I1212 00:38:45.911698 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-v92fv" Dec 12 00:38:46 crc kubenswrapper[4606]: I1212 00:38:46.557950 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-l7nww" Dec 12 00:38:46 crc kubenswrapper[4606]: I1212 00:38:46.614436 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-l7nww" Dec 12 00:38:47 crc kubenswrapper[4606]: I1212 00:38:47.436386 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-6bmzj" podUID="f8f2888b-ed28-4fb8-b268-b7190b535644" containerName="registry-server" containerID="cri-o://9958aaba90c7611348b606c9b5b0c631e979e262ea56e9b5853f75bcd7461874" gracePeriod=2 Dec 12 00:38:47 crc kubenswrapper[4606]: I1212 00:38:47.780036 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6bmzj" Dec 12 00:38:47 crc kubenswrapper[4606]: I1212 00:38:47.826302 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f8f2888b-ed28-4fb8-b268-b7190b535644-utilities\") pod \"f8f2888b-ed28-4fb8-b268-b7190b535644\" (UID: \"f8f2888b-ed28-4fb8-b268-b7190b535644\") " Dec 12 00:38:47 crc kubenswrapper[4606]: I1212 00:38:47.826437 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hhr6p\" (UniqueName: \"kubernetes.io/projected/f8f2888b-ed28-4fb8-b268-b7190b535644-kube-api-access-hhr6p\") pod \"f8f2888b-ed28-4fb8-b268-b7190b535644\" (UID: \"f8f2888b-ed28-4fb8-b268-b7190b535644\") " Dec 12 00:38:47 crc kubenswrapper[4606]: I1212 00:38:47.826472 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f8f2888b-ed28-4fb8-b268-b7190b535644-catalog-content\") pod \"f8f2888b-ed28-4fb8-b268-b7190b535644\" (UID: \"f8f2888b-ed28-4fb8-b268-b7190b535644\") " Dec 12 00:38:47 crc kubenswrapper[4606]: I1212 00:38:47.827319 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8f2888b-ed28-4fb8-b268-b7190b535644-utilities" (OuterVolumeSpecName: "utilities") pod "f8f2888b-ed28-4fb8-b268-b7190b535644" (UID: "f8f2888b-ed28-4fb8-b268-b7190b535644"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 00:38:47 crc kubenswrapper[4606]: I1212 00:38:47.832047 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8f2888b-ed28-4fb8-b268-b7190b535644-kube-api-access-hhr6p" (OuterVolumeSpecName: "kube-api-access-hhr6p") pod "f8f2888b-ed28-4fb8-b268-b7190b535644" (UID: "f8f2888b-ed28-4fb8-b268-b7190b535644"). InnerVolumeSpecName "kube-api-access-hhr6p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:38:47 crc kubenswrapper[4606]: I1212 00:38:47.876266 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8f2888b-ed28-4fb8-b268-b7190b535644-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f8f2888b-ed28-4fb8-b268-b7190b535644" (UID: "f8f2888b-ed28-4fb8-b268-b7190b535644"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 00:38:47 crc kubenswrapper[4606]: I1212 00:38:47.928316 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hhr6p\" (UniqueName: \"kubernetes.io/projected/f8f2888b-ed28-4fb8-b268-b7190b535644-kube-api-access-hhr6p\") on node \"crc\" DevicePath \"\"" Dec 12 00:38:47 crc kubenswrapper[4606]: I1212 00:38:47.928375 4606 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f8f2888b-ed28-4fb8-b268-b7190b535644-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 12 00:38:47 crc kubenswrapper[4606]: I1212 00:38:47.928388 4606 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f8f2888b-ed28-4fb8-b268-b7190b535644-utilities\") on node \"crc\" DevicePath \"\"" Dec 12 00:38:48 crc kubenswrapper[4606]: I1212 00:38:48.442276 4606 generic.go:334] "Generic (PLEG): container finished" podID="f8f2888b-ed28-4fb8-b268-b7190b535644" containerID="9958aaba90c7611348b606c9b5b0c631e979e262ea56e9b5853f75bcd7461874" exitCode=0 Dec 12 00:38:48 crc kubenswrapper[4606]: I1212 00:38:48.442317 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6bmzj" event={"ID":"f8f2888b-ed28-4fb8-b268-b7190b535644","Type":"ContainerDied","Data":"9958aaba90c7611348b606c9b5b0c631e979e262ea56e9b5853f75bcd7461874"} Dec 12 00:38:48 crc kubenswrapper[4606]: I1212 00:38:48.442341 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6bmzj" event={"ID":"f8f2888b-ed28-4fb8-b268-b7190b535644","Type":"ContainerDied","Data":"e005765e60e24df6c3391b9b94d7d0782018b5babfcbd7ca790b2e8e13148179"} Dec 12 00:38:48 crc kubenswrapper[4606]: I1212 00:38:48.442358 4606 scope.go:117] "RemoveContainer" containerID="9958aaba90c7611348b606c9b5b0c631e979e262ea56e9b5853f75bcd7461874" Dec 12 00:38:48 crc kubenswrapper[4606]: I1212 00:38:48.442459 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6bmzj" Dec 12 00:38:48 crc kubenswrapper[4606]: I1212 00:38:48.465048 4606 scope.go:117] "RemoveContainer" containerID="c8fb19854e34b035d9791ac8e297afbea4566855f415686d33705993b7198f79" Dec 12 00:38:48 crc kubenswrapper[4606]: I1212 00:38:48.487531 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6bmzj"] Dec 12 00:38:48 crc kubenswrapper[4606]: I1212 00:38:48.490729 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-6bmzj"] Dec 12 00:38:48 crc kubenswrapper[4606]: I1212 00:38:48.490944 4606 scope.go:117] "RemoveContainer" containerID="223ce7498fbb3267571aa6c5d9e6e27811879a720016d5ee3e9172652cb85016" Dec 12 00:38:48 crc kubenswrapper[4606]: I1212 00:38:48.502663 4606 scope.go:117] "RemoveContainer" containerID="9958aaba90c7611348b606c9b5b0c631e979e262ea56e9b5853f75bcd7461874" Dec 12 00:38:48 crc kubenswrapper[4606]: E1212 00:38:48.503066 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9958aaba90c7611348b606c9b5b0c631e979e262ea56e9b5853f75bcd7461874\": container with ID starting with 9958aaba90c7611348b606c9b5b0c631e979e262ea56e9b5853f75bcd7461874 not found: ID does not exist" containerID="9958aaba90c7611348b606c9b5b0c631e979e262ea56e9b5853f75bcd7461874" Dec 12 00:38:48 crc kubenswrapper[4606]: I1212 00:38:48.503116 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9958aaba90c7611348b606c9b5b0c631e979e262ea56e9b5853f75bcd7461874"} err="failed to get container status \"9958aaba90c7611348b606c9b5b0c631e979e262ea56e9b5853f75bcd7461874\": rpc error: code = NotFound desc = could not find container \"9958aaba90c7611348b606c9b5b0c631e979e262ea56e9b5853f75bcd7461874\": container with ID starting with 9958aaba90c7611348b606c9b5b0c631e979e262ea56e9b5853f75bcd7461874 not found: ID does not exist" Dec 12 00:38:48 crc kubenswrapper[4606]: I1212 00:38:48.503146 4606 scope.go:117] "RemoveContainer" containerID="c8fb19854e34b035d9791ac8e297afbea4566855f415686d33705993b7198f79" Dec 12 00:38:48 crc kubenswrapper[4606]: E1212 00:38:48.503702 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c8fb19854e34b035d9791ac8e297afbea4566855f415686d33705993b7198f79\": container with ID starting with c8fb19854e34b035d9791ac8e297afbea4566855f415686d33705993b7198f79 not found: ID does not exist" containerID="c8fb19854e34b035d9791ac8e297afbea4566855f415686d33705993b7198f79" Dec 12 00:38:48 crc kubenswrapper[4606]: I1212 00:38:48.503727 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8fb19854e34b035d9791ac8e297afbea4566855f415686d33705993b7198f79"} err="failed to get container status \"c8fb19854e34b035d9791ac8e297afbea4566855f415686d33705993b7198f79\": rpc error: code = NotFound desc = could not find container \"c8fb19854e34b035d9791ac8e297afbea4566855f415686d33705993b7198f79\": container with ID starting with c8fb19854e34b035d9791ac8e297afbea4566855f415686d33705993b7198f79 not found: ID does not exist" Dec 12 00:38:48 crc kubenswrapper[4606]: I1212 00:38:48.503743 4606 scope.go:117] "RemoveContainer" containerID="223ce7498fbb3267571aa6c5d9e6e27811879a720016d5ee3e9172652cb85016" Dec 12 00:38:48 crc kubenswrapper[4606]: E1212 00:38:48.504024 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"223ce7498fbb3267571aa6c5d9e6e27811879a720016d5ee3e9172652cb85016\": container with ID starting with 223ce7498fbb3267571aa6c5d9e6e27811879a720016d5ee3e9172652cb85016 not found: ID does not exist" containerID="223ce7498fbb3267571aa6c5d9e6e27811879a720016d5ee3e9172652cb85016" Dec 12 00:38:48 crc kubenswrapper[4606]: I1212 00:38:48.504056 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"223ce7498fbb3267571aa6c5d9e6e27811879a720016d5ee3e9172652cb85016"} err="failed to get container status \"223ce7498fbb3267571aa6c5d9e6e27811879a720016d5ee3e9172652cb85016\": rpc error: code = NotFound desc = could not find container \"223ce7498fbb3267571aa6c5d9e6e27811879a720016d5ee3e9172652cb85016\": container with ID starting with 223ce7498fbb3267571aa6c5d9e6e27811879a720016d5ee3e9172652cb85016 not found: ID does not exist" Dec 12 00:38:49 crc kubenswrapper[4606]: I1212 00:38:49.719310 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8f2888b-ed28-4fb8-b268-b7190b535644" path="/var/lib/kubelet/pods/f8f2888b-ed28-4fb8-b268-b7190b535644/volumes" Dec 12 00:39:02 crc kubenswrapper[4606]: I1212 00:39:02.010231 4606 patch_prober.go:28] interesting pod/machine-config-daemon-cqmz5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 12 00:39:02 crc kubenswrapper[4606]: I1212 00:39:02.010735 4606 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 12 00:39:02 crc kubenswrapper[4606]: I1212 00:39:02.010777 4606 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" Dec 12 00:39:02 crc kubenswrapper[4606]: I1212 00:39:02.011290 4606 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d38d273cf66284763f6d7e3888567975579d39fc9a7372cc2cae90d2dfe8ce04"} pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 12 00:39:02 crc kubenswrapper[4606]: I1212 00:39:02.011335 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" containerName="machine-config-daemon" containerID="cri-o://d38d273cf66284763f6d7e3888567975579d39fc9a7372cc2cae90d2dfe8ce04" gracePeriod=600 Dec 12 00:39:02 crc kubenswrapper[4606]: I1212 00:39:02.517518 4606 generic.go:334] "Generic (PLEG): container finished" podID="a543e227-be89-40cb-941d-b4707cc28921" containerID="d38d273cf66284763f6d7e3888567975579d39fc9a7372cc2cae90d2dfe8ce04" exitCode=0 Dec 12 00:39:02 crc kubenswrapper[4606]: I1212 00:39:02.517725 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" event={"ID":"a543e227-be89-40cb-941d-b4707cc28921","Type":"ContainerDied","Data":"d38d273cf66284763f6d7e3888567975579d39fc9a7372cc2cae90d2dfe8ce04"} Dec 12 00:39:02 crc kubenswrapper[4606]: I1212 00:39:02.517749 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" event={"ID":"a543e227-be89-40cb-941d-b4707cc28921","Type":"ContainerStarted","Data":"98193dc190ed04d9478e682edb5e4363e657a585ee1347d2eb910b80fed16f3f"} Dec 12 00:39:02 crc kubenswrapper[4606]: I1212 00:39:02.517766 4606 scope.go:117] "RemoveContainer" containerID="1873e8515b38b39e992466285ce6933f345b21f7fe695ca304e250f3437cff70" Dec 12 00:39:03 crc kubenswrapper[4606]: I1212 00:39:03.221048 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8bpn9m"] Dec 12 00:39:03 crc kubenswrapper[4606]: E1212 00:39:03.221354 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0986169d-1add-4351-8132-d483cfcf36ef" containerName="extract-content" Dec 12 00:39:03 crc kubenswrapper[4606]: I1212 00:39:03.221376 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="0986169d-1add-4351-8132-d483cfcf36ef" containerName="extract-content" Dec 12 00:39:03 crc kubenswrapper[4606]: E1212 00:39:03.221395 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0986169d-1add-4351-8132-d483cfcf36ef" containerName="extract-utilities" Dec 12 00:39:03 crc kubenswrapper[4606]: I1212 00:39:03.221406 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="0986169d-1add-4351-8132-d483cfcf36ef" containerName="extract-utilities" Dec 12 00:39:03 crc kubenswrapper[4606]: E1212 00:39:03.221425 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0986169d-1add-4351-8132-d483cfcf36ef" containerName="registry-server" Dec 12 00:39:03 crc kubenswrapper[4606]: I1212 00:39:03.221437 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="0986169d-1add-4351-8132-d483cfcf36ef" containerName="registry-server" Dec 12 00:39:03 crc kubenswrapper[4606]: E1212 00:39:03.221450 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8f2888b-ed28-4fb8-b268-b7190b535644" containerName="extract-utilities" Dec 12 00:39:03 crc kubenswrapper[4606]: I1212 00:39:03.221461 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8f2888b-ed28-4fb8-b268-b7190b535644" containerName="extract-utilities" Dec 12 00:39:03 crc kubenswrapper[4606]: E1212 00:39:03.221488 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8f2888b-ed28-4fb8-b268-b7190b535644" containerName="registry-server" Dec 12 00:39:03 crc kubenswrapper[4606]: I1212 00:39:03.221498 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8f2888b-ed28-4fb8-b268-b7190b535644" containerName="registry-server" Dec 12 00:39:03 crc kubenswrapper[4606]: E1212 00:39:03.221511 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0375875-a842-430b-8e0f-996d03beb60f" containerName="extract-utilities" Dec 12 00:39:03 crc kubenswrapper[4606]: I1212 00:39:03.221547 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0375875-a842-430b-8e0f-996d03beb60f" containerName="extract-utilities" Dec 12 00:39:03 crc kubenswrapper[4606]: E1212 00:39:03.221562 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8f2888b-ed28-4fb8-b268-b7190b535644" containerName="extract-content" Dec 12 00:39:03 crc kubenswrapper[4606]: I1212 00:39:03.221572 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8f2888b-ed28-4fb8-b268-b7190b535644" containerName="extract-content" Dec 12 00:39:03 crc kubenswrapper[4606]: E1212 00:39:03.221588 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0375875-a842-430b-8e0f-996d03beb60f" containerName="registry-server" Dec 12 00:39:03 crc kubenswrapper[4606]: I1212 00:39:03.221599 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0375875-a842-430b-8e0f-996d03beb60f" containerName="registry-server" Dec 12 00:39:03 crc kubenswrapper[4606]: E1212 00:39:03.221613 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0375875-a842-430b-8e0f-996d03beb60f" containerName="extract-content" Dec 12 00:39:03 crc kubenswrapper[4606]: I1212 00:39:03.221623 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0375875-a842-430b-8e0f-996d03beb60f" containerName="extract-content" Dec 12 00:39:03 crc kubenswrapper[4606]: I1212 00:39:03.221775 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8f2888b-ed28-4fb8-b268-b7190b535644" containerName="registry-server" Dec 12 00:39:03 crc kubenswrapper[4606]: I1212 00:39:03.221795 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="0986169d-1add-4351-8132-d483cfcf36ef" containerName="registry-server" Dec 12 00:39:03 crc kubenswrapper[4606]: I1212 00:39:03.221815 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0375875-a842-430b-8e0f-996d03beb60f" containerName="registry-server" Dec 12 00:39:03 crc kubenswrapper[4606]: I1212 00:39:03.222931 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8bpn9m" Dec 12 00:39:03 crc kubenswrapper[4606]: I1212 00:39:03.226610 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 12 00:39:03 crc kubenswrapper[4606]: I1212 00:39:03.242361 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8bpn9m"] Dec 12 00:39:03 crc kubenswrapper[4606]: I1212 00:39:03.344761 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dac2009b-c1b1-4c9a-8d8b-045e0c3b4545-bundle\") pod \"98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8bpn9m\" (UID: \"dac2009b-c1b1-4c9a-8d8b-045e0c3b4545\") " pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8bpn9m" Dec 12 00:39:03 crc kubenswrapper[4606]: I1212 00:39:03.344855 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5x9hq\" (UniqueName: \"kubernetes.io/projected/dac2009b-c1b1-4c9a-8d8b-045e0c3b4545-kube-api-access-5x9hq\") pod \"98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8bpn9m\" (UID: \"dac2009b-c1b1-4c9a-8d8b-045e0c3b4545\") " pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8bpn9m" Dec 12 00:39:03 crc kubenswrapper[4606]: I1212 00:39:03.344891 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dac2009b-c1b1-4c9a-8d8b-045e0c3b4545-util\") pod \"98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8bpn9m\" (UID: \"dac2009b-c1b1-4c9a-8d8b-045e0c3b4545\") " pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8bpn9m" Dec 12 00:39:03 crc kubenswrapper[4606]: I1212 00:39:03.445711 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dac2009b-c1b1-4c9a-8d8b-045e0c3b4545-bundle\") pod \"98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8bpn9m\" (UID: \"dac2009b-c1b1-4c9a-8d8b-045e0c3b4545\") " pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8bpn9m" Dec 12 00:39:03 crc kubenswrapper[4606]: I1212 00:39:03.445850 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5x9hq\" (UniqueName: \"kubernetes.io/projected/dac2009b-c1b1-4c9a-8d8b-045e0c3b4545-kube-api-access-5x9hq\") pod \"98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8bpn9m\" (UID: \"dac2009b-c1b1-4c9a-8d8b-045e0c3b4545\") " pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8bpn9m" Dec 12 00:39:03 crc kubenswrapper[4606]: I1212 00:39:03.445921 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dac2009b-c1b1-4c9a-8d8b-045e0c3b4545-util\") pod \"98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8bpn9m\" (UID: \"dac2009b-c1b1-4c9a-8d8b-045e0c3b4545\") " pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8bpn9m" Dec 12 00:39:03 crc kubenswrapper[4606]: I1212 00:39:03.446892 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dac2009b-c1b1-4c9a-8d8b-045e0c3b4545-bundle\") pod \"98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8bpn9m\" (UID: \"dac2009b-c1b1-4c9a-8d8b-045e0c3b4545\") " pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8bpn9m" Dec 12 00:39:03 crc kubenswrapper[4606]: I1212 00:39:03.446895 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dac2009b-c1b1-4c9a-8d8b-045e0c3b4545-util\") pod \"98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8bpn9m\" (UID: \"dac2009b-c1b1-4c9a-8d8b-045e0c3b4545\") " pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8bpn9m" Dec 12 00:39:03 crc kubenswrapper[4606]: I1212 00:39:03.472755 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5x9hq\" (UniqueName: \"kubernetes.io/projected/dac2009b-c1b1-4c9a-8d8b-045e0c3b4545-kube-api-access-5x9hq\") pod \"98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8bpn9m\" (UID: \"dac2009b-c1b1-4c9a-8d8b-045e0c3b4545\") " pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8bpn9m" Dec 12 00:39:03 crc kubenswrapper[4606]: I1212 00:39:03.552955 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8bpn9m" Dec 12 00:39:03 crc kubenswrapper[4606]: I1212 00:39:03.831751 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8bpn9m"] Dec 12 00:39:03 crc kubenswrapper[4606]: W1212 00:39:03.835877 4606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddac2009b_c1b1_4c9a_8d8b_045e0c3b4545.slice/crio-37ed7216fc8ee02f46a97b835ef9ff822e4dc4a5e0b8aaea30b99b233adc4341 WatchSource:0}: Error finding container 37ed7216fc8ee02f46a97b835ef9ff822e4dc4a5e0b8aaea30b99b233adc4341: Status 404 returned error can't find the container with id 37ed7216fc8ee02f46a97b835ef9ff822e4dc4a5e0b8aaea30b99b233adc4341 Dec 12 00:39:04 crc kubenswrapper[4606]: I1212 00:39:04.530644 4606 generic.go:334] "Generic (PLEG): container finished" podID="dac2009b-c1b1-4c9a-8d8b-045e0c3b4545" containerID="62a891a83f4b6a1b95454e46299c4d923fb19af58f1e0fc69713a215de9143b1" exitCode=0 Dec 12 00:39:04 crc kubenswrapper[4606]: I1212 00:39:04.530830 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8bpn9m" event={"ID":"dac2009b-c1b1-4c9a-8d8b-045e0c3b4545","Type":"ContainerDied","Data":"62a891a83f4b6a1b95454e46299c4d923fb19af58f1e0fc69713a215de9143b1"} Dec 12 00:39:04 crc kubenswrapper[4606]: I1212 00:39:04.530918 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8bpn9m" event={"ID":"dac2009b-c1b1-4c9a-8d8b-045e0c3b4545","Type":"ContainerStarted","Data":"37ed7216fc8ee02f46a97b835ef9ff822e4dc4a5e0b8aaea30b99b233adc4341"} Dec 12 00:39:06 crc kubenswrapper[4606]: I1212 00:39:06.545781 4606 generic.go:334] "Generic (PLEG): container finished" podID="dac2009b-c1b1-4c9a-8d8b-045e0c3b4545" containerID="ebb57c695527bf4bd537e97a2ed24b5d6bf57de7a74024a2e140ad7caa973536" exitCode=0 Dec 12 00:39:06 crc kubenswrapper[4606]: I1212 00:39:06.546132 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8bpn9m" event={"ID":"dac2009b-c1b1-4c9a-8d8b-045e0c3b4545","Type":"ContainerDied","Data":"ebb57c695527bf4bd537e97a2ed24b5d6bf57de7a74024a2e140ad7caa973536"} Dec 12 00:39:07 crc kubenswrapper[4606]: I1212 00:39:07.556438 4606 generic.go:334] "Generic (PLEG): container finished" podID="dac2009b-c1b1-4c9a-8d8b-045e0c3b4545" containerID="4ca0e020a9fb06b6ef42d97d664ad2328aa4406eecfe4e5fbe76218e9fb4bc21" exitCode=0 Dec 12 00:39:07 crc kubenswrapper[4606]: I1212 00:39:07.556529 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8bpn9m" event={"ID":"dac2009b-c1b1-4c9a-8d8b-045e0c3b4545","Type":"ContainerDied","Data":"4ca0e020a9fb06b6ef42d97d664ad2328aa4406eecfe4e5fbe76218e9fb4bc21"} Dec 12 00:39:08 crc kubenswrapper[4606]: I1212 00:39:08.867159 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8bpn9m" Dec 12 00:39:09 crc kubenswrapper[4606]: I1212 00:39:09.011741 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dac2009b-c1b1-4c9a-8d8b-045e0c3b4545-bundle\") pod \"dac2009b-c1b1-4c9a-8d8b-045e0c3b4545\" (UID: \"dac2009b-c1b1-4c9a-8d8b-045e0c3b4545\") " Dec 12 00:39:09 crc kubenswrapper[4606]: I1212 00:39:09.011861 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dac2009b-c1b1-4c9a-8d8b-045e0c3b4545-util\") pod \"dac2009b-c1b1-4c9a-8d8b-045e0c3b4545\" (UID: \"dac2009b-c1b1-4c9a-8d8b-045e0c3b4545\") " Dec 12 00:39:09 crc kubenswrapper[4606]: I1212 00:39:09.011903 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5x9hq\" (UniqueName: \"kubernetes.io/projected/dac2009b-c1b1-4c9a-8d8b-045e0c3b4545-kube-api-access-5x9hq\") pod \"dac2009b-c1b1-4c9a-8d8b-045e0c3b4545\" (UID: \"dac2009b-c1b1-4c9a-8d8b-045e0c3b4545\") " Dec 12 00:39:09 crc kubenswrapper[4606]: I1212 00:39:09.014285 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dac2009b-c1b1-4c9a-8d8b-045e0c3b4545-bundle" (OuterVolumeSpecName: "bundle") pod "dac2009b-c1b1-4c9a-8d8b-045e0c3b4545" (UID: "dac2009b-c1b1-4c9a-8d8b-045e0c3b4545"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 00:39:09 crc kubenswrapper[4606]: I1212 00:39:09.020745 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dac2009b-c1b1-4c9a-8d8b-045e0c3b4545-kube-api-access-5x9hq" (OuterVolumeSpecName: "kube-api-access-5x9hq") pod "dac2009b-c1b1-4c9a-8d8b-045e0c3b4545" (UID: "dac2009b-c1b1-4c9a-8d8b-045e0c3b4545"). InnerVolumeSpecName "kube-api-access-5x9hq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:39:09 crc kubenswrapper[4606]: I1212 00:39:09.047730 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dac2009b-c1b1-4c9a-8d8b-045e0c3b4545-util" (OuterVolumeSpecName: "util") pod "dac2009b-c1b1-4c9a-8d8b-045e0c3b4545" (UID: "dac2009b-c1b1-4c9a-8d8b-045e0c3b4545"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 00:39:09 crc kubenswrapper[4606]: I1212 00:39:09.114003 4606 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dac2009b-c1b1-4c9a-8d8b-045e0c3b4545-bundle\") on node \"crc\" DevicePath \"\"" Dec 12 00:39:09 crc kubenswrapper[4606]: I1212 00:39:09.114036 4606 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dac2009b-c1b1-4c9a-8d8b-045e0c3b4545-util\") on node \"crc\" DevicePath \"\"" Dec 12 00:39:09 crc kubenswrapper[4606]: I1212 00:39:09.114049 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5x9hq\" (UniqueName: \"kubernetes.io/projected/dac2009b-c1b1-4c9a-8d8b-045e0c3b4545-kube-api-access-5x9hq\") on node \"crc\" DevicePath \"\"" Dec 12 00:39:09 crc kubenswrapper[4606]: I1212 00:39:09.579101 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8bpn9m" event={"ID":"dac2009b-c1b1-4c9a-8d8b-045e0c3b4545","Type":"ContainerDied","Data":"37ed7216fc8ee02f46a97b835ef9ff822e4dc4a5e0b8aaea30b99b233adc4341"} Dec 12 00:39:09 crc kubenswrapper[4606]: I1212 00:39:09.579168 4606 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="37ed7216fc8ee02f46a97b835ef9ff822e4dc4a5e0b8aaea30b99b233adc4341" Dec 12 00:39:09 crc kubenswrapper[4606]: I1212 00:39:09.579217 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8bpn9m" Dec 12 00:39:12 crc kubenswrapper[4606]: I1212 00:39:12.454267 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-6769fb99d-9fhgf"] Dec 12 00:39:12 crc kubenswrapper[4606]: E1212 00:39:12.454825 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dac2009b-c1b1-4c9a-8d8b-045e0c3b4545" containerName="util" Dec 12 00:39:12 crc kubenswrapper[4606]: I1212 00:39:12.454839 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="dac2009b-c1b1-4c9a-8d8b-045e0c3b4545" containerName="util" Dec 12 00:39:12 crc kubenswrapper[4606]: E1212 00:39:12.454850 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dac2009b-c1b1-4c9a-8d8b-045e0c3b4545" containerName="pull" Dec 12 00:39:12 crc kubenswrapper[4606]: I1212 00:39:12.454857 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="dac2009b-c1b1-4c9a-8d8b-045e0c3b4545" containerName="pull" Dec 12 00:39:12 crc kubenswrapper[4606]: E1212 00:39:12.454869 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dac2009b-c1b1-4c9a-8d8b-045e0c3b4545" containerName="extract" Dec 12 00:39:12 crc kubenswrapper[4606]: I1212 00:39:12.454877 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="dac2009b-c1b1-4c9a-8d8b-045e0c3b4545" containerName="extract" Dec 12 00:39:12 crc kubenswrapper[4606]: I1212 00:39:12.454993 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="dac2009b-c1b1-4c9a-8d8b-045e0c3b4545" containerName="extract" Dec 12 00:39:12 crc kubenswrapper[4606]: I1212 00:39:12.455419 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-6769fb99d-9fhgf" Dec 12 00:39:12 crc kubenswrapper[4606]: I1212 00:39:12.462155 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Dec 12 00:39:12 crc kubenswrapper[4606]: I1212 00:39:12.462236 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Dec 12 00:39:12 crc kubenswrapper[4606]: I1212 00:39:12.463681 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-sw2ml" Dec 12 00:39:12 crc kubenswrapper[4606]: I1212 00:39:12.487783 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-6769fb99d-9fhgf"] Dec 12 00:39:12 crc kubenswrapper[4606]: I1212 00:39:12.560242 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckq45\" (UniqueName: \"kubernetes.io/projected/afff02ee-90b6-4315-ae47-8c8585994b6d-kube-api-access-ckq45\") pod \"nmstate-operator-6769fb99d-9fhgf\" (UID: \"afff02ee-90b6-4315-ae47-8c8585994b6d\") " pod="openshift-nmstate/nmstate-operator-6769fb99d-9fhgf" Dec 12 00:39:12 crc kubenswrapper[4606]: I1212 00:39:12.661781 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckq45\" (UniqueName: \"kubernetes.io/projected/afff02ee-90b6-4315-ae47-8c8585994b6d-kube-api-access-ckq45\") pod \"nmstate-operator-6769fb99d-9fhgf\" (UID: \"afff02ee-90b6-4315-ae47-8c8585994b6d\") " pod="openshift-nmstate/nmstate-operator-6769fb99d-9fhgf" Dec 12 00:39:12 crc kubenswrapper[4606]: I1212 00:39:12.697276 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckq45\" (UniqueName: \"kubernetes.io/projected/afff02ee-90b6-4315-ae47-8c8585994b6d-kube-api-access-ckq45\") pod \"nmstate-operator-6769fb99d-9fhgf\" (UID: \"afff02ee-90b6-4315-ae47-8c8585994b6d\") " pod="openshift-nmstate/nmstate-operator-6769fb99d-9fhgf" Dec 12 00:39:12 crc kubenswrapper[4606]: I1212 00:39:12.770698 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-6769fb99d-9fhgf" Dec 12 00:39:12 crc kubenswrapper[4606]: I1212 00:39:12.969954 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-6769fb99d-9fhgf"] Dec 12 00:39:13 crc kubenswrapper[4606]: I1212 00:39:13.597613 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-6769fb99d-9fhgf" event={"ID":"afff02ee-90b6-4315-ae47-8c8585994b6d","Type":"ContainerStarted","Data":"518546a9bf6b76452d83287f056f8bd2364308babb6dd63e17154264b514cdcd"} Dec 12 00:39:15 crc kubenswrapper[4606]: I1212 00:39:15.608305 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-6769fb99d-9fhgf" event={"ID":"afff02ee-90b6-4315-ae47-8c8585994b6d","Type":"ContainerStarted","Data":"2927a56d5eeb9137f93d6a53488e1641e51f9c053f2ccb1819db332c87495945"} Dec 12 00:39:16 crc kubenswrapper[4606]: I1212 00:39:16.672060 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-6769fb99d-9fhgf" podStartSLOduration=2.3852863380000002 podStartE2EDuration="4.672041429s" podCreationTimestamp="2025-12-12 00:39:12 +0000 UTC" firstStartedPulling="2025-12-12 00:39:12.980714335 +0000 UTC m=+943.526067201" lastFinishedPulling="2025-12-12 00:39:15.267469426 +0000 UTC m=+945.812822292" observedRunningTime="2025-12-12 00:39:15.63043878 +0000 UTC m=+946.175791646" watchObservedRunningTime="2025-12-12 00:39:16.672041429 +0000 UTC m=+947.217394295" Dec 12 00:39:16 crc kubenswrapper[4606]: I1212 00:39:16.676166 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-7f7f7578db-95nwl"] Dec 12 00:39:16 crc kubenswrapper[4606]: I1212 00:39:16.676955 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f7f7578db-95nwl" Dec 12 00:39:16 crc kubenswrapper[4606]: I1212 00:39:16.680784 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-pjdsg" Dec 12 00:39:16 crc kubenswrapper[4606]: I1212 00:39:16.694217 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f7f7578db-95nwl"] Dec 12 00:39:16 crc kubenswrapper[4606]: I1212 00:39:16.706050 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-f8fb84555-xpnht"] Dec 12 00:39:16 crc kubenswrapper[4606]: I1212 00:39:16.706696 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-f8fb84555-xpnht" Dec 12 00:39:16 crc kubenswrapper[4606]: I1212 00:39:16.708057 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Dec 12 00:39:16 crc kubenswrapper[4606]: I1212 00:39:16.709247 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-m5nlb"] Dec 12 00:39:16 crc kubenswrapper[4606]: I1212 00:39:16.709766 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-m5nlb" Dec 12 00:39:16 crc kubenswrapper[4606]: I1212 00:39:16.717415 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxhsv\" (UniqueName: \"kubernetes.io/projected/042d210a-3148-4706-8e99-798c7cab2239-kube-api-access-wxhsv\") pod \"nmstate-metrics-7f7f7578db-95nwl\" (UID: \"042d210a-3148-4706-8e99-798c7cab2239\") " pod="openshift-nmstate/nmstate-metrics-7f7f7578db-95nwl" Dec 12 00:39:16 crc kubenswrapper[4606]: I1212 00:39:16.767993 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-f8fb84555-xpnht"] Dec 12 00:39:16 crc kubenswrapper[4606]: I1212 00:39:16.818845 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/c5c179d2-3e8f-4aa4-8b37-737c167dd42f-dbus-socket\") pod \"nmstate-handler-m5nlb\" (UID: \"c5c179d2-3e8f-4aa4-8b37-737c167dd42f\") " pod="openshift-nmstate/nmstate-handler-m5nlb" Dec 12 00:39:16 crc kubenswrapper[4606]: I1212 00:39:16.819054 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbm8r\" (UniqueName: \"kubernetes.io/projected/c5c179d2-3e8f-4aa4-8b37-737c167dd42f-kube-api-access-mbm8r\") pod \"nmstate-handler-m5nlb\" (UID: \"c5c179d2-3e8f-4aa4-8b37-737c167dd42f\") " pod="openshift-nmstate/nmstate-handler-m5nlb" Dec 12 00:39:16 crc kubenswrapper[4606]: I1212 00:39:16.819142 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/fccbfa74-64a7-4920-a145-abde992f617d-tls-key-pair\") pod \"nmstate-webhook-f8fb84555-xpnht\" (UID: \"fccbfa74-64a7-4920-a145-abde992f617d\") " pod="openshift-nmstate/nmstate-webhook-f8fb84555-xpnht" Dec 12 00:39:16 crc kubenswrapper[4606]: I1212 00:39:16.819329 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wxhsv\" (UniqueName: \"kubernetes.io/projected/042d210a-3148-4706-8e99-798c7cab2239-kube-api-access-wxhsv\") pod \"nmstate-metrics-7f7f7578db-95nwl\" (UID: \"042d210a-3148-4706-8e99-798c7cab2239\") " pod="openshift-nmstate/nmstate-metrics-7f7f7578db-95nwl" Dec 12 00:39:16 crc kubenswrapper[4606]: I1212 00:39:16.819419 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/c5c179d2-3e8f-4aa4-8b37-737c167dd42f-ovs-socket\") pod \"nmstate-handler-m5nlb\" (UID: \"c5c179d2-3e8f-4aa4-8b37-737c167dd42f\") " pod="openshift-nmstate/nmstate-handler-m5nlb" Dec 12 00:39:16 crc kubenswrapper[4606]: I1212 00:39:16.819488 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/c5c179d2-3e8f-4aa4-8b37-737c167dd42f-nmstate-lock\") pod \"nmstate-handler-m5nlb\" (UID: \"c5c179d2-3e8f-4aa4-8b37-737c167dd42f\") " pod="openshift-nmstate/nmstate-handler-m5nlb" Dec 12 00:39:16 crc kubenswrapper[4606]: I1212 00:39:16.819656 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5vq7\" (UniqueName: \"kubernetes.io/projected/fccbfa74-64a7-4920-a145-abde992f617d-kube-api-access-h5vq7\") pod \"nmstate-webhook-f8fb84555-xpnht\" (UID: \"fccbfa74-64a7-4920-a145-abde992f617d\") " pod="openshift-nmstate/nmstate-webhook-f8fb84555-xpnht" Dec 12 00:39:16 crc kubenswrapper[4606]: I1212 00:39:16.853094 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxhsv\" (UniqueName: \"kubernetes.io/projected/042d210a-3148-4706-8e99-798c7cab2239-kube-api-access-wxhsv\") pod \"nmstate-metrics-7f7f7578db-95nwl\" (UID: \"042d210a-3148-4706-8e99-798c7cab2239\") " pod="openshift-nmstate/nmstate-metrics-7f7f7578db-95nwl" Dec 12 00:39:16 crc kubenswrapper[4606]: I1212 00:39:16.875584 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6ff7998486-82lk9"] Dec 12 00:39:16 crc kubenswrapper[4606]: I1212 00:39:16.876328 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-82lk9" Dec 12 00:39:16 crc kubenswrapper[4606]: I1212 00:39:16.877874 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Dec 12 00:39:16 crc kubenswrapper[4606]: I1212 00:39:16.879166 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Dec 12 00:39:16 crc kubenswrapper[4606]: I1212 00:39:16.879426 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-8zt46" Dec 12 00:39:16 crc kubenswrapper[4606]: I1212 00:39:16.895651 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6ff7998486-82lk9"] Dec 12 00:39:16 crc kubenswrapper[4606]: I1212 00:39:16.921363 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mbm8r\" (UniqueName: \"kubernetes.io/projected/c5c179d2-3e8f-4aa4-8b37-737c167dd42f-kube-api-access-mbm8r\") pod \"nmstate-handler-m5nlb\" (UID: \"c5c179d2-3e8f-4aa4-8b37-737c167dd42f\") " pod="openshift-nmstate/nmstate-handler-m5nlb" Dec 12 00:39:16 crc kubenswrapper[4606]: I1212 00:39:16.921419 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/27901900-87d9-45a6-a5cb-1fcf505917ee-plugin-serving-cert\") pod \"nmstate-console-plugin-6ff7998486-82lk9\" (UID: \"27901900-87d9-45a6-a5cb-1fcf505917ee\") " pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-82lk9" Dec 12 00:39:16 crc kubenswrapper[4606]: I1212 00:39:16.921444 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/fccbfa74-64a7-4920-a145-abde992f617d-tls-key-pair\") pod \"nmstate-webhook-f8fb84555-xpnht\" (UID: \"fccbfa74-64a7-4920-a145-abde992f617d\") " pod="openshift-nmstate/nmstate-webhook-f8fb84555-xpnht" Dec 12 00:39:16 crc kubenswrapper[4606]: I1212 00:39:16.921474 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/c5c179d2-3e8f-4aa4-8b37-737c167dd42f-ovs-socket\") pod \"nmstate-handler-m5nlb\" (UID: \"c5c179d2-3e8f-4aa4-8b37-737c167dd42f\") " pod="openshift-nmstate/nmstate-handler-m5nlb" Dec 12 00:39:16 crc kubenswrapper[4606]: I1212 00:39:16.921530 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/c5c179d2-3e8f-4aa4-8b37-737c167dd42f-ovs-socket\") pod \"nmstate-handler-m5nlb\" (UID: \"c5c179d2-3e8f-4aa4-8b37-737c167dd42f\") " pod="openshift-nmstate/nmstate-handler-m5nlb" Dec 12 00:39:16 crc kubenswrapper[4606]: I1212 00:39:16.921591 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/c5c179d2-3e8f-4aa4-8b37-737c167dd42f-nmstate-lock\") pod \"nmstate-handler-m5nlb\" (UID: \"c5c179d2-3e8f-4aa4-8b37-737c167dd42f\") " pod="openshift-nmstate/nmstate-handler-m5nlb" Dec 12 00:39:16 crc kubenswrapper[4606]: E1212 00:39:16.921636 4606 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Dec 12 00:39:16 crc kubenswrapper[4606]: I1212 00:39:16.921673 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/c5c179d2-3e8f-4aa4-8b37-737c167dd42f-nmstate-lock\") pod \"nmstate-handler-m5nlb\" (UID: \"c5c179d2-3e8f-4aa4-8b37-737c167dd42f\") " pod="openshift-nmstate/nmstate-handler-m5nlb" Dec 12 00:39:16 crc kubenswrapper[4606]: I1212 00:39:16.921684 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h5vq7\" (UniqueName: \"kubernetes.io/projected/fccbfa74-64a7-4920-a145-abde992f617d-kube-api-access-h5vq7\") pod \"nmstate-webhook-f8fb84555-xpnht\" (UID: \"fccbfa74-64a7-4920-a145-abde992f617d\") " pod="openshift-nmstate/nmstate-webhook-f8fb84555-xpnht" Dec 12 00:39:16 crc kubenswrapper[4606]: E1212 00:39:16.922132 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fccbfa74-64a7-4920-a145-abde992f617d-tls-key-pair podName:fccbfa74-64a7-4920-a145-abde992f617d nodeName:}" failed. No retries permitted until 2025-12-12 00:39:17.421692355 +0000 UTC m=+947.967045221 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/fccbfa74-64a7-4920-a145-abde992f617d-tls-key-pair") pod "nmstate-webhook-f8fb84555-xpnht" (UID: "fccbfa74-64a7-4920-a145-abde992f617d") : secret "openshift-nmstate-webhook" not found Dec 12 00:39:16 crc kubenswrapper[4606]: I1212 00:39:16.922223 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/27901900-87d9-45a6-a5cb-1fcf505917ee-nginx-conf\") pod \"nmstate-console-plugin-6ff7998486-82lk9\" (UID: \"27901900-87d9-45a6-a5cb-1fcf505917ee\") " pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-82lk9" Dec 12 00:39:16 crc kubenswrapper[4606]: I1212 00:39:16.922265 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kn6qj\" (UniqueName: \"kubernetes.io/projected/27901900-87d9-45a6-a5cb-1fcf505917ee-kube-api-access-kn6qj\") pod \"nmstate-console-plugin-6ff7998486-82lk9\" (UID: \"27901900-87d9-45a6-a5cb-1fcf505917ee\") " pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-82lk9" Dec 12 00:39:16 crc kubenswrapper[4606]: I1212 00:39:16.922350 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/c5c179d2-3e8f-4aa4-8b37-737c167dd42f-dbus-socket\") pod \"nmstate-handler-m5nlb\" (UID: \"c5c179d2-3e8f-4aa4-8b37-737c167dd42f\") " pod="openshift-nmstate/nmstate-handler-m5nlb" Dec 12 00:39:16 crc kubenswrapper[4606]: I1212 00:39:16.922644 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/c5c179d2-3e8f-4aa4-8b37-737c167dd42f-dbus-socket\") pod \"nmstate-handler-m5nlb\" (UID: \"c5c179d2-3e8f-4aa4-8b37-737c167dd42f\") " pod="openshift-nmstate/nmstate-handler-m5nlb" Dec 12 00:39:16 crc kubenswrapper[4606]: I1212 00:39:16.955395 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5vq7\" (UniqueName: \"kubernetes.io/projected/fccbfa74-64a7-4920-a145-abde992f617d-kube-api-access-h5vq7\") pod \"nmstate-webhook-f8fb84555-xpnht\" (UID: \"fccbfa74-64a7-4920-a145-abde992f617d\") " pod="openshift-nmstate/nmstate-webhook-f8fb84555-xpnht" Dec 12 00:39:16 crc kubenswrapper[4606]: I1212 00:39:16.965833 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbm8r\" (UniqueName: \"kubernetes.io/projected/c5c179d2-3e8f-4aa4-8b37-737c167dd42f-kube-api-access-mbm8r\") pod \"nmstate-handler-m5nlb\" (UID: \"c5c179d2-3e8f-4aa4-8b37-737c167dd42f\") " pod="openshift-nmstate/nmstate-handler-m5nlb" Dec 12 00:39:16 crc kubenswrapper[4606]: I1212 00:39:16.988826 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f7f7578db-95nwl" Dec 12 00:39:17 crc kubenswrapper[4606]: I1212 00:39:17.025134 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/27901900-87d9-45a6-a5cb-1fcf505917ee-nginx-conf\") pod \"nmstate-console-plugin-6ff7998486-82lk9\" (UID: \"27901900-87d9-45a6-a5cb-1fcf505917ee\") " pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-82lk9" Dec 12 00:39:17 crc kubenswrapper[4606]: I1212 00:39:17.025274 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kn6qj\" (UniqueName: \"kubernetes.io/projected/27901900-87d9-45a6-a5cb-1fcf505917ee-kube-api-access-kn6qj\") pod \"nmstate-console-plugin-6ff7998486-82lk9\" (UID: \"27901900-87d9-45a6-a5cb-1fcf505917ee\") " pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-82lk9" Dec 12 00:39:17 crc kubenswrapper[4606]: I1212 00:39:17.026085 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/27901900-87d9-45a6-a5cb-1fcf505917ee-nginx-conf\") pod \"nmstate-console-plugin-6ff7998486-82lk9\" (UID: \"27901900-87d9-45a6-a5cb-1fcf505917ee\") " pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-82lk9" Dec 12 00:39:17 crc kubenswrapper[4606]: I1212 00:39:17.026132 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/27901900-87d9-45a6-a5cb-1fcf505917ee-plugin-serving-cert\") pod \"nmstate-console-plugin-6ff7998486-82lk9\" (UID: \"27901900-87d9-45a6-a5cb-1fcf505917ee\") " pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-82lk9" Dec 12 00:39:17 crc kubenswrapper[4606]: E1212 00:39:17.026448 4606 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Dec 12 00:39:17 crc kubenswrapper[4606]: E1212 00:39:17.026486 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/27901900-87d9-45a6-a5cb-1fcf505917ee-plugin-serving-cert podName:27901900-87d9-45a6-a5cb-1fcf505917ee nodeName:}" failed. No retries permitted until 2025-12-12 00:39:17.526475813 +0000 UTC m=+948.071828679 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/27901900-87d9-45a6-a5cb-1fcf505917ee-plugin-serving-cert") pod "nmstate-console-plugin-6ff7998486-82lk9" (UID: "27901900-87d9-45a6-a5cb-1fcf505917ee") : secret "plugin-serving-cert" not found Dec 12 00:39:17 crc kubenswrapper[4606]: I1212 00:39:17.031486 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-m5nlb" Dec 12 00:39:17 crc kubenswrapper[4606]: I1212 00:39:17.066826 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kn6qj\" (UniqueName: \"kubernetes.io/projected/27901900-87d9-45a6-a5cb-1fcf505917ee-kube-api-access-kn6qj\") pod \"nmstate-console-plugin-6ff7998486-82lk9\" (UID: \"27901900-87d9-45a6-a5cb-1fcf505917ee\") " pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-82lk9" Dec 12 00:39:17 crc kubenswrapper[4606]: I1212 00:39:17.100943 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-969d5cc7f-gr4h2"] Dec 12 00:39:17 crc kubenswrapper[4606]: I1212 00:39:17.101855 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-969d5cc7f-gr4h2" Dec 12 00:39:17 crc kubenswrapper[4606]: I1212 00:39:17.113968 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-969d5cc7f-gr4h2"] Dec 12 00:39:17 crc kubenswrapper[4606]: I1212 00:39:17.228292 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f75cab47-0242-4639-ae86-5e48f5e5149d-console-config\") pod \"console-969d5cc7f-gr4h2\" (UID: \"f75cab47-0242-4639-ae86-5e48f5e5149d\") " pod="openshift-console/console-969d5cc7f-gr4h2" Dec 12 00:39:17 crc kubenswrapper[4606]: I1212 00:39:17.228346 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xk9xk\" (UniqueName: \"kubernetes.io/projected/f75cab47-0242-4639-ae86-5e48f5e5149d-kube-api-access-xk9xk\") pod \"console-969d5cc7f-gr4h2\" (UID: \"f75cab47-0242-4639-ae86-5e48f5e5149d\") " pod="openshift-console/console-969d5cc7f-gr4h2" Dec 12 00:39:17 crc kubenswrapper[4606]: I1212 00:39:17.228379 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f75cab47-0242-4639-ae86-5e48f5e5149d-console-serving-cert\") pod \"console-969d5cc7f-gr4h2\" (UID: \"f75cab47-0242-4639-ae86-5e48f5e5149d\") " pod="openshift-console/console-969d5cc7f-gr4h2" Dec 12 00:39:17 crc kubenswrapper[4606]: I1212 00:39:17.228426 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f75cab47-0242-4639-ae86-5e48f5e5149d-oauth-serving-cert\") pod \"console-969d5cc7f-gr4h2\" (UID: \"f75cab47-0242-4639-ae86-5e48f5e5149d\") " pod="openshift-console/console-969d5cc7f-gr4h2" Dec 12 00:39:17 crc kubenswrapper[4606]: I1212 00:39:17.228449 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f75cab47-0242-4639-ae86-5e48f5e5149d-service-ca\") pod \"console-969d5cc7f-gr4h2\" (UID: \"f75cab47-0242-4639-ae86-5e48f5e5149d\") " pod="openshift-console/console-969d5cc7f-gr4h2" Dec 12 00:39:17 crc kubenswrapper[4606]: I1212 00:39:17.228471 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f75cab47-0242-4639-ae86-5e48f5e5149d-console-oauth-config\") pod \"console-969d5cc7f-gr4h2\" (UID: \"f75cab47-0242-4639-ae86-5e48f5e5149d\") " pod="openshift-console/console-969d5cc7f-gr4h2" Dec 12 00:39:17 crc kubenswrapper[4606]: I1212 00:39:17.228525 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f75cab47-0242-4639-ae86-5e48f5e5149d-trusted-ca-bundle\") pod \"console-969d5cc7f-gr4h2\" (UID: \"f75cab47-0242-4639-ae86-5e48f5e5149d\") " pod="openshift-console/console-969d5cc7f-gr4h2" Dec 12 00:39:17 crc kubenswrapper[4606]: I1212 00:39:17.283412 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f7f7578db-95nwl"] Dec 12 00:39:17 crc kubenswrapper[4606]: W1212 00:39:17.292423 4606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod042d210a_3148_4706_8e99_798c7cab2239.slice/crio-a38338de4b9261ebbfd25aae74e3a29b71ee5d075714f2811a7666885686765f WatchSource:0}: Error finding container a38338de4b9261ebbfd25aae74e3a29b71ee5d075714f2811a7666885686765f: Status 404 returned error can't find the container with id a38338de4b9261ebbfd25aae74e3a29b71ee5d075714f2811a7666885686765f Dec 12 00:39:17 crc kubenswrapper[4606]: I1212 00:39:17.329731 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f75cab47-0242-4639-ae86-5e48f5e5149d-oauth-serving-cert\") pod \"console-969d5cc7f-gr4h2\" (UID: \"f75cab47-0242-4639-ae86-5e48f5e5149d\") " pod="openshift-console/console-969d5cc7f-gr4h2" Dec 12 00:39:17 crc kubenswrapper[4606]: I1212 00:39:17.329772 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f75cab47-0242-4639-ae86-5e48f5e5149d-service-ca\") pod \"console-969d5cc7f-gr4h2\" (UID: \"f75cab47-0242-4639-ae86-5e48f5e5149d\") " pod="openshift-console/console-969d5cc7f-gr4h2" Dec 12 00:39:17 crc kubenswrapper[4606]: I1212 00:39:17.329797 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f75cab47-0242-4639-ae86-5e48f5e5149d-console-oauth-config\") pod \"console-969d5cc7f-gr4h2\" (UID: \"f75cab47-0242-4639-ae86-5e48f5e5149d\") " pod="openshift-console/console-969d5cc7f-gr4h2" Dec 12 00:39:17 crc kubenswrapper[4606]: I1212 00:39:17.329849 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f75cab47-0242-4639-ae86-5e48f5e5149d-trusted-ca-bundle\") pod \"console-969d5cc7f-gr4h2\" (UID: \"f75cab47-0242-4639-ae86-5e48f5e5149d\") " pod="openshift-console/console-969d5cc7f-gr4h2" Dec 12 00:39:17 crc kubenswrapper[4606]: I1212 00:39:17.329870 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f75cab47-0242-4639-ae86-5e48f5e5149d-console-config\") pod \"console-969d5cc7f-gr4h2\" (UID: \"f75cab47-0242-4639-ae86-5e48f5e5149d\") " pod="openshift-console/console-969d5cc7f-gr4h2" Dec 12 00:39:17 crc kubenswrapper[4606]: I1212 00:39:17.329902 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f75cab47-0242-4639-ae86-5e48f5e5149d-console-serving-cert\") pod \"console-969d5cc7f-gr4h2\" (UID: \"f75cab47-0242-4639-ae86-5e48f5e5149d\") " pod="openshift-console/console-969d5cc7f-gr4h2" Dec 12 00:39:17 crc kubenswrapper[4606]: I1212 00:39:17.329917 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xk9xk\" (UniqueName: \"kubernetes.io/projected/f75cab47-0242-4639-ae86-5e48f5e5149d-kube-api-access-xk9xk\") pod \"console-969d5cc7f-gr4h2\" (UID: \"f75cab47-0242-4639-ae86-5e48f5e5149d\") " pod="openshift-console/console-969d5cc7f-gr4h2" Dec 12 00:39:17 crc kubenswrapper[4606]: I1212 00:39:17.330889 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f75cab47-0242-4639-ae86-5e48f5e5149d-oauth-serving-cert\") pod \"console-969d5cc7f-gr4h2\" (UID: \"f75cab47-0242-4639-ae86-5e48f5e5149d\") " pod="openshift-console/console-969d5cc7f-gr4h2" Dec 12 00:39:17 crc kubenswrapper[4606]: I1212 00:39:17.331835 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f75cab47-0242-4639-ae86-5e48f5e5149d-trusted-ca-bundle\") pod \"console-969d5cc7f-gr4h2\" (UID: \"f75cab47-0242-4639-ae86-5e48f5e5149d\") " pod="openshift-console/console-969d5cc7f-gr4h2" Dec 12 00:39:17 crc kubenswrapper[4606]: I1212 00:39:17.332414 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f75cab47-0242-4639-ae86-5e48f5e5149d-service-ca\") pod \"console-969d5cc7f-gr4h2\" (UID: \"f75cab47-0242-4639-ae86-5e48f5e5149d\") " pod="openshift-console/console-969d5cc7f-gr4h2" Dec 12 00:39:17 crc kubenswrapper[4606]: I1212 00:39:17.332877 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f75cab47-0242-4639-ae86-5e48f5e5149d-console-config\") pod \"console-969d5cc7f-gr4h2\" (UID: \"f75cab47-0242-4639-ae86-5e48f5e5149d\") " pod="openshift-console/console-969d5cc7f-gr4h2" Dec 12 00:39:17 crc kubenswrapper[4606]: I1212 00:39:17.335438 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f75cab47-0242-4639-ae86-5e48f5e5149d-console-oauth-config\") pod \"console-969d5cc7f-gr4h2\" (UID: \"f75cab47-0242-4639-ae86-5e48f5e5149d\") " pod="openshift-console/console-969d5cc7f-gr4h2" Dec 12 00:39:17 crc kubenswrapper[4606]: I1212 00:39:17.336671 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f75cab47-0242-4639-ae86-5e48f5e5149d-console-serving-cert\") pod \"console-969d5cc7f-gr4h2\" (UID: \"f75cab47-0242-4639-ae86-5e48f5e5149d\") " pod="openshift-console/console-969d5cc7f-gr4h2" Dec 12 00:39:17 crc kubenswrapper[4606]: I1212 00:39:17.344874 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xk9xk\" (UniqueName: \"kubernetes.io/projected/f75cab47-0242-4639-ae86-5e48f5e5149d-kube-api-access-xk9xk\") pod \"console-969d5cc7f-gr4h2\" (UID: \"f75cab47-0242-4639-ae86-5e48f5e5149d\") " pod="openshift-console/console-969d5cc7f-gr4h2" Dec 12 00:39:17 crc kubenswrapper[4606]: I1212 00:39:17.427860 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-969d5cc7f-gr4h2" Dec 12 00:39:17 crc kubenswrapper[4606]: I1212 00:39:17.431344 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/fccbfa74-64a7-4920-a145-abde992f617d-tls-key-pair\") pod \"nmstate-webhook-f8fb84555-xpnht\" (UID: \"fccbfa74-64a7-4920-a145-abde992f617d\") " pod="openshift-nmstate/nmstate-webhook-f8fb84555-xpnht" Dec 12 00:39:17 crc kubenswrapper[4606]: I1212 00:39:17.435861 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/fccbfa74-64a7-4920-a145-abde992f617d-tls-key-pair\") pod \"nmstate-webhook-f8fb84555-xpnht\" (UID: \"fccbfa74-64a7-4920-a145-abde992f617d\") " pod="openshift-nmstate/nmstate-webhook-f8fb84555-xpnht" Dec 12 00:39:17 crc kubenswrapper[4606]: I1212 00:39:17.532923 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/27901900-87d9-45a6-a5cb-1fcf505917ee-plugin-serving-cert\") pod \"nmstate-console-plugin-6ff7998486-82lk9\" (UID: \"27901900-87d9-45a6-a5cb-1fcf505917ee\") " pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-82lk9" Dec 12 00:39:17 crc kubenswrapper[4606]: I1212 00:39:17.537608 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/27901900-87d9-45a6-a5cb-1fcf505917ee-plugin-serving-cert\") pod \"nmstate-console-plugin-6ff7998486-82lk9\" (UID: \"27901900-87d9-45a6-a5cb-1fcf505917ee\") " pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-82lk9" Dec 12 00:39:17 crc kubenswrapper[4606]: I1212 00:39:17.598081 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-969d5cc7f-gr4h2"] Dec 12 00:39:17 crc kubenswrapper[4606]: W1212 00:39:17.606473 4606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf75cab47_0242_4639_ae86_5e48f5e5149d.slice/crio-6fa0b9d4fa2375299f5e106e7bffb2dcde6476c812b308ae58dce1bbd91a906e WatchSource:0}: Error finding container 6fa0b9d4fa2375299f5e106e7bffb2dcde6476c812b308ae58dce1bbd91a906e: Status 404 returned error can't find the container with id 6fa0b9d4fa2375299f5e106e7bffb2dcde6476c812b308ae58dce1bbd91a906e Dec 12 00:39:17 crc kubenswrapper[4606]: I1212 00:39:17.620601 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-f8fb84555-xpnht" Dec 12 00:39:17 crc kubenswrapper[4606]: I1212 00:39:17.631151 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-m5nlb" event={"ID":"c5c179d2-3e8f-4aa4-8b37-737c167dd42f","Type":"ContainerStarted","Data":"97fd31fd993ffc94689796195c1d56bade705399cc78334e6966ee337c157e60"} Dec 12 00:39:17 crc kubenswrapper[4606]: I1212 00:39:17.632395 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f7f7578db-95nwl" event={"ID":"042d210a-3148-4706-8e99-798c7cab2239","Type":"ContainerStarted","Data":"a38338de4b9261ebbfd25aae74e3a29b71ee5d075714f2811a7666885686765f"} Dec 12 00:39:17 crc kubenswrapper[4606]: I1212 00:39:17.633493 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-969d5cc7f-gr4h2" event={"ID":"f75cab47-0242-4639-ae86-5e48f5e5149d","Type":"ContainerStarted","Data":"6fa0b9d4fa2375299f5e106e7bffb2dcde6476c812b308ae58dce1bbd91a906e"} Dec 12 00:39:17 crc kubenswrapper[4606]: I1212 00:39:17.791768 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-82lk9" Dec 12 00:39:17 crc kubenswrapper[4606]: I1212 00:39:17.834079 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-f8fb84555-xpnht"] Dec 12 00:39:17 crc kubenswrapper[4606]: W1212 00:39:17.850902 4606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfccbfa74_64a7_4920_a145_abde992f617d.slice/crio-5b7c8da69722b87e999442ac1cb6da7538e6f4401f45325f3cf0d8d737cf0a74 WatchSource:0}: Error finding container 5b7c8da69722b87e999442ac1cb6da7538e6f4401f45325f3cf0d8d737cf0a74: Status 404 returned error can't find the container with id 5b7c8da69722b87e999442ac1cb6da7538e6f4401f45325f3cf0d8d737cf0a74 Dec 12 00:39:17 crc kubenswrapper[4606]: I1212 00:39:17.981732 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6ff7998486-82lk9"] Dec 12 00:39:18 crc kubenswrapper[4606]: W1212 00:39:18.000553 4606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod27901900_87d9_45a6_a5cb_1fcf505917ee.slice/crio-74b178f49d119d460bb33c815b20fb4e2c01708aaa01a4249145e994ff749537 WatchSource:0}: Error finding container 74b178f49d119d460bb33c815b20fb4e2c01708aaa01a4249145e994ff749537: Status 404 returned error can't find the container with id 74b178f49d119d460bb33c815b20fb4e2c01708aaa01a4249145e994ff749537 Dec 12 00:39:18 crc kubenswrapper[4606]: I1212 00:39:18.639870 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-f8fb84555-xpnht" event={"ID":"fccbfa74-64a7-4920-a145-abde992f617d","Type":"ContainerStarted","Data":"5b7c8da69722b87e999442ac1cb6da7538e6f4401f45325f3cf0d8d737cf0a74"} Dec 12 00:39:18 crc kubenswrapper[4606]: I1212 00:39:18.641956 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-969d5cc7f-gr4h2" event={"ID":"f75cab47-0242-4639-ae86-5e48f5e5149d","Type":"ContainerStarted","Data":"552ad96b9756d4e88e0c3e5694559f4474423dcb46239163ae289bc93dc646e1"} Dec 12 00:39:18 crc kubenswrapper[4606]: I1212 00:39:18.642821 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-82lk9" event={"ID":"27901900-87d9-45a6-a5cb-1fcf505917ee","Type":"ContainerStarted","Data":"74b178f49d119d460bb33c815b20fb4e2c01708aaa01a4249145e994ff749537"} Dec 12 00:39:18 crc kubenswrapper[4606]: I1212 00:39:18.666181 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-969d5cc7f-gr4h2" podStartSLOduration=1.66615347 podStartE2EDuration="1.66615347s" podCreationTimestamp="2025-12-12 00:39:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:39:18.659625724 +0000 UTC m=+949.204978590" watchObservedRunningTime="2025-12-12 00:39:18.66615347 +0000 UTC m=+949.211506336" Dec 12 00:39:20 crc kubenswrapper[4606]: I1212 00:39:20.660608 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-m5nlb" event={"ID":"c5c179d2-3e8f-4aa4-8b37-737c167dd42f","Type":"ContainerStarted","Data":"da6f6c0962fca5646dae9ce90054d41caed1095b551d4b2c5472ddba7db830d5"} Dec 12 00:39:20 crc kubenswrapper[4606]: I1212 00:39:20.660992 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-m5nlb" Dec 12 00:39:20 crc kubenswrapper[4606]: I1212 00:39:20.664237 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-f8fb84555-xpnht" event={"ID":"fccbfa74-64a7-4920-a145-abde992f617d","Type":"ContainerStarted","Data":"a062f37bc8eedf015a2e0d8b56d784aa14e16d0f246c769003e6a06cfac8f5a8"} Dec 12 00:39:20 crc kubenswrapper[4606]: I1212 00:39:20.664446 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-f8fb84555-xpnht" Dec 12 00:39:20 crc kubenswrapper[4606]: I1212 00:39:20.666346 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f7f7578db-95nwl" event={"ID":"042d210a-3148-4706-8e99-798c7cab2239","Type":"ContainerStarted","Data":"aab187e672a9a3426a1e9a3f77ba01cfd0e03383a9a09d0416bca8a9018852b5"} Dec 12 00:39:20 crc kubenswrapper[4606]: I1212 00:39:20.686280 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-m5nlb" podStartSLOduration=1.953848129 podStartE2EDuration="4.68625528s" podCreationTimestamp="2025-12-12 00:39:16 +0000 UTC" firstStartedPulling="2025-12-12 00:39:17.087058983 +0000 UTC m=+947.632411849" lastFinishedPulling="2025-12-12 00:39:19.819466134 +0000 UTC m=+950.364819000" observedRunningTime="2025-12-12 00:39:20.678001858 +0000 UTC m=+951.223354724" watchObservedRunningTime="2025-12-12 00:39:20.68625528 +0000 UTC m=+951.231608146" Dec 12 00:39:20 crc kubenswrapper[4606]: I1212 00:39:20.708826 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-f8fb84555-xpnht" podStartSLOduration=2.764269269 podStartE2EDuration="4.708806237s" podCreationTimestamp="2025-12-12 00:39:16 +0000 UTC" firstStartedPulling="2025-12-12 00:39:17.85946615 +0000 UTC m=+948.404819016" lastFinishedPulling="2025-12-12 00:39:19.804003118 +0000 UTC m=+950.349355984" observedRunningTime="2025-12-12 00:39:20.702347403 +0000 UTC m=+951.247700289" watchObservedRunningTime="2025-12-12 00:39:20.708806237 +0000 UTC m=+951.254159103" Dec 12 00:39:21 crc kubenswrapper[4606]: I1212 00:39:21.675670 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-82lk9" event={"ID":"27901900-87d9-45a6-a5cb-1fcf505917ee","Type":"ContainerStarted","Data":"3a570b8a6bdfc14cf0bfbadfeb34a02a8bcfe77de56374e5c3c32e66868badc9"} Dec 12 00:39:21 crc kubenswrapper[4606]: I1212 00:39:21.691777 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-82lk9" podStartSLOduration=2.763431235 podStartE2EDuration="5.691762697s" podCreationTimestamp="2025-12-12 00:39:16 +0000 UTC" firstStartedPulling="2025-12-12 00:39:18.003308509 +0000 UTC m=+948.548661365" lastFinishedPulling="2025-12-12 00:39:20.931639961 +0000 UTC m=+951.476992827" observedRunningTime="2025-12-12 00:39:21.689234879 +0000 UTC m=+952.234587745" watchObservedRunningTime="2025-12-12 00:39:21.691762697 +0000 UTC m=+952.237115563" Dec 12 00:39:22 crc kubenswrapper[4606]: I1212 00:39:22.682415 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f7f7578db-95nwl" event={"ID":"042d210a-3148-4706-8e99-798c7cab2239","Type":"ContainerStarted","Data":"b7d74fa18b1f0d431d4069658fa5a367dd4c74f10e2eb6d2df767cd13a00a1ef"} Dec 12 00:39:22 crc kubenswrapper[4606]: I1212 00:39:22.702041 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-7f7f7578db-95nwl" podStartSLOduration=1.595851669 podStartE2EDuration="6.702019923s" podCreationTimestamp="2025-12-12 00:39:16 +0000 UTC" firstStartedPulling="2025-12-12 00:39:17.294989196 +0000 UTC m=+947.840342062" lastFinishedPulling="2025-12-12 00:39:22.40115745 +0000 UTC m=+952.946510316" observedRunningTime="2025-12-12 00:39:22.697640065 +0000 UTC m=+953.242992931" watchObservedRunningTime="2025-12-12 00:39:22.702019923 +0000 UTC m=+953.247372799" Dec 12 00:39:27 crc kubenswrapper[4606]: I1212 00:39:27.050788 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-m5nlb" Dec 12 00:39:27 crc kubenswrapper[4606]: I1212 00:39:27.459602 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-969d5cc7f-gr4h2" Dec 12 00:39:27 crc kubenswrapper[4606]: I1212 00:39:27.460514 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-969d5cc7f-gr4h2" Dec 12 00:39:27 crc kubenswrapper[4606]: I1212 00:39:27.466715 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-969d5cc7f-gr4h2" Dec 12 00:39:27 crc kubenswrapper[4606]: I1212 00:39:27.720918 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-969d5cc7f-gr4h2" Dec 12 00:39:27 crc kubenswrapper[4606]: I1212 00:39:27.779356 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-dlrwh"] Dec 12 00:39:37 crc kubenswrapper[4606]: I1212 00:39:37.633041 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-f8fb84555-xpnht" Dec 12 00:39:51 crc kubenswrapper[4606]: I1212 00:39:51.227159 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4mzlxf"] Dec 12 00:39:51 crc kubenswrapper[4606]: I1212 00:39:51.228930 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4mzlxf" Dec 12 00:39:51 crc kubenswrapper[4606]: I1212 00:39:51.231537 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 12 00:39:51 crc kubenswrapper[4606]: I1212 00:39:51.246847 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4mzlxf"] Dec 12 00:39:51 crc kubenswrapper[4606]: I1212 00:39:51.303454 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4frzk\" (UniqueName: \"kubernetes.io/projected/ca9f70a4-aa76-4acc-bcd5-90581609d523-kube-api-access-4frzk\") pod \"5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4mzlxf\" (UID: \"ca9f70a4-aa76-4acc-bcd5-90581609d523\") " pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4mzlxf" Dec 12 00:39:51 crc kubenswrapper[4606]: I1212 00:39:51.303581 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ca9f70a4-aa76-4acc-bcd5-90581609d523-bundle\") pod \"5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4mzlxf\" (UID: \"ca9f70a4-aa76-4acc-bcd5-90581609d523\") " pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4mzlxf" Dec 12 00:39:51 crc kubenswrapper[4606]: I1212 00:39:51.303667 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ca9f70a4-aa76-4acc-bcd5-90581609d523-util\") pod \"5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4mzlxf\" (UID: \"ca9f70a4-aa76-4acc-bcd5-90581609d523\") " pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4mzlxf" Dec 12 00:39:51 crc kubenswrapper[4606]: I1212 00:39:51.404867 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4frzk\" (UniqueName: \"kubernetes.io/projected/ca9f70a4-aa76-4acc-bcd5-90581609d523-kube-api-access-4frzk\") pod \"5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4mzlxf\" (UID: \"ca9f70a4-aa76-4acc-bcd5-90581609d523\") " pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4mzlxf" Dec 12 00:39:51 crc kubenswrapper[4606]: I1212 00:39:51.404975 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ca9f70a4-aa76-4acc-bcd5-90581609d523-bundle\") pod \"5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4mzlxf\" (UID: \"ca9f70a4-aa76-4acc-bcd5-90581609d523\") " pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4mzlxf" Dec 12 00:39:51 crc kubenswrapper[4606]: I1212 00:39:51.405035 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ca9f70a4-aa76-4acc-bcd5-90581609d523-util\") pod \"5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4mzlxf\" (UID: \"ca9f70a4-aa76-4acc-bcd5-90581609d523\") " pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4mzlxf" Dec 12 00:39:51 crc kubenswrapper[4606]: I1212 00:39:51.405771 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ca9f70a4-aa76-4acc-bcd5-90581609d523-util\") pod \"5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4mzlxf\" (UID: \"ca9f70a4-aa76-4acc-bcd5-90581609d523\") " pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4mzlxf" Dec 12 00:39:51 crc kubenswrapper[4606]: I1212 00:39:51.405944 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ca9f70a4-aa76-4acc-bcd5-90581609d523-bundle\") pod \"5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4mzlxf\" (UID: \"ca9f70a4-aa76-4acc-bcd5-90581609d523\") " pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4mzlxf" Dec 12 00:39:51 crc kubenswrapper[4606]: I1212 00:39:51.434703 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4frzk\" (UniqueName: \"kubernetes.io/projected/ca9f70a4-aa76-4acc-bcd5-90581609d523-kube-api-access-4frzk\") pod \"5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4mzlxf\" (UID: \"ca9f70a4-aa76-4acc-bcd5-90581609d523\") " pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4mzlxf" Dec 12 00:39:51 crc kubenswrapper[4606]: I1212 00:39:51.548107 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4mzlxf" Dec 12 00:39:51 crc kubenswrapper[4606]: I1212 00:39:51.784161 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4mzlxf"] Dec 12 00:39:51 crc kubenswrapper[4606]: I1212 00:39:51.866018 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4mzlxf" event={"ID":"ca9f70a4-aa76-4acc-bcd5-90581609d523","Type":"ContainerStarted","Data":"294a9f5570db7d15bbc411d1892689cdcc5af5b6b7b960519d98662d9c5744b5"} Dec 12 00:39:52 crc kubenswrapper[4606]: I1212 00:39:52.836514 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-dlrwh" podUID="c454b7c4-18db-442a-ae25-d66e7e6061f3" containerName="console" containerID="cri-o://6a212b878af3aafa15083dc36af083bfdfc315253bb055049c8a514b2bbe5426" gracePeriod=15 Dec 12 00:39:52 crc kubenswrapper[4606]: I1212 00:39:52.874737 4606 generic.go:334] "Generic (PLEG): container finished" podID="ca9f70a4-aa76-4acc-bcd5-90581609d523" containerID="fb210c0ee6ab8ad43db6592fbe78399ef55e680e3e97f076d5887135bfdc8f78" exitCode=0 Dec 12 00:39:52 crc kubenswrapper[4606]: I1212 00:39:52.874796 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4mzlxf" event={"ID":"ca9f70a4-aa76-4acc-bcd5-90581609d523","Type":"ContainerDied","Data":"fb210c0ee6ab8ad43db6592fbe78399ef55e680e3e97f076d5887135bfdc8f78"} Dec 12 00:39:53 crc kubenswrapper[4606]: I1212 00:39:53.207355 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-dlrwh_c454b7c4-18db-442a-ae25-d66e7e6061f3/console/0.log" Dec 12 00:39:53 crc kubenswrapper[4606]: I1212 00:39:53.207691 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-dlrwh" Dec 12 00:39:53 crc kubenswrapper[4606]: I1212 00:39:53.329519 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kb6vx\" (UniqueName: \"kubernetes.io/projected/c454b7c4-18db-442a-ae25-d66e7e6061f3-kube-api-access-kb6vx\") pod \"c454b7c4-18db-442a-ae25-d66e7e6061f3\" (UID: \"c454b7c4-18db-442a-ae25-d66e7e6061f3\") " Dec 12 00:39:53 crc kubenswrapper[4606]: I1212 00:39:53.329593 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c454b7c4-18db-442a-ae25-d66e7e6061f3-console-oauth-config\") pod \"c454b7c4-18db-442a-ae25-d66e7e6061f3\" (UID: \"c454b7c4-18db-442a-ae25-d66e7e6061f3\") " Dec 12 00:39:53 crc kubenswrapper[4606]: I1212 00:39:53.329623 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c454b7c4-18db-442a-ae25-d66e7e6061f3-console-config\") pod \"c454b7c4-18db-442a-ae25-d66e7e6061f3\" (UID: \"c454b7c4-18db-442a-ae25-d66e7e6061f3\") " Dec 12 00:39:53 crc kubenswrapper[4606]: I1212 00:39:53.329642 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c454b7c4-18db-442a-ae25-d66e7e6061f3-oauth-serving-cert\") pod \"c454b7c4-18db-442a-ae25-d66e7e6061f3\" (UID: \"c454b7c4-18db-442a-ae25-d66e7e6061f3\") " Dec 12 00:39:53 crc kubenswrapper[4606]: I1212 00:39:53.329665 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c454b7c4-18db-442a-ae25-d66e7e6061f3-console-serving-cert\") pod \"c454b7c4-18db-442a-ae25-d66e7e6061f3\" (UID: \"c454b7c4-18db-442a-ae25-d66e7e6061f3\") " Dec 12 00:39:53 crc kubenswrapper[4606]: I1212 00:39:53.329717 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c454b7c4-18db-442a-ae25-d66e7e6061f3-service-ca\") pod \"c454b7c4-18db-442a-ae25-d66e7e6061f3\" (UID: \"c454b7c4-18db-442a-ae25-d66e7e6061f3\") " Dec 12 00:39:53 crc kubenswrapper[4606]: I1212 00:39:53.329779 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c454b7c4-18db-442a-ae25-d66e7e6061f3-trusted-ca-bundle\") pod \"c454b7c4-18db-442a-ae25-d66e7e6061f3\" (UID: \"c454b7c4-18db-442a-ae25-d66e7e6061f3\") " Dec 12 00:39:53 crc kubenswrapper[4606]: I1212 00:39:53.330500 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c454b7c4-18db-442a-ae25-d66e7e6061f3-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "c454b7c4-18db-442a-ae25-d66e7e6061f3" (UID: "c454b7c4-18db-442a-ae25-d66e7e6061f3"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:39:53 crc kubenswrapper[4606]: I1212 00:39:53.330580 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c454b7c4-18db-442a-ae25-d66e7e6061f3-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "c454b7c4-18db-442a-ae25-d66e7e6061f3" (UID: "c454b7c4-18db-442a-ae25-d66e7e6061f3"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:39:53 crc kubenswrapper[4606]: I1212 00:39:53.331104 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c454b7c4-18db-442a-ae25-d66e7e6061f3-console-config" (OuterVolumeSpecName: "console-config") pod "c454b7c4-18db-442a-ae25-d66e7e6061f3" (UID: "c454b7c4-18db-442a-ae25-d66e7e6061f3"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:39:53 crc kubenswrapper[4606]: I1212 00:39:53.331269 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c454b7c4-18db-442a-ae25-d66e7e6061f3-service-ca" (OuterVolumeSpecName: "service-ca") pod "c454b7c4-18db-442a-ae25-d66e7e6061f3" (UID: "c454b7c4-18db-442a-ae25-d66e7e6061f3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:39:53 crc kubenswrapper[4606]: I1212 00:39:53.335526 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c454b7c4-18db-442a-ae25-d66e7e6061f3-kube-api-access-kb6vx" (OuterVolumeSpecName: "kube-api-access-kb6vx") pod "c454b7c4-18db-442a-ae25-d66e7e6061f3" (UID: "c454b7c4-18db-442a-ae25-d66e7e6061f3"). InnerVolumeSpecName "kube-api-access-kb6vx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:39:53 crc kubenswrapper[4606]: I1212 00:39:53.335526 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c454b7c4-18db-442a-ae25-d66e7e6061f3-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "c454b7c4-18db-442a-ae25-d66e7e6061f3" (UID: "c454b7c4-18db-442a-ae25-d66e7e6061f3"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:39:53 crc kubenswrapper[4606]: I1212 00:39:53.337719 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c454b7c4-18db-442a-ae25-d66e7e6061f3-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "c454b7c4-18db-442a-ae25-d66e7e6061f3" (UID: "c454b7c4-18db-442a-ae25-d66e7e6061f3"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:39:53 crc kubenswrapper[4606]: I1212 00:39:53.432575 4606 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c454b7c4-18db-442a-ae25-d66e7e6061f3-service-ca\") on node \"crc\" DevicePath \"\"" Dec 12 00:39:53 crc kubenswrapper[4606]: I1212 00:39:53.432939 4606 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c454b7c4-18db-442a-ae25-d66e7e6061f3-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 12 00:39:53 crc kubenswrapper[4606]: I1212 00:39:53.433069 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kb6vx\" (UniqueName: \"kubernetes.io/projected/c454b7c4-18db-442a-ae25-d66e7e6061f3-kube-api-access-kb6vx\") on node \"crc\" DevicePath \"\"" Dec 12 00:39:53 crc kubenswrapper[4606]: I1212 00:39:53.433092 4606 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c454b7c4-18db-442a-ae25-d66e7e6061f3-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 12 00:39:53 crc kubenswrapper[4606]: I1212 00:39:53.433109 4606 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c454b7c4-18db-442a-ae25-d66e7e6061f3-console-config\") on node \"crc\" DevicePath \"\"" Dec 12 00:39:53 crc kubenswrapper[4606]: I1212 00:39:53.433126 4606 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c454b7c4-18db-442a-ae25-d66e7e6061f3-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 12 00:39:53 crc kubenswrapper[4606]: I1212 00:39:53.433143 4606 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c454b7c4-18db-442a-ae25-d66e7e6061f3-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 12 00:39:53 crc kubenswrapper[4606]: I1212 00:39:53.882443 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-dlrwh_c454b7c4-18db-442a-ae25-d66e7e6061f3/console/0.log" Dec 12 00:39:53 crc kubenswrapper[4606]: I1212 00:39:53.882499 4606 generic.go:334] "Generic (PLEG): container finished" podID="c454b7c4-18db-442a-ae25-d66e7e6061f3" containerID="6a212b878af3aafa15083dc36af083bfdfc315253bb055049c8a514b2bbe5426" exitCode=2 Dec 12 00:39:53 crc kubenswrapper[4606]: I1212 00:39:53.882529 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-dlrwh" event={"ID":"c454b7c4-18db-442a-ae25-d66e7e6061f3","Type":"ContainerDied","Data":"6a212b878af3aafa15083dc36af083bfdfc315253bb055049c8a514b2bbe5426"} Dec 12 00:39:53 crc kubenswrapper[4606]: I1212 00:39:53.882620 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-dlrwh" event={"ID":"c454b7c4-18db-442a-ae25-d66e7e6061f3","Type":"ContainerDied","Data":"0bdb716413dfc350432245354147f9ed6a46e60ec109954eeeac29fd89d8dcc9"} Dec 12 00:39:53 crc kubenswrapper[4606]: I1212 00:39:53.882630 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-dlrwh" Dec 12 00:39:53 crc kubenswrapper[4606]: I1212 00:39:53.882660 4606 scope.go:117] "RemoveContainer" containerID="6a212b878af3aafa15083dc36af083bfdfc315253bb055049c8a514b2bbe5426" Dec 12 00:39:53 crc kubenswrapper[4606]: I1212 00:39:53.911550 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-dlrwh"] Dec 12 00:39:53 crc kubenswrapper[4606]: I1212 00:39:53.916357 4606 scope.go:117] "RemoveContainer" containerID="6a212b878af3aafa15083dc36af083bfdfc315253bb055049c8a514b2bbe5426" Dec 12 00:39:53 crc kubenswrapper[4606]: E1212 00:39:53.917107 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a212b878af3aafa15083dc36af083bfdfc315253bb055049c8a514b2bbe5426\": container with ID starting with 6a212b878af3aafa15083dc36af083bfdfc315253bb055049c8a514b2bbe5426 not found: ID does not exist" containerID="6a212b878af3aafa15083dc36af083bfdfc315253bb055049c8a514b2bbe5426" Dec 12 00:39:53 crc kubenswrapper[4606]: I1212 00:39:53.917147 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a212b878af3aafa15083dc36af083bfdfc315253bb055049c8a514b2bbe5426"} err="failed to get container status \"6a212b878af3aafa15083dc36af083bfdfc315253bb055049c8a514b2bbe5426\": rpc error: code = NotFound desc = could not find container \"6a212b878af3aafa15083dc36af083bfdfc315253bb055049c8a514b2bbe5426\": container with ID starting with 6a212b878af3aafa15083dc36af083bfdfc315253bb055049c8a514b2bbe5426 not found: ID does not exist" Dec 12 00:39:53 crc kubenswrapper[4606]: I1212 00:39:53.920767 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-dlrwh"] Dec 12 00:39:54 crc kubenswrapper[4606]: I1212 00:39:54.894617 4606 generic.go:334] "Generic (PLEG): container finished" podID="ca9f70a4-aa76-4acc-bcd5-90581609d523" containerID="52f558e7de383ba55beddaa0bda31e295fe15f4e3d4468da7dfee0e4ba4d11a6" exitCode=0 Dec 12 00:39:54 crc kubenswrapper[4606]: I1212 00:39:54.894679 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4mzlxf" event={"ID":"ca9f70a4-aa76-4acc-bcd5-90581609d523","Type":"ContainerDied","Data":"52f558e7de383ba55beddaa0bda31e295fe15f4e3d4468da7dfee0e4ba4d11a6"} Dec 12 00:39:55 crc kubenswrapper[4606]: I1212 00:39:55.710060 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c454b7c4-18db-442a-ae25-d66e7e6061f3" path="/var/lib/kubelet/pods/c454b7c4-18db-442a-ae25-d66e7e6061f3/volumes" Dec 12 00:39:55 crc kubenswrapper[4606]: I1212 00:39:55.907084 4606 generic.go:334] "Generic (PLEG): container finished" podID="ca9f70a4-aa76-4acc-bcd5-90581609d523" containerID="85f94c35ce5f6dfad0c4ff160d021e1f49c45e02614b3975d515f67edec945ad" exitCode=0 Dec 12 00:39:55 crc kubenswrapper[4606]: I1212 00:39:55.907128 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4mzlxf" event={"ID":"ca9f70a4-aa76-4acc-bcd5-90581609d523","Type":"ContainerDied","Data":"85f94c35ce5f6dfad0c4ff160d021e1f49c45e02614b3975d515f67edec945ad"} Dec 12 00:39:57 crc kubenswrapper[4606]: I1212 00:39:57.131612 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4mzlxf" Dec 12 00:39:57 crc kubenswrapper[4606]: I1212 00:39:57.184704 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ca9f70a4-aa76-4acc-bcd5-90581609d523-bundle\") pod \"ca9f70a4-aa76-4acc-bcd5-90581609d523\" (UID: \"ca9f70a4-aa76-4acc-bcd5-90581609d523\") " Dec 12 00:39:57 crc kubenswrapper[4606]: I1212 00:39:57.184820 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ca9f70a4-aa76-4acc-bcd5-90581609d523-util\") pod \"ca9f70a4-aa76-4acc-bcd5-90581609d523\" (UID: \"ca9f70a4-aa76-4acc-bcd5-90581609d523\") " Dec 12 00:39:57 crc kubenswrapper[4606]: I1212 00:39:57.184849 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4frzk\" (UniqueName: \"kubernetes.io/projected/ca9f70a4-aa76-4acc-bcd5-90581609d523-kube-api-access-4frzk\") pod \"ca9f70a4-aa76-4acc-bcd5-90581609d523\" (UID: \"ca9f70a4-aa76-4acc-bcd5-90581609d523\") " Dec 12 00:39:57 crc kubenswrapper[4606]: I1212 00:39:57.186400 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca9f70a4-aa76-4acc-bcd5-90581609d523-bundle" (OuterVolumeSpecName: "bundle") pod "ca9f70a4-aa76-4acc-bcd5-90581609d523" (UID: "ca9f70a4-aa76-4acc-bcd5-90581609d523"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 00:39:57 crc kubenswrapper[4606]: I1212 00:39:57.190584 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca9f70a4-aa76-4acc-bcd5-90581609d523-kube-api-access-4frzk" (OuterVolumeSpecName: "kube-api-access-4frzk") pod "ca9f70a4-aa76-4acc-bcd5-90581609d523" (UID: "ca9f70a4-aa76-4acc-bcd5-90581609d523"). InnerVolumeSpecName "kube-api-access-4frzk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:39:57 crc kubenswrapper[4606]: I1212 00:39:57.207057 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca9f70a4-aa76-4acc-bcd5-90581609d523-util" (OuterVolumeSpecName: "util") pod "ca9f70a4-aa76-4acc-bcd5-90581609d523" (UID: "ca9f70a4-aa76-4acc-bcd5-90581609d523"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 00:39:57 crc kubenswrapper[4606]: I1212 00:39:57.286655 4606 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ca9f70a4-aa76-4acc-bcd5-90581609d523-bundle\") on node \"crc\" DevicePath \"\"" Dec 12 00:39:57 crc kubenswrapper[4606]: I1212 00:39:57.286724 4606 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ca9f70a4-aa76-4acc-bcd5-90581609d523-util\") on node \"crc\" DevicePath \"\"" Dec 12 00:39:57 crc kubenswrapper[4606]: I1212 00:39:57.286737 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4frzk\" (UniqueName: \"kubernetes.io/projected/ca9f70a4-aa76-4acc-bcd5-90581609d523-kube-api-access-4frzk\") on node \"crc\" DevicePath \"\"" Dec 12 00:39:57 crc kubenswrapper[4606]: I1212 00:39:57.923425 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4mzlxf" event={"ID":"ca9f70a4-aa76-4acc-bcd5-90581609d523","Type":"ContainerDied","Data":"294a9f5570db7d15bbc411d1892689cdcc5af5b6b7b960519d98662d9c5744b5"} Dec 12 00:39:57 crc kubenswrapper[4606]: I1212 00:39:57.923476 4606 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="294a9f5570db7d15bbc411d1892689cdcc5af5b6b7b960519d98662d9c5744b5" Dec 12 00:39:57 crc kubenswrapper[4606]: I1212 00:39:57.923497 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4mzlxf" Dec 12 00:40:06 crc kubenswrapper[4606]: I1212 00:40:06.301280 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-77595d9574-hx5wt"] Dec 12 00:40:06 crc kubenswrapper[4606]: E1212 00:40:06.301883 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca9f70a4-aa76-4acc-bcd5-90581609d523" containerName="extract" Dec 12 00:40:06 crc kubenswrapper[4606]: I1212 00:40:06.301894 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca9f70a4-aa76-4acc-bcd5-90581609d523" containerName="extract" Dec 12 00:40:06 crc kubenswrapper[4606]: E1212 00:40:06.301913 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca9f70a4-aa76-4acc-bcd5-90581609d523" containerName="util" Dec 12 00:40:06 crc kubenswrapper[4606]: I1212 00:40:06.301920 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca9f70a4-aa76-4acc-bcd5-90581609d523" containerName="util" Dec 12 00:40:06 crc kubenswrapper[4606]: E1212 00:40:06.301926 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c454b7c4-18db-442a-ae25-d66e7e6061f3" containerName="console" Dec 12 00:40:06 crc kubenswrapper[4606]: I1212 00:40:06.301932 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="c454b7c4-18db-442a-ae25-d66e7e6061f3" containerName="console" Dec 12 00:40:06 crc kubenswrapper[4606]: E1212 00:40:06.301941 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca9f70a4-aa76-4acc-bcd5-90581609d523" containerName="pull" Dec 12 00:40:06 crc kubenswrapper[4606]: I1212 00:40:06.301947 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca9f70a4-aa76-4acc-bcd5-90581609d523" containerName="pull" Dec 12 00:40:06 crc kubenswrapper[4606]: I1212 00:40:06.302043 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca9f70a4-aa76-4acc-bcd5-90581609d523" containerName="extract" Dec 12 00:40:06 crc kubenswrapper[4606]: I1212 00:40:06.302060 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="c454b7c4-18db-442a-ae25-d66e7e6061f3" containerName="console" Dec 12 00:40:06 crc kubenswrapper[4606]: I1212 00:40:06.302405 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-77595d9574-hx5wt" Dec 12 00:40:06 crc kubenswrapper[4606]: I1212 00:40:06.308024 4606 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Dec 12 00:40:06 crc kubenswrapper[4606]: I1212 00:40:06.309604 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Dec 12 00:40:06 crc kubenswrapper[4606]: I1212 00:40:06.312685 4606 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-cqw4n" Dec 12 00:40:06 crc kubenswrapper[4606]: I1212 00:40:06.315272 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Dec 12 00:40:06 crc kubenswrapper[4606]: I1212 00:40:06.317265 4606 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Dec 12 00:40:06 crc kubenswrapper[4606]: I1212 00:40:06.327458 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-77595d9574-hx5wt"] Dec 12 00:40:06 crc kubenswrapper[4606]: I1212 00:40:06.403758 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2d54ebef-2685-424c-8d1a-7d3d56a8681c-webhook-cert\") pod \"metallb-operator-controller-manager-77595d9574-hx5wt\" (UID: \"2d54ebef-2685-424c-8d1a-7d3d56a8681c\") " pod="metallb-system/metallb-operator-controller-manager-77595d9574-hx5wt" Dec 12 00:40:06 crc kubenswrapper[4606]: I1212 00:40:06.403820 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqwqz\" (UniqueName: \"kubernetes.io/projected/2d54ebef-2685-424c-8d1a-7d3d56a8681c-kube-api-access-vqwqz\") pod \"metallb-operator-controller-manager-77595d9574-hx5wt\" (UID: \"2d54ebef-2685-424c-8d1a-7d3d56a8681c\") " pod="metallb-system/metallb-operator-controller-manager-77595d9574-hx5wt" Dec 12 00:40:06 crc kubenswrapper[4606]: I1212 00:40:06.403872 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2d54ebef-2685-424c-8d1a-7d3d56a8681c-apiservice-cert\") pod \"metallb-operator-controller-manager-77595d9574-hx5wt\" (UID: \"2d54ebef-2685-424c-8d1a-7d3d56a8681c\") " pod="metallb-system/metallb-operator-controller-manager-77595d9574-hx5wt" Dec 12 00:40:06 crc kubenswrapper[4606]: I1212 00:40:06.505317 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2d54ebef-2685-424c-8d1a-7d3d56a8681c-apiservice-cert\") pod \"metallb-operator-controller-manager-77595d9574-hx5wt\" (UID: \"2d54ebef-2685-424c-8d1a-7d3d56a8681c\") " pod="metallb-system/metallb-operator-controller-manager-77595d9574-hx5wt" Dec 12 00:40:06 crc kubenswrapper[4606]: I1212 00:40:06.505409 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2d54ebef-2685-424c-8d1a-7d3d56a8681c-webhook-cert\") pod \"metallb-operator-controller-manager-77595d9574-hx5wt\" (UID: \"2d54ebef-2685-424c-8d1a-7d3d56a8681c\") " pod="metallb-system/metallb-operator-controller-manager-77595d9574-hx5wt" Dec 12 00:40:06 crc kubenswrapper[4606]: I1212 00:40:06.505439 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqwqz\" (UniqueName: \"kubernetes.io/projected/2d54ebef-2685-424c-8d1a-7d3d56a8681c-kube-api-access-vqwqz\") pod \"metallb-operator-controller-manager-77595d9574-hx5wt\" (UID: \"2d54ebef-2685-424c-8d1a-7d3d56a8681c\") " pod="metallb-system/metallb-operator-controller-manager-77595d9574-hx5wt" Dec 12 00:40:06 crc kubenswrapper[4606]: I1212 00:40:06.514291 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2d54ebef-2685-424c-8d1a-7d3d56a8681c-webhook-cert\") pod \"metallb-operator-controller-manager-77595d9574-hx5wt\" (UID: \"2d54ebef-2685-424c-8d1a-7d3d56a8681c\") " pod="metallb-system/metallb-operator-controller-manager-77595d9574-hx5wt" Dec 12 00:40:06 crc kubenswrapper[4606]: I1212 00:40:06.521957 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2d54ebef-2685-424c-8d1a-7d3d56a8681c-apiservice-cert\") pod \"metallb-operator-controller-manager-77595d9574-hx5wt\" (UID: \"2d54ebef-2685-424c-8d1a-7d3d56a8681c\") " pod="metallb-system/metallb-operator-controller-manager-77595d9574-hx5wt" Dec 12 00:40:06 crc kubenswrapper[4606]: I1212 00:40:06.523668 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqwqz\" (UniqueName: \"kubernetes.io/projected/2d54ebef-2685-424c-8d1a-7d3d56a8681c-kube-api-access-vqwqz\") pod \"metallb-operator-controller-manager-77595d9574-hx5wt\" (UID: \"2d54ebef-2685-424c-8d1a-7d3d56a8681c\") " pod="metallb-system/metallb-operator-controller-manager-77595d9574-hx5wt" Dec 12 00:40:06 crc kubenswrapper[4606]: I1212 00:40:06.616534 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-77595d9574-hx5wt" Dec 12 00:40:06 crc kubenswrapper[4606]: I1212 00:40:06.681810 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-76c66465b9-m7hxs"] Dec 12 00:40:06 crc kubenswrapper[4606]: I1212 00:40:06.682959 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-76c66465b9-m7hxs" Dec 12 00:40:06 crc kubenswrapper[4606]: I1212 00:40:06.687504 4606 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Dec 12 00:40:06 crc kubenswrapper[4606]: I1212 00:40:06.687698 4606 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Dec 12 00:40:06 crc kubenswrapper[4606]: I1212 00:40:06.687815 4606 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-9tr5k" Dec 12 00:40:06 crc kubenswrapper[4606]: I1212 00:40:06.695094 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-76c66465b9-m7hxs"] Dec 12 00:40:06 crc kubenswrapper[4606]: I1212 00:40:06.810957 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5d954396-d8c4-45d2-97b3-3606eb503029-webhook-cert\") pod \"metallb-operator-webhook-server-76c66465b9-m7hxs\" (UID: \"5d954396-d8c4-45d2-97b3-3606eb503029\") " pod="metallb-system/metallb-operator-webhook-server-76c66465b9-m7hxs" Dec 12 00:40:06 crc kubenswrapper[4606]: I1212 00:40:06.811013 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5d954396-d8c4-45d2-97b3-3606eb503029-apiservice-cert\") pod \"metallb-operator-webhook-server-76c66465b9-m7hxs\" (UID: \"5d954396-d8c4-45d2-97b3-3606eb503029\") " pod="metallb-system/metallb-operator-webhook-server-76c66465b9-m7hxs" Dec 12 00:40:06 crc kubenswrapper[4606]: I1212 00:40:06.811068 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6qr9\" (UniqueName: \"kubernetes.io/projected/5d954396-d8c4-45d2-97b3-3606eb503029-kube-api-access-k6qr9\") pod \"metallb-operator-webhook-server-76c66465b9-m7hxs\" (UID: \"5d954396-d8c4-45d2-97b3-3606eb503029\") " pod="metallb-system/metallb-operator-webhook-server-76c66465b9-m7hxs" Dec 12 00:40:06 crc kubenswrapper[4606]: I1212 00:40:06.912539 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k6qr9\" (UniqueName: \"kubernetes.io/projected/5d954396-d8c4-45d2-97b3-3606eb503029-kube-api-access-k6qr9\") pod \"metallb-operator-webhook-server-76c66465b9-m7hxs\" (UID: \"5d954396-d8c4-45d2-97b3-3606eb503029\") " pod="metallb-system/metallb-operator-webhook-server-76c66465b9-m7hxs" Dec 12 00:40:06 crc kubenswrapper[4606]: I1212 00:40:06.912627 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5d954396-d8c4-45d2-97b3-3606eb503029-webhook-cert\") pod \"metallb-operator-webhook-server-76c66465b9-m7hxs\" (UID: \"5d954396-d8c4-45d2-97b3-3606eb503029\") " pod="metallb-system/metallb-operator-webhook-server-76c66465b9-m7hxs" Dec 12 00:40:06 crc kubenswrapper[4606]: I1212 00:40:06.912664 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5d954396-d8c4-45d2-97b3-3606eb503029-apiservice-cert\") pod \"metallb-operator-webhook-server-76c66465b9-m7hxs\" (UID: \"5d954396-d8c4-45d2-97b3-3606eb503029\") " pod="metallb-system/metallb-operator-webhook-server-76c66465b9-m7hxs" Dec 12 00:40:06 crc kubenswrapper[4606]: I1212 00:40:06.936595 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5d954396-d8c4-45d2-97b3-3606eb503029-apiservice-cert\") pod \"metallb-operator-webhook-server-76c66465b9-m7hxs\" (UID: \"5d954396-d8c4-45d2-97b3-3606eb503029\") " pod="metallb-system/metallb-operator-webhook-server-76c66465b9-m7hxs" Dec 12 00:40:06 crc kubenswrapper[4606]: I1212 00:40:06.936960 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6qr9\" (UniqueName: \"kubernetes.io/projected/5d954396-d8c4-45d2-97b3-3606eb503029-kube-api-access-k6qr9\") pod \"metallb-operator-webhook-server-76c66465b9-m7hxs\" (UID: \"5d954396-d8c4-45d2-97b3-3606eb503029\") " pod="metallb-system/metallb-operator-webhook-server-76c66465b9-m7hxs" Dec 12 00:40:06 crc kubenswrapper[4606]: I1212 00:40:06.937064 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5d954396-d8c4-45d2-97b3-3606eb503029-webhook-cert\") pod \"metallb-operator-webhook-server-76c66465b9-m7hxs\" (UID: \"5d954396-d8c4-45d2-97b3-3606eb503029\") " pod="metallb-system/metallb-operator-webhook-server-76c66465b9-m7hxs" Dec 12 00:40:06 crc kubenswrapper[4606]: I1212 00:40:06.938682 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-77595d9574-hx5wt"] Dec 12 00:40:06 crc kubenswrapper[4606]: I1212 00:40:06.978286 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-77595d9574-hx5wt" event={"ID":"2d54ebef-2685-424c-8d1a-7d3d56a8681c","Type":"ContainerStarted","Data":"39f07e945d0d674f5ebf843e90509f5676ec5a53952c72d2907379321ffe3479"} Dec 12 00:40:06 crc kubenswrapper[4606]: I1212 00:40:06.999495 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-76c66465b9-m7hxs" Dec 12 00:40:07 crc kubenswrapper[4606]: I1212 00:40:07.244910 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-76c66465b9-m7hxs"] Dec 12 00:40:07 crc kubenswrapper[4606]: W1212 00:40:07.252443 4606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5d954396_d8c4_45d2_97b3_3606eb503029.slice/crio-73ff747a22124880998aebec38866cc1e1f450946c7cdb7789b0e7d60c14da4f WatchSource:0}: Error finding container 73ff747a22124880998aebec38866cc1e1f450946c7cdb7789b0e7d60c14da4f: Status 404 returned error can't find the container with id 73ff747a22124880998aebec38866cc1e1f450946c7cdb7789b0e7d60c14da4f Dec 12 00:40:07 crc kubenswrapper[4606]: I1212 00:40:07.984925 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-76c66465b9-m7hxs" event={"ID":"5d954396-d8c4-45d2-97b3-3606eb503029","Type":"ContainerStarted","Data":"73ff747a22124880998aebec38866cc1e1f450946c7cdb7789b0e7d60c14da4f"} Dec 12 00:40:11 crc kubenswrapper[4606]: I1212 00:40:11.009263 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-77595d9574-hx5wt" event={"ID":"2d54ebef-2685-424c-8d1a-7d3d56a8681c","Type":"ContainerStarted","Data":"50a685512f03ce8d3db2d7639160437ff88d063c94ad647e768c111858e2315a"} Dec 12 00:40:11 crc kubenswrapper[4606]: I1212 00:40:11.010493 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-77595d9574-hx5wt" Dec 12 00:40:13 crc kubenswrapper[4606]: I1212 00:40:13.019312 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-76c66465b9-m7hxs" event={"ID":"5d954396-d8c4-45d2-97b3-3606eb503029","Type":"ContainerStarted","Data":"29904980b47de40173bbba017f4619c6d342d1a9dd0313cc574523be1d2f70ee"} Dec 12 00:40:13 crc kubenswrapper[4606]: I1212 00:40:13.019645 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-76c66465b9-m7hxs" Dec 12 00:40:13 crc kubenswrapper[4606]: I1212 00:40:13.037262 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-77595d9574-hx5wt" podStartSLOduration=3.350076668 podStartE2EDuration="7.037242948s" podCreationTimestamp="2025-12-12 00:40:06 +0000 UTC" firstStartedPulling="2025-12-12 00:40:06.936576136 +0000 UTC m=+997.481929002" lastFinishedPulling="2025-12-12 00:40:10.623742416 +0000 UTC m=+1001.169095282" observedRunningTime="2025-12-12 00:40:11.034758129 +0000 UTC m=+1001.580111015" watchObservedRunningTime="2025-12-12 00:40:13.037242948 +0000 UTC m=+1003.582595814" Dec 12 00:40:27 crc kubenswrapper[4606]: I1212 00:40:27.009829 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-76c66465b9-m7hxs" Dec 12 00:40:27 crc kubenswrapper[4606]: I1212 00:40:27.047492 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-76c66465b9-m7hxs" podStartSLOduration=15.887658273 podStartE2EDuration="21.047476668s" podCreationTimestamp="2025-12-12 00:40:06 +0000 UTC" firstStartedPulling="2025-12-12 00:40:07.255890264 +0000 UTC m=+997.801243130" lastFinishedPulling="2025-12-12 00:40:12.415708659 +0000 UTC m=+1002.961061525" observedRunningTime="2025-12-12 00:40:13.042572051 +0000 UTC m=+1003.587924917" watchObservedRunningTime="2025-12-12 00:40:27.047476668 +0000 UTC m=+1017.592829534" Dec 12 00:40:46 crc kubenswrapper[4606]: I1212 00:40:46.618726 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-77595d9574-hx5wt" Dec 12 00:40:47 crc kubenswrapper[4606]: I1212 00:40:47.470252 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-n4sjm"] Dec 12 00:40:47 crc kubenswrapper[4606]: I1212 00:40:47.472873 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-n4sjm" Dec 12 00:40:47 crc kubenswrapper[4606]: I1212 00:40:47.474672 4606 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-7f87n" Dec 12 00:40:47 crc kubenswrapper[4606]: I1212 00:40:47.474716 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Dec 12 00:40:47 crc kubenswrapper[4606]: I1212 00:40:47.474925 4606 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Dec 12 00:40:47 crc kubenswrapper[4606]: I1212 00:40:47.485184 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7784b6fcf-gx2jt"] Dec 12 00:40:47 crc kubenswrapper[4606]: I1212 00:40:47.486465 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-gx2jt" Dec 12 00:40:47 crc kubenswrapper[4606]: I1212 00:40:47.488081 4606 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Dec 12 00:40:47 crc kubenswrapper[4606]: I1212 00:40:47.497774 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7784b6fcf-gx2jt"] Dec 12 00:40:47 crc kubenswrapper[4606]: I1212 00:40:47.563225 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lcwwz\" (UniqueName: \"kubernetes.io/projected/5d1c165f-9379-412b-b7aa-6e4da7c4717a-kube-api-access-lcwwz\") pod \"frr-k8s-n4sjm\" (UID: \"5d1c165f-9379-412b-b7aa-6e4da7c4717a\") " pod="metallb-system/frr-k8s-n4sjm" Dec 12 00:40:47 crc kubenswrapper[4606]: I1212 00:40:47.563276 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/5d1c165f-9379-412b-b7aa-6e4da7c4717a-reloader\") pod \"frr-k8s-n4sjm\" (UID: \"5d1c165f-9379-412b-b7aa-6e4da7c4717a\") " pod="metallb-system/frr-k8s-n4sjm" Dec 12 00:40:47 crc kubenswrapper[4606]: I1212 00:40:47.563303 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5d1c165f-9379-412b-b7aa-6e4da7c4717a-metrics-certs\") pod \"frr-k8s-n4sjm\" (UID: \"5d1c165f-9379-412b-b7aa-6e4da7c4717a\") " pod="metallb-system/frr-k8s-n4sjm" Dec 12 00:40:47 crc kubenswrapper[4606]: I1212 00:40:47.563437 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/5d1c165f-9379-412b-b7aa-6e4da7c4717a-frr-conf\") pod \"frr-k8s-n4sjm\" (UID: \"5d1c165f-9379-412b-b7aa-6e4da7c4717a\") " pod="metallb-system/frr-k8s-n4sjm" Dec 12 00:40:47 crc kubenswrapper[4606]: I1212 00:40:47.563494 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/5d1c165f-9379-412b-b7aa-6e4da7c4717a-frr-sockets\") pod \"frr-k8s-n4sjm\" (UID: \"5d1c165f-9379-412b-b7aa-6e4da7c4717a\") " pod="metallb-system/frr-k8s-n4sjm" Dec 12 00:40:47 crc kubenswrapper[4606]: I1212 00:40:47.563633 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcxsz\" (UniqueName: \"kubernetes.io/projected/886be8e2-677e-4bd4-81cf-032dd6d8a890-kube-api-access-vcxsz\") pod \"frr-k8s-webhook-server-7784b6fcf-gx2jt\" (UID: \"886be8e2-677e-4bd4-81cf-032dd6d8a890\") " pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-gx2jt" Dec 12 00:40:47 crc kubenswrapper[4606]: I1212 00:40:47.563677 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/886be8e2-677e-4bd4-81cf-032dd6d8a890-cert\") pod \"frr-k8s-webhook-server-7784b6fcf-gx2jt\" (UID: \"886be8e2-677e-4bd4-81cf-032dd6d8a890\") " pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-gx2jt" Dec 12 00:40:47 crc kubenswrapper[4606]: I1212 00:40:47.563727 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/5d1c165f-9379-412b-b7aa-6e4da7c4717a-frr-startup\") pod \"frr-k8s-n4sjm\" (UID: \"5d1c165f-9379-412b-b7aa-6e4da7c4717a\") " pod="metallb-system/frr-k8s-n4sjm" Dec 12 00:40:47 crc kubenswrapper[4606]: I1212 00:40:47.563750 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/5d1c165f-9379-412b-b7aa-6e4da7c4717a-metrics\") pod \"frr-k8s-n4sjm\" (UID: \"5d1c165f-9379-412b-b7aa-6e4da7c4717a\") " pod="metallb-system/frr-k8s-n4sjm" Dec 12 00:40:47 crc kubenswrapper[4606]: I1212 00:40:47.590250 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-xngsn"] Dec 12 00:40:47 crc kubenswrapper[4606]: I1212 00:40:47.591082 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-xngsn" Dec 12 00:40:47 crc kubenswrapper[4606]: I1212 00:40:47.594166 4606 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Dec 12 00:40:47 crc kubenswrapper[4606]: I1212 00:40:47.594467 4606 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Dec 12 00:40:47 crc kubenswrapper[4606]: I1212 00:40:47.594528 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Dec 12 00:40:47 crc kubenswrapper[4606]: I1212 00:40:47.594647 4606 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-qzplr" Dec 12 00:40:47 crc kubenswrapper[4606]: I1212 00:40:47.598728 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-5bddd4b946-74lqn"] Dec 12 00:40:47 crc kubenswrapper[4606]: I1212 00:40:47.599620 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-5bddd4b946-74lqn" Dec 12 00:40:47 crc kubenswrapper[4606]: I1212 00:40:47.603895 4606 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Dec 12 00:40:47 crc kubenswrapper[4606]: I1212 00:40:47.612835 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-5bddd4b946-74lqn"] Dec 12 00:40:47 crc kubenswrapper[4606]: I1212 00:40:47.665268 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lcwwz\" (UniqueName: \"kubernetes.io/projected/5d1c165f-9379-412b-b7aa-6e4da7c4717a-kube-api-access-lcwwz\") pod \"frr-k8s-n4sjm\" (UID: \"5d1c165f-9379-412b-b7aa-6e4da7c4717a\") " pod="metallb-system/frr-k8s-n4sjm" Dec 12 00:40:47 crc kubenswrapper[4606]: I1212 00:40:47.665330 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/5d1c165f-9379-412b-b7aa-6e4da7c4717a-reloader\") pod \"frr-k8s-n4sjm\" (UID: \"5d1c165f-9379-412b-b7aa-6e4da7c4717a\") " pod="metallb-system/frr-k8s-n4sjm" Dec 12 00:40:47 crc kubenswrapper[4606]: I1212 00:40:47.665364 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmpnl\" (UniqueName: \"kubernetes.io/projected/166cd42d-4038-46ed-aa22-d264904eb215-kube-api-access-nmpnl\") pod \"speaker-xngsn\" (UID: \"166cd42d-4038-46ed-aa22-d264904eb215\") " pod="metallb-system/speaker-xngsn" Dec 12 00:40:47 crc kubenswrapper[4606]: I1212 00:40:47.665384 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5d1c165f-9379-412b-b7aa-6e4da7c4717a-metrics-certs\") pod \"frr-k8s-n4sjm\" (UID: \"5d1c165f-9379-412b-b7aa-6e4da7c4717a\") " pod="metallb-system/frr-k8s-n4sjm" Dec 12 00:40:47 crc kubenswrapper[4606]: I1212 00:40:47.665406 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/5d1c165f-9379-412b-b7aa-6e4da7c4717a-frr-conf\") pod \"frr-k8s-n4sjm\" (UID: \"5d1c165f-9379-412b-b7aa-6e4da7c4717a\") " pod="metallb-system/frr-k8s-n4sjm" Dec 12 00:40:47 crc kubenswrapper[4606]: I1212 00:40:47.665490 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/5d1c165f-9379-412b-b7aa-6e4da7c4717a-frr-sockets\") pod \"frr-k8s-n4sjm\" (UID: \"5d1c165f-9379-412b-b7aa-6e4da7c4717a\") " pod="metallb-system/frr-k8s-n4sjm" Dec 12 00:40:47 crc kubenswrapper[4606]: E1212 00:40:47.665521 4606 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Dec 12 00:40:47 crc kubenswrapper[4606]: I1212 00:40:47.665564 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/166cd42d-4038-46ed-aa22-d264904eb215-metrics-certs\") pod \"speaker-xngsn\" (UID: \"166cd42d-4038-46ed-aa22-d264904eb215\") " pod="metallb-system/speaker-xngsn" Dec 12 00:40:47 crc kubenswrapper[4606]: E1212 00:40:47.665592 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5d1c165f-9379-412b-b7aa-6e4da7c4717a-metrics-certs podName:5d1c165f-9379-412b-b7aa-6e4da7c4717a nodeName:}" failed. No retries permitted until 2025-12-12 00:40:48.165576138 +0000 UTC m=+1038.710929004 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5d1c165f-9379-412b-b7aa-6e4da7c4717a-metrics-certs") pod "frr-k8s-n4sjm" (UID: "5d1c165f-9379-412b-b7aa-6e4da7c4717a") : secret "frr-k8s-certs-secret" not found Dec 12 00:40:47 crc kubenswrapper[4606]: I1212 00:40:47.665620 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vcxsz\" (UniqueName: \"kubernetes.io/projected/886be8e2-677e-4bd4-81cf-032dd6d8a890-kube-api-access-vcxsz\") pod \"frr-k8s-webhook-server-7784b6fcf-gx2jt\" (UID: \"886be8e2-677e-4bd4-81cf-032dd6d8a890\") " pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-gx2jt" Dec 12 00:40:47 crc kubenswrapper[4606]: I1212 00:40:47.665650 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/886be8e2-677e-4bd4-81cf-032dd6d8a890-cert\") pod \"frr-k8s-webhook-server-7784b6fcf-gx2jt\" (UID: \"886be8e2-677e-4bd4-81cf-032dd6d8a890\") " pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-gx2jt" Dec 12 00:40:47 crc kubenswrapper[4606]: I1212 00:40:47.665680 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/5d1c165f-9379-412b-b7aa-6e4da7c4717a-frr-startup\") pod \"frr-k8s-n4sjm\" (UID: \"5d1c165f-9379-412b-b7aa-6e4da7c4717a\") " pod="metallb-system/frr-k8s-n4sjm" Dec 12 00:40:47 crc kubenswrapper[4606]: I1212 00:40:47.665705 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/166cd42d-4038-46ed-aa22-d264904eb215-metallb-excludel2\") pod \"speaker-xngsn\" (UID: \"166cd42d-4038-46ed-aa22-d264904eb215\") " pod="metallb-system/speaker-xngsn" Dec 12 00:40:47 crc kubenswrapper[4606]: I1212 00:40:47.665727 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/5d1c165f-9379-412b-b7aa-6e4da7c4717a-metrics\") pod \"frr-k8s-n4sjm\" (UID: \"5d1c165f-9379-412b-b7aa-6e4da7c4717a\") " pod="metallb-system/frr-k8s-n4sjm" Dec 12 00:40:47 crc kubenswrapper[4606]: I1212 00:40:47.665766 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/166cd42d-4038-46ed-aa22-d264904eb215-memberlist\") pod \"speaker-xngsn\" (UID: \"166cd42d-4038-46ed-aa22-d264904eb215\") " pod="metallb-system/speaker-xngsn" Dec 12 00:40:47 crc kubenswrapper[4606]: I1212 00:40:47.665863 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/5d1c165f-9379-412b-b7aa-6e4da7c4717a-reloader\") pod \"frr-k8s-n4sjm\" (UID: \"5d1c165f-9379-412b-b7aa-6e4da7c4717a\") " pod="metallb-system/frr-k8s-n4sjm" Dec 12 00:40:47 crc kubenswrapper[4606]: I1212 00:40:47.665917 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/5d1c165f-9379-412b-b7aa-6e4da7c4717a-frr-conf\") pod \"frr-k8s-n4sjm\" (UID: \"5d1c165f-9379-412b-b7aa-6e4da7c4717a\") " pod="metallb-system/frr-k8s-n4sjm" Dec 12 00:40:47 crc kubenswrapper[4606]: I1212 00:40:47.665944 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/5d1c165f-9379-412b-b7aa-6e4da7c4717a-frr-sockets\") pod \"frr-k8s-n4sjm\" (UID: \"5d1c165f-9379-412b-b7aa-6e4da7c4717a\") " pod="metallb-system/frr-k8s-n4sjm" Dec 12 00:40:47 crc kubenswrapper[4606]: I1212 00:40:47.666119 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/5d1c165f-9379-412b-b7aa-6e4da7c4717a-metrics\") pod \"frr-k8s-n4sjm\" (UID: \"5d1c165f-9379-412b-b7aa-6e4da7c4717a\") " pod="metallb-system/frr-k8s-n4sjm" Dec 12 00:40:47 crc kubenswrapper[4606]: I1212 00:40:47.666683 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/5d1c165f-9379-412b-b7aa-6e4da7c4717a-frr-startup\") pod \"frr-k8s-n4sjm\" (UID: \"5d1c165f-9379-412b-b7aa-6e4da7c4717a\") " pod="metallb-system/frr-k8s-n4sjm" Dec 12 00:40:47 crc kubenswrapper[4606]: I1212 00:40:47.671486 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/886be8e2-677e-4bd4-81cf-032dd6d8a890-cert\") pod \"frr-k8s-webhook-server-7784b6fcf-gx2jt\" (UID: \"886be8e2-677e-4bd4-81cf-032dd6d8a890\") " pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-gx2jt" Dec 12 00:40:47 crc kubenswrapper[4606]: I1212 00:40:47.690661 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lcwwz\" (UniqueName: \"kubernetes.io/projected/5d1c165f-9379-412b-b7aa-6e4da7c4717a-kube-api-access-lcwwz\") pod \"frr-k8s-n4sjm\" (UID: \"5d1c165f-9379-412b-b7aa-6e4da7c4717a\") " pod="metallb-system/frr-k8s-n4sjm" Dec 12 00:40:47 crc kubenswrapper[4606]: I1212 00:40:47.701905 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcxsz\" (UniqueName: \"kubernetes.io/projected/886be8e2-677e-4bd4-81cf-032dd6d8a890-kube-api-access-vcxsz\") pod \"frr-k8s-webhook-server-7784b6fcf-gx2jt\" (UID: \"886be8e2-677e-4bd4-81cf-032dd6d8a890\") " pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-gx2jt" Dec 12 00:40:47 crc kubenswrapper[4606]: I1212 00:40:47.767049 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/166cd42d-4038-46ed-aa22-d264904eb215-metallb-excludel2\") pod \"speaker-xngsn\" (UID: \"166cd42d-4038-46ed-aa22-d264904eb215\") " pod="metallb-system/speaker-xngsn" Dec 12 00:40:47 crc kubenswrapper[4606]: I1212 00:40:47.767103 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/166cd42d-4038-46ed-aa22-d264904eb215-memberlist\") pod \"speaker-xngsn\" (UID: \"166cd42d-4038-46ed-aa22-d264904eb215\") " pod="metallb-system/speaker-xngsn" Dec 12 00:40:47 crc kubenswrapper[4606]: I1212 00:40:47.767144 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c2e79dcf-8eee-4042-b9b0-8edcf88f3fce-cert\") pod \"controller-5bddd4b946-74lqn\" (UID: \"c2e79dcf-8eee-4042-b9b0-8edcf88f3fce\") " pod="metallb-system/controller-5bddd4b946-74lqn" Dec 12 00:40:47 crc kubenswrapper[4606]: I1212 00:40:47.767181 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkfqf\" (UniqueName: \"kubernetes.io/projected/c2e79dcf-8eee-4042-b9b0-8edcf88f3fce-kube-api-access-rkfqf\") pod \"controller-5bddd4b946-74lqn\" (UID: \"c2e79dcf-8eee-4042-b9b0-8edcf88f3fce\") " pod="metallb-system/controller-5bddd4b946-74lqn" Dec 12 00:40:47 crc kubenswrapper[4606]: I1212 00:40:47.767215 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmpnl\" (UniqueName: \"kubernetes.io/projected/166cd42d-4038-46ed-aa22-d264904eb215-kube-api-access-nmpnl\") pod \"speaker-xngsn\" (UID: \"166cd42d-4038-46ed-aa22-d264904eb215\") " pod="metallb-system/speaker-xngsn" Dec 12 00:40:47 crc kubenswrapper[4606]: E1212 00:40:47.767315 4606 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 12 00:40:47 crc kubenswrapper[4606]: E1212 00:40:47.767362 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/166cd42d-4038-46ed-aa22-d264904eb215-memberlist podName:166cd42d-4038-46ed-aa22-d264904eb215 nodeName:}" failed. No retries permitted until 2025-12-12 00:40:48.267348572 +0000 UTC m=+1038.812701438 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/166cd42d-4038-46ed-aa22-d264904eb215-memberlist") pod "speaker-xngsn" (UID: "166cd42d-4038-46ed-aa22-d264904eb215") : secret "metallb-memberlist" not found Dec 12 00:40:47 crc kubenswrapper[4606]: I1212 00:40:47.767425 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c2e79dcf-8eee-4042-b9b0-8edcf88f3fce-metrics-certs\") pod \"controller-5bddd4b946-74lqn\" (UID: \"c2e79dcf-8eee-4042-b9b0-8edcf88f3fce\") " pod="metallb-system/controller-5bddd4b946-74lqn" Dec 12 00:40:47 crc kubenswrapper[4606]: I1212 00:40:47.767535 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/166cd42d-4038-46ed-aa22-d264904eb215-metrics-certs\") pod \"speaker-xngsn\" (UID: \"166cd42d-4038-46ed-aa22-d264904eb215\") " pod="metallb-system/speaker-xngsn" Dec 12 00:40:47 crc kubenswrapper[4606]: E1212 00:40:47.767669 4606 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Dec 12 00:40:47 crc kubenswrapper[4606]: E1212 00:40:47.767730 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/166cd42d-4038-46ed-aa22-d264904eb215-metrics-certs podName:166cd42d-4038-46ed-aa22-d264904eb215 nodeName:}" failed. No retries permitted until 2025-12-12 00:40:48.267709322 +0000 UTC m=+1038.813062188 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/166cd42d-4038-46ed-aa22-d264904eb215-metrics-certs") pod "speaker-xngsn" (UID: "166cd42d-4038-46ed-aa22-d264904eb215") : secret "speaker-certs-secret" not found Dec 12 00:40:47 crc kubenswrapper[4606]: I1212 00:40:47.767967 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/166cd42d-4038-46ed-aa22-d264904eb215-metallb-excludel2\") pod \"speaker-xngsn\" (UID: \"166cd42d-4038-46ed-aa22-d264904eb215\") " pod="metallb-system/speaker-xngsn" Dec 12 00:40:47 crc kubenswrapper[4606]: I1212 00:40:47.784964 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmpnl\" (UniqueName: \"kubernetes.io/projected/166cd42d-4038-46ed-aa22-d264904eb215-kube-api-access-nmpnl\") pod \"speaker-xngsn\" (UID: \"166cd42d-4038-46ed-aa22-d264904eb215\") " pod="metallb-system/speaker-xngsn" Dec 12 00:40:47 crc kubenswrapper[4606]: I1212 00:40:47.811669 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-gx2jt" Dec 12 00:40:47 crc kubenswrapper[4606]: I1212 00:40:47.869356 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c2e79dcf-8eee-4042-b9b0-8edcf88f3fce-cert\") pod \"controller-5bddd4b946-74lqn\" (UID: \"c2e79dcf-8eee-4042-b9b0-8edcf88f3fce\") " pod="metallb-system/controller-5bddd4b946-74lqn" Dec 12 00:40:47 crc kubenswrapper[4606]: I1212 00:40:47.869418 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rkfqf\" (UniqueName: \"kubernetes.io/projected/c2e79dcf-8eee-4042-b9b0-8edcf88f3fce-kube-api-access-rkfqf\") pod \"controller-5bddd4b946-74lqn\" (UID: \"c2e79dcf-8eee-4042-b9b0-8edcf88f3fce\") " pod="metallb-system/controller-5bddd4b946-74lqn" Dec 12 00:40:47 crc kubenswrapper[4606]: I1212 00:40:47.869486 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c2e79dcf-8eee-4042-b9b0-8edcf88f3fce-metrics-certs\") pod \"controller-5bddd4b946-74lqn\" (UID: \"c2e79dcf-8eee-4042-b9b0-8edcf88f3fce\") " pod="metallb-system/controller-5bddd4b946-74lqn" Dec 12 00:40:47 crc kubenswrapper[4606]: I1212 00:40:47.871894 4606 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Dec 12 00:40:47 crc kubenswrapper[4606]: I1212 00:40:47.874759 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c2e79dcf-8eee-4042-b9b0-8edcf88f3fce-metrics-certs\") pod \"controller-5bddd4b946-74lqn\" (UID: \"c2e79dcf-8eee-4042-b9b0-8edcf88f3fce\") " pod="metallb-system/controller-5bddd4b946-74lqn" Dec 12 00:40:47 crc kubenswrapper[4606]: I1212 00:40:47.884539 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c2e79dcf-8eee-4042-b9b0-8edcf88f3fce-cert\") pod \"controller-5bddd4b946-74lqn\" (UID: \"c2e79dcf-8eee-4042-b9b0-8edcf88f3fce\") " pod="metallb-system/controller-5bddd4b946-74lqn" Dec 12 00:40:47 crc kubenswrapper[4606]: I1212 00:40:47.894848 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkfqf\" (UniqueName: \"kubernetes.io/projected/c2e79dcf-8eee-4042-b9b0-8edcf88f3fce-kube-api-access-rkfqf\") pod \"controller-5bddd4b946-74lqn\" (UID: \"c2e79dcf-8eee-4042-b9b0-8edcf88f3fce\") " pod="metallb-system/controller-5bddd4b946-74lqn" Dec 12 00:40:47 crc kubenswrapper[4606]: I1212 00:40:47.923945 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-5bddd4b946-74lqn" Dec 12 00:40:48 crc kubenswrapper[4606]: I1212 00:40:48.172979 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5d1c165f-9379-412b-b7aa-6e4da7c4717a-metrics-certs\") pod \"frr-k8s-n4sjm\" (UID: \"5d1c165f-9379-412b-b7aa-6e4da7c4717a\") " pod="metallb-system/frr-k8s-n4sjm" Dec 12 00:40:48 crc kubenswrapper[4606]: I1212 00:40:48.176380 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5d1c165f-9379-412b-b7aa-6e4da7c4717a-metrics-certs\") pod \"frr-k8s-n4sjm\" (UID: \"5d1c165f-9379-412b-b7aa-6e4da7c4717a\") " pod="metallb-system/frr-k8s-n4sjm" Dec 12 00:40:48 crc kubenswrapper[4606]: I1212 00:40:48.274520 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/166cd42d-4038-46ed-aa22-d264904eb215-metrics-certs\") pod \"speaker-xngsn\" (UID: \"166cd42d-4038-46ed-aa22-d264904eb215\") " pod="metallb-system/speaker-xngsn" Dec 12 00:40:48 crc kubenswrapper[4606]: I1212 00:40:48.274643 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/166cd42d-4038-46ed-aa22-d264904eb215-memberlist\") pod \"speaker-xngsn\" (UID: \"166cd42d-4038-46ed-aa22-d264904eb215\") " pod="metallb-system/speaker-xngsn" Dec 12 00:40:48 crc kubenswrapper[4606]: E1212 00:40:48.274790 4606 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Dec 12 00:40:48 crc kubenswrapper[4606]: E1212 00:40:48.274910 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/166cd42d-4038-46ed-aa22-d264904eb215-metrics-certs podName:166cd42d-4038-46ed-aa22-d264904eb215 nodeName:}" failed. No retries permitted until 2025-12-12 00:40:49.2748833 +0000 UTC m=+1039.820236206 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/166cd42d-4038-46ed-aa22-d264904eb215-metrics-certs") pod "speaker-xngsn" (UID: "166cd42d-4038-46ed-aa22-d264904eb215") : secret "speaker-certs-secret" not found Dec 12 00:40:48 crc kubenswrapper[4606]: E1212 00:40:48.274978 4606 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 12 00:40:48 crc kubenswrapper[4606]: E1212 00:40:48.275112 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/166cd42d-4038-46ed-aa22-d264904eb215-memberlist podName:166cd42d-4038-46ed-aa22-d264904eb215 nodeName:}" failed. No retries permitted until 2025-12-12 00:40:49.275073735 +0000 UTC m=+1039.820426641 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/166cd42d-4038-46ed-aa22-d264904eb215-memberlist") pod "speaker-xngsn" (UID: "166cd42d-4038-46ed-aa22-d264904eb215") : secret "metallb-memberlist" not found Dec 12 00:40:48 crc kubenswrapper[4606]: I1212 00:40:48.399280 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-n4sjm" Dec 12 00:40:49 crc kubenswrapper[4606]: I1212 00:40:49.177345 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-5bddd4b946-74lqn"] Dec 12 00:40:49 crc kubenswrapper[4606]: I1212 00:40:49.273436 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-n4sjm" event={"ID":"5d1c165f-9379-412b-b7aa-6e4da7c4717a","Type":"ContainerStarted","Data":"fd6589dd41a3da4106ea4b8e8eeb2d130ef9164d8552a06b53fdbb2f47c7dc41"} Dec 12 00:40:49 crc kubenswrapper[4606]: I1212 00:40:49.274216 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-5bddd4b946-74lqn" event={"ID":"c2e79dcf-8eee-4042-b9b0-8edcf88f3fce","Type":"ContainerStarted","Data":"48eb2fe87ccdd2bc2af34109baf09c6a52d6e6098f3c15cab9f3eb66916dbb6f"} Dec 12 00:40:49 crc kubenswrapper[4606]: I1212 00:40:49.289893 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/166cd42d-4038-46ed-aa22-d264904eb215-metrics-certs\") pod \"speaker-xngsn\" (UID: \"166cd42d-4038-46ed-aa22-d264904eb215\") " pod="metallb-system/speaker-xngsn" Dec 12 00:40:49 crc kubenswrapper[4606]: I1212 00:40:49.289948 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/166cd42d-4038-46ed-aa22-d264904eb215-memberlist\") pod \"speaker-xngsn\" (UID: \"166cd42d-4038-46ed-aa22-d264904eb215\") " pod="metallb-system/speaker-xngsn" Dec 12 00:40:49 crc kubenswrapper[4606]: E1212 00:40:49.290080 4606 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 12 00:40:49 crc kubenswrapper[4606]: E1212 00:40:49.290133 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/166cd42d-4038-46ed-aa22-d264904eb215-memberlist podName:166cd42d-4038-46ed-aa22-d264904eb215 nodeName:}" failed. No retries permitted until 2025-12-12 00:40:51.290118919 +0000 UTC m=+1041.835471785 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/166cd42d-4038-46ed-aa22-d264904eb215-memberlist") pod "speaker-xngsn" (UID: "166cd42d-4038-46ed-aa22-d264904eb215") : secret "metallb-memberlist" not found Dec 12 00:40:49 crc kubenswrapper[4606]: I1212 00:40:49.296749 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/166cd42d-4038-46ed-aa22-d264904eb215-metrics-certs\") pod \"speaker-xngsn\" (UID: \"166cd42d-4038-46ed-aa22-d264904eb215\") " pod="metallb-system/speaker-xngsn" Dec 12 00:40:49 crc kubenswrapper[4606]: I1212 00:40:49.323875 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7784b6fcf-gx2jt"] Dec 12 00:40:49 crc kubenswrapper[4606]: W1212 00:40:49.337510 4606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod886be8e2_677e_4bd4_81cf_032dd6d8a890.slice/crio-b7bd839c4abd429483416b204259a97493f7d0d4569eed83afbb1631c49d0bbd WatchSource:0}: Error finding container b7bd839c4abd429483416b204259a97493f7d0d4569eed83afbb1631c49d0bbd: Status 404 returned error can't find the container with id b7bd839c4abd429483416b204259a97493f7d0d4569eed83afbb1631c49d0bbd Dec 12 00:40:50 crc kubenswrapper[4606]: I1212 00:40:50.291209 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-5bddd4b946-74lqn" event={"ID":"c2e79dcf-8eee-4042-b9b0-8edcf88f3fce","Type":"ContainerStarted","Data":"73c699946bebd6e2ef2d01544fa098d8b3e39fef23d22e71ac66e43871a8a4b9"} Dec 12 00:40:50 crc kubenswrapper[4606]: I1212 00:40:50.291497 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-5bddd4b946-74lqn" event={"ID":"c2e79dcf-8eee-4042-b9b0-8edcf88f3fce","Type":"ContainerStarted","Data":"d692d27a822d12f03d111299889800ed512a495b165b3c21a84b220c46e307c6"} Dec 12 00:40:50 crc kubenswrapper[4606]: I1212 00:40:50.291513 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-5bddd4b946-74lqn" Dec 12 00:40:50 crc kubenswrapper[4606]: I1212 00:40:50.293193 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-gx2jt" event={"ID":"886be8e2-677e-4bd4-81cf-032dd6d8a890","Type":"ContainerStarted","Data":"b7bd839c4abd429483416b204259a97493f7d0d4569eed83afbb1631c49d0bbd"} Dec 12 00:40:50 crc kubenswrapper[4606]: I1212 00:40:50.317968 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-5bddd4b946-74lqn" podStartSLOduration=3.317951916 podStartE2EDuration="3.317951916s" podCreationTimestamp="2025-12-12 00:40:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:40:50.314231806 +0000 UTC m=+1040.859584672" watchObservedRunningTime="2025-12-12 00:40:50.317951916 +0000 UTC m=+1040.863304782" Dec 12 00:40:51 crc kubenswrapper[4606]: I1212 00:40:51.318186 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/166cd42d-4038-46ed-aa22-d264904eb215-memberlist\") pod \"speaker-xngsn\" (UID: \"166cd42d-4038-46ed-aa22-d264904eb215\") " pod="metallb-system/speaker-xngsn" Dec 12 00:40:51 crc kubenswrapper[4606]: I1212 00:40:51.338966 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/166cd42d-4038-46ed-aa22-d264904eb215-memberlist\") pod \"speaker-xngsn\" (UID: \"166cd42d-4038-46ed-aa22-d264904eb215\") " pod="metallb-system/speaker-xngsn" Dec 12 00:40:51 crc kubenswrapper[4606]: I1212 00:40:51.505264 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-xngsn" Dec 12 00:40:51 crc kubenswrapper[4606]: W1212 00:40:51.527538 4606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod166cd42d_4038_46ed_aa22_d264904eb215.slice/crio-e293517cb85c47621b2596e2a462238ce82ace4d2e259f48ac642cbad6d14d60 WatchSource:0}: Error finding container e293517cb85c47621b2596e2a462238ce82ace4d2e259f48ac642cbad6d14d60: Status 404 returned error can't find the container with id e293517cb85c47621b2596e2a462238ce82ace4d2e259f48ac642cbad6d14d60 Dec 12 00:40:52 crc kubenswrapper[4606]: I1212 00:40:52.312201 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-xngsn" event={"ID":"166cd42d-4038-46ed-aa22-d264904eb215","Type":"ContainerStarted","Data":"9bf38a9134ee1c2e9a74534cdd33006ee8515fd3e0a87819588e72b3166cb937"} Dec 12 00:40:52 crc kubenswrapper[4606]: I1212 00:40:52.312532 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-xngsn" event={"ID":"166cd42d-4038-46ed-aa22-d264904eb215","Type":"ContainerStarted","Data":"ab697e905c8f45762e72726a9f63f3fd697753fef4c7c4d3aebe7baf1a9c8fba"} Dec 12 00:40:52 crc kubenswrapper[4606]: I1212 00:40:52.312543 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-xngsn" event={"ID":"166cd42d-4038-46ed-aa22-d264904eb215","Type":"ContainerStarted","Data":"e293517cb85c47621b2596e2a462238ce82ace4d2e259f48ac642cbad6d14d60"} Dec 12 00:40:52 crc kubenswrapper[4606]: I1212 00:40:52.312936 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-xngsn" Dec 12 00:40:52 crc kubenswrapper[4606]: I1212 00:40:52.347136 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-xngsn" podStartSLOduration=5.3471164380000005 podStartE2EDuration="5.347116438s" podCreationTimestamp="2025-12-12 00:40:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:40:52.343110011 +0000 UTC m=+1042.888462877" watchObservedRunningTime="2025-12-12 00:40:52.347116438 +0000 UTC m=+1042.892469304" Dec 12 00:40:57 crc kubenswrapper[4606]: I1212 00:40:57.349208 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-gx2jt" event={"ID":"886be8e2-677e-4bd4-81cf-032dd6d8a890","Type":"ContainerStarted","Data":"e2fde182d26fb89487b71ffb0a42b7a1a6ad4290ce5c7ccff4dd71c77c34a29c"} Dec 12 00:40:57 crc kubenswrapper[4606]: I1212 00:40:57.349673 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-gx2jt" Dec 12 00:40:57 crc kubenswrapper[4606]: I1212 00:40:57.351432 4606 generic.go:334] "Generic (PLEG): container finished" podID="5d1c165f-9379-412b-b7aa-6e4da7c4717a" containerID="c48bb3791be8628c908f9f4265fd031b4b7d5f8aaccafa6f8b878649e6043fb9" exitCode=0 Dec 12 00:40:57 crc kubenswrapper[4606]: I1212 00:40:57.351529 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-n4sjm" event={"ID":"5d1c165f-9379-412b-b7aa-6e4da7c4717a","Type":"ContainerDied","Data":"c48bb3791be8628c908f9f4265fd031b4b7d5f8aaccafa6f8b878649e6043fb9"} Dec 12 00:40:57 crc kubenswrapper[4606]: I1212 00:40:57.376589 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-gx2jt" podStartSLOduration=3.097667829 podStartE2EDuration="10.376560943s" podCreationTimestamp="2025-12-12 00:40:47 +0000 UTC" firstStartedPulling="2025-12-12 00:40:49.339513782 +0000 UTC m=+1039.884866648" lastFinishedPulling="2025-12-12 00:40:56.618406896 +0000 UTC m=+1047.163759762" observedRunningTime="2025-12-12 00:40:57.375062953 +0000 UTC m=+1047.920415809" watchObservedRunningTime="2025-12-12 00:40:57.376560943 +0000 UTC m=+1047.921913819" Dec 12 00:40:58 crc kubenswrapper[4606]: I1212 00:40:58.370844 4606 generic.go:334] "Generic (PLEG): container finished" podID="5d1c165f-9379-412b-b7aa-6e4da7c4717a" containerID="70578c4232b555c598f85d2a01e8281a5a12eced9e82bdcd2c42868d9d714b6d" exitCode=0 Dec 12 00:40:58 crc kubenswrapper[4606]: I1212 00:40:58.370979 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-n4sjm" event={"ID":"5d1c165f-9379-412b-b7aa-6e4da7c4717a","Type":"ContainerDied","Data":"70578c4232b555c598f85d2a01e8281a5a12eced9e82bdcd2c42868d9d714b6d"} Dec 12 00:40:59 crc kubenswrapper[4606]: I1212 00:40:59.378952 4606 generic.go:334] "Generic (PLEG): container finished" podID="5d1c165f-9379-412b-b7aa-6e4da7c4717a" containerID="0d9f2ef7f6cc6fd43c75badee5bbee29935d4cae03a4f49fc52bb745348e3673" exitCode=0 Dec 12 00:40:59 crc kubenswrapper[4606]: I1212 00:40:59.379072 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-n4sjm" event={"ID":"5d1c165f-9379-412b-b7aa-6e4da7c4717a","Type":"ContainerDied","Data":"0d9f2ef7f6cc6fd43c75badee5bbee29935d4cae03a4f49fc52bb745348e3673"} Dec 12 00:41:00 crc kubenswrapper[4606]: I1212 00:41:00.388720 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-n4sjm" event={"ID":"5d1c165f-9379-412b-b7aa-6e4da7c4717a","Type":"ContainerStarted","Data":"7b819cfb8c1e2d2c49f551a2c595cd181ad95127c68f29b8ac1b7edcbc03b0c6"} Dec 12 00:41:00 crc kubenswrapper[4606]: I1212 00:41:00.389027 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-n4sjm" event={"ID":"5d1c165f-9379-412b-b7aa-6e4da7c4717a","Type":"ContainerStarted","Data":"642f3befc239b2182c16d45c32cb1a384666f531ea90881a4dcc6bb0b4683e08"} Dec 12 00:41:00 crc kubenswrapper[4606]: I1212 00:41:00.389039 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-n4sjm" event={"ID":"5d1c165f-9379-412b-b7aa-6e4da7c4717a","Type":"ContainerStarted","Data":"04771b151d9531f157464cae55fbb7ee468e1cb8965182f3200c36f5f34fe3be"} Dec 12 00:41:00 crc kubenswrapper[4606]: I1212 00:41:00.389047 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-n4sjm" event={"ID":"5d1c165f-9379-412b-b7aa-6e4da7c4717a","Type":"ContainerStarted","Data":"88bbbb2d5d480fbc45c746c25b29eccd6e37bc181ebc1d987833e872f64ab1de"} Dec 12 00:41:00 crc kubenswrapper[4606]: I1212 00:41:00.389056 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-n4sjm" event={"ID":"5d1c165f-9379-412b-b7aa-6e4da7c4717a","Type":"ContainerStarted","Data":"585ff663ea6e87d03467f9799f61a32d9384cc64df0dbf01383e988d2a14fe62"} Dec 12 00:41:00 crc kubenswrapper[4606]: I1212 00:41:00.389064 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-n4sjm" event={"ID":"5d1c165f-9379-412b-b7aa-6e4da7c4717a","Type":"ContainerStarted","Data":"b1d336af4bee47eec55b206585d6feef232d6b6e36b622ebcce82da69c7e83aa"} Dec 12 00:41:00 crc kubenswrapper[4606]: I1212 00:41:00.389315 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-n4sjm" Dec 12 00:41:00 crc kubenswrapper[4606]: I1212 00:41:00.424572 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-n4sjm" podStartSLOduration=5.843374505 podStartE2EDuration="13.424551742s" podCreationTimestamp="2025-12-12 00:40:47 +0000 UTC" firstStartedPulling="2025-12-12 00:40:49.016234347 +0000 UTC m=+1039.561587213" lastFinishedPulling="2025-12-12 00:40:56.597411584 +0000 UTC m=+1047.142764450" observedRunningTime="2025-12-12 00:41:00.418245593 +0000 UTC m=+1050.963598469" watchObservedRunningTime="2025-12-12 00:41:00.424551742 +0000 UTC m=+1050.969904618" Dec 12 00:41:01 crc kubenswrapper[4606]: I1212 00:41:01.510316 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-xngsn" Dec 12 00:41:02 crc kubenswrapper[4606]: I1212 00:41:02.010978 4606 patch_prober.go:28] interesting pod/machine-config-daemon-cqmz5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 12 00:41:02 crc kubenswrapper[4606]: I1212 00:41:02.011055 4606 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 12 00:41:03 crc kubenswrapper[4606]: I1212 00:41:03.399829 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-n4sjm" Dec 12 00:41:03 crc kubenswrapper[4606]: I1212 00:41:03.444592 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-n4sjm" Dec 12 00:41:04 crc kubenswrapper[4606]: I1212 00:41:04.903418 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-b5s49"] Dec 12 00:41:04 crc kubenswrapper[4606]: I1212 00:41:04.904930 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-b5s49" Dec 12 00:41:04 crc kubenswrapper[4606]: I1212 00:41:04.908797 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Dec 12 00:41:04 crc kubenswrapper[4606]: I1212 00:41:04.909131 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-m5pfx" Dec 12 00:41:04 crc kubenswrapper[4606]: I1212 00:41:04.912344 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Dec 12 00:41:04 crc kubenswrapper[4606]: I1212 00:41:04.939536 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-b5s49"] Dec 12 00:41:05 crc kubenswrapper[4606]: I1212 00:41:05.065792 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgbmf\" (UniqueName: \"kubernetes.io/projected/995d96ae-3df4-4d98-9c0f-245ab5d7494a-kube-api-access-qgbmf\") pod \"openstack-operator-index-b5s49\" (UID: \"995d96ae-3df4-4d98-9c0f-245ab5d7494a\") " pod="openstack-operators/openstack-operator-index-b5s49" Dec 12 00:41:05 crc kubenswrapper[4606]: I1212 00:41:05.167148 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qgbmf\" (UniqueName: \"kubernetes.io/projected/995d96ae-3df4-4d98-9c0f-245ab5d7494a-kube-api-access-qgbmf\") pod \"openstack-operator-index-b5s49\" (UID: \"995d96ae-3df4-4d98-9c0f-245ab5d7494a\") " pod="openstack-operators/openstack-operator-index-b5s49" Dec 12 00:41:05 crc kubenswrapper[4606]: I1212 00:41:05.186799 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgbmf\" (UniqueName: \"kubernetes.io/projected/995d96ae-3df4-4d98-9c0f-245ab5d7494a-kube-api-access-qgbmf\") pod \"openstack-operator-index-b5s49\" (UID: \"995d96ae-3df4-4d98-9c0f-245ab5d7494a\") " pod="openstack-operators/openstack-operator-index-b5s49" Dec 12 00:41:05 crc kubenswrapper[4606]: I1212 00:41:05.242118 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-b5s49" Dec 12 00:41:05 crc kubenswrapper[4606]: W1212 00:41:05.637293 4606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod995d96ae_3df4_4d98_9c0f_245ab5d7494a.slice/crio-f23e53459afad34dcaa6f6266217b343e70438ccdab6705db650cd362ccaa4b0 WatchSource:0}: Error finding container f23e53459afad34dcaa6f6266217b343e70438ccdab6705db650cd362ccaa4b0: Status 404 returned error can't find the container with id f23e53459afad34dcaa6f6266217b343e70438ccdab6705db650cd362ccaa4b0 Dec 12 00:41:05 crc kubenswrapper[4606]: I1212 00:41:05.638727 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-b5s49"] Dec 12 00:41:06 crc kubenswrapper[4606]: I1212 00:41:06.426248 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-b5s49" event={"ID":"995d96ae-3df4-4d98-9c0f-245ab5d7494a","Type":"ContainerStarted","Data":"f23e53459afad34dcaa6f6266217b343e70438ccdab6705db650cd362ccaa4b0"} Dec 12 00:41:07 crc kubenswrapper[4606]: I1212 00:41:07.817539 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-gx2jt" Dec 12 00:41:07 crc kubenswrapper[4606]: I1212 00:41:07.927898 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-5bddd4b946-74lqn" Dec 12 00:41:08 crc kubenswrapper[4606]: I1212 00:41:08.265841 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-b5s49"] Dec 12 00:41:08 crc kubenswrapper[4606]: I1212 00:41:08.441472 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-b5s49" event={"ID":"995d96ae-3df4-4d98-9c0f-245ab5d7494a","Type":"ContainerStarted","Data":"7fb7a444bac4ddbe9d78b63e2616407dd9ba3a7ebaf3df42e41de62f28afab8f"} Dec 12 00:41:08 crc kubenswrapper[4606]: I1212 00:41:08.463696 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-b5s49" podStartSLOduration=2.048200642 podStartE2EDuration="4.463679077s" podCreationTimestamp="2025-12-12 00:41:04 +0000 UTC" firstStartedPulling="2025-12-12 00:41:05.639744418 +0000 UTC m=+1056.185097284" lastFinishedPulling="2025-12-12 00:41:08.055222853 +0000 UTC m=+1058.600575719" observedRunningTime="2025-12-12 00:41:08.459781033 +0000 UTC m=+1059.005133929" watchObservedRunningTime="2025-12-12 00:41:08.463679077 +0000 UTC m=+1059.009031943" Dec 12 00:41:08 crc kubenswrapper[4606]: I1212 00:41:08.885019 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-p546n"] Dec 12 00:41:08 crc kubenswrapper[4606]: I1212 00:41:08.886442 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-p546n" Dec 12 00:41:08 crc kubenswrapper[4606]: I1212 00:41:08.896590 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-p546n"] Dec 12 00:41:08 crc kubenswrapper[4606]: I1212 00:41:08.919669 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sm8hz\" (UniqueName: \"kubernetes.io/projected/89ebf2d8-d7a1-4b1c-a90c-8236306cc7bd-kube-api-access-sm8hz\") pod \"openstack-operator-index-p546n\" (UID: \"89ebf2d8-d7a1-4b1c-a90c-8236306cc7bd\") " pod="openstack-operators/openstack-operator-index-p546n" Dec 12 00:41:09 crc kubenswrapper[4606]: I1212 00:41:09.021406 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sm8hz\" (UniqueName: \"kubernetes.io/projected/89ebf2d8-d7a1-4b1c-a90c-8236306cc7bd-kube-api-access-sm8hz\") pod \"openstack-operator-index-p546n\" (UID: \"89ebf2d8-d7a1-4b1c-a90c-8236306cc7bd\") " pod="openstack-operators/openstack-operator-index-p546n" Dec 12 00:41:09 crc kubenswrapper[4606]: I1212 00:41:09.047820 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sm8hz\" (UniqueName: \"kubernetes.io/projected/89ebf2d8-d7a1-4b1c-a90c-8236306cc7bd-kube-api-access-sm8hz\") pod \"openstack-operator-index-p546n\" (UID: \"89ebf2d8-d7a1-4b1c-a90c-8236306cc7bd\") " pod="openstack-operators/openstack-operator-index-p546n" Dec 12 00:41:09 crc kubenswrapper[4606]: I1212 00:41:09.218035 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-p546n" Dec 12 00:41:09 crc kubenswrapper[4606]: I1212 00:41:09.459939 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-b5s49" podUID="995d96ae-3df4-4d98-9c0f-245ab5d7494a" containerName="registry-server" containerID="cri-o://7fb7a444bac4ddbe9d78b63e2616407dd9ba3a7ebaf3df42e41de62f28afab8f" gracePeriod=2 Dec 12 00:41:09 crc kubenswrapper[4606]: I1212 00:41:09.491005 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-p546n"] Dec 12 00:41:09 crc kubenswrapper[4606]: I1212 00:41:09.802911 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-b5s49" Dec 12 00:41:09 crc kubenswrapper[4606]: I1212 00:41:09.932892 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qgbmf\" (UniqueName: \"kubernetes.io/projected/995d96ae-3df4-4d98-9c0f-245ab5d7494a-kube-api-access-qgbmf\") pod \"995d96ae-3df4-4d98-9c0f-245ab5d7494a\" (UID: \"995d96ae-3df4-4d98-9c0f-245ab5d7494a\") " Dec 12 00:41:09 crc kubenswrapper[4606]: I1212 00:41:09.939209 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/995d96ae-3df4-4d98-9c0f-245ab5d7494a-kube-api-access-qgbmf" (OuterVolumeSpecName: "kube-api-access-qgbmf") pod "995d96ae-3df4-4d98-9c0f-245ab5d7494a" (UID: "995d96ae-3df4-4d98-9c0f-245ab5d7494a"). InnerVolumeSpecName "kube-api-access-qgbmf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:41:10 crc kubenswrapper[4606]: I1212 00:41:10.034718 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qgbmf\" (UniqueName: \"kubernetes.io/projected/995d96ae-3df4-4d98-9c0f-245ab5d7494a-kube-api-access-qgbmf\") on node \"crc\" DevicePath \"\"" Dec 12 00:41:10 crc kubenswrapper[4606]: I1212 00:41:10.470185 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-p546n" event={"ID":"89ebf2d8-d7a1-4b1c-a90c-8236306cc7bd","Type":"ContainerStarted","Data":"f0e49c2b8da969cfa5cb98bcbc54bae825d334a4913dea2b5984c146e746f4cc"} Dec 12 00:41:10 crc kubenswrapper[4606]: I1212 00:41:10.470283 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-p546n" event={"ID":"89ebf2d8-d7a1-4b1c-a90c-8236306cc7bd","Type":"ContainerStarted","Data":"a5f3d5aee0542039f22b48959f754eff275ced883545f265e9ff576b5a686598"} Dec 12 00:41:10 crc kubenswrapper[4606]: I1212 00:41:10.471959 4606 generic.go:334] "Generic (PLEG): container finished" podID="995d96ae-3df4-4d98-9c0f-245ab5d7494a" containerID="7fb7a444bac4ddbe9d78b63e2616407dd9ba3a7ebaf3df42e41de62f28afab8f" exitCode=0 Dec 12 00:41:10 crc kubenswrapper[4606]: I1212 00:41:10.472012 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-b5s49" Dec 12 00:41:10 crc kubenswrapper[4606]: I1212 00:41:10.471998 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-b5s49" event={"ID":"995d96ae-3df4-4d98-9c0f-245ab5d7494a","Type":"ContainerDied","Data":"7fb7a444bac4ddbe9d78b63e2616407dd9ba3a7ebaf3df42e41de62f28afab8f"} Dec 12 00:41:10 crc kubenswrapper[4606]: I1212 00:41:10.472069 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-b5s49" event={"ID":"995d96ae-3df4-4d98-9c0f-245ab5d7494a","Type":"ContainerDied","Data":"f23e53459afad34dcaa6f6266217b343e70438ccdab6705db650cd362ccaa4b0"} Dec 12 00:41:10 crc kubenswrapper[4606]: I1212 00:41:10.472128 4606 scope.go:117] "RemoveContainer" containerID="7fb7a444bac4ddbe9d78b63e2616407dd9ba3a7ebaf3df42e41de62f28afab8f" Dec 12 00:41:10 crc kubenswrapper[4606]: I1212 00:41:10.493500 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-p546n" podStartSLOduration=2.417049272 podStartE2EDuration="2.493478188s" podCreationTimestamp="2025-12-12 00:41:08 +0000 UTC" firstStartedPulling="2025-12-12 00:41:09.498283795 +0000 UTC m=+1060.043636681" lastFinishedPulling="2025-12-12 00:41:09.574712731 +0000 UTC m=+1060.120065597" observedRunningTime="2025-12-12 00:41:10.487282392 +0000 UTC m=+1061.032635298" watchObservedRunningTime="2025-12-12 00:41:10.493478188 +0000 UTC m=+1061.038831064" Dec 12 00:41:10 crc kubenswrapper[4606]: I1212 00:41:10.497052 4606 scope.go:117] "RemoveContainer" containerID="7fb7a444bac4ddbe9d78b63e2616407dd9ba3a7ebaf3df42e41de62f28afab8f" Dec 12 00:41:10 crc kubenswrapper[4606]: E1212 00:41:10.497647 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7fb7a444bac4ddbe9d78b63e2616407dd9ba3a7ebaf3df42e41de62f28afab8f\": container with ID starting with 7fb7a444bac4ddbe9d78b63e2616407dd9ba3a7ebaf3df42e41de62f28afab8f not found: ID does not exist" containerID="7fb7a444bac4ddbe9d78b63e2616407dd9ba3a7ebaf3df42e41de62f28afab8f" Dec 12 00:41:10 crc kubenswrapper[4606]: I1212 00:41:10.497687 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7fb7a444bac4ddbe9d78b63e2616407dd9ba3a7ebaf3df42e41de62f28afab8f"} err="failed to get container status \"7fb7a444bac4ddbe9d78b63e2616407dd9ba3a7ebaf3df42e41de62f28afab8f\": rpc error: code = NotFound desc = could not find container \"7fb7a444bac4ddbe9d78b63e2616407dd9ba3a7ebaf3df42e41de62f28afab8f\": container with ID starting with 7fb7a444bac4ddbe9d78b63e2616407dd9ba3a7ebaf3df42e41de62f28afab8f not found: ID does not exist" Dec 12 00:41:10 crc kubenswrapper[4606]: I1212 00:41:10.525776 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-b5s49"] Dec 12 00:41:10 crc kubenswrapper[4606]: I1212 00:41:10.531132 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-b5s49"] Dec 12 00:41:11 crc kubenswrapper[4606]: I1212 00:41:11.713868 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="995d96ae-3df4-4d98-9c0f-245ab5d7494a" path="/var/lib/kubelet/pods/995d96ae-3df4-4d98-9c0f-245ab5d7494a/volumes" Dec 12 00:41:18 crc kubenswrapper[4606]: I1212 00:41:18.403072 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-n4sjm" Dec 12 00:41:19 crc kubenswrapper[4606]: I1212 00:41:19.219504 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-p546n" Dec 12 00:41:19 crc kubenswrapper[4606]: I1212 00:41:19.219547 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-p546n" Dec 12 00:41:19 crc kubenswrapper[4606]: I1212 00:41:19.254699 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-p546n" Dec 12 00:41:19 crc kubenswrapper[4606]: I1212 00:41:19.581115 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-p546n" Dec 12 00:41:20 crc kubenswrapper[4606]: I1212 00:41:20.930426 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/dd89fd431413de875134da266d7ab4d16c4fc0ad81c223e66ec802ddc75wwz5"] Dec 12 00:41:20 crc kubenswrapper[4606]: E1212 00:41:20.930690 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="995d96ae-3df4-4d98-9c0f-245ab5d7494a" containerName="registry-server" Dec 12 00:41:20 crc kubenswrapper[4606]: I1212 00:41:20.930705 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="995d96ae-3df4-4d98-9c0f-245ab5d7494a" containerName="registry-server" Dec 12 00:41:20 crc kubenswrapper[4606]: I1212 00:41:20.930868 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="995d96ae-3df4-4d98-9c0f-245ab5d7494a" containerName="registry-server" Dec 12 00:41:20 crc kubenswrapper[4606]: I1212 00:41:20.931931 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/dd89fd431413de875134da266d7ab4d16c4fc0ad81c223e66ec802ddc75wwz5" Dec 12 00:41:20 crc kubenswrapper[4606]: I1212 00:41:20.934711 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-gcspp" Dec 12 00:41:20 crc kubenswrapper[4606]: I1212 00:41:20.948396 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/dd89fd431413de875134da266d7ab4d16c4fc0ad81c223e66ec802ddc75wwz5"] Dec 12 00:41:20 crc kubenswrapper[4606]: I1212 00:41:20.990582 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9t2tc\" (UniqueName: \"kubernetes.io/projected/50a3ac6d-a23e-479c-9356-1b42add509da-kube-api-access-9t2tc\") pod \"dd89fd431413de875134da266d7ab4d16c4fc0ad81c223e66ec802ddc75wwz5\" (UID: \"50a3ac6d-a23e-479c-9356-1b42add509da\") " pod="openstack-operators/dd89fd431413de875134da266d7ab4d16c4fc0ad81c223e66ec802ddc75wwz5" Dec 12 00:41:20 crc kubenswrapper[4606]: I1212 00:41:20.990657 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/50a3ac6d-a23e-479c-9356-1b42add509da-util\") pod \"dd89fd431413de875134da266d7ab4d16c4fc0ad81c223e66ec802ddc75wwz5\" (UID: \"50a3ac6d-a23e-479c-9356-1b42add509da\") " pod="openstack-operators/dd89fd431413de875134da266d7ab4d16c4fc0ad81c223e66ec802ddc75wwz5" Dec 12 00:41:20 crc kubenswrapper[4606]: I1212 00:41:20.990714 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/50a3ac6d-a23e-479c-9356-1b42add509da-bundle\") pod \"dd89fd431413de875134da266d7ab4d16c4fc0ad81c223e66ec802ddc75wwz5\" (UID: \"50a3ac6d-a23e-479c-9356-1b42add509da\") " pod="openstack-operators/dd89fd431413de875134da266d7ab4d16c4fc0ad81c223e66ec802ddc75wwz5" Dec 12 00:41:21 crc kubenswrapper[4606]: I1212 00:41:21.091093 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9t2tc\" (UniqueName: \"kubernetes.io/projected/50a3ac6d-a23e-479c-9356-1b42add509da-kube-api-access-9t2tc\") pod \"dd89fd431413de875134da266d7ab4d16c4fc0ad81c223e66ec802ddc75wwz5\" (UID: \"50a3ac6d-a23e-479c-9356-1b42add509da\") " pod="openstack-operators/dd89fd431413de875134da266d7ab4d16c4fc0ad81c223e66ec802ddc75wwz5" Dec 12 00:41:21 crc kubenswrapper[4606]: I1212 00:41:21.091165 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/50a3ac6d-a23e-479c-9356-1b42add509da-util\") pod \"dd89fd431413de875134da266d7ab4d16c4fc0ad81c223e66ec802ddc75wwz5\" (UID: \"50a3ac6d-a23e-479c-9356-1b42add509da\") " pod="openstack-operators/dd89fd431413de875134da266d7ab4d16c4fc0ad81c223e66ec802ddc75wwz5" Dec 12 00:41:21 crc kubenswrapper[4606]: I1212 00:41:21.091224 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/50a3ac6d-a23e-479c-9356-1b42add509da-bundle\") pod \"dd89fd431413de875134da266d7ab4d16c4fc0ad81c223e66ec802ddc75wwz5\" (UID: \"50a3ac6d-a23e-479c-9356-1b42add509da\") " pod="openstack-operators/dd89fd431413de875134da266d7ab4d16c4fc0ad81c223e66ec802ddc75wwz5" Dec 12 00:41:21 crc kubenswrapper[4606]: I1212 00:41:21.091782 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/50a3ac6d-a23e-479c-9356-1b42add509da-bundle\") pod \"dd89fd431413de875134da266d7ab4d16c4fc0ad81c223e66ec802ddc75wwz5\" (UID: \"50a3ac6d-a23e-479c-9356-1b42add509da\") " pod="openstack-operators/dd89fd431413de875134da266d7ab4d16c4fc0ad81c223e66ec802ddc75wwz5" Dec 12 00:41:21 crc kubenswrapper[4606]: I1212 00:41:21.091955 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/50a3ac6d-a23e-479c-9356-1b42add509da-util\") pod \"dd89fd431413de875134da266d7ab4d16c4fc0ad81c223e66ec802ddc75wwz5\" (UID: \"50a3ac6d-a23e-479c-9356-1b42add509da\") " pod="openstack-operators/dd89fd431413de875134da266d7ab4d16c4fc0ad81c223e66ec802ddc75wwz5" Dec 12 00:41:21 crc kubenswrapper[4606]: I1212 00:41:21.121649 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9t2tc\" (UniqueName: \"kubernetes.io/projected/50a3ac6d-a23e-479c-9356-1b42add509da-kube-api-access-9t2tc\") pod \"dd89fd431413de875134da266d7ab4d16c4fc0ad81c223e66ec802ddc75wwz5\" (UID: \"50a3ac6d-a23e-479c-9356-1b42add509da\") " pod="openstack-operators/dd89fd431413de875134da266d7ab4d16c4fc0ad81c223e66ec802ddc75wwz5" Dec 12 00:41:21 crc kubenswrapper[4606]: I1212 00:41:21.259259 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/dd89fd431413de875134da266d7ab4d16c4fc0ad81c223e66ec802ddc75wwz5" Dec 12 00:41:21 crc kubenswrapper[4606]: I1212 00:41:21.689961 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/dd89fd431413de875134da266d7ab4d16c4fc0ad81c223e66ec802ddc75wwz5"] Dec 12 00:41:22 crc kubenswrapper[4606]: I1212 00:41:22.570849 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/dd89fd431413de875134da266d7ab4d16c4fc0ad81c223e66ec802ddc75wwz5" event={"ID":"50a3ac6d-a23e-479c-9356-1b42add509da","Type":"ContainerStarted","Data":"90047f9b97a50beed4664a1eb4594c7ec6b3740c73fbdea98209a1566b8f5d7f"} Dec 12 00:41:23 crc kubenswrapper[4606]: I1212 00:41:23.579461 4606 generic.go:334] "Generic (PLEG): container finished" podID="50a3ac6d-a23e-479c-9356-1b42add509da" containerID="21c2a36b97e7b8ca823c15dd64d5a873b846460608114a277ddf4ec7daffdee0" exitCode=0 Dec 12 00:41:23 crc kubenswrapper[4606]: I1212 00:41:23.579606 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/dd89fd431413de875134da266d7ab4d16c4fc0ad81c223e66ec802ddc75wwz5" event={"ID":"50a3ac6d-a23e-479c-9356-1b42add509da","Type":"ContainerDied","Data":"21c2a36b97e7b8ca823c15dd64d5a873b846460608114a277ddf4ec7daffdee0"} Dec 12 00:41:24 crc kubenswrapper[4606]: I1212 00:41:24.598973 4606 generic.go:334] "Generic (PLEG): container finished" podID="50a3ac6d-a23e-479c-9356-1b42add509da" containerID="148a4bf1b115652510fdd706826139480cc94ee1094eaef82bd63499f7644899" exitCode=0 Dec 12 00:41:24 crc kubenswrapper[4606]: I1212 00:41:24.599048 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/dd89fd431413de875134da266d7ab4d16c4fc0ad81c223e66ec802ddc75wwz5" event={"ID":"50a3ac6d-a23e-479c-9356-1b42add509da","Type":"ContainerDied","Data":"148a4bf1b115652510fdd706826139480cc94ee1094eaef82bd63499f7644899"} Dec 12 00:41:25 crc kubenswrapper[4606]: I1212 00:41:25.605206 4606 generic.go:334] "Generic (PLEG): container finished" podID="50a3ac6d-a23e-479c-9356-1b42add509da" containerID="2f90057ccdc3d02a878fff9f6fca5a0f56ad61b2b13eaee5091f0ff036fa2cf9" exitCode=0 Dec 12 00:41:25 crc kubenswrapper[4606]: I1212 00:41:25.605292 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/dd89fd431413de875134da266d7ab4d16c4fc0ad81c223e66ec802ddc75wwz5" event={"ID":"50a3ac6d-a23e-479c-9356-1b42add509da","Type":"ContainerDied","Data":"2f90057ccdc3d02a878fff9f6fca5a0f56ad61b2b13eaee5091f0ff036fa2cf9"} Dec 12 00:41:26 crc kubenswrapper[4606]: I1212 00:41:26.881970 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/dd89fd431413de875134da266d7ab4d16c4fc0ad81c223e66ec802ddc75wwz5" Dec 12 00:41:27 crc kubenswrapper[4606]: I1212 00:41:27.065587 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9t2tc\" (UniqueName: \"kubernetes.io/projected/50a3ac6d-a23e-479c-9356-1b42add509da-kube-api-access-9t2tc\") pod \"50a3ac6d-a23e-479c-9356-1b42add509da\" (UID: \"50a3ac6d-a23e-479c-9356-1b42add509da\") " Dec 12 00:41:27 crc kubenswrapper[4606]: I1212 00:41:27.065711 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/50a3ac6d-a23e-479c-9356-1b42add509da-bundle\") pod \"50a3ac6d-a23e-479c-9356-1b42add509da\" (UID: \"50a3ac6d-a23e-479c-9356-1b42add509da\") " Dec 12 00:41:27 crc kubenswrapper[4606]: I1212 00:41:27.065747 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/50a3ac6d-a23e-479c-9356-1b42add509da-util\") pod \"50a3ac6d-a23e-479c-9356-1b42add509da\" (UID: \"50a3ac6d-a23e-479c-9356-1b42add509da\") " Dec 12 00:41:27 crc kubenswrapper[4606]: I1212 00:41:27.067114 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50a3ac6d-a23e-479c-9356-1b42add509da-bundle" (OuterVolumeSpecName: "bundle") pod "50a3ac6d-a23e-479c-9356-1b42add509da" (UID: "50a3ac6d-a23e-479c-9356-1b42add509da"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 00:41:27 crc kubenswrapper[4606]: I1212 00:41:27.074768 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50a3ac6d-a23e-479c-9356-1b42add509da-kube-api-access-9t2tc" (OuterVolumeSpecName: "kube-api-access-9t2tc") pod "50a3ac6d-a23e-479c-9356-1b42add509da" (UID: "50a3ac6d-a23e-479c-9356-1b42add509da"). InnerVolumeSpecName "kube-api-access-9t2tc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:41:27 crc kubenswrapper[4606]: I1212 00:41:27.086382 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50a3ac6d-a23e-479c-9356-1b42add509da-util" (OuterVolumeSpecName: "util") pod "50a3ac6d-a23e-479c-9356-1b42add509da" (UID: "50a3ac6d-a23e-479c-9356-1b42add509da"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 00:41:27 crc kubenswrapper[4606]: I1212 00:41:27.166704 4606 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/50a3ac6d-a23e-479c-9356-1b42add509da-bundle\") on node \"crc\" DevicePath \"\"" Dec 12 00:41:27 crc kubenswrapper[4606]: I1212 00:41:27.166733 4606 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/50a3ac6d-a23e-479c-9356-1b42add509da-util\") on node \"crc\" DevicePath \"\"" Dec 12 00:41:27 crc kubenswrapper[4606]: I1212 00:41:27.166742 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9t2tc\" (UniqueName: \"kubernetes.io/projected/50a3ac6d-a23e-479c-9356-1b42add509da-kube-api-access-9t2tc\") on node \"crc\" DevicePath \"\"" Dec 12 00:41:27 crc kubenswrapper[4606]: I1212 00:41:27.626247 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/dd89fd431413de875134da266d7ab4d16c4fc0ad81c223e66ec802ddc75wwz5" event={"ID":"50a3ac6d-a23e-479c-9356-1b42add509da","Type":"ContainerDied","Data":"90047f9b97a50beed4664a1eb4594c7ec6b3740c73fbdea98209a1566b8f5d7f"} Dec 12 00:41:27 crc kubenswrapper[4606]: I1212 00:41:27.626664 4606 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="90047f9b97a50beed4664a1eb4594c7ec6b3740c73fbdea98209a1566b8f5d7f" Dec 12 00:41:27 crc kubenswrapper[4606]: I1212 00:41:27.626333 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/dd89fd431413de875134da266d7ab4d16c4fc0ad81c223e66ec802ddc75wwz5" Dec 12 00:41:32 crc kubenswrapper[4606]: I1212 00:41:32.010997 4606 patch_prober.go:28] interesting pod/machine-config-daemon-cqmz5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 12 00:41:32 crc kubenswrapper[4606]: I1212 00:41:32.011452 4606 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 12 00:41:32 crc kubenswrapper[4606]: I1212 00:41:32.987196 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-686f4c6566-qf7w8"] Dec 12 00:41:32 crc kubenswrapper[4606]: E1212 00:41:32.987408 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50a3ac6d-a23e-479c-9356-1b42add509da" containerName="util" Dec 12 00:41:32 crc kubenswrapper[4606]: I1212 00:41:32.987419 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="50a3ac6d-a23e-479c-9356-1b42add509da" containerName="util" Dec 12 00:41:32 crc kubenswrapper[4606]: E1212 00:41:32.987428 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50a3ac6d-a23e-479c-9356-1b42add509da" containerName="extract" Dec 12 00:41:32 crc kubenswrapper[4606]: I1212 00:41:32.987433 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="50a3ac6d-a23e-479c-9356-1b42add509da" containerName="extract" Dec 12 00:41:32 crc kubenswrapper[4606]: E1212 00:41:32.987453 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50a3ac6d-a23e-479c-9356-1b42add509da" containerName="pull" Dec 12 00:41:32 crc kubenswrapper[4606]: I1212 00:41:32.987459 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="50a3ac6d-a23e-479c-9356-1b42add509da" containerName="pull" Dec 12 00:41:32 crc kubenswrapper[4606]: I1212 00:41:32.987568 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="50a3ac6d-a23e-479c-9356-1b42add509da" containerName="extract" Dec 12 00:41:32 crc kubenswrapper[4606]: I1212 00:41:32.987965 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-686f4c6566-qf7w8" Dec 12 00:41:33 crc kubenswrapper[4606]: I1212 00:41:33.002062 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-kdq9s" Dec 12 00:41:33 crc kubenswrapper[4606]: I1212 00:41:33.025574 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-686f4c6566-qf7w8"] Dec 12 00:41:33 crc kubenswrapper[4606]: I1212 00:41:33.141484 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6ltt\" (UniqueName: \"kubernetes.io/projected/6f1636a2-66b9-4641-9779-34142a76a14f-kube-api-access-z6ltt\") pod \"openstack-operator-controller-operator-686f4c6566-qf7w8\" (UID: \"6f1636a2-66b9-4641-9779-34142a76a14f\") " pod="openstack-operators/openstack-operator-controller-operator-686f4c6566-qf7w8" Dec 12 00:41:33 crc kubenswrapper[4606]: I1212 00:41:33.243015 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6ltt\" (UniqueName: \"kubernetes.io/projected/6f1636a2-66b9-4641-9779-34142a76a14f-kube-api-access-z6ltt\") pod \"openstack-operator-controller-operator-686f4c6566-qf7w8\" (UID: \"6f1636a2-66b9-4641-9779-34142a76a14f\") " pod="openstack-operators/openstack-operator-controller-operator-686f4c6566-qf7w8" Dec 12 00:41:33 crc kubenswrapper[4606]: I1212 00:41:33.262846 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6ltt\" (UniqueName: \"kubernetes.io/projected/6f1636a2-66b9-4641-9779-34142a76a14f-kube-api-access-z6ltt\") pod \"openstack-operator-controller-operator-686f4c6566-qf7w8\" (UID: \"6f1636a2-66b9-4641-9779-34142a76a14f\") " pod="openstack-operators/openstack-operator-controller-operator-686f4c6566-qf7w8" Dec 12 00:41:33 crc kubenswrapper[4606]: I1212 00:41:33.307728 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-686f4c6566-qf7w8" Dec 12 00:41:33 crc kubenswrapper[4606]: I1212 00:41:33.566026 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-686f4c6566-qf7w8"] Dec 12 00:41:33 crc kubenswrapper[4606]: I1212 00:41:33.664870 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-686f4c6566-qf7w8" event={"ID":"6f1636a2-66b9-4641-9779-34142a76a14f","Type":"ContainerStarted","Data":"62624c6c6f71b5df601da8fbb4adf4dff8c52185afb29cc44532e171672a438d"} Dec 12 00:41:37 crc kubenswrapper[4606]: I1212 00:41:37.691113 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-686f4c6566-qf7w8" event={"ID":"6f1636a2-66b9-4641-9779-34142a76a14f","Type":"ContainerStarted","Data":"05060ecc4abc31e5798079779bb0b38587cef2de184c964701378798e5e3e449"} Dec 12 00:41:37 crc kubenswrapper[4606]: I1212 00:41:37.691591 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-686f4c6566-qf7w8" Dec 12 00:41:37 crc kubenswrapper[4606]: I1212 00:41:37.721863 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-686f4c6566-qf7w8" podStartSLOduration=1.863332387 podStartE2EDuration="5.721839164s" podCreationTimestamp="2025-12-12 00:41:32 +0000 UTC" firstStartedPulling="2025-12-12 00:41:33.584912844 +0000 UTC m=+1084.130265710" lastFinishedPulling="2025-12-12 00:41:37.443419621 +0000 UTC m=+1087.988772487" observedRunningTime="2025-12-12 00:41:37.715234997 +0000 UTC m=+1088.260587903" watchObservedRunningTime="2025-12-12 00:41:37.721839164 +0000 UTC m=+1088.267192060" Dec 12 00:41:43 crc kubenswrapper[4606]: I1212 00:41:43.310487 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-686f4c6566-qf7w8" Dec 12 00:42:02 crc kubenswrapper[4606]: I1212 00:42:02.010741 4606 patch_prober.go:28] interesting pod/machine-config-daemon-cqmz5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 12 00:42:02 crc kubenswrapper[4606]: I1212 00:42:02.011244 4606 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 12 00:42:02 crc kubenswrapper[4606]: I1212 00:42:02.011305 4606 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" Dec 12 00:42:02 crc kubenswrapper[4606]: I1212 00:42:02.011916 4606 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"98193dc190ed04d9478e682edb5e4363e657a585ee1347d2eb910b80fed16f3f"} pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 12 00:42:02 crc kubenswrapper[4606]: I1212 00:42:02.011968 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" containerName="machine-config-daemon" containerID="cri-o://98193dc190ed04d9478e682edb5e4363e657a585ee1347d2eb910b80fed16f3f" gracePeriod=600 Dec 12 00:42:02 crc kubenswrapper[4606]: I1212 00:42:02.708766 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-97g95"] Dec 12 00:42:02 crc kubenswrapper[4606]: I1212 00:42:02.710063 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-97g95" Dec 12 00:42:02 crc kubenswrapper[4606]: I1212 00:42:02.712986 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-cfx4x" Dec 12 00:42:02 crc kubenswrapper[4606]: I1212 00:42:02.717142 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-6c677c69b-9npjw"] Dec 12 00:42:02 crc kubenswrapper[4606]: I1212 00:42:02.718063 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-9npjw" Dec 12 00:42:02 crc kubenswrapper[4606]: I1212 00:42:02.722464 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-z92fz" Dec 12 00:42:02 crc kubenswrapper[4606]: I1212 00:42:02.740913 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-6c677c69b-9npjw"] Dec 12 00:42:02 crc kubenswrapper[4606]: I1212 00:42:02.755843 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-697fb699cf-2c5hc"] Dec 12 00:42:02 crc kubenswrapper[4606]: I1212 00:42:02.762094 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-697fb699cf-2c5hc" Dec 12 00:42:02 crc kubenswrapper[4606]: I1212 00:42:02.768554 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-gckfj" Dec 12 00:42:02 crc kubenswrapper[4606]: I1212 00:42:02.771721 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-5697bb5779-mzf56"] Dec 12 00:42:02 crc kubenswrapper[4606]: I1212 00:42:02.772627 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-5697bb5779-mzf56" Dec 12 00:42:02 crc kubenswrapper[4606]: I1212 00:42:02.795611 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-vbt8g" Dec 12 00:42:02 crc kubenswrapper[4606]: I1212 00:42:02.800536 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-697fb699cf-2c5hc"] Dec 12 00:42:02 crc kubenswrapper[4606]: I1212 00:42:02.805997 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4t6s\" (UniqueName: \"kubernetes.io/projected/a6f8bedd-5eb2-4092-abd9-34f8ccbed690-kube-api-access-h4t6s\") pod \"barbican-operator-controller-manager-7d9dfd778-97g95\" (UID: \"a6f8bedd-5eb2-4092-abd9-34f8ccbed690\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-97g95" Dec 12 00:42:02 crc kubenswrapper[4606]: I1212 00:42:02.806191 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f46mx\" (UniqueName: \"kubernetes.io/projected/b5316be9-1796-4bf0-aabf-ac9cf01c709b-kube-api-access-f46mx\") pod \"cinder-operator-controller-manager-6c677c69b-9npjw\" (UID: \"b5316be9-1796-4bf0-aabf-ac9cf01c709b\") " pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-9npjw" Dec 12 00:42:02 crc kubenswrapper[4606]: I1212 00:42:02.823223 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-5697bb5779-mzf56"] Dec 12 00:42:02 crc kubenswrapper[4606]: I1212 00:42:02.844005 4606 generic.go:334] "Generic (PLEG): container finished" podID="a543e227-be89-40cb-941d-b4707cc28921" containerID="98193dc190ed04d9478e682edb5e4363e657a585ee1347d2eb910b80fed16f3f" exitCode=0 Dec 12 00:42:02 crc kubenswrapper[4606]: I1212 00:42:02.844049 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" event={"ID":"a543e227-be89-40cb-941d-b4707cc28921","Type":"ContainerDied","Data":"98193dc190ed04d9478e682edb5e4363e657a585ee1347d2eb910b80fed16f3f"} Dec 12 00:42:02 crc kubenswrapper[4606]: I1212 00:42:02.844078 4606 scope.go:117] "RemoveContainer" containerID="d38d273cf66284763f6d7e3888567975579d39fc9a7372cc2cae90d2dfe8ce04" Dec 12 00:42:02 crc kubenswrapper[4606]: I1212 00:42:02.850284 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-n54mf"] Dec 12 00:42:02 crc kubenswrapper[4606]: I1212 00:42:02.851435 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-n54mf" Dec 12 00:42:02 crc kubenswrapper[4606]: I1212 00:42:02.860818 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-dr9lx" Dec 12 00:42:02 crc kubenswrapper[4606]: I1212 00:42:02.867072 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-97g95"] Dec 12 00:42:02 crc kubenswrapper[4606]: I1212 00:42:02.879944 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-n54mf"] Dec 12 00:42:02 crc kubenswrapper[4606]: I1212 00:42:02.890569 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-w5849"] Dec 12 00:42:02 crc kubenswrapper[4606]: I1212 00:42:02.891520 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-w5849" Dec 12 00:42:02 crc kubenswrapper[4606]: I1212 00:42:02.899904 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-9w67s" Dec 12 00:42:02 crc kubenswrapper[4606]: I1212 00:42:02.903770 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-78d48bff9d-mnqs5"] Dec 12 00:42:02 crc kubenswrapper[4606]: I1212 00:42:02.904734 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-mnqs5" Dec 12 00:42:02 crc kubenswrapper[4606]: I1212 00:42:02.908698 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-jn5bn" Dec 12 00:42:02 crc kubenswrapper[4606]: I1212 00:42:02.908855 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Dec 12 00:42:02 crc kubenswrapper[4606]: I1212 00:42:02.909482 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f46mx\" (UniqueName: \"kubernetes.io/projected/b5316be9-1796-4bf0-aabf-ac9cf01c709b-kube-api-access-f46mx\") pod \"cinder-operator-controller-manager-6c677c69b-9npjw\" (UID: \"b5316be9-1796-4bf0-aabf-ac9cf01c709b\") " pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-9npjw" Dec 12 00:42:02 crc kubenswrapper[4606]: I1212 00:42:02.909518 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h4t6s\" (UniqueName: \"kubernetes.io/projected/a6f8bedd-5eb2-4092-abd9-34f8ccbed690-kube-api-access-h4t6s\") pod \"barbican-operator-controller-manager-7d9dfd778-97g95\" (UID: \"a6f8bedd-5eb2-4092-abd9-34f8ccbed690\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-97g95" Dec 12 00:42:02 crc kubenswrapper[4606]: I1212 00:42:02.909671 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjvqk\" (UniqueName: \"kubernetes.io/projected/93b508cc-be40-4c34-a5ea-81b58893894e-kube-api-access-rjvqk\") pod \"designate-operator-controller-manager-697fb699cf-2c5hc\" (UID: \"93b508cc-be40-4c34-a5ea-81b58893894e\") " pod="openstack-operators/designate-operator-controller-manager-697fb699cf-2c5hc" Dec 12 00:42:02 crc kubenswrapper[4606]: I1212 00:42:02.909715 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5ljj\" (UniqueName: \"kubernetes.io/projected/1c42899f-ae12-4c9b-b012-6ead724854cb-kube-api-access-j5ljj\") pod \"glance-operator-controller-manager-5697bb5779-mzf56\" (UID: \"1c42899f-ae12-4c9b-b012-6ead724854cb\") " pod="openstack-operators/glance-operator-controller-manager-5697bb5779-mzf56" Dec 12 00:42:02 crc kubenswrapper[4606]: I1212 00:42:02.916751 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-w5849"] Dec 12 00:42:02 crc kubenswrapper[4606]: I1212 00:42:02.921152 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-78d48bff9d-mnqs5"] Dec 12 00:42:02 crc kubenswrapper[4606]: I1212 00:42:02.936425 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-967d97867-zgnm9"] Dec 12 00:42:02 crc kubenswrapper[4606]: I1212 00:42:02.965865 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f46mx\" (UniqueName: \"kubernetes.io/projected/b5316be9-1796-4bf0-aabf-ac9cf01c709b-kube-api-access-f46mx\") pod \"cinder-operator-controller-manager-6c677c69b-9npjw\" (UID: \"b5316be9-1796-4bf0-aabf-ac9cf01c709b\") " pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-9npjw" Dec 12 00:42:02 crc kubenswrapper[4606]: I1212 00:42:02.965913 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-967d97867-zgnm9" Dec 12 00:42:02 crc kubenswrapper[4606]: I1212 00:42:02.975589 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-hqx72" Dec 12 00:42:03 crc kubenswrapper[4606]: I1212 00:42:02.996763 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-967d97867-zgnm9"] Dec 12 00:42:03 crc kubenswrapper[4606]: I1212 00:42:03.039288 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rjvqk\" (UniqueName: \"kubernetes.io/projected/93b508cc-be40-4c34-a5ea-81b58893894e-kube-api-access-rjvqk\") pod \"designate-operator-controller-manager-697fb699cf-2c5hc\" (UID: \"93b508cc-be40-4c34-a5ea-81b58893894e\") " pod="openstack-operators/designate-operator-controller-manager-697fb699cf-2c5hc" Dec 12 00:42:03 crc kubenswrapper[4606]: I1212 00:42:03.040162 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5ljj\" (UniqueName: \"kubernetes.io/projected/1c42899f-ae12-4c9b-b012-6ead724854cb-kube-api-access-j5ljj\") pod \"glance-operator-controller-manager-5697bb5779-mzf56\" (UID: \"1c42899f-ae12-4c9b-b012-6ead724854cb\") " pod="openstack-operators/glance-operator-controller-manager-5697bb5779-mzf56" Dec 12 00:42:03 crc kubenswrapper[4606]: I1212 00:42:03.040288 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9ec63351-044c-4c07-b021-a2835b2290c8-cert\") pod \"infra-operator-controller-manager-78d48bff9d-mnqs5\" (UID: \"9ec63351-044c-4c07-b021-a2835b2290c8\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-mnqs5" Dec 12 00:42:03 crc kubenswrapper[4606]: I1212 00:42:03.040370 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlc9n\" (UniqueName: \"kubernetes.io/projected/9ec63351-044c-4c07-b021-a2835b2290c8-kube-api-access-hlc9n\") pod \"infra-operator-controller-manager-78d48bff9d-mnqs5\" (UID: \"9ec63351-044c-4c07-b021-a2835b2290c8\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-mnqs5" Dec 12 00:42:03 crc kubenswrapper[4606]: I1212 00:42:03.040467 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jsspd\" (UniqueName: \"kubernetes.io/projected/7663a2be-d4ba-43d4-bd35-7bf4b969a72d-kube-api-access-jsspd\") pod \"horizon-operator-controller-manager-68c6d99b8f-w5849\" (UID: \"7663a2be-d4ba-43d4-bd35-7bf4b969a72d\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-w5849" Dec 12 00:42:03 crc kubenswrapper[4606]: I1212 00:42:03.040553 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvktc\" (UniqueName: \"kubernetes.io/projected/1d9582d9-c931-4b43-8431-407d6c98cbc1-kube-api-access-cvktc\") pod \"heat-operator-controller-manager-5f64f6f8bb-n54mf\" (UID: \"1d9582d9-c931-4b43-8431-407d6c98cbc1\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-n54mf" Dec 12 00:42:03 crc kubenswrapper[4606]: I1212 00:42:03.041072 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4t6s\" (UniqueName: \"kubernetes.io/projected/a6f8bedd-5eb2-4092-abd9-34f8ccbed690-kube-api-access-h4t6s\") pod \"barbican-operator-controller-manager-7d9dfd778-97g95\" (UID: \"a6f8bedd-5eb2-4092-abd9-34f8ccbed690\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-97g95" Dec 12 00:42:03 crc kubenswrapper[4606]: I1212 00:42:03.052239 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-9npjw" Dec 12 00:42:03 crc kubenswrapper[4606]: I1212 00:42:03.089793 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-5b5fd79c9c-66jql"] Dec 12 00:42:03 crc kubenswrapper[4606]: I1212 00:42:03.090717 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-5b5fd79c9c-66jql" Dec 12 00:42:03 crc kubenswrapper[4606]: I1212 00:42:03.093765 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5ljj\" (UniqueName: \"kubernetes.io/projected/1c42899f-ae12-4c9b-b012-6ead724854cb-kube-api-access-j5ljj\") pod \"glance-operator-controller-manager-5697bb5779-mzf56\" (UID: \"1c42899f-ae12-4c9b-b012-6ead724854cb\") " pod="openstack-operators/glance-operator-controller-manager-5697bb5779-mzf56" Dec 12 00:42:03 crc kubenswrapper[4606]: I1212 00:42:03.094062 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-hbcq2" Dec 12 00:42:03 crc kubenswrapper[4606]: I1212 00:42:03.095307 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjvqk\" (UniqueName: \"kubernetes.io/projected/93b508cc-be40-4c34-a5ea-81b58893894e-kube-api-access-rjvqk\") pod \"designate-operator-controller-manager-697fb699cf-2c5hc\" (UID: \"93b508cc-be40-4c34-a5ea-81b58893894e\") " pod="openstack-operators/designate-operator-controller-manager-697fb699cf-2c5hc" Dec 12 00:42:03 crc kubenswrapper[4606]: I1212 00:42:03.098885 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-697fb699cf-2c5hc" Dec 12 00:42:03 crc kubenswrapper[4606]: I1212 00:42:03.099802 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-bdj9z"] Dec 12 00:42:03 crc kubenswrapper[4606]: I1212 00:42:03.101295 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-bdj9z" Dec 12 00:42:03 crc kubenswrapper[4606]: I1212 00:42:03.118122 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-5697bb5779-mzf56" Dec 12 00:42:03 crc kubenswrapper[4606]: I1212 00:42:03.131222 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-m5sjd" Dec 12 00:42:03 crc kubenswrapper[4606]: I1212 00:42:03.141315 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jsspd\" (UniqueName: \"kubernetes.io/projected/7663a2be-d4ba-43d4-bd35-7bf4b969a72d-kube-api-access-jsspd\") pod \"horizon-operator-controller-manager-68c6d99b8f-w5849\" (UID: \"7663a2be-d4ba-43d4-bd35-7bf4b969a72d\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-w5849" Dec 12 00:42:03 crc kubenswrapper[4606]: I1212 00:42:03.141354 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvktc\" (UniqueName: \"kubernetes.io/projected/1d9582d9-c931-4b43-8431-407d6c98cbc1-kube-api-access-cvktc\") pod \"heat-operator-controller-manager-5f64f6f8bb-n54mf\" (UID: \"1d9582d9-c931-4b43-8431-407d6c98cbc1\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-n54mf" Dec 12 00:42:03 crc kubenswrapper[4606]: I1212 00:42:03.141403 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9c2d\" (UniqueName: \"kubernetes.io/projected/181d9f8e-1256-417e-ae8b-cc71d7fdc2b7-kube-api-access-v9c2d\") pod \"ironic-operator-controller-manager-967d97867-zgnm9\" (UID: \"181d9f8e-1256-417e-ae8b-cc71d7fdc2b7\") " pod="openstack-operators/ironic-operator-controller-manager-967d97867-zgnm9" Dec 12 00:42:03 crc kubenswrapper[4606]: I1212 00:42:03.141456 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9ec63351-044c-4c07-b021-a2835b2290c8-cert\") pod \"infra-operator-controller-manager-78d48bff9d-mnqs5\" (UID: \"9ec63351-044c-4c07-b021-a2835b2290c8\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-mnqs5" Dec 12 00:42:03 crc kubenswrapper[4606]: I1212 00:42:03.141478 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hlc9n\" (UniqueName: \"kubernetes.io/projected/9ec63351-044c-4c07-b021-a2835b2290c8-kube-api-access-hlc9n\") pod \"infra-operator-controller-manager-78d48bff9d-mnqs5\" (UID: \"9ec63351-044c-4c07-b021-a2835b2290c8\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-mnqs5" Dec 12 00:42:03 crc kubenswrapper[4606]: E1212 00:42:03.142015 4606 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 12 00:42:03 crc kubenswrapper[4606]: E1212 00:42:03.142062 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9ec63351-044c-4c07-b021-a2835b2290c8-cert podName:9ec63351-044c-4c07-b021-a2835b2290c8 nodeName:}" failed. No retries permitted until 2025-12-12 00:42:03.642045516 +0000 UTC m=+1114.187398382 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9ec63351-044c-4c07-b021-a2835b2290c8-cert") pod "infra-operator-controller-manager-78d48bff9d-mnqs5" (UID: "9ec63351-044c-4c07-b021-a2835b2290c8") : secret "infra-operator-webhook-server-cert" not found Dec 12 00:42:03 crc kubenswrapper[4606]: I1212 00:42:03.196313 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-5b5fd79c9c-66jql"] Dec 12 00:42:03 crc kubenswrapper[4606]: I1212 00:42:03.215599 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-79c8c4686c-j44wr"] Dec 12 00:42:03 crc kubenswrapper[4606]: I1212 00:42:03.216548 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-j44wr" Dec 12 00:42:03 crc kubenswrapper[4606]: I1212 00:42:03.218018 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jsspd\" (UniqueName: \"kubernetes.io/projected/7663a2be-d4ba-43d4-bd35-7bf4b969a72d-kube-api-access-jsspd\") pod \"horizon-operator-controller-manager-68c6d99b8f-w5849\" (UID: \"7663a2be-d4ba-43d4-bd35-7bf4b969a72d\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-w5849" Dec 12 00:42:03 crc kubenswrapper[4606]: I1212 00:42:03.218392 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-w5849" Dec 12 00:42:03 crc kubenswrapper[4606]: I1212 00:42:03.246392 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdxqc\" (UniqueName: \"kubernetes.io/projected/6130b694-1b33-495f-b0af-481805aa4727-kube-api-access-wdxqc\") pod \"manila-operator-controller-manager-5b5fd79c9c-66jql\" (UID: \"6130b694-1b33-495f-b0af-481805aa4727\") " pod="openstack-operators/manila-operator-controller-manager-5b5fd79c9c-66jql" Dec 12 00:42:03 crc kubenswrapper[4606]: I1212 00:42:03.246430 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9c2d\" (UniqueName: \"kubernetes.io/projected/181d9f8e-1256-417e-ae8b-cc71d7fdc2b7-kube-api-access-v9c2d\") pod \"ironic-operator-controller-manager-967d97867-zgnm9\" (UID: \"181d9f8e-1256-417e-ae8b-cc71d7fdc2b7\") " pod="openstack-operators/ironic-operator-controller-manager-967d97867-zgnm9" Dec 12 00:42:03 crc kubenswrapper[4606]: I1212 00:42:03.246487 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g49p7\" (UniqueName: \"kubernetes.io/projected/19a5895a-f008-411d-9ac2-6122eb52aa1e-kube-api-access-g49p7\") pod \"keystone-operator-controller-manager-7765d96ddf-bdj9z\" (UID: \"19a5895a-f008-411d-9ac2-6122eb52aa1e\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-bdj9z" Dec 12 00:42:03 crc kubenswrapper[4606]: I1212 00:42:03.262500 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlc9n\" (UniqueName: \"kubernetes.io/projected/9ec63351-044c-4c07-b021-a2835b2290c8-kube-api-access-hlc9n\") pod \"infra-operator-controller-manager-78d48bff9d-mnqs5\" (UID: \"9ec63351-044c-4c07-b021-a2835b2290c8\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-mnqs5" Dec 12 00:42:03 crc kubenswrapper[4606]: I1212 00:42:03.263000 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvktc\" (UniqueName: \"kubernetes.io/projected/1d9582d9-c931-4b43-8431-407d6c98cbc1-kube-api-access-cvktc\") pod \"heat-operator-controller-manager-5f64f6f8bb-n54mf\" (UID: \"1d9582d9-c931-4b43-8431-407d6c98cbc1\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-n54mf" Dec 12 00:42:03 crc kubenswrapper[4606]: I1212 00:42:03.274409 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-bdj9z"] Dec 12 00:42:03 crc kubenswrapper[4606]: I1212 00:42:03.281651 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-fzhwm" Dec 12 00:42:03 crc kubenswrapper[4606]: I1212 00:42:03.303572 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-79c8c4686c-j44wr"] Dec 12 00:42:03 crc kubenswrapper[4606]: I1212 00:42:03.320387 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-wqj4k"] Dec 12 00:42:03 crc kubenswrapper[4606]: I1212 00:42:03.321348 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-wqj4k" Dec 12 00:42:03 crc kubenswrapper[4606]: I1212 00:42:03.339738 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-97g95" Dec 12 00:42:03 crc kubenswrapper[4606]: I1212 00:42:03.340748 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-frx4m" Dec 12 00:42:03 crc kubenswrapper[4606]: I1212 00:42:03.350568 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdxqc\" (UniqueName: \"kubernetes.io/projected/6130b694-1b33-495f-b0af-481805aa4727-kube-api-access-wdxqc\") pod \"manila-operator-controller-manager-5b5fd79c9c-66jql\" (UID: \"6130b694-1b33-495f-b0af-481805aa4727\") " pod="openstack-operators/manila-operator-controller-manager-5b5fd79c9c-66jql" Dec 12 00:42:03 crc kubenswrapper[4606]: I1212 00:42:03.350929 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zrhb\" (UniqueName: \"kubernetes.io/projected/7f8a5b5c-6158-4f24-8323-2afd6b9b2664-kube-api-access-7zrhb\") pod \"mariadb-operator-controller-manager-79c8c4686c-j44wr\" (UID: \"7f8a5b5c-6158-4f24-8323-2afd6b9b2664\") " pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-j44wr" Dec 12 00:42:03 crc kubenswrapper[4606]: I1212 00:42:03.350964 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g49p7\" (UniqueName: \"kubernetes.io/projected/19a5895a-f008-411d-9ac2-6122eb52aa1e-kube-api-access-g49p7\") pod \"keystone-operator-controller-manager-7765d96ddf-bdj9z\" (UID: \"19a5895a-f008-411d-9ac2-6122eb52aa1e\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-bdj9z" Dec 12 00:42:03 crc kubenswrapper[4606]: I1212 00:42:03.351749 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9c2d\" (UniqueName: \"kubernetes.io/projected/181d9f8e-1256-417e-ae8b-cc71d7fdc2b7-kube-api-access-v9c2d\") pod \"ironic-operator-controller-manager-967d97867-zgnm9\" (UID: \"181d9f8e-1256-417e-ae8b-cc71d7fdc2b7\") " pod="openstack-operators/ironic-operator-controller-manager-967d97867-zgnm9" Dec 12 00:42:03 crc kubenswrapper[4606]: I1212 00:42:03.356708 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-h5tgw"] Dec 12 00:42:03 crc kubenswrapper[4606]: I1212 00:42:03.357751 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-h5tgw" Dec 12 00:42:03 crc kubenswrapper[4606]: I1212 00:42:03.361110 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-9vgrv" Dec 12 00:42:03 crc kubenswrapper[4606]: I1212 00:42:03.371711 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-wqj4k"] Dec 12 00:42:03 crc kubenswrapper[4606]: I1212 00:42:03.372380 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-967d97867-zgnm9" Dec 12 00:42:03 crc kubenswrapper[4606]: I1212 00:42:03.383246 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-gtwmt"] Dec 12 00:42:03 crc kubenswrapper[4606]: I1212 00:42:03.384120 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-998648c74-gtwmt" Dec 12 00:42:03 crc kubenswrapper[4606]: I1212 00:42:03.385466 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g49p7\" (UniqueName: \"kubernetes.io/projected/19a5895a-f008-411d-9ac2-6122eb52aa1e-kube-api-access-g49p7\") pod \"keystone-operator-controller-manager-7765d96ddf-bdj9z\" (UID: \"19a5895a-f008-411d-9ac2-6122eb52aa1e\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-bdj9z" Dec 12 00:42:03 crc kubenswrapper[4606]: I1212 00:42:03.387659 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-bjrq8" Dec 12 00:42:03 crc kubenswrapper[4606]: I1212 00:42:03.396481 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-h5tgw"] Dec 12 00:42:03 crc kubenswrapper[4606]: I1212 00:42:03.411953 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdxqc\" (UniqueName: \"kubernetes.io/projected/6130b694-1b33-495f-b0af-481805aa4727-kube-api-access-wdxqc\") pod \"manila-operator-controller-manager-5b5fd79c9c-66jql\" (UID: \"6130b694-1b33-495f-b0af-481805aa4727\") " pod="openstack-operators/manila-operator-controller-manager-5b5fd79c9c-66jql" Dec 12 00:42:03 crc kubenswrapper[4606]: I1212 00:42:03.427850 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-gtwmt"] Dec 12 00:42:03 crc kubenswrapper[4606]: I1212 00:42:03.453870 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7zrhb\" (UniqueName: \"kubernetes.io/projected/7f8a5b5c-6158-4f24-8323-2afd6b9b2664-kube-api-access-7zrhb\") pod \"mariadb-operator-controller-manager-79c8c4686c-j44wr\" (UID: \"7f8a5b5c-6158-4f24-8323-2afd6b9b2664\") " pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-j44wr" Dec 12 00:42:03 crc kubenswrapper[4606]: I1212 00:42:03.453935 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85zqv\" (UniqueName: \"kubernetes.io/projected/1d4554d9-9dc1-4d74-b8ea-f4c886c08fde-kube-api-access-85zqv\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-h5tgw\" (UID: \"1d4554d9-9dc1-4d74-b8ea-f4c886c08fde\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-h5tgw" Dec 12 00:42:03 crc kubenswrapper[4606]: I1212 00:42:03.454015 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jl2cj\" (UniqueName: \"kubernetes.io/projected/a6d74506-7048-4b2d-ba7f-46e83a508405-kube-api-access-jl2cj\") pod \"nova-operator-controller-manager-697bc559fc-wqj4k\" (UID: \"a6d74506-7048-4b2d-ba7f-46e83a508405\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-wqj4k" Dec 12 00:42:03 crc kubenswrapper[4606]: I1212 00:42:03.472717 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fkjwr5"] Dec 12 00:42:03 crc kubenswrapper[4606]: I1212 00:42:03.473666 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fkjwr5" Dec 12 00:42:03 crc kubenswrapper[4606]: I1212 00:42:03.483367 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Dec 12 00:42:03 crc kubenswrapper[4606]: I1212 00:42:03.499034 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-bmrpv" Dec 12 00:42:03 crc kubenswrapper[4606]: I1212 00:42:03.499433 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-n54mf" Dec 12 00:42:03 crc kubenswrapper[4606]: I1212 00:42:03.507155 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-hcqxk"] Dec 12 00:42:03 crc kubenswrapper[4606]: I1212 00:42:03.508519 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-hcqxk" Dec 12 00:42:03 crc kubenswrapper[4606]: I1212 00:42:03.511971 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-82zsp" Dec 12 00:42:03 crc kubenswrapper[4606]: I1212 00:42:03.532641 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zrhb\" (UniqueName: \"kubernetes.io/projected/7f8a5b5c-6158-4f24-8323-2afd6b9b2664-kube-api-access-7zrhb\") pod \"mariadb-operator-controller-manager-79c8c4686c-j44wr\" (UID: \"7f8a5b5c-6158-4f24-8323-2afd6b9b2664\") " pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-j44wr" Dec 12 00:42:03 crc kubenswrapper[4606]: I1212 00:42:03.533844 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-pq6pj"] Dec 12 00:42:03 crc kubenswrapper[4606]: I1212 00:42:03.534884 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-78f8948974-pq6pj" Dec 12 00:42:03 crc kubenswrapper[4606]: I1212 00:42:03.537280 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-qtqtp" Dec 12 00:42:03 crc kubenswrapper[4606]: I1212 00:42:03.557985 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86j47\" (UniqueName: \"kubernetes.io/projected/616771f5-4be8-4f22-86d8-dcd4a365a311-kube-api-access-86j47\") pod \"octavia-operator-controller-manager-998648c74-gtwmt\" (UID: \"616771f5-4be8-4f22-86d8-dcd4a365a311\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-gtwmt" Dec 12 00:42:03 crc kubenswrapper[4606]: I1212 00:42:03.558016 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85zqv\" (UniqueName: \"kubernetes.io/projected/1d4554d9-9dc1-4d74-b8ea-f4c886c08fde-kube-api-access-85zqv\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-h5tgw\" (UID: \"1d4554d9-9dc1-4d74-b8ea-f4c886c08fde\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-h5tgw" Dec 12 00:42:03 crc kubenswrapper[4606]: I1212 00:42:03.558062 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pk6ld\" (UniqueName: \"kubernetes.io/projected/02def546-751a-46ac-848a-367f0a7f84cb-kube-api-access-pk6ld\") pod \"openstack-baremetal-operator-controller-manager-84b575879fkjwr5\" (UID: \"02def546-751a-46ac-848a-367f0a7f84cb\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fkjwr5" Dec 12 00:42:03 crc kubenswrapper[4606]: I1212 00:42:03.558088 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/02def546-751a-46ac-848a-367f0a7f84cb-cert\") pod \"openstack-baremetal-operator-controller-manager-84b575879fkjwr5\" (UID: \"02def546-751a-46ac-848a-367f0a7f84cb\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fkjwr5" Dec 12 00:42:03 crc kubenswrapper[4606]: I1212 00:42:03.558128 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jl2cj\" (UniqueName: \"kubernetes.io/projected/a6d74506-7048-4b2d-ba7f-46e83a508405-kube-api-access-jl2cj\") pod \"nova-operator-controller-manager-697bc559fc-wqj4k\" (UID: \"a6d74506-7048-4b2d-ba7f-46e83a508405\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-wqj4k" Dec 12 00:42:03 crc kubenswrapper[4606]: I1212 00:42:03.582450 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-5b5fd79c9c-66jql" Dec 12 00:42:03 crc kubenswrapper[4606]: I1212 00:42:03.587149 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-hcqxk"] Dec 12 00:42:03 crc kubenswrapper[4606]: I1212 00:42:03.587607 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jl2cj\" (UniqueName: \"kubernetes.io/projected/a6d74506-7048-4b2d-ba7f-46e83a508405-kube-api-access-jl2cj\") pod \"nova-operator-controller-manager-697bc559fc-wqj4k\" (UID: \"a6d74506-7048-4b2d-ba7f-46e83a508405\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-wqj4k" Dec 12 00:42:03 crc kubenswrapper[4606]: I1212 00:42:03.609812 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85zqv\" (UniqueName: \"kubernetes.io/projected/1d4554d9-9dc1-4d74-b8ea-f4c886c08fde-kube-api-access-85zqv\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-h5tgw\" (UID: \"1d4554d9-9dc1-4d74-b8ea-f4c886c08fde\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-h5tgw" Dec 12 00:42:03 crc kubenswrapper[4606]: I1212 00:42:03.623347 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-9d58d64bc-g4nxc"] Dec 12 00:42:03 crc kubenswrapper[4606]: I1212 00:42:03.624388 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-g4nxc" Dec 12 00:42:03 crc kubenswrapper[4606]: I1212 00:42:03.624606 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-bdj9z" Dec 12 00:42:03 crc kubenswrapper[4606]: I1212 00:42:03.634958 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-lkd9n" Dec 12 00:42:03 crc kubenswrapper[4606]: I1212 00:42:03.642274 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-58d5ff84df-st6cm"] Dec 12 00:42:03 crc kubenswrapper[4606]: I1212 00:42:03.643429 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-st6cm" Dec 12 00:42:03 crc kubenswrapper[4606]: I1212 00:42:03.646384 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-j44wr" Dec 12 00:42:03 crc kubenswrapper[4606]: I1212 00:42:03.648753 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-bz9pt" Dec 12 00:42:03 crc kubenswrapper[4606]: I1212 00:42:03.666449 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9ec63351-044c-4c07-b021-a2835b2290c8-cert\") pod \"infra-operator-controller-manager-78d48bff9d-mnqs5\" (UID: \"9ec63351-044c-4c07-b021-a2835b2290c8\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-mnqs5" Dec 12 00:42:03 crc kubenswrapper[4606]: I1212 00:42:03.666497 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pk6ld\" (UniqueName: \"kubernetes.io/projected/02def546-751a-46ac-848a-367f0a7f84cb-kube-api-access-pk6ld\") pod \"openstack-baremetal-operator-controller-manager-84b575879fkjwr5\" (UID: \"02def546-751a-46ac-848a-367f0a7f84cb\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fkjwr5" Dec 12 00:42:03 crc kubenswrapper[4606]: I1212 00:42:03.666525 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/02def546-751a-46ac-848a-367f0a7f84cb-cert\") pod \"openstack-baremetal-operator-controller-manager-84b575879fkjwr5\" (UID: \"02def546-751a-46ac-848a-367f0a7f84cb\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fkjwr5" Dec 12 00:42:03 crc kubenswrapper[4606]: I1212 00:42:03.666550 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2ncs\" (UniqueName: \"kubernetes.io/projected/3b429293-caf6-47e1-9976-01d6fca19c6c-kube-api-access-c2ncs\") pod \"ovn-operator-controller-manager-b6456fdb6-hcqxk\" (UID: \"3b429293-caf6-47e1-9976-01d6fca19c6c\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-hcqxk" Dec 12 00:42:03 crc kubenswrapper[4606]: I1212 00:42:03.666569 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gftt5\" (UniqueName: \"kubernetes.io/projected/8d1093f3-e1d5-45be-9682-2f3ccf90eda2-kube-api-access-gftt5\") pod \"placement-operator-controller-manager-78f8948974-pq6pj\" (UID: \"8d1093f3-e1d5-45be-9682-2f3ccf90eda2\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-pq6pj" Dec 12 00:42:03 crc kubenswrapper[4606]: I1212 00:42:03.666637 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86j47\" (UniqueName: \"kubernetes.io/projected/616771f5-4be8-4f22-86d8-dcd4a365a311-kube-api-access-86j47\") pod \"octavia-operator-controller-manager-998648c74-gtwmt\" (UID: \"616771f5-4be8-4f22-86d8-dcd4a365a311\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-gtwmt" Dec 12 00:42:03 crc kubenswrapper[4606]: E1212 00:42:03.666987 4606 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 12 00:42:03 crc kubenswrapper[4606]: E1212 00:42:03.667027 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9ec63351-044c-4c07-b021-a2835b2290c8-cert podName:9ec63351-044c-4c07-b021-a2835b2290c8 nodeName:}" failed. No retries permitted until 2025-12-12 00:42:04.66701366 +0000 UTC m=+1115.212366516 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9ec63351-044c-4c07-b021-a2835b2290c8-cert") pod "infra-operator-controller-manager-78d48bff9d-mnqs5" (UID: "9ec63351-044c-4c07-b021-a2835b2290c8") : secret "infra-operator-webhook-server-cert" not found Dec 12 00:42:03 crc kubenswrapper[4606]: E1212 00:42:03.667372 4606 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 12 00:42:03 crc kubenswrapper[4606]: E1212 00:42:03.667398 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/02def546-751a-46ac-848a-367f0a7f84cb-cert podName:02def546-751a-46ac-848a-367f0a7f84cb nodeName:}" failed. No retries permitted until 2025-12-12 00:42:04.167389271 +0000 UTC m=+1114.712742137 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/02def546-751a-46ac-848a-367f0a7f84cb-cert") pod "openstack-baremetal-operator-controller-manager-84b575879fkjwr5" (UID: "02def546-751a-46ac-848a-367f0a7f84cb") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 12 00:42:03 crc kubenswrapper[4606]: I1212 00:42:03.682838 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-pq6pj"] Dec 12 00:42:03 crc kubenswrapper[4606]: I1212 00:42:03.683263 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-wqj4k" Dec 12 00:42:03 crc kubenswrapper[4606]: I1212 00:42:03.695588 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86j47\" (UniqueName: \"kubernetes.io/projected/616771f5-4be8-4f22-86d8-dcd4a365a311-kube-api-access-86j47\") pod \"octavia-operator-controller-manager-998648c74-gtwmt\" (UID: \"616771f5-4be8-4f22-86d8-dcd4a365a311\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-gtwmt" Dec 12 00:42:03 crc kubenswrapper[4606]: I1212 00:42:03.754346 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-h5tgw" Dec 12 00:42:03 crc kubenswrapper[4606]: I1212 00:42:03.770420 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qk6md\" (UniqueName: \"kubernetes.io/projected/0b3e1e95-7581-4453-af8b-6a23e4bba5fe-kube-api-access-qk6md\") pod \"swift-operator-controller-manager-9d58d64bc-g4nxc\" (UID: \"0b3e1e95-7581-4453-af8b-6a23e4bba5fe\") " pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-g4nxc" Dec 12 00:42:03 crc kubenswrapper[4606]: I1212 00:42:03.770489 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2ncs\" (UniqueName: \"kubernetes.io/projected/3b429293-caf6-47e1-9976-01d6fca19c6c-kube-api-access-c2ncs\") pod \"ovn-operator-controller-manager-b6456fdb6-hcqxk\" (UID: \"3b429293-caf6-47e1-9976-01d6fca19c6c\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-hcqxk" Dec 12 00:42:03 crc kubenswrapper[4606]: I1212 00:42:03.770517 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gftt5\" (UniqueName: \"kubernetes.io/projected/8d1093f3-e1d5-45be-9682-2f3ccf90eda2-kube-api-access-gftt5\") pod \"placement-operator-controller-manager-78f8948974-pq6pj\" (UID: \"8d1093f3-e1d5-45be-9682-2f3ccf90eda2\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-pq6pj" Dec 12 00:42:03 crc kubenswrapper[4606]: I1212 00:42:03.770561 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jlm8\" (UniqueName: \"kubernetes.io/projected/2a27185b-308d-419c-bc01-26714a1f0394-kube-api-access-2jlm8\") pod \"telemetry-operator-controller-manager-58d5ff84df-st6cm\" (UID: \"2a27185b-308d-419c-bc01-26714a1f0394\") " pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-st6cm" Dec 12 00:42:03 crc kubenswrapper[4606]: I1212 00:42:03.771422 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-998648c74-gtwmt" Dec 12 00:42:03 crc kubenswrapper[4606]: I1212 00:42:03.772968 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fkjwr5"] Dec 12 00:42:03 crc kubenswrapper[4606]: I1212 00:42:03.792060 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-58d5ff84df-st6cm"] Dec 12 00:42:03 crc kubenswrapper[4606]: I1212 00:42:03.803829 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pk6ld\" (UniqueName: \"kubernetes.io/projected/02def546-751a-46ac-848a-367f0a7f84cb-kube-api-access-pk6ld\") pod \"openstack-baremetal-operator-controller-manager-84b575879fkjwr5\" (UID: \"02def546-751a-46ac-848a-367f0a7f84cb\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fkjwr5" Dec 12 00:42:03 crc kubenswrapper[4606]: I1212 00:42:03.845930 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-9d58d64bc-g4nxc"] Dec 12 00:42:03 crc kubenswrapper[4606]: I1212 00:42:03.862245 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" event={"ID":"a543e227-be89-40cb-941d-b4707cc28921","Type":"ContainerStarted","Data":"e80327b00df207db6b4792ec6ccf6cd67956fd25c801ee29c3af3ba674cbe9cc"} Dec 12 00:42:03 crc kubenswrapper[4606]: I1212 00:42:03.872635 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jlm8\" (UniqueName: \"kubernetes.io/projected/2a27185b-308d-419c-bc01-26714a1f0394-kube-api-access-2jlm8\") pod \"telemetry-operator-controller-manager-58d5ff84df-st6cm\" (UID: \"2a27185b-308d-419c-bc01-26714a1f0394\") " pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-st6cm" Dec 12 00:42:03 crc kubenswrapper[4606]: I1212 00:42:03.872740 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qk6md\" (UniqueName: \"kubernetes.io/projected/0b3e1e95-7581-4453-af8b-6a23e4bba5fe-kube-api-access-qk6md\") pod \"swift-operator-controller-manager-9d58d64bc-g4nxc\" (UID: \"0b3e1e95-7581-4453-af8b-6a23e4bba5fe\") " pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-g4nxc" Dec 12 00:42:03 crc kubenswrapper[4606]: I1212 00:42:03.886651 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-qjxxm"] Dec 12 00:42:03 crc kubenswrapper[4606]: I1212 00:42:03.888013 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5854674fcc-qjxxm" Dec 12 00:42:03 crc kubenswrapper[4606]: I1212 00:42:03.898994 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gftt5\" (UniqueName: \"kubernetes.io/projected/8d1093f3-e1d5-45be-9682-2f3ccf90eda2-kube-api-access-gftt5\") pod \"placement-operator-controller-manager-78f8948974-pq6pj\" (UID: \"8d1093f3-e1d5-45be-9682-2f3ccf90eda2\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-pq6pj" Dec 12 00:42:03 crc kubenswrapper[4606]: I1212 00:42:03.899138 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-8sgqh" Dec 12 00:42:03 crc kubenswrapper[4606]: I1212 00:42:03.899453 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2ncs\" (UniqueName: \"kubernetes.io/projected/3b429293-caf6-47e1-9976-01d6fca19c6c-kube-api-access-c2ncs\") pod \"ovn-operator-controller-manager-b6456fdb6-hcqxk\" (UID: \"3b429293-caf6-47e1-9976-01d6fca19c6c\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-hcqxk" Dec 12 00:42:03 crc kubenswrapper[4606]: I1212 00:42:03.915324 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-qjxxm"] Dec 12 00:42:03 crc kubenswrapper[4606]: I1212 00:42:03.928289 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-75944c9b7-z2b8c"] Dec 12 00:42:03 crc kubenswrapper[4606]: I1212 00:42:03.929683 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-75944c9b7-z2b8c" Dec 12 00:42:03 crc kubenswrapper[4606]: I1212 00:42:03.929812 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qk6md\" (UniqueName: \"kubernetes.io/projected/0b3e1e95-7581-4453-af8b-6a23e4bba5fe-kube-api-access-qk6md\") pod \"swift-operator-controller-manager-9d58d64bc-g4nxc\" (UID: \"0b3e1e95-7581-4453-af8b-6a23e4bba5fe\") " pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-g4nxc" Dec 12 00:42:03 crc kubenswrapper[4606]: I1212 00:42:03.929886 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jlm8\" (UniqueName: \"kubernetes.io/projected/2a27185b-308d-419c-bc01-26714a1f0394-kube-api-access-2jlm8\") pod \"telemetry-operator-controller-manager-58d5ff84df-st6cm\" (UID: \"2a27185b-308d-419c-bc01-26714a1f0394\") " pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-st6cm" Dec 12 00:42:03 crc kubenswrapper[4606]: I1212 00:42:03.934453 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-st6cm" Dec 12 00:42:03 crc kubenswrapper[4606]: I1212 00:42:03.941187 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-9smsp" Dec 12 00:42:03 crc kubenswrapper[4606]: I1212 00:42:03.958804 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-75944c9b7-z2b8c"] Dec 12 00:42:03 crc kubenswrapper[4606]: I1212 00:42:03.974559 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hshnf\" (UniqueName: \"kubernetes.io/projected/d7b0479e-9d3b-48b0-a7dd-6388faf6cfc0-kube-api-access-hshnf\") pod \"test-operator-controller-manager-5854674fcc-qjxxm\" (UID: \"d7b0479e-9d3b-48b0-a7dd-6388faf6cfc0\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-qjxxm" Dec 12 00:42:03 crc kubenswrapper[4606]: I1212 00:42:03.990425 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6f7f89d9c9-mfj2g"] Dec 12 00:42:03 crc kubenswrapper[4606]: I1212 00:42:03.991516 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-6f7f89d9c9-mfj2g" Dec 12 00:42:03 crc kubenswrapper[4606]: I1212 00:42:03.994698 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Dec 12 00:42:03 crc kubenswrapper[4606]: I1212 00:42:03.994898 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-tt78v" Dec 12 00:42:03 crc kubenswrapper[4606]: I1212 00:42:03.995269 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Dec 12 00:42:04 crc kubenswrapper[4606]: I1212 00:42:04.000373 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6f7f89d9c9-mfj2g"] Dec 12 00:42:04 crc kubenswrapper[4606]: I1212 00:42:04.003121 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-g4nxc" Dec 12 00:42:04 crc kubenswrapper[4606]: I1212 00:42:04.041970 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-6lzdf"] Dec 12 00:42:04 crc kubenswrapper[4606]: I1212 00:42:04.043894 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-6lzdf" Dec 12 00:42:04 crc kubenswrapper[4606]: I1212 00:42:04.052486 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-6lzdf"] Dec 12 00:42:04 crc kubenswrapper[4606]: I1212 00:42:04.055627 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-6c677c69b-9npjw"] Dec 12 00:42:04 crc kubenswrapper[4606]: I1212 00:42:04.071048 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-4p6zs" Dec 12 00:42:04 crc kubenswrapper[4606]: I1212 00:42:04.076743 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9lz9\" (UniqueName: \"kubernetes.io/projected/f289c6b8-d4dd-40da-ac6a-0249b4a3e9f8-kube-api-access-f9lz9\") pod \"watcher-operator-controller-manager-75944c9b7-z2b8c\" (UID: \"f289c6b8-d4dd-40da-ac6a-0249b4a3e9f8\") " pod="openstack-operators/watcher-operator-controller-manager-75944c9b7-z2b8c" Dec 12 00:42:04 crc kubenswrapper[4606]: I1212 00:42:04.076861 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hshnf\" (UniqueName: \"kubernetes.io/projected/d7b0479e-9d3b-48b0-a7dd-6388faf6cfc0-kube-api-access-hshnf\") pod \"test-operator-controller-manager-5854674fcc-qjxxm\" (UID: \"d7b0479e-9d3b-48b0-a7dd-6388faf6cfc0\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-qjxxm" Dec 12 00:42:04 crc kubenswrapper[4606]: I1212 00:42:04.076915 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d681c7e6-bef3-4733-875c-45d6b60643e5-metrics-certs\") pod \"openstack-operator-controller-manager-6f7f89d9c9-mfj2g\" (UID: \"d681c7e6-bef3-4733-875c-45d6b60643e5\") " pod="openstack-operators/openstack-operator-controller-manager-6f7f89d9c9-mfj2g" Dec 12 00:42:04 crc kubenswrapper[4606]: I1212 00:42:04.076944 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d681c7e6-bef3-4733-875c-45d6b60643e5-webhook-certs\") pod \"openstack-operator-controller-manager-6f7f89d9c9-mfj2g\" (UID: \"d681c7e6-bef3-4733-875c-45d6b60643e5\") " pod="openstack-operators/openstack-operator-controller-manager-6f7f89d9c9-mfj2g" Dec 12 00:42:04 crc kubenswrapper[4606]: I1212 00:42:04.076964 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kj57\" (UniqueName: \"kubernetes.io/projected/d681c7e6-bef3-4733-875c-45d6b60643e5-kube-api-access-8kj57\") pod \"openstack-operator-controller-manager-6f7f89d9c9-mfj2g\" (UID: \"d681c7e6-bef3-4733-875c-45d6b60643e5\") " pod="openstack-operators/openstack-operator-controller-manager-6f7f89d9c9-mfj2g" Dec 12 00:42:04 crc kubenswrapper[4606]: I1212 00:42:04.138624 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-hcqxk" Dec 12 00:42:04 crc kubenswrapper[4606]: I1212 00:42:04.145585 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hshnf\" (UniqueName: \"kubernetes.io/projected/d7b0479e-9d3b-48b0-a7dd-6388faf6cfc0-kube-api-access-hshnf\") pod \"test-operator-controller-manager-5854674fcc-qjxxm\" (UID: \"d7b0479e-9d3b-48b0-a7dd-6388faf6cfc0\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-qjxxm" Dec 12 00:42:04 crc kubenswrapper[4606]: I1212 00:42:04.166671 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-78f8948974-pq6pj" Dec 12 00:42:04 crc kubenswrapper[4606]: I1212 00:42:04.180899 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9lz9\" (UniqueName: \"kubernetes.io/projected/f289c6b8-d4dd-40da-ac6a-0249b4a3e9f8-kube-api-access-f9lz9\") pod \"watcher-operator-controller-manager-75944c9b7-z2b8c\" (UID: \"f289c6b8-d4dd-40da-ac6a-0249b4a3e9f8\") " pod="openstack-operators/watcher-operator-controller-manager-75944c9b7-z2b8c" Dec 12 00:42:04 crc kubenswrapper[4606]: I1212 00:42:04.180956 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/02def546-751a-46ac-848a-367f0a7f84cb-cert\") pod \"openstack-baremetal-operator-controller-manager-84b575879fkjwr5\" (UID: \"02def546-751a-46ac-848a-367f0a7f84cb\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fkjwr5" Dec 12 00:42:04 crc kubenswrapper[4606]: I1212 00:42:04.181011 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6n9kk\" (UniqueName: \"kubernetes.io/projected/e1c99848-c685-4782-bb57-71217db4db6c-kube-api-access-6n9kk\") pod \"rabbitmq-cluster-operator-manager-668c99d594-6lzdf\" (UID: \"e1c99848-c685-4782-bb57-71217db4db6c\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-6lzdf" Dec 12 00:42:04 crc kubenswrapper[4606]: I1212 00:42:04.181073 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d681c7e6-bef3-4733-875c-45d6b60643e5-metrics-certs\") pod \"openstack-operator-controller-manager-6f7f89d9c9-mfj2g\" (UID: \"d681c7e6-bef3-4733-875c-45d6b60643e5\") " pod="openstack-operators/openstack-operator-controller-manager-6f7f89d9c9-mfj2g" Dec 12 00:42:04 crc kubenswrapper[4606]: I1212 00:42:04.181113 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d681c7e6-bef3-4733-875c-45d6b60643e5-webhook-certs\") pod \"openstack-operator-controller-manager-6f7f89d9c9-mfj2g\" (UID: \"d681c7e6-bef3-4733-875c-45d6b60643e5\") " pod="openstack-operators/openstack-operator-controller-manager-6f7f89d9c9-mfj2g" Dec 12 00:42:04 crc kubenswrapper[4606]: I1212 00:42:04.181138 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8kj57\" (UniqueName: \"kubernetes.io/projected/d681c7e6-bef3-4733-875c-45d6b60643e5-kube-api-access-8kj57\") pod \"openstack-operator-controller-manager-6f7f89d9c9-mfj2g\" (UID: \"d681c7e6-bef3-4733-875c-45d6b60643e5\") " pod="openstack-operators/openstack-operator-controller-manager-6f7f89d9c9-mfj2g" Dec 12 00:42:04 crc kubenswrapper[4606]: E1212 00:42:04.181324 4606 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 12 00:42:04 crc kubenswrapper[4606]: E1212 00:42:04.181380 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/02def546-751a-46ac-848a-367f0a7f84cb-cert podName:02def546-751a-46ac-848a-367f0a7f84cb nodeName:}" failed. No retries permitted until 2025-12-12 00:42:05.18136276 +0000 UTC m=+1115.726715696 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/02def546-751a-46ac-848a-367f0a7f84cb-cert") pod "openstack-baremetal-operator-controller-manager-84b575879fkjwr5" (UID: "02def546-751a-46ac-848a-367f0a7f84cb") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 12 00:42:04 crc kubenswrapper[4606]: E1212 00:42:04.181494 4606 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 12 00:42:04 crc kubenswrapper[4606]: E1212 00:42:04.181539 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d681c7e6-bef3-4733-875c-45d6b60643e5-webhook-certs podName:d681c7e6-bef3-4733-875c-45d6b60643e5 nodeName:}" failed. No retries permitted until 2025-12-12 00:42:04.681525424 +0000 UTC m=+1115.226878290 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/d681c7e6-bef3-4733-875c-45d6b60643e5-webhook-certs") pod "openstack-operator-controller-manager-6f7f89d9c9-mfj2g" (UID: "d681c7e6-bef3-4733-875c-45d6b60643e5") : secret "webhook-server-cert" not found Dec 12 00:42:04 crc kubenswrapper[4606]: E1212 00:42:04.181568 4606 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 12 00:42:04 crc kubenswrapper[4606]: E1212 00:42:04.181586 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d681c7e6-bef3-4733-875c-45d6b60643e5-metrics-certs podName:d681c7e6-bef3-4733-875c-45d6b60643e5 nodeName:}" failed. No retries permitted until 2025-12-12 00:42:04.681580356 +0000 UTC m=+1115.226933222 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d681c7e6-bef3-4733-875c-45d6b60643e5-metrics-certs") pod "openstack-operator-controller-manager-6f7f89d9c9-mfj2g" (UID: "d681c7e6-bef3-4733-875c-45d6b60643e5") : secret "metrics-server-cert" not found Dec 12 00:42:04 crc kubenswrapper[4606]: I1212 00:42:04.229192 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kj57\" (UniqueName: \"kubernetes.io/projected/d681c7e6-bef3-4733-875c-45d6b60643e5-kube-api-access-8kj57\") pod \"openstack-operator-controller-manager-6f7f89d9c9-mfj2g\" (UID: \"d681c7e6-bef3-4733-875c-45d6b60643e5\") " pod="openstack-operators/openstack-operator-controller-manager-6f7f89d9c9-mfj2g" Dec 12 00:42:04 crc kubenswrapper[4606]: I1212 00:42:04.241242 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9lz9\" (UniqueName: \"kubernetes.io/projected/f289c6b8-d4dd-40da-ac6a-0249b4a3e9f8-kube-api-access-f9lz9\") pod \"watcher-operator-controller-manager-75944c9b7-z2b8c\" (UID: \"f289c6b8-d4dd-40da-ac6a-0249b4a3e9f8\") " pod="openstack-operators/watcher-operator-controller-manager-75944c9b7-z2b8c" Dec 12 00:42:04 crc kubenswrapper[4606]: I1212 00:42:04.282667 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6n9kk\" (UniqueName: \"kubernetes.io/projected/e1c99848-c685-4782-bb57-71217db4db6c-kube-api-access-6n9kk\") pod \"rabbitmq-cluster-operator-manager-668c99d594-6lzdf\" (UID: \"e1c99848-c685-4782-bb57-71217db4db6c\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-6lzdf" Dec 12 00:42:04 crc kubenswrapper[4606]: I1212 00:42:04.343882 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6n9kk\" (UniqueName: \"kubernetes.io/projected/e1c99848-c685-4782-bb57-71217db4db6c-kube-api-access-6n9kk\") pod \"rabbitmq-cluster-operator-manager-668c99d594-6lzdf\" (UID: \"e1c99848-c685-4782-bb57-71217db4db6c\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-6lzdf" Dec 12 00:42:04 crc kubenswrapper[4606]: I1212 00:42:04.378096 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5854674fcc-qjxxm" Dec 12 00:42:04 crc kubenswrapper[4606]: I1212 00:42:04.426244 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-75944c9b7-z2b8c" Dec 12 00:42:04 crc kubenswrapper[4606]: I1212 00:42:04.502564 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-6lzdf" Dec 12 00:42:04 crc kubenswrapper[4606]: I1212 00:42:04.711121 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d681c7e6-bef3-4733-875c-45d6b60643e5-metrics-certs\") pod \"openstack-operator-controller-manager-6f7f89d9c9-mfj2g\" (UID: \"d681c7e6-bef3-4733-875c-45d6b60643e5\") " pod="openstack-operators/openstack-operator-controller-manager-6f7f89d9c9-mfj2g" Dec 12 00:42:04 crc kubenswrapper[4606]: I1212 00:42:04.711202 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d681c7e6-bef3-4733-875c-45d6b60643e5-webhook-certs\") pod \"openstack-operator-controller-manager-6f7f89d9c9-mfj2g\" (UID: \"d681c7e6-bef3-4733-875c-45d6b60643e5\") " pod="openstack-operators/openstack-operator-controller-manager-6f7f89d9c9-mfj2g" Dec 12 00:42:04 crc kubenswrapper[4606]: I1212 00:42:04.711232 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9ec63351-044c-4c07-b021-a2835b2290c8-cert\") pod \"infra-operator-controller-manager-78d48bff9d-mnqs5\" (UID: \"9ec63351-044c-4c07-b021-a2835b2290c8\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-mnqs5" Dec 12 00:42:04 crc kubenswrapper[4606]: E1212 00:42:04.711388 4606 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 12 00:42:04 crc kubenswrapper[4606]: E1212 00:42:04.711447 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9ec63351-044c-4c07-b021-a2835b2290c8-cert podName:9ec63351-044c-4c07-b021-a2835b2290c8 nodeName:}" failed. No retries permitted until 2025-12-12 00:42:06.711432892 +0000 UTC m=+1117.256785758 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9ec63351-044c-4c07-b021-a2835b2290c8-cert") pod "infra-operator-controller-manager-78d48bff9d-mnqs5" (UID: "9ec63351-044c-4c07-b021-a2835b2290c8") : secret "infra-operator-webhook-server-cert" not found Dec 12 00:42:04 crc kubenswrapper[4606]: E1212 00:42:04.712509 4606 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 12 00:42:04 crc kubenswrapper[4606]: E1212 00:42:04.712619 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d681c7e6-bef3-4733-875c-45d6b60643e5-webhook-certs podName:d681c7e6-bef3-4733-875c-45d6b60643e5 nodeName:}" failed. No retries permitted until 2025-12-12 00:42:05.712610253 +0000 UTC m=+1116.257963119 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/d681c7e6-bef3-4733-875c-45d6b60643e5-webhook-certs") pod "openstack-operator-controller-manager-6f7f89d9c9-mfj2g" (UID: "d681c7e6-bef3-4733-875c-45d6b60643e5") : secret "webhook-server-cert" not found Dec 12 00:42:04 crc kubenswrapper[4606]: E1212 00:42:04.712738 4606 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 12 00:42:04 crc kubenswrapper[4606]: E1212 00:42:04.712843 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d681c7e6-bef3-4733-875c-45d6b60643e5-metrics-certs podName:d681c7e6-bef3-4733-875c-45d6b60643e5 nodeName:}" failed. No retries permitted until 2025-12-12 00:42:05.712833819 +0000 UTC m=+1116.258186685 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d681c7e6-bef3-4733-875c-45d6b60643e5-metrics-certs") pod "openstack-operator-controller-manager-6f7f89d9c9-mfj2g" (UID: "d681c7e6-bef3-4733-875c-45d6b60643e5") : secret "metrics-server-cert" not found Dec 12 00:42:04 crc kubenswrapper[4606]: I1212 00:42:04.901260 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-9npjw" event={"ID":"b5316be9-1796-4bf0-aabf-ac9cf01c709b","Type":"ContainerStarted","Data":"79190c44c3092693431eb0182fb043c6b686c7604e8a625c1e39907c2cd6c7f3"} Dec 12 00:42:05 crc kubenswrapper[4606]: I1212 00:42:05.043868 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-697fb699cf-2c5hc"] Dec 12 00:42:05 crc kubenswrapper[4606]: I1212 00:42:05.079478 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-n54mf"] Dec 12 00:42:05 crc kubenswrapper[4606]: I1212 00:42:05.099883 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-5697bb5779-mzf56"] Dec 12 00:42:05 crc kubenswrapper[4606]: W1212 00:42:05.119236 4606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1c42899f_ae12_4c9b_b012_6ead724854cb.slice/crio-7a6bc25e35c458ecf9abe460fa31c669c21ff65abbfff974c6e1986a797f6783 WatchSource:0}: Error finding container 7a6bc25e35c458ecf9abe460fa31c669c21ff65abbfff974c6e1986a797f6783: Status 404 returned error can't find the container with id 7a6bc25e35c458ecf9abe460fa31c669c21ff65abbfff974c6e1986a797f6783 Dec 12 00:42:05 crc kubenswrapper[4606]: I1212 00:42:05.177356 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-97g95"] Dec 12 00:42:05 crc kubenswrapper[4606]: I1212 00:42:05.226210 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-w5849"] Dec 12 00:42:05 crc kubenswrapper[4606]: I1212 00:42:05.227983 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/02def546-751a-46ac-848a-367f0a7f84cb-cert\") pod \"openstack-baremetal-operator-controller-manager-84b575879fkjwr5\" (UID: \"02def546-751a-46ac-848a-367f0a7f84cb\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fkjwr5" Dec 12 00:42:05 crc kubenswrapper[4606]: E1212 00:42:05.228138 4606 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 12 00:42:05 crc kubenswrapper[4606]: E1212 00:42:05.228224 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/02def546-751a-46ac-848a-367f0a7f84cb-cert podName:02def546-751a-46ac-848a-367f0a7f84cb nodeName:}" failed. No retries permitted until 2025-12-12 00:42:07.228201476 +0000 UTC m=+1117.773554342 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/02def546-751a-46ac-848a-367f0a7f84cb-cert") pod "openstack-baremetal-operator-controller-manager-84b575879fkjwr5" (UID: "02def546-751a-46ac-848a-367f0a7f84cb") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 12 00:42:05 crc kubenswrapper[4606]: I1212 00:42:05.356417 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-5b5fd79c9c-66jql"] Dec 12 00:42:05 crc kubenswrapper[4606]: I1212 00:42:05.397893 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-bdj9z"] Dec 12 00:42:05 crc kubenswrapper[4606]: I1212 00:42:05.410274 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-hcqxk"] Dec 12 00:42:05 crc kubenswrapper[4606]: I1212 00:42:05.426480 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-967d97867-zgnm9"] Dec 12 00:42:05 crc kubenswrapper[4606]: I1212 00:42:05.428263 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-h5tgw"] Dec 12 00:42:05 crc kubenswrapper[4606]: I1212 00:42:05.554794 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-gtwmt"] Dec 12 00:42:05 crc kubenswrapper[4606]: I1212 00:42:05.581889 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-9d58d64bc-g4nxc"] Dec 12 00:42:05 crc kubenswrapper[4606]: W1212 00:42:05.602216 4606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0b3e1e95_7581_4453_af8b_6a23e4bba5fe.slice/crio-bbd87635391d5c834d69db5c05b2ffa6481f563f29a757570c4f6e024cb4c66c WatchSource:0}: Error finding container bbd87635391d5c834d69db5c05b2ffa6481f563f29a757570c4f6e024cb4c66c: Status 404 returned error can't find the container with id bbd87635391d5c834d69db5c05b2ffa6481f563f29a757570c4f6e024cb4c66c Dec 12 00:42:05 crc kubenswrapper[4606]: I1212 00:42:05.744739 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d681c7e6-bef3-4733-875c-45d6b60643e5-metrics-certs\") pod \"openstack-operator-controller-manager-6f7f89d9c9-mfj2g\" (UID: \"d681c7e6-bef3-4733-875c-45d6b60643e5\") " pod="openstack-operators/openstack-operator-controller-manager-6f7f89d9c9-mfj2g" Dec 12 00:42:05 crc kubenswrapper[4606]: I1212 00:42:05.744786 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d681c7e6-bef3-4733-875c-45d6b60643e5-webhook-certs\") pod \"openstack-operator-controller-manager-6f7f89d9c9-mfj2g\" (UID: \"d681c7e6-bef3-4733-875c-45d6b60643e5\") " pod="openstack-operators/openstack-operator-controller-manager-6f7f89d9c9-mfj2g" Dec 12 00:42:05 crc kubenswrapper[4606]: E1212 00:42:05.744898 4606 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 12 00:42:05 crc kubenswrapper[4606]: E1212 00:42:05.744942 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d681c7e6-bef3-4733-875c-45d6b60643e5-webhook-certs podName:d681c7e6-bef3-4733-875c-45d6b60643e5 nodeName:}" failed. No retries permitted until 2025-12-12 00:42:07.744929768 +0000 UTC m=+1118.290282634 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/d681c7e6-bef3-4733-875c-45d6b60643e5-webhook-certs") pod "openstack-operator-controller-manager-6f7f89d9c9-mfj2g" (UID: "d681c7e6-bef3-4733-875c-45d6b60643e5") : secret "webhook-server-cert" not found Dec 12 00:42:05 crc kubenswrapper[4606]: E1212 00:42:05.745320 4606 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 12 00:42:05 crc kubenswrapper[4606]: E1212 00:42:05.745359 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d681c7e6-bef3-4733-875c-45d6b60643e5-metrics-certs podName:d681c7e6-bef3-4733-875c-45d6b60643e5 nodeName:}" failed. No retries permitted until 2025-12-12 00:42:07.74534921 +0000 UTC m=+1118.290702076 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d681c7e6-bef3-4733-875c-45d6b60643e5-metrics-certs") pod "openstack-operator-controller-manager-6f7f89d9c9-mfj2g" (UID: "d681c7e6-bef3-4733-875c-45d6b60643e5") : secret "metrics-server-cert" not found Dec 12 00:42:05 crc kubenswrapper[4606]: I1212 00:42:05.800213 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-58d5ff84df-st6cm"] Dec 12 00:42:05 crc kubenswrapper[4606]: I1212 00:42:05.801385 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-75944c9b7-z2b8c"] Dec 12 00:42:05 crc kubenswrapper[4606]: W1212 00:42:05.823348 4606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2a27185b_308d_419c_bc01_26714a1f0394.slice/crio-6b79608c4369e482b7fe0edc2d6195edbfc42e1a1053e8769dd22231f451136c WatchSource:0}: Error finding container 6b79608c4369e482b7fe0edc2d6195edbfc42e1a1053e8769dd22231f451136c: Status 404 returned error can't find the container with id 6b79608c4369e482b7fe0edc2d6195edbfc42e1a1053e8769dd22231f451136c Dec 12 00:42:05 crc kubenswrapper[4606]: E1212 00:42:05.838900 4606 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:f27e732ec1faee765461bf137d9be81278b2fa39675019a73622755e1e610b6f,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2jlm8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-58d5ff84df-st6cm_openstack-operators(2a27185b-308d-419c-bc01-26714a1f0394): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 12 00:42:05 crc kubenswrapper[4606]: E1212 00:42:05.843122 4606 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2jlm8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-58d5ff84df-st6cm_openstack-operators(2a27185b-308d-419c-bc01-26714a1f0394): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 12 00:42:05 crc kubenswrapper[4606]: E1212 00:42:05.844696 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-st6cm" podUID="2a27185b-308d-419c-bc01-26714a1f0394" Dec 12 00:42:05 crc kubenswrapper[4606]: I1212 00:42:05.934087 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-g4nxc" event={"ID":"0b3e1e95-7581-4453-af8b-6a23e4bba5fe","Type":"ContainerStarted","Data":"bbd87635391d5c834d69db5c05b2ffa6481f563f29a757570c4f6e024cb4c66c"} Dec 12 00:42:05 crc kubenswrapper[4606]: I1212 00:42:05.936355 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-n54mf" event={"ID":"1d9582d9-c931-4b43-8431-407d6c98cbc1","Type":"ContainerStarted","Data":"accd89e10f37ce704d2b9a13459cad2e9b5e2c67eccb249a621b768dc5d82a81"} Dec 12 00:42:05 crc kubenswrapper[4606]: I1212 00:42:05.938427 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-st6cm" event={"ID":"2a27185b-308d-419c-bc01-26714a1f0394","Type":"ContainerStarted","Data":"6b79608c4369e482b7fe0edc2d6195edbfc42e1a1053e8769dd22231f451136c"} Dec 12 00:42:05 crc kubenswrapper[4606]: I1212 00:42:05.942639 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-75944c9b7-z2b8c" event={"ID":"f289c6b8-d4dd-40da-ac6a-0249b4a3e9f8","Type":"ContainerStarted","Data":"5e3da91e3dbdd3c74f0f6a50422cc6174e8f44d35d49ce34dd5b56df2f09e1ac"} Dec 12 00:42:05 crc kubenswrapper[4606]: E1212 00:42:05.943108 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:f27e732ec1faee765461bf137d9be81278b2fa39675019a73622755e1e610b6f\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-st6cm" podUID="2a27185b-308d-419c-bc01-26714a1f0394" Dec 12 00:42:05 crc kubenswrapper[4606]: I1212 00:42:05.947817 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-h5tgw" event={"ID":"1d4554d9-9dc1-4d74-b8ea-f4c886c08fde","Type":"ContainerStarted","Data":"458901d1d2236d65e9f091df330da32f6b3b10cf06afeb8caf969ad89d59d6ba"} Dec 12 00:42:05 crc kubenswrapper[4606]: I1212 00:42:05.953806 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-hcqxk" event={"ID":"3b429293-caf6-47e1-9976-01d6fca19c6c","Type":"ContainerStarted","Data":"95586ef56572bf965bb176be442f87815523cae30dd17133aa4c18d0811fb5d2"} Dec 12 00:42:05 crc kubenswrapper[4606]: I1212 00:42:05.974532 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-gtwmt" event={"ID":"616771f5-4be8-4f22-86d8-dcd4a365a311","Type":"ContainerStarted","Data":"3fa2cfd95150859ac4b78069fe53774970aa11a2a8c259154c907ffff415d203"} Dec 12 00:42:05 crc kubenswrapper[4606]: I1212 00:42:05.981139 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-5b5fd79c9c-66jql" event={"ID":"6130b694-1b33-495f-b0af-481805aa4727","Type":"ContainerStarted","Data":"e0fd95eaa0ae0e61be232353b559f92bd5938393aedc955f56a99b6c2cbd1a62"} Dec 12 00:42:05 crc kubenswrapper[4606]: I1212 00:42:05.987212 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-697fb699cf-2c5hc" event={"ID":"93b508cc-be40-4c34-a5ea-81b58893894e","Type":"ContainerStarted","Data":"898934baf63d267cc879bfa781113af78ff02344189dbbc0485c0ec05725dd47"} Dec 12 00:42:05 crc kubenswrapper[4606]: I1212 00:42:05.994942 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-qjxxm"] Dec 12 00:42:06 crc kubenswrapper[4606]: I1212 00:42:05.997658 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-w5849" event={"ID":"7663a2be-d4ba-43d4-bd35-7bf4b969a72d","Type":"ContainerStarted","Data":"a35061b6e9d76cbc15c9e26d0917ddde2be2fc88a895a0f8cf9a0d0438435e58"} Dec 12 00:42:06 crc kubenswrapper[4606]: I1212 00:42:06.007369 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5697bb5779-mzf56" event={"ID":"1c42899f-ae12-4c9b-b012-6ead724854cb","Type":"ContainerStarted","Data":"7a6bc25e35c458ecf9abe460fa31c669c21ff65abbfff974c6e1986a797f6783"} Dec 12 00:42:06 crc kubenswrapper[4606]: I1212 00:42:06.030902 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-bdj9z" event={"ID":"19a5895a-f008-411d-9ac2-6122eb52aa1e","Type":"ContainerStarted","Data":"876950ebd19020ad3c88d79c6648ee1f1c803ced4078f8dd03eb572b23c29b8a"} Dec 12 00:42:06 crc kubenswrapper[4606]: I1212 00:42:06.037470 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-97g95" event={"ID":"a6f8bedd-5eb2-4092-abd9-34f8ccbed690","Type":"ContainerStarted","Data":"5aa4b9aa75b8093df62a61e717c2daba8ca67fbf6901a33c42f310cfc186998f"} Dec 12 00:42:06 crc kubenswrapper[4606]: I1212 00:42:06.043052 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-6lzdf"] Dec 12 00:42:06 crc kubenswrapper[4606]: I1212 00:42:06.045423 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-967d97867-zgnm9" event={"ID":"181d9f8e-1256-417e-ae8b-cc71d7fdc2b7","Type":"ContainerStarted","Data":"0e6bf5217b91b8236a42bf0ecd4f8a0cdf2383d560a0dc50a4b040f0cd8a6959"} Dec 12 00:42:06 crc kubenswrapper[4606]: I1212 00:42:06.060802 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-79c8c4686c-j44wr"] Dec 12 00:42:06 crc kubenswrapper[4606]: I1212 00:42:06.071686 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-wqj4k"] Dec 12 00:42:06 crc kubenswrapper[4606]: I1212 00:42:06.077718 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-pq6pj"] Dec 12 00:42:06 crc kubenswrapper[4606]: E1212 00:42:06.079892 4606 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:424da951f13f1fbe9083215dc9f5088f90676dd813f01fdf3c1a8639b61cbaad,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7zrhb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-79c8c4686c-j44wr_openstack-operators(7f8a5b5c-6158-4f24-8323-2afd6b9b2664): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 12 00:42:06 crc kubenswrapper[4606]: E1212 00:42:06.081999 4606 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7zrhb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-79c8c4686c-j44wr_openstack-operators(7f8a5b5c-6158-4f24-8323-2afd6b9b2664): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 12 00:42:06 crc kubenswrapper[4606]: E1212 00:42:06.083756 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-j44wr" podUID="7f8a5b5c-6158-4f24-8323-2afd6b9b2664" Dec 12 00:42:06 crc kubenswrapper[4606]: E1212 00:42:06.094759 4606 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:d29650b006da97eb9178fcc58f2eb9fead8c2b414fac18f86a3c3a1507488c4f,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-gftt5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-78f8948974-pq6pj_openstack-operators(8d1093f3-e1d5-45be-9682-2f3ccf90eda2): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 12 00:42:06 crc kubenswrapper[4606]: E1212 00:42:06.096987 4606 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-gftt5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-78f8948974-pq6pj_openstack-operators(8d1093f3-e1d5-45be-9682-2f3ccf90eda2): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 12 00:42:06 crc kubenswrapper[4606]: E1212 00:42:06.098260 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/placement-operator-controller-manager-78f8948974-pq6pj" podUID="8d1093f3-e1d5-45be-9682-2f3ccf90eda2" Dec 12 00:42:06 crc kubenswrapper[4606]: E1212 00:42:06.099249 4606 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-6n9kk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-6lzdf_openstack-operators(e1c99848-c685-4782-bb57-71217db4db6c): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 12 00:42:06 crc kubenswrapper[4606]: E1212 00:42:06.100517 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-6lzdf" podUID="e1c99848-c685-4782-bb57-71217db4db6c" Dec 12 00:42:06 crc kubenswrapper[4606]: I1212 00:42:06.764003 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9ec63351-044c-4c07-b021-a2835b2290c8-cert\") pod \"infra-operator-controller-manager-78d48bff9d-mnqs5\" (UID: \"9ec63351-044c-4c07-b021-a2835b2290c8\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-mnqs5" Dec 12 00:42:06 crc kubenswrapper[4606]: E1212 00:42:06.764215 4606 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 12 00:42:06 crc kubenswrapper[4606]: E1212 00:42:06.764441 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9ec63351-044c-4c07-b021-a2835b2290c8-cert podName:9ec63351-044c-4c07-b021-a2835b2290c8 nodeName:}" failed. No retries permitted until 2025-12-12 00:42:10.76442255 +0000 UTC m=+1121.309775416 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9ec63351-044c-4c07-b021-a2835b2290c8-cert") pod "infra-operator-controller-manager-78d48bff9d-mnqs5" (UID: "9ec63351-044c-4c07-b021-a2835b2290c8") : secret "infra-operator-webhook-server-cert" not found Dec 12 00:42:07 crc kubenswrapper[4606]: I1212 00:42:07.063501 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-pq6pj" event={"ID":"8d1093f3-e1d5-45be-9682-2f3ccf90eda2","Type":"ContainerStarted","Data":"dcbcf4484b910f0d2df377da40270c7a0c9cae2517828d218600708fe0637ab8"} Dec 12 00:42:07 crc kubenswrapper[4606]: E1212 00:42:07.070651 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:d29650b006da97eb9178fcc58f2eb9fead8c2b414fac18f86a3c3a1507488c4f\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/placement-operator-controller-manager-78f8948974-pq6pj" podUID="8d1093f3-e1d5-45be-9682-2f3ccf90eda2" Dec 12 00:42:07 crc kubenswrapper[4606]: I1212 00:42:07.071144 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-qjxxm" event={"ID":"d7b0479e-9d3b-48b0-a7dd-6388faf6cfc0","Type":"ContainerStarted","Data":"4de6d1c0838f712853d7d99773de3355234ca41adc6b8b46cad2658df78b7529"} Dec 12 00:42:07 crc kubenswrapper[4606]: I1212 00:42:07.077622 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-wqj4k" event={"ID":"a6d74506-7048-4b2d-ba7f-46e83a508405","Type":"ContainerStarted","Data":"2cceb346945c3f86e6edb25f3d18544f4cf64a8153f201a8fdee5e3baa66f75b"} Dec 12 00:42:07 crc kubenswrapper[4606]: I1212 00:42:07.089723 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-6lzdf" event={"ID":"e1c99848-c685-4782-bb57-71217db4db6c","Type":"ContainerStarted","Data":"908b59d374b842ea5789abc066d1eff9bb3447a04c4b58e57d4b7bd277d3098a"} Dec 12 00:42:07 crc kubenswrapper[4606]: E1212 00:42:07.096384 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-6lzdf" podUID="e1c99848-c685-4782-bb57-71217db4db6c" Dec 12 00:42:07 crc kubenswrapper[4606]: I1212 00:42:07.100189 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-j44wr" event={"ID":"7f8a5b5c-6158-4f24-8323-2afd6b9b2664","Type":"ContainerStarted","Data":"2d0bccfb94e629f891fe4aa45deb1c14be4add2f671514af7c29ad390e64519c"} Dec 12 00:42:07 crc kubenswrapper[4606]: E1212 00:42:07.105002 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:424da951f13f1fbe9083215dc9f5088f90676dd813f01fdf3c1a8639b61cbaad\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-j44wr" podUID="7f8a5b5c-6158-4f24-8323-2afd6b9b2664" Dec 12 00:42:07 crc kubenswrapper[4606]: E1212 00:42:07.105076 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:f27e732ec1faee765461bf137d9be81278b2fa39675019a73622755e1e610b6f\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-st6cm" podUID="2a27185b-308d-419c-bc01-26714a1f0394" Dec 12 00:42:07 crc kubenswrapper[4606]: I1212 00:42:07.271021 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/02def546-751a-46ac-848a-367f0a7f84cb-cert\") pod \"openstack-baremetal-operator-controller-manager-84b575879fkjwr5\" (UID: \"02def546-751a-46ac-848a-367f0a7f84cb\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fkjwr5" Dec 12 00:42:07 crc kubenswrapper[4606]: E1212 00:42:07.271231 4606 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 12 00:42:07 crc kubenswrapper[4606]: E1212 00:42:07.271305 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/02def546-751a-46ac-848a-367f0a7f84cb-cert podName:02def546-751a-46ac-848a-367f0a7f84cb nodeName:}" failed. No retries permitted until 2025-12-12 00:42:11.271285718 +0000 UTC m=+1121.816638584 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/02def546-751a-46ac-848a-367f0a7f84cb-cert") pod "openstack-baremetal-operator-controller-manager-84b575879fkjwr5" (UID: "02def546-751a-46ac-848a-367f0a7f84cb") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 12 00:42:07 crc kubenswrapper[4606]: I1212 00:42:07.777761 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d681c7e6-bef3-4733-875c-45d6b60643e5-metrics-certs\") pod \"openstack-operator-controller-manager-6f7f89d9c9-mfj2g\" (UID: \"d681c7e6-bef3-4733-875c-45d6b60643e5\") " pod="openstack-operators/openstack-operator-controller-manager-6f7f89d9c9-mfj2g" Dec 12 00:42:07 crc kubenswrapper[4606]: E1212 00:42:07.777874 4606 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 12 00:42:07 crc kubenswrapper[4606]: E1212 00:42:07.777954 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d681c7e6-bef3-4733-875c-45d6b60643e5-metrics-certs podName:d681c7e6-bef3-4733-875c-45d6b60643e5 nodeName:}" failed. No retries permitted until 2025-12-12 00:42:11.777940641 +0000 UTC m=+1122.323293507 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d681c7e6-bef3-4733-875c-45d6b60643e5-metrics-certs") pod "openstack-operator-controller-manager-6f7f89d9c9-mfj2g" (UID: "d681c7e6-bef3-4733-875c-45d6b60643e5") : secret "metrics-server-cert" not found Dec 12 00:42:07 crc kubenswrapper[4606]: I1212 00:42:07.778334 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d681c7e6-bef3-4733-875c-45d6b60643e5-webhook-certs\") pod \"openstack-operator-controller-manager-6f7f89d9c9-mfj2g\" (UID: \"d681c7e6-bef3-4733-875c-45d6b60643e5\") " pod="openstack-operators/openstack-operator-controller-manager-6f7f89d9c9-mfj2g" Dec 12 00:42:07 crc kubenswrapper[4606]: E1212 00:42:07.778415 4606 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 12 00:42:07 crc kubenswrapper[4606]: E1212 00:42:07.778447 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d681c7e6-bef3-4733-875c-45d6b60643e5-webhook-certs podName:d681c7e6-bef3-4733-875c-45d6b60643e5 nodeName:}" failed. No retries permitted until 2025-12-12 00:42:11.778439774 +0000 UTC m=+1122.323792640 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/d681c7e6-bef3-4733-875c-45d6b60643e5-webhook-certs") pod "openstack-operator-controller-manager-6f7f89d9c9-mfj2g" (UID: "d681c7e6-bef3-4733-875c-45d6b60643e5") : secret "webhook-server-cert" not found Dec 12 00:42:08 crc kubenswrapper[4606]: E1212 00:42:08.123813 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-6lzdf" podUID="e1c99848-c685-4782-bb57-71217db4db6c" Dec 12 00:42:08 crc kubenswrapper[4606]: E1212 00:42:08.125750 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:d29650b006da97eb9178fcc58f2eb9fead8c2b414fac18f86a3c3a1507488c4f\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/placement-operator-controller-manager-78f8948974-pq6pj" podUID="8d1093f3-e1d5-45be-9682-2f3ccf90eda2" Dec 12 00:42:08 crc kubenswrapper[4606]: E1212 00:42:08.128380 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:424da951f13f1fbe9083215dc9f5088f90676dd813f01fdf3c1a8639b61cbaad\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-j44wr" podUID="7f8a5b5c-6158-4f24-8323-2afd6b9b2664" Dec 12 00:42:10 crc kubenswrapper[4606]: I1212 00:42:10.817426 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9ec63351-044c-4c07-b021-a2835b2290c8-cert\") pod \"infra-operator-controller-manager-78d48bff9d-mnqs5\" (UID: \"9ec63351-044c-4c07-b021-a2835b2290c8\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-mnqs5" Dec 12 00:42:10 crc kubenswrapper[4606]: E1212 00:42:10.817756 4606 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 12 00:42:10 crc kubenswrapper[4606]: E1212 00:42:10.818051 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9ec63351-044c-4c07-b021-a2835b2290c8-cert podName:9ec63351-044c-4c07-b021-a2835b2290c8 nodeName:}" failed. No retries permitted until 2025-12-12 00:42:18.81802986 +0000 UTC m=+1129.363382736 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9ec63351-044c-4c07-b021-a2835b2290c8-cert") pod "infra-operator-controller-manager-78d48bff9d-mnqs5" (UID: "9ec63351-044c-4c07-b021-a2835b2290c8") : secret "infra-operator-webhook-server-cert" not found Dec 12 00:42:11 crc kubenswrapper[4606]: I1212 00:42:11.323410 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/02def546-751a-46ac-848a-367f0a7f84cb-cert\") pod \"openstack-baremetal-operator-controller-manager-84b575879fkjwr5\" (UID: \"02def546-751a-46ac-848a-367f0a7f84cb\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fkjwr5" Dec 12 00:42:11 crc kubenswrapper[4606]: E1212 00:42:11.323669 4606 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 12 00:42:11 crc kubenswrapper[4606]: E1212 00:42:11.323828 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/02def546-751a-46ac-848a-367f0a7f84cb-cert podName:02def546-751a-46ac-848a-367f0a7f84cb nodeName:}" failed. No retries permitted until 2025-12-12 00:42:19.323789009 +0000 UTC m=+1129.869141925 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/02def546-751a-46ac-848a-367f0a7f84cb-cert") pod "openstack-baremetal-operator-controller-manager-84b575879fkjwr5" (UID: "02def546-751a-46ac-848a-367f0a7f84cb") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 12 00:42:11 crc kubenswrapper[4606]: I1212 00:42:11.830599 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d681c7e6-bef3-4733-875c-45d6b60643e5-metrics-certs\") pod \"openstack-operator-controller-manager-6f7f89d9c9-mfj2g\" (UID: \"d681c7e6-bef3-4733-875c-45d6b60643e5\") " pod="openstack-operators/openstack-operator-controller-manager-6f7f89d9c9-mfj2g" Dec 12 00:42:11 crc kubenswrapper[4606]: I1212 00:42:11.830752 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d681c7e6-bef3-4733-875c-45d6b60643e5-webhook-certs\") pod \"openstack-operator-controller-manager-6f7f89d9c9-mfj2g\" (UID: \"d681c7e6-bef3-4733-875c-45d6b60643e5\") " pod="openstack-operators/openstack-operator-controller-manager-6f7f89d9c9-mfj2g" Dec 12 00:42:11 crc kubenswrapper[4606]: E1212 00:42:11.830820 4606 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 12 00:42:11 crc kubenswrapper[4606]: E1212 00:42:11.830938 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d681c7e6-bef3-4733-875c-45d6b60643e5-metrics-certs podName:d681c7e6-bef3-4733-875c-45d6b60643e5 nodeName:}" failed. No retries permitted until 2025-12-12 00:42:19.830906384 +0000 UTC m=+1130.376259290 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d681c7e6-bef3-4733-875c-45d6b60643e5-metrics-certs") pod "openstack-operator-controller-manager-6f7f89d9c9-mfj2g" (UID: "d681c7e6-bef3-4733-875c-45d6b60643e5") : secret "metrics-server-cert" not found Dec 12 00:42:11 crc kubenswrapper[4606]: E1212 00:42:11.831215 4606 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 12 00:42:11 crc kubenswrapper[4606]: E1212 00:42:11.831303 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d681c7e6-bef3-4733-875c-45d6b60643e5-webhook-certs podName:d681c7e6-bef3-4733-875c-45d6b60643e5 nodeName:}" failed. No retries permitted until 2025-12-12 00:42:19.831273034 +0000 UTC m=+1130.376625960 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/d681c7e6-bef3-4733-875c-45d6b60643e5-webhook-certs") pod "openstack-operator-controller-manager-6f7f89d9c9-mfj2g" (UID: "d681c7e6-bef3-4733-875c-45d6b60643e5") : secret "webhook-server-cert" not found Dec 12 00:42:18 crc kubenswrapper[4606]: I1212 00:42:18.833462 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9ec63351-044c-4c07-b021-a2835b2290c8-cert\") pod \"infra-operator-controller-manager-78d48bff9d-mnqs5\" (UID: \"9ec63351-044c-4c07-b021-a2835b2290c8\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-mnqs5" Dec 12 00:42:18 crc kubenswrapper[4606]: I1212 00:42:18.838827 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9ec63351-044c-4c07-b021-a2835b2290c8-cert\") pod \"infra-operator-controller-manager-78d48bff9d-mnqs5\" (UID: \"9ec63351-044c-4c07-b021-a2835b2290c8\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-mnqs5" Dec 12 00:42:18 crc kubenswrapper[4606]: I1212 00:42:18.871018 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-mnqs5" Dec 12 00:42:19 crc kubenswrapper[4606]: I1212 00:42:19.341724 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/02def546-751a-46ac-848a-367f0a7f84cb-cert\") pod \"openstack-baremetal-operator-controller-manager-84b575879fkjwr5\" (UID: \"02def546-751a-46ac-848a-367f0a7f84cb\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fkjwr5" Dec 12 00:42:19 crc kubenswrapper[4606]: E1212 00:42:19.341920 4606 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 12 00:42:19 crc kubenswrapper[4606]: E1212 00:42:19.342273 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/02def546-751a-46ac-848a-367f0a7f84cb-cert podName:02def546-751a-46ac-848a-367f0a7f84cb nodeName:}" failed. No retries permitted until 2025-12-12 00:42:35.342250966 +0000 UTC m=+1145.887603842 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/02def546-751a-46ac-848a-367f0a7f84cb-cert") pod "openstack-baremetal-operator-controller-manager-84b575879fkjwr5" (UID: "02def546-751a-46ac-848a-367f0a7f84cb") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 12 00:42:19 crc kubenswrapper[4606]: I1212 00:42:19.848821 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d681c7e6-bef3-4733-875c-45d6b60643e5-metrics-certs\") pod \"openstack-operator-controller-manager-6f7f89d9c9-mfj2g\" (UID: \"d681c7e6-bef3-4733-875c-45d6b60643e5\") " pod="openstack-operators/openstack-operator-controller-manager-6f7f89d9c9-mfj2g" Dec 12 00:42:19 crc kubenswrapper[4606]: I1212 00:42:19.848868 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d681c7e6-bef3-4733-875c-45d6b60643e5-webhook-certs\") pod \"openstack-operator-controller-manager-6f7f89d9c9-mfj2g\" (UID: \"d681c7e6-bef3-4733-875c-45d6b60643e5\") " pod="openstack-operators/openstack-operator-controller-manager-6f7f89d9c9-mfj2g" Dec 12 00:42:19 crc kubenswrapper[4606]: E1212 00:42:19.849003 4606 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 12 00:42:19 crc kubenswrapper[4606]: E1212 00:42:19.849004 4606 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 12 00:42:19 crc kubenswrapper[4606]: E1212 00:42:19.849048 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d681c7e6-bef3-4733-875c-45d6b60643e5-webhook-certs podName:d681c7e6-bef3-4733-875c-45d6b60643e5 nodeName:}" failed. No retries permitted until 2025-12-12 00:42:35.849034752 +0000 UTC m=+1146.394387618 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/d681c7e6-bef3-4733-875c-45d6b60643e5-webhook-certs") pod "openstack-operator-controller-manager-6f7f89d9c9-mfj2g" (UID: "d681c7e6-bef3-4733-875c-45d6b60643e5") : secret "webhook-server-cert" not found Dec 12 00:42:19 crc kubenswrapper[4606]: E1212 00:42:19.849059 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d681c7e6-bef3-4733-875c-45d6b60643e5-metrics-certs podName:d681c7e6-bef3-4733-875c-45d6b60643e5 nodeName:}" failed. No retries permitted until 2025-12-12 00:42:35.849054583 +0000 UTC m=+1146.394407449 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d681c7e6-bef3-4733-875c-45d6b60643e5-metrics-certs") pod "openstack-operator-controller-manager-6f7f89d9c9-mfj2g" (UID: "d681c7e6-bef3-4733-875c-45d6b60643e5") : secret "metrics-server-cert" not found Dec 12 00:42:20 crc kubenswrapper[4606]: E1212 00:42:20.404989 4606 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94" Dec 12 00:42:20 crc kubenswrapper[4606]: E1212 00:42:20.405199 4606 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hshnf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5854674fcc-qjxxm_openstack-operators(d7b0479e-9d3b-48b0-a7dd-6388faf6cfc0): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 12 00:42:20 crc kubenswrapper[4606]: E1212 00:42:20.967534 4606 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/neutron-operator@sha256:0b3fb69f35c151895d3dffd514974a9f9fe1c77c3bca69b78b81efb183cf4557" Dec 12 00:42:20 crc kubenswrapper[4606]: E1212 00:42:20.967743 4606 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:0b3fb69f35c151895d3dffd514974a9f9fe1c77c3bca69b78b81efb183cf4557,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-85zqv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-5fdfd5b6b5-h5tgw_openstack-operators(1d4554d9-9dc1-4d74-b8ea-f4c886c08fde): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 12 00:42:21 crc kubenswrapper[4606]: E1212 00:42:21.664947 4606 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/heat-operator@sha256:c4abfc148600dfa85915f3dc911d988ea2335f26cb6b8d749fe79bfe53e5e429" Dec 12 00:42:21 crc kubenswrapper[4606]: E1212 00:42:21.665120 4606 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/heat-operator@sha256:c4abfc148600dfa85915f3dc911d988ea2335f26cb6b8d749fe79bfe53e5e429,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-cvktc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-operator-controller-manager-5f64f6f8bb-n54mf_openstack-operators(1d9582d9-c931-4b43-8431-407d6c98cbc1): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 12 00:42:22 crc kubenswrapper[4606]: E1212 00:42:22.524841 4606 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/octavia-operator@sha256:d9a3694865a7d54ee96397add18c3898886e98d079aa20876a0f4de1fa7a7168" Dec 12 00:42:22 crc kubenswrapper[4606]: E1212 00:42:22.525058 4606 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:d9a3694865a7d54ee96397add18c3898886e98d079aa20876a0f4de1fa7a7168,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-86j47,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-998648c74-gtwmt_openstack-operators(616771f5-4be8-4f22-86d8-dcd4a365a311): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 12 00:42:24 crc kubenswrapper[4606]: E1212 00:42:24.408739 4606 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/swift-operator@sha256:3aa109bb973253ae9dcf339b9b65abbd1176cdb4be672c93e538a5f113816991" Dec 12 00:42:24 crc kubenswrapper[4606]: E1212 00:42:24.409535 4606 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:3aa109bb973253ae9dcf339b9b65abbd1176cdb4be672c93e538a5f113816991,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qk6md,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-9d58d64bc-g4nxc_openstack-operators(0b3e1e95-7581-4453-af8b-6a23e4bba5fe): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 12 00:42:29 crc kubenswrapper[4606]: E1212 00:42:29.258691 4606 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/designate-operator@sha256:900050d3501c0785b227db34b89883efe68247816e5c7427cacb74f8aa10605a" Dec 12 00:42:29 crc kubenswrapper[4606]: E1212 00:42:29.259351 4606 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/designate-operator@sha256:900050d3501c0785b227db34b89883efe68247816e5c7427cacb74f8aa10605a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-rjvqk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod designate-operator-controller-manager-697fb699cf-2c5hc_openstack-operators(93b508cc-be40-4c34-a5ea-81b58893894e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 12 00:42:29 crc kubenswrapper[4606]: I1212 00:42:29.268223 4606 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 12 00:42:29 crc kubenswrapper[4606]: E1212 00:42:29.936718 4606 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/watcher-operator@sha256:961417d59f527d925ac48ff6a11de747d0493315e496e34dc83d76a1a1fff58a" Dec 12 00:42:29 crc kubenswrapper[4606]: E1212 00:42:29.937126 4606 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:961417d59f527d925ac48ff6a11de747d0493315e496e34dc83d76a1a1fff58a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-f9lz9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-75944c9b7-z2b8c_openstack-operators(f289c6b8-d4dd-40da-ac6a-0249b4a3e9f8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 12 00:42:31 crc kubenswrapper[4606]: E1212 00:42:31.455759 4606 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/glance-operator@sha256:5370dc4a8e776923eec00bb50cbdb2e390e9dde50be26bdc04a216bd2d6b5027" Dec 12 00:42:31 crc kubenswrapper[4606]: E1212 00:42:31.456102 4606 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/glance-operator@sha256:5370dc4a8e776923eec00bb50cbdb2e390e9dde50be26bdc04a216bd2d6b5027,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-j5ljj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-operator-controller-manager-5697bb5779-mzf56_openstack-operators(1c42899f-ae12-4c9b-b012-6ead724854cb): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 12 00:42:32 crc kubenswrapper[4606]: E1212 00:42:32.004466 4606 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/manila-operator@sha256:44126f9c6b1d2bf752ddf989e20a4fc4cc1c07723d4fcb78465ccb2f55da6b3a" Dec 12 00:42:32 crc kubenswrapper[4606]: E1212 00:42:32.005012 4606 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/manila-operator@sha256:44126f9c6b1d2bf752ddf989e20a4fc4cc1c07723d4fcb78465ccb2f55da6b3a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wdxqc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-5b5fd79c9c-66jql_openstack-operators(6130b694-1b33-495f-b0af-481805aa4727): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 12 00:42:34 crc kubenswrapper[4606]: E1212 00:42:34.089710 4606 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59" Dec 12 00:42:34 crc kubenswrapper[4606]: E1212 00:42:34.090286 4606 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-c2ncs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-b6456fdb6-hcqxk_openstack-operators(3b429293-caf6-47e1-9976-01d6fca19c6c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 12 00:42:34 crc kubenswrapper[4606]: E1212 00:42:34.626329 4606 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:72ad6517987f674af0d0ae092cbb874aeae909c8b8b60188099c311762ebc8f7" Dec 12 00:42:34 crc kubenswrapper[4606]: E1212 00:42:34.626620 4606 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:72ad6517987f674af0d0ae092cbb874aeae909c8b8b60188099c311762ebc8f7,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-g49p7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-7765d96ddf-bdj9z_openstack-operators(19a5895a-f008-411d-9ac2-6122eb52aa1e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 12 00:42:35 crc kubenswrapper[4606]: I1212 00:42:35.399955 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/02def546-751a-46ac-848a-367f0a7f84cb-cert\") pod \"openstack-baremetal-operator-controller-manager-84b575879fkjwr5\" (UID: \"02def546-751a-46ac-848a-367f0a7f84cb\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fkjwr5" Dec 12 00:42:35 crc kubenswrapper[4606]: I1212 00:42:35.407791 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/02def546-751a-46ac-848a-367f0a7f84cb-cert\") pod \"openstack-baremetal-operator-controller-manager-84b575879fkjwr5\" (UID: \"02def546-751a-46ac-848a-367f0a7f84cb\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fkjwr5" Dec 12 00:42:35 crc kubenswrapper[4606]: I1212 00:42:35.621794 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-bmrpv" Dec 12 00:42:35 crc kubenswrapper[4606]: I1212 00:42:35.630364 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fkjwr5" Dec 12 00:42:35 crc kubenswrapper[4606]: I1212 00:42:35.909533 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d681c7e6-bef3-4733-875c-45d6b60643e5-webhook-certs\") pod \"openstack-operator-controller-manager-6f7f89d9c9-mfj2g\" (UID: \"d681c7e6-bef3-4733-875c-45d6b60643e5\") " pod="openstack-operators/openstack-operator-controller-manager-6f7f89d9c9-mfj2g" Dec 12 00:42:35 crc kubenswrapper[4606]: I1212 00:42:35.909670 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d681c7e6-bef3-4733-875c-45d6b60643e5-metrics-certs\") pod \"openstack-operator-controller-manager-6f7f89d9c9-mfj2g\" (UID: \"d681c7e6-bef3-4733-875c-45d6b60643e5\") " pod="openstack-operators/openstack-operator-controller-manager-6f7f89d9c9-mfj2g" Dec 12 00:42:35 crc kubenswrapper[4606]: I1212 00:42:35.914774 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d681c7e6-bef3-4733-875c-45d6b60643e5-webhook-certs\") pod \"openstack-operator-controller-manager-6f7f89d9c9-mfj2g\" (UID: \"d681c7e6-bef3-4733-875c-45d6b60643e5\") " pod="openstack-operators/openstack-operator-controller-manager-6f7f89d9c9-mfj2g" Dec 12 00:42:35 crc kubenswrapper[4606]: I1212 00:42:35.928792 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d681c7e6-bef3-4733-875c-45d6b60643e5-metrics-certs\") pod \"openstack-operator-controller-manager-6f7f89d9c9-mfj2g\" (UID: \"d681c7e6-bef3-4733-875c-45d6b60643e5\") " pod="openstack-operators/openstack-operator-controller-manager-6f7f89d9c9-mfj2g" Dec 12 00:42:35 crc kubenswrapper[4606]: I1212 00:42:35.977521 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-tt78v" Dec 12 00:42:35 crc kubenswrapper[4606]: I1212 00:42:35.986112 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-6f7f89d9c9-mfj2g" Dec 12 00:42:36 crc kubenswrapper[4606]: E1212 00:42:36.239121 4606 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670" Dec 12 00:42:36 crc kubenswrapper[4606]: E1212 00:42:36.239520 4606 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-jl2cj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-697bc559fc-wqj4k_openstack-operators(a6d74506-7048-4b2d-ba7f-46e83a508405): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 12 00:42:40 crc kubenswrapper[4606]: E1212 00:42:40.551633 4606 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/mariadb-operator@sha256:424da951f13f1fbe9083215dc9f5088f90676dd813f01fdf3c1a8639b61cbaad" Dec 12 00:42:40 crc kubenswrapper[4606]: E1212 00:42:40.552827 4606 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:424da951f13f1fbe9083215dc9f5088f90676dd813f01fdf3c1a8639b61cbaad,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7zrhb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-79c8c4686c-j44wr_openstack-operators(7f8a5b5c-6158-4f24-8323-2afd6b9b2664): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 12 00:42:41 crc kubenswrapper[4606]: E1212 00:42:41.151750 4606 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2" Dec 12 00:42:41 crc kubenswrapper[4606]: E1212 00:42:41.151932 4606 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-6n9kk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-6lzdf_openstack-operators(e1c99848-c685-4782-bb57-71217db4db6c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 12 00:42:41 crc kubenswrapper[4606]: E1212 00:42:41.153110 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-6lzdf" podUID="e1c99848-c685-4782-bb57-71217db4db6c" Dec 12 00:42:41 crc kubenswrapper[4606]: I1212 00:42:41.618580 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-78d48bff9d-mnqs5"] Dec 12 00:42:42 crc kubenswrapper[4606]: I1212 00:42:42.016623 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6f7f89d9c9-mfj2g"] Dec 12 00:42:42 crc kubenswrapper[4606]: I1212 00:42:42.056546 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fkjwr5"] Dec 12 00:42:42 crc kubenswrapper[4606]: I1212 00:42:42.334968 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6f7f89d9c9-mfj2g" event={"ID":"d681c7e6-bef3-4733-875c-45d6b60643e5","Type":"ContainerStarted","Data":"ad7f65cfcde5757ac5f26c9f3a2047a65238962ec24e35d9c6b703d6b7cb16bb"} Dec 12 00:42:42 crc kubenswrapper[4606]: I1212 00:42:42.337553 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-967d97867-zgnm9" event={"ID":"181d9f8e-1256-417e-ae8b-cc71d7fdc2b7","Type":"ContainerStarted","Data":"dd66ae5b135ae6c96fd88bb8968b519009ad89dc3857fef6c03a57e3fa39c2b3"} Dec 12 00:42:42 crc kubenswrapper[4606]: I1212 00:42:42.339960 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fkjwr5" event={"ID":"02def546-751a-46ac-848a-367f0a7f84cb","Type":"ContainerStarted","Data":"c2946dcf92f6234b1723a0d7eeae016b76e52a5062c2410538918a17b414452f"} Dec 12 00:42:42 crc kubenswrapper[4606]: I1212 00:42:42.342227 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-pq6pj" event={"ID":"8d1093f3-e1d5-45be-9682-2f3ccf90eda2","Type":"ContainerStarted","Data":"38110d110713f4ec1ef02c5cbcc7f86c2af550d941e96d83a03e2d9099b72ba3"} Dec 12 00:42:42 crc kubenswrapper[4606]: I1212 00:42:42.344752 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-mnqs5" event={"ID":"9ec63351-044c-4c07-b021-a2835b2290c8","Type":"ContainerStarted","Data":"27b9cf331cdf19c101db89b1c6d435ca82cf5d1046b8c8bad4d8bd2fab358a5d"} Dec 12 00:42:42 crc kubenswrapper[4606]: I1212 00:42:42.346748 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-w5849" event={"ID":"7663a2be-d4ba-43d4-bd35-7bf4b969a72d","Type":"ContainerStarted","Data":"8cc74121f63d92d36883c0a2991ac4fc0e97f6db7493765f14016ce4953abaa7"} Dec 12 00:42:42 crc kubenswrapper[4606]: I1212 00:42:42.359324 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-97g95" event={"ID":"a6f8bedd-5eb2-4092-abd9-34f8ccbed690","Type":"ContainerStarted","Data":"3c8f83f4d39a27da381e0134a54207dcf036dae48c36ad7f77f76cd823126e83"} Dec 12 00:42:43 crc kubenswrapper[4606]: I1212 00:42:43.404504 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-9npjw" event={"ID":"b5316be9-1796-4bf0-aabf-ac9cf01c709b","Type":"ContainerStarted","Data":"d82fa00006908d5f5fb41cb617012d5b783e7211256e1b478e096f3da80b7cef"} Dec 12 00:42:45 crc kubenswrapper[4606]: I1212 00:42:45.430655 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-st6cm" event={"ID":"2a27185b-308d-419c-bc01-26714a1f0394","Type":"ContainerStarted","Data":"b99185df9a9d84176b8f609514916bc2e2a5769e15e8a0bace5abc52881fd240"} Dec 12 00:42:45 crc kubenswrapper[4606]: I1212 00:42:45.432907 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6f7f89d9c9-mfj2g" event={"ID":"d681c7e6-bef3-4733-875c-45d6b60643e5","Type":"ContainerStarted","Data":"1d2e517bb4dceb76692b26fa6285ded2603e79612f61de2b5a3b3db99b3b4ddb"} Dec 12 00:42:45 crc kubenswrapper[4606]: I1212 00:42:45.433560 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-6f7f89d9c9-mfj2g" Dec 12 00:42:45 crc kubenswrapper[4606]: I1212 00:42:45.458587 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-6f7f89d9c9-mfj2g" podStartSLOduration=42.458567425 podStartE2EDuration="42.458567425s" podCreationTimestamp="2025-12-12 00:42:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:42:45.456068728 +0000 UTC m=+1156.001421634" watchObservedRunningTime="2025-12-12 00:42:45.458567425 +0000 UTC m=+1156.003920291" Dec 12 00:42:46 crc kubenswrapper[4606]: E1212 00:42:46.337975 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/designate-operator-controller-manager-697fb699cf-2c5hc" podUID="93b508cc-be40-4c34-a5ea-81b58893894e" Dec 12 00:42:46 crc kubenswrapper[4606]: E1212 00:42:46.350943 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-h5tgw" podUID="1d4554d9-9dc1-4d74-b8ea-f4c886c08fde" Dec 12 00:42:46 crc kubenswrapper[4606]: E1212 00:42:46.371254 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/octavia-operator-controller-manager-998648c74-gtwmt" podUID="616771f5-4be8-4f22-86d8-dcd4a365a311" Dec 12 00:42:46 crc kubenswrapper[4606]: I1212 00:42:46.442826 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-697fb699cf-2c5hc" event={"ID":"93b508cc-be40-4c34-a5ea-81b58893894e","Type":"ContainerStarted","Data":"7181393694f90edd1cfb0c64c4c2903a53b94e30faedabf2cd9e84c4e60d2e56"} Dec 12 00:42:46 crc kubenswrapper[4606]: I1212 00:42:46.456588 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-h5tgw" event={"ID":"1d4554d9-9dc1-4d74-b8ea-f4c886c08fde","Type":"ContainerStarted","Data":"8fa6aed3c90d4a8e25d40c17349e18afb9ed91dc66488a8446001abad99fa032"} Dec 12 00:42:46 crc kubenswrapper[4606]: I1212 00:42:46.476254 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-gtwmt" event={"ID":"616771f5-4be8-4f22-86d8-dcd4a365a311","Type":"ContainerStarted","Data":"9dfcc950ae431034e4900ea633e4a00bbc17f3e2ef74bbe22710bbbe756bffda"} Dec 12 00:42:46 crc kubenswrapper[4606]: E1212 00:42:46.514656 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/manila-operator-controller-manager-5b5fd79c9c-66jql" podUID="6130b694-1b33-495f-b0af-481805aa4727" Dec 12 00:42:46 crc kubenswrapper[4606]: E1212 00:42:46.731288 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/glance-operator-controller-manager-5697bb5779-mzf56" podUID="1c42899f-ae12-4c9b-b012-6ead724854cb" Dec 12 00:42:46 crc kubenswrapper[4606]: E1212 00:42:46.731785 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-g4nxc" podUID="0b3e1e95-7581-4453-af8b-6a23e4bba5fe" Dec 12 00:42:46 crc kubenswrapper[4606]: E1212 00:42:46.824052 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/watcher-operator-controller-manager-75944c9b7-z2b8c" podUID="f289c6b8-d4dd-40da-ac6a-0249b4a3e9f8" Dec 12 00:42:46 crc kubenswrapper[4606]: E1212 00:42:46.872791 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-hcqxk" podUID="3b429293-caf6-47e1-9976-01d6fca19c6c" Dec 12 00:42:46 crc kubenswrapper[4606]: E1212 00:42:46.939556 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-n54mf" podUID="1d9582d9-c931-4b43-8431-407d6c98cbc1" Dec 12 00:42:47 crc kubenswrapper[4606]: E1212 00:42:47.045348 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-wqj4k" podUID="a6d74506-7048-4b2d-ba7f-46e83a508405" Dec 12 00:42:47 crc kubenswrapper[4606]: E1212 00:42:47.122547 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-bdj9z" podUID="19a5895a-f008-411d-9ac2-6122eb52aa1e" Dec 12 00:42:47 crc kubenswrapper[4606]: E1212 00:42:47.130382 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/test-operator-controller-manager-5854674fcc-qjxxm" podUID="d7b0479e-9d3b-48b0-a7dd-6388faf6cfc0" Dec 12 00:42:47 crc kubenswrapper[4606]: I1212 00:42:47.525577 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5697bb5779-mzf56" event={"ID":"1c42899f-ae12-4c9b-b012-6ead724854cb","Type":"ContainerStarted","Data":"ab7b943ddd91065d9801070a565170270e03c04b7f20e7947fa724f576c2acdb"} Dec 12 00:42:47 crc kubenswrapper[4606]: I1212 00:42:47.535496 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-hcqxk" event={"ID":"3b429293-caf6-47e1-9976-01d6fca19c6c","Type":"ContainerStarted","Data":"19355901358a3b3cd8d4f5f03998d7a4f1cdb405ff2176cdca43aa1d9ac0d1e4"} Dec 12 00:42:47 crc kubenswrapper[4606]: I1212 00:42:47.571807 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-wqj4k" event={"ID":"a6d74506-7048-4b2d-ba7f-46e83a508405","Type":"ContainerStarted","Data":"f7b998e3acf43b0e7d615039e6a1b5ea837aab26fc111b24b8617af90bcc3fe5"} Dec 12 00:42:47 crc kubenswrapper[4606]: I1212 00:42:47.582997 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-n54mf" event={"ID":"1d9582d9-c931-4b43-8431-407d6c98cbc1","Type":"ContainerStarted","Data":"a2cd2f9d9179c073a51d06a2f8e595ddf6ac56409014b2ded4e43715e7aeb084"} Dec 12 00:42:47 crc kubenswrapper[4606]: I1212 00:42:47.610328 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-st6cm" event={"ID":"2a27185b-308d-419c-bc01-26714a1f0394","Type":"ContainerStarted","Data":"d3af539a13a00830defed98da61bf27c2b9107f94454e75f72b23b30812b288c"} Dec 12 00:42:47 crc kubenswrapper[4606]: I1212 00:42:47.610987 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-st6cm" Dec 12 00:42:47 crc kubenswrapper[4606]: I1212 00:42:47.620046 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-75944c9b7-z2b8c" event={"ID":"f289c6b8-d4dd-40da-ac6a-0249b4a3e9f8","Type":"ContainerStarted","Data":"f7f7a697de1e25f9e9adb8f30f516f650b7c0e3856c5c5ad0265d790dcf958ed"} Dec 12 00:42:47 crc kubenswrapper[4606]: I1212 00:42:47.633698 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-78f8948974-pq6pj" Dec 12 00:42:47 crc kubenswrapper[4606]: I1212 00:42:47.637352 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-g4nxc" event={"ID":"0b3e1e95-7581-4453-af8b-6a23e4bba5fe","Type":"ContainerStarted","Data":"93c12da27969364effdb58fbae809eeaec0cd85aa0bc8f007821259b9180e34b"} Dec 12 00:42:47 crc kubenswrapper[4606]: I1212 00:42:47.647433 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-78f8948974-pq6pj" Dec 12 00:42:47 crc kubenswrapper[4606]: I1212 00:42:47.657619 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-qjxxm" event={"ID":"d7b0479e-9d3b-48b0-a7dd-6388faf6cfc0","Type":"ContainerStarted","Data":"0e10f13196e24db7b433ccb04f5c66a6d924197aa7136d1ce3a76da01143182b"} Dec 12 00:42:47 crc kubenswrapper[4606]: I1212 00:42:47.665911 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-9npjw" event={"ID":"b5316be9-1796-4bf0-aabf-ac9cf01c709b","Type":"ContainerStarted","Data":"75cd567127ff635221bd161786e511cb5388ca881b5807860ca45d00cbb68048"} Dec 12 00:42:47 crc kubenswrapper[4606]: I1212 00:42:47.667271 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-9npjw" Dec 12 00:42:47 crc kubenswrapper[4606]: I1212 00:42:47.674880 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-9npjw" Dec 12 00:42:47 crc kubenswrapper[4606]: I1212 00:42:47.695940 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-bdj9z" event={"ID":"19a5895a-f008-411d-9ac2-6122eb52aa1e","Type":"ContainerStarted","Data":"5d7c4f3eb1c81248745daec173dbd29a27beb82e4ca66abd1cae14b22dae13b9"} Dec 12 00:42:47 crc kubenswrapper[4606]: I1212 00:42:47.697491 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-78f8948974-pq6pj" podStartSLOduration=3.689711192 podStartE2EDuration="44.697480369s" podCreationTimestamp="2025-12-12 00:42:03 +0000 UTC" firstStartedPulling="2025-12-12 00:42:06.094643624 +0000 UTC m=+1116.639996490" lastFinishedPulling="2025-12-12 00:42:47.102412801 +0000 UTC m=+1157.647765667" observedRunningTime="2025-12-12 00:42:47.696449641 +0000 UTC m=+1158.241802507" watchObservedRunningTime="2025-12-12 00:42:47.697480369 +0000 UTC m=+1158.242833225" Dec 12 00:42:47 crc kubenswrapper[4606]: I1212 00:42:47.730643 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-967d97867-zgnm9" event={"ID":"181d9f8e-1256-417e-ae8b-cc71d7fdc2b7","Type":"ContainerStarted","Data":"0a5ff872f7a96a0b31c86425b8e8a27e25d364fef2e0fcc869e3d11a25682888"} Dec 12 00:42:47 crc kubenswrapper[4606]: I1212 00:42:47.732581 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-967d97867-zgnm9" Dec 12 00:42:47 crc kubenswrapper[4606]: I1212 00:42:47.742489 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-9npjw" podStartSLOduration=3.620881896 podStartE2EDuration="45.742467988s" podCreationTimestamp="2025-12-12 00:42:02 +0000 UTC" firstStartedPulling="2025-12-12 00:42:04.214022467 +0000 UTC m=+1114.759375333" lastFinishedPulling="2025-12-12 00:42:46.335608559 +0000 UTC m=+1156.880961425" observedRunningTime="2025-12-12 00:42:47.732845979 +0000 UTC m=+1158.278198845" watchObservedRunningTime="2025-12-12 00:42:47.742467988 +0000 UTC m=+1158.287820854" Dec 12 00:42:47 crc kubenswrapper[4606]: I1212 00:42:47.759993 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-5b5fd79c9c-66jql" event={"ID":"6130b694-1b33-495f-b0af-481805aa4727","Type":"ContainerStarted","Data":"c4a34ee838e2819d00a13f45692ecb45c7e02f5ed0ee1022bafeb9573a8e1d77"} Dec 12 00:42:47 crc kubenswrapper[4606]: I1212 00:42:47.804573 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-967d97867-zgnm9" Dec 12 00:42:47 crc kubenswrapper[4606]: I1212 00:42:47.813425 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-st6cm" podStartSLOduration=3.838333256 podStartE2EDuration="44.813407574s" podCreationTimestamp="2025-12-12 00:42:03 +0000 UTC" firstStartedPulling="2025-12-12 00:42:05.83876513 +0000 UTC m=+1116.384117996" lastFinishedPulling="2025-12-12 00:42:46.813839448 +0000 UTC m=+1157.359192314" observedRunningTime="2025-12-12 00:42:47.810016103 +0000 UTC m=+1158.355368989" watchObservedRunningTime="2025-12-12 00:42:47.813407574 +0000 UTC m=+1158.358760440" Dec 12 00:42:48 crc kubenswrapper[4606]: I1212 00:42:48.089159 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-967d97867-zgnm9" podStartSLOduration=4.652905805 podStartE2EDuration="46.089134492s" podCreationTimestamp="2025-12-12 00:42:02 +0000 UTC" firstStartedPulling="2025-12-12 00:42:05.47436068 +0000 UTC m=+1116.019713546" lastFinishedPulling="2025-12-12 00:42:46.910589367 +0000 UTC m=+1157.455942233" observedRunningTime="2025-12-12 00:42:48.066913375 +0000 UTC m=+1158.612266241" watchObservedRunningTime="2025-12-12 00:42:48.089134492 +0000 UTC m=+1158.634487358" Dec 12 00:42:48 crc kubenswrapper[4606]: I1212 00:42:48.779138 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-gtwmt" event={"ID":"616771f5-4be8-4f22-86d8-dcd4a365a311","Type":"ContainerStarted","Data":"0366727f5154d2e4b75318417202a1e05aa8ab7d46c9da881b0500fc59b78f2d"} Dec 12 00:42:48 crc kubenswrapper[4606]: I1212 00:42:48.780465 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-998648c74-gtwmt" Dec 12 00:42:48 crc kubenswrapper[4606]: I1212 00:42:48.788357 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-697fb699cf-2c5hc" event={"ID":"93b508cc-be40-4c34-a5ea-81b58893894e","Type":"ContainerStarted","Data":"052b160c05682b5a87bbf90d1d89d8da27d63829ec0e1f85ca5131caa726d16c"} Dec 12 00:42:48 crc kubenswrapper[4606]: I1212 00:42:48.788418 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-697fb699cf-2c5hc" Dec 12 00:42:48 crc kubenswrapper[4606]: I1212 00:42:48.811419 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-pq6pj" event={"ID":"8d1093f3-e1d5-45be-9682-2f3ccf90eda2","Type":"ContainerStarted","Data":"625a2581152349a3e764de0507e2680d0c29aaf5b220d9b9d2b5a59b4c55662a"} Dec 12 00:42:48 crc kubenswrapper[4606]: I1212 00:42:48.829914 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-h5tgw" event={"ID":"1d4554d9-9dc1-4d74-b8ea-f4c886c08fde","Type":"ContainerStarted","Data":"298efabb9336b5cf192fc188f170e1fbcff8a4b6579d89aaae5ef2b369281cb9"} Dec 12 00:42:48 crc kubenswrapper[4606]: I1212 00:42:48.829950 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-h5tgw" Dec 12 00:42:48 crc kubenswrapper[4606]: I1212 00:42:48.833226 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-998648c74-gtwmt" podStartSLOduration=4.128849091 podStartE2EDuration="45.833209753s" podCreationTimestamp="2025-12-12 00:42:03 +0000 UTC" firstStartedPulling="2025-12-12 00:42:05.593547722 +0000 UTC m=+1116.138900588" lastFinishedPulling="2025-12-12 00:42:47.297908384 +0000 UTC m=+1157.843261250" observedRunningTime="2025-12-12 00:42:48.815767524 +0000 UTC m=+1159.361120400" watchObservedRunningTime="2025-12-12 00:42:48.833209753 +0000 UTC m=+1159.378562619" Dec 12 00:42:48 crc kubenswrapper[4606]: I1212 00:42:48.852382 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-697fb699cf-2c5hc" podStartSLOduration=4.622944969 podStartE2EDuration="46.852362427s" podCreationTimestamp="2025-12-12 00:42:02 +0000 UTC" firstStartedPulling="2025-12-12 00:42:05.05899139 +0000 UTC m=+1115.604344256" lastFinishedPulling="2025-12-12 00:42:47.288408848 +0000 UTC m=+1157.833761714" observedRunningTime="2025-12-12 00:42:48.848649748 +0000 UTC m=+1159.394002624" watchObservedRunningTime="2025-12-12 00:42:48.852362427 +0000 UTC m=+1159.397715293" Dec 12 00:42:48 crc kubenswrapper[4606]: I1212 00:42:48.870974 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-w5849" event={"ID":"7663a2be-d4ba-43d4-bd35-7bf4b969a72d","Type":"ContainerStarted","Data":"92f2df5ea5fa7f07b493d389d214e4ce4a4cdf8014c618716c9e55b8289d1759"} Dec 12 00:42:48 crc kubenswrapper[4606]: I1212 00:42:48.871990 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-w5849" Dec 12 00:42:48 crc kubenswrapper[4606]: I1212 00:42:48.874154 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-w5849" Dec 12 00:42:48 crc kubenswrapper[4606]: I1212 00:42:48.892595 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-h5tgw" podStartSLOduration=4.062677944 podStartE2EDuration="45.892557057s" podCreationTimestamp="2025-12-12 00:42:03 +0000 UTC" firstStartedPulling="2025-12-12 00:42:05.4613235 +0000 UTC m=+1116.006676366" lastFinishedPulling="2025-12-12 00:42:47.291202613 +0000 UTC m=+1157.836555479" observedRunningTime="2025-12-12 00:42:48.88967698 +0000 UTC m=+1159.435029856" watchObservedRunningTime="2025-12-12 00:42:48.892557057 +0000 UTC m=+1159.437909913" Dec 12 00:42:48 crc kubenswrapper[4606]: I1212 00:42:48.909386 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-97g95" event={"ID":"a6f8bedd-5eb2-4092-abd9-34f8ccbed690","Type":"ContainerStarted","Data":"fea4580e9822a6a3c9a15255c4e5100fac6a13d18b75202a2eae6c9da48685a5"} Dec 12 00:42:48 crc kubenswrapper[4606]: I1212 00:42:48.909636 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-97g95" Dec 12 00:42:48 crc kubenswrapper[4606]: I1212 00:42:48.913066 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-97g95" Dec 12 00:42:48 crc kubenswrapper[4606]: I1212 00:42:48.936350 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-w5849" podStartSLOduration=4.839685532 podStartE2EDuration="46.936329283s" podCreationTimestamp="2025-12-12 00:42:02 +0000 UTC" firstStartedPulling="2025-12-12 00:42:05.293583543 +0000 UTC m=+1115.838936409" lastFinishedPulling="2025-12-12 00:42:47.390227294 +0000 UTC m=+1157.935580160" observedRunningTime="2025-12-12 00:42:48.933099256 +0000 UTC m=+1159.478452122" watchObservedRunningTime="2025-12-12 00:42:48.936329283 +0000 UTC m=+1159.481682149" Dec 12 00:42:49 crc kubenswrapper[4606]: I1212 00:42:49.745106 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-97g95" podStartSLOduration=5.654853533 podStartE2EDuration="47.745093363s" podCreationTimestamp="2025-12-12 00:42:02 +0000 UTC" firstStartedPulling="2025-12-12 00:42:05.202645069 +0000 UTC m=+1115.747997935" lastFinishedPulling="2025-12-12 00:42:47.292884899 +0000 UTC m=+1157.838237765" observedRunningTime="2025-12-12 00:42:48.982583526 +0000 UTC m=+1159.527936392" watchObservedRunningTime="2025-12-12 00:42:49.745093363 +0000 UTC m=+1160.290446229" Dec 12 00:42:49 crc kubenswrapper[4606]: I1212 00:42:49.922653 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-n54mf" event={"ID":"1d9582d9-c931-4b43-8431-407d6c98cbc1","Type":"ContainerStarted","Data":"4556579a6c7734b9990931d79fbeb0204e3e7ff7f850115acc9a8124edbd8cd0"} Dec 12 00:42:49 crc kubenswrapper[4606]: I1212 00:42:49.923550 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-n54mf" Dec 12 00:42:49 crc kubenswrapper[4606]: I1212 00:42:49.937567 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-g4nxc" event={"ID":"0b3e1e95-7581-4453-af8b-6a23e4bba5fe","Type":"ContainerStarted","Data":"e6aaecf1743e2845b451886853747d459f1e2493c906afa55154a987d80775c6"} Dec 12 00:42:49 crc kubenswrapper[4606]: I1212 00:42:49.938066 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-g4nxc" Dec 12 00:42:49 crc kubenswrapper[4606]: I1212 00:42:49.939951 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-qjxxm" event={"ID":"d7b0479e-9d3b-48b0-a7dd-6388faf6cfc0","Type":"ContainerStarted","Data":"9fda1628232b4bac92284349ed43c76198b5ee949d5c6f4ed270a15a6bb5536d"} Dec 12 00:42:49 crc kubenswrapper[4606]: I1212 00:42:49.940333 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5854674fcc-qjxxm" Dec 12 00:42:49 crc kubenswrapper[4606]: I1212 00:42:49.958562 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-n54mf" podStartSLOduration=4.694567403 podStartE2EDuration="47.958547478s" podCreationTimestamp="2025-12-12 00:42:02 +0000 UTC" firstStartedPulling="2025-12-12 00:42:05.084897086 +0000 UTC m=+1115.630249952" lastFinishedPulling="2025-12-12 00:42:48.348877161 +0000 UTC m=+1158.894230027" observedRunningTime="2025-12-12 00:42:49.957305074 +0000 UTC m=+1160.502657930" watchObservedRunningTime="2025-12-12 00:42:49.958547478 +0000 UTC m=+1160.503900344" Dec 12 00:42:49 crc kubenswrapper[4606]: I1212 00:42:49.964684 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-wqj4k" event={"ID":"a6d74506-7048-4b2d-ba7f-46e83a508405","Type":"ContainerStarted","Data":"55014d2e646b27ca7d8a864486a2b071da9cd313b26c3473ec40ad0ddbc7092f"} Dec 12 00:42:49 crc kubenswrapper[4606]: I1212 00:42:49.965314 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-wqj4k" Dec 12 00:42:49 crc kubenswrapper[4606]: I1212 00:42:49.982480 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5697bb5779-mzf56" event={"ID":"1c42899f-ae12-4c9b-b012-6ead724854cb","Type":"ContainerStarted","Data":"9962b4bcbe71bc623765616da2182bc2fc7eee99707e8292dd031637a8e229a3"} Dec 12 00:42:49 crc kubenswrapper[4606]: I1212 00:42:49.983028 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-5697bb5779-mzf56" Dec 12 00:42:49 crc kubenswrapper[4606]: I1212 00:42:49.985499 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-5b5fd79c9c-66jql" event={"ID":"6130b694-1b33-495f-b0af-481805aa4727","Type":"ContainerStarted","Data":"ef0c8e4aa3aa26d1a05f62c0012614747f62214bd3f1246be39ff76d92f3e72a"} Dec 12 00:42:49 crc kubenswrapper[4606]: I1212 00:42:49.986153 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-5b5fd79c9c-66jql" Dec 12 00:42:49 crc kubenswrapper[4606]: I1212 00:42:49.996537 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-g4nxc" podStartSLOduration=4.219689541 podStartE2EDuration="46.996524848s" podCreationTimestamp="2025-12-12 00:42:03 +0000 UTC" firstStartedPulling="2025-12-12 00:42:05.606914051 +0000 UTC m=+1116.152266917" lastFinishedPulling="2025-12-12 00:42:48.383749358 +0000 UTC m=+1158.929102224" observedRunningTime="2025-12-12 00:42:49.994855613 +0000 UTC m=+1160.540208469" watchObservedRunningTime="2025-12-12 00:42:49.996524848 +0000 UTC m=+1160.541877714" Dec 12 00:42:50 crc kubenswrapper[4606]: I1212 00:42:50.009677 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-hcqxk" event={"ID":"3b429293-caf6-47e1-9976-01d6fca19c6c","Type":"ContainerStarted","Data":"48f55055f452f06251313dc5fbfb4ffea08284e560e2fe60b32d904aea382121"} Dec 12 00:42:50 crc kubenswrapper[4606]: I1212 00:42:50.009967 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-hcqxk" Dec 12 00:42:50 crc kubenswrapper[4606]: I1212 00:42:50.025312 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-bdj9z" event={"ID":"19a5895a-f008-411d-9ac2-6122eb52aa1e","Type":"ContainerStarted","Data":"5cf55fdc11b82d056a9e4dd85c1a0e45efa2dcf05a5fbf973edc403f2ccf618f"} Dec 12 00:42:50 crc kubenswrapper[4606]: I1212 00:42:50.029921 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5854674fcc-qjxxm" podStartSLOduration=4.713124419 podStartE2EDuration="47.029901905s" podCreationTimestamp="2025-12-12 00:42:03 +0000 UTC" firstStartedPulling="2025-12-12 00:42:06.03081774 +0000 UTC m=+1116.576170606" lastFinishedPulling="2025-12-12 00:42:48.347595226 +0000 UTC m=+1158.892948092" observedRunningTime="2025-12-12 00:42:50.026975576 +0000 UTC m=+1160.572328452" watchObservedRunningTime="2025-12-12 00:42:50.029901905 +0000 UTC m=+1160.575254761" Dec 12 00:42:50 crc kubenswrapper[4606]: I1212 00:42:50.032974 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-st6cm" Dec 12 00:42:50 crc kubenswrapper[4606]: I1212 00:42:50.053148 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-hcqxk" podStartSLOduration=4.358978414 podStartE2EDuration="47.053126559s" podCreationTimestamp="2025-12-12 00:42:03 +0000 UTC" firstStartedPulling="2025-12-12 00:42:05.449131402 +0000 UTC m=+1115.994484268" lastFinishedPulling="2025-12-12 00:42:48.143279557 +0000 UTC m=+1158.688632413" observedRunningTime="2025-12-12 00:42:50.04458248 +0000 UTC m=+1160.589935356" watchObservedRunningTime="2025-12-12 00:42:50.053126559 +0000 UTC m=+1160.598479425" Dec 12 00:42:50 crc kubenswrapper[4606]: I1212 00:42:50.085311 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-5697bb5779-mzf56" podStartSLOduration=5.040799915 podStartE2EDuration="48.085290323s" podCreationTimestamp="2025-12-12 00:42:02 +0000 UTC" firstStartedPulling="2025-12-12 00:42:05.12226758 +0000 UTC m=+1115.667620436" lastFinishedPulling="2025-12-12 00:42:48.166757978 +0000 UTC m=+1158.712110844" observedRunningTime="2025-12-12 00:42:50.083397652 +0000 UTC m=+1160.628750518" watchObservedRunningTime="2025-12-12 00:42:50.085290323 +0000 UTC m=+1160.630643179" Dec 12 00:42:50 crc kubenswrapper[4606]: E1212 00:42:50.099608 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-j44wr" podUID="7f8a5b5c-6158-4f24-8323-2afd6b9b2664" Dec 12 00:42:50 crc kubenswrapper[4606]: I1212 00:42:50.106614 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-5b5fd79c9c-66jql" podStartSLOduration=5.024745924 podStartE2EDuration="48.106599396s" podCreationTimestamp="2025-12-12 00:42:02 +0000 UTC" firstStartedPulling="2025-12-12 00:42:05.369973365 +0000 UTC m=+1115.915326231" lastFinishedPulling="2025-12-12 00:42:48.451826837 +0000 UTC m=+1158.997179703" observedRunningTime="2025-12-12 00:42:50.100339688 +0000 UTC m=+1160.645692554" watchObservedRunningTime="2025-12-12 00:42:50.106599396 +0000 UTC m=+1160.651952262" Dec 12 00:42:50 crc kubenswrapper[4606]: I1212 00:42:50.125292 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-wqj4k" podStartSLOduration=4.846471371 podStartE2EDuration="47.125268787s" podCreationTimestamp="2025-12-12 00:42:03 +0000 UTC" firstStartedPulling="2025-12-12 00:42:06.070739922 +0000 UTC m=+1116.616092788" lastFinishedPulling="2025-12-12 00:42:48.349537338 +0000 UTC m=+1158.894890204" observedRunningTime="2025-12-12 00:42:50.120396056 +0000 UTC m=+1160.665748922" watchObservedRunningTime="2025-12-12 00:42:50.125268787 +0000 UTC m=+1160.670621653" Dec 12 00:42:50 crc kubenswrapper[4606]: I1212 00:42:50.139759 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-bdj9z" podStartSLOduration=5.092087123 podStartE2EDuration="48.139740306s" podCreationTimestamp="2025-12-12 00:42:02 +0000 UTC" firstStartedPulling="2025-12-12 00:42:05.46096886 +0000 UTC m=+1116.006321726" lastFinishedPulling="2025-12-12 00:42:48.508622043 +0000 UTC m=+1159.053974909" observedRunningTime="2025-12-12 00:42:50.136723235 +0000 UTC m=+1160.682076101" watchObservedRunningTime="2025-12-12 00:42:50.139740306 +0000 UTC m=+1160.685093172" Dec 12 00:42:51 crc kubenswrapper[4606]: I1212 00:42:51.038274 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-75944c9b7-z2b8c" event={"ID":"f289c6b8-d4dd-40da-ac6a-0249b4a3e9f8","Type":"ContainerStarted","Data":"c19d75398d0d0db3f050680427a0b756cede21113b82ffec6b87b1a0f110bad1"} Dec 12 00:42:51 crc kubenswrapper[4606]: I1212 00:42:51.039297 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-75944c9b7-z2b8c" Dec 12 00:42:51 crc kubenswrapper[4606]: I1212 00:42:51.043503 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-j44wr" event={"ID":"7f8a5b5c-6158-4f24-8323-2afd6b9b2664","Type":"ContainerStarted","Data":"55ff8dcff6a3833f58bea0100a41aafac2ee4a7b145473ec3ed986f512c8d111"} Dec 12 00:42:51 crc kubenswrapper[4606]: I1212 00:42:51.045225 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-bdj9z" Dec 12 00:42:51 crc kubenswrapper[4606]: E1212 00:42:51.052605 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:424da951f13f1fbe9083215dc9f5088f90676dd813f01fdf3c1a8639b61cbaad\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-j44wr" podUID="7f8a5b5c-6158-4f24-8323-2afd6b9b2664" Dec 12 00:42:51 crc kubenswrapper[4606]: I1212 00:42:51.084358 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-75944c9b7-z2b8c" podStartSLOduration=4.099512034 podStartE2EDuration="48.084342896s" podCreationTimestamp="2025-12-12 00:42:03 +0000 UTC" firstStartedPulling="2025-12-12 00:42:05.822316258 +0000 UTC m=+1116.367669124" lastFinishedPulling="2025-12-12 00:42:49.80714712 +0000 UTC m=+1160.352499986" observedRunningTime="2025-12-12 00:42:51.064281046 +0000 UTC m=+1161.609633912" watchObservedRunningTime="2025-12-12 00:42:51.084342896 +0000 UTC m=+1161.629695762" Dec 12 00:42:53 crc kubenswrapper[4606]: I1212 00:42:53.059761 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-mnqs5" event={"ID":"9ec63351-044c-4c07-b021-a2835b2290c8","Type":"ContainerStarted","Data":"fa9da967b906ef1312eeee1cb4684302625f7671e70c04eadb55b4044ed4c29d"} Dec 12 00:42:53 crc kubenswrapper[4606]: I1212 00:42:53.061364 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fkjwr5" event={"ID":"02def546-751a-46ac-848a-367f0a7f84cb","Type":"ContainerStarted","Data":"f93936ecd28bd88b35fb276a395b43d0f17d344ba926c4af2e6c33f85e482a26"} Dec 12 00:42:53 crc kubenswrapper[4606]: I1212 00:42:53.103265 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-697fb699cf-2c5hc" Dec 12 00:42:53 crc kubenswrapper[4606]: I1212 00:42:53.757035 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-h5tgw" Dec 12 00:42:53 crc kubenswrapper[4606]: I1212 00:42:53.776754 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-998648c74-gtwmt" Dec 12 00:42:54 crc kubenswrapper[4606]: I1212 00:42:54.006847 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-g4nxc" Dec 12 00:42:54 crc kubenswrapper[4606]: I1212 00:42:54.069163 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-mnqs5" event={"ID":"9ec63351-044c-4c07-b021-a2835b2290c8","Type":"ContainerStarted","Data":"2ac053a11dedbb607c9b7b379f91579439a1dd2a5c7a73f101399dd6b87a20f4"} Dec 12 00:42:54 crc kubenswrapper[4606]: I1212 00:42:54.069254 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-mnqs5" Dec 12 00:42:54 crc kubenswrapper[4606]: I1212 00:42:54.082546 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fkjwr5" event={"ID":"02def546-751a-46ac-848a-367f0a7f84cb","Type":"ContainerStarted","Data":"7928d75319df297acf016b4438a30331a86c411a1f20584b51c82a92043bbc88"} Dec 12 00:42:54 crc kubenswrapper[4606]: I1212 00:42:54.082717 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fkjwr5" Dec 12 00:42:54 crc kubenswrapper[4606]: I1212 00:42:54.094740 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-mnqs5" podStartSLOduration=41.321912828 podStartE2EDuration="52.094717126s" podCreationTimestamp="2025-12-12 00:42:02 +0000 UTC" firstStartedPulling="2025-12-12 00:42:41.96755334 +0000 UTC m=+1152.512906216" lastFinishedPulling="2025-12-12 00:42:52.740357648 +0000 UTC m=+1163.285710514" observedRunningTime="2025-12-12 00:42:54.088805538 +0000 UTC m=+1164.634158404" watchObservedRunningTime="2025-12-12 00:42:54.094717126 +0000 UTC m=+1164.640069992" Dec 12 00:42:54 crc kubenswrapper[4606]: I1212 00:42:54.134254 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fkjwr5" podStartSLOduration=40.597933894 podStartE2EDuration="51.134235618s" podCreationTimestamp="2025-12-12 00:42:03 +0000 UTC" firstStartedPulling="2025-12-12 00:42:42.219681984 +0000 UTC m=+1152.765034850" lastFinishedPulling="2025-12-12 00:42:52.755983668 +0000 UTC m=+1163.301336574" observedRunningTime="2025-12-12 00:42:54.132893422 +0000 UTC m=+1164.678246288" watchObservedRunningTime="2025-12-12 00:42:54.134235618 +0000 UTC m=+1164.679588484" Dec 12 00:42:54 crc kubenswrapper[4606]: I1212 00:42:54.144789 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-hcqxk" Dec 12 00:42:54 crc kubenswrapper[4606]: I1212 00:42:54.381730 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5854674fcc-qjxxm" Dec 12 00:42:54 crc kubenswrapper[4606]: E1212 00:42:54.700840 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-6lzdf" podUID="e1c99848-c685-4782-bb57-71217db4db6c" Dec 12 00:42:55 crc kubenswrapper[4606]: I1212 00:42:55.992546 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-6f7f89d9c9-mfj2g" Dec 12 00:42:58 crc kubenswrapper[4606]: I1212 00:42:58.883089 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-mnqs5" Dec 12 00:43:03 crc kubenswrapper[4606]: I1212 00:43:03.120980 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-5697bb5779-mzf56" Dec 12 00:43:03 crc kubenswrapper[4606]: I1212 00:43:03.504000 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-n54mf" Dec 12 00:43:03 crc kubenswrapper[4606]: I1212 00:43:03.586306 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-5b5fd79c9c-66jql" Dec 12 00:43:03 crc kubenswrapper[4606]: I1212 00:43:03.628853 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-bdj9z" Dec 12 00:43:03 crc kubenswrapper[4606]: I1212 00:43:03.691570 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-wqj4k" Dec 12 00:43:04 crc kubenswrapper[4606]: I1212 00:43:04.430083 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-75944c9b7-z2b8c" Dec 12 00:43:05 crc kubenswrapper[4606]: I1212 00:43:05.637078 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fkjwr5" Dec 12 00:43:06 crc kubenswrapper[4606]: I1212 00:43:06.167665 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-j44wr" event={"ID":"7f8a5b5c-6158-4f24-8323-2afd6b9b2664","Type":"ContainerStarted","Data":"0e374ae052bd4a6648b52089f5949b7429ec002d9980ba040a513270b0a9610c"} Dec 12 00:43:06 crc kubenswrapper[4606]: I1212 00:43:06.167881 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-j44wr" Dec 12 00:43:07 crc kubenswrapper[4606]: I1212 00:43:07.179387 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-6lzdf" event={"ID":"e1c99848-c685-4782-bb57-71217db4db6c","Type":"ContainerStarted","Data":"2d758165e343f9976c0e0f862f8206cd31a5682be57cecda05a8e5e4061da4c2"} Dec 12 00:43:07 crc kubenswrapper[4606]: I1212 00:43:07.205353 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-6lzdf" podStartSLOduration=4.168270819 podStartE2EDuration="1m4.205333384s" podCreationTimestamp="2025-12-12 00:42:03 +0000 UTC" firstStartedPulling="2025-12-12 00:42:06.099171816 +0000 UTC m=+1116.644524672" lastFinishedPulling="2025-12-12 00:43:06.136234361 +0000 UTC m=+1176.681587237" observedRunningTime="2025-12-12 00:43:07.203393092 +0000 UTC m=+1177.748745978" watchObservedRunningTime="2025-12-12 00:43:07.205333384 +0000 UTC m=+1177.750686250" Dec 12 00:43:07 crc kubenswrapper[4606]: I1212 00:43:07.205643 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-j44wr" podStartSLOduration=5.984697825 podStartE2EDuration="1m5.205635672s" podCreationTimestamp="2025-12-12 00:42:02 +0000 UTC" firstStartedPulling="2025-12-12 00:42:06.079114527 +0000 UTC m=+1116.624467393" lastFinishedPulling="2025-12-12 00:43:05.300052374 +0000 UTC m=+1175.845405240" observedRunningTime="2025-12-12 00:43:06.201873014 +0000 UTC m=+1176.747225880" watchObservedRunningTime="2025-12-12 00:43:07.205635672 +0000 UTC m=+1177.750988538" Dec 12 00:43:13 crc kubenswrapper[4606]: I1212 00:43:13.649692 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-j44wr" Dec 12 00:43:34 crc kubenswrapper[4606]: I1212 00:43:34.491530 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-82vqv"] Dec 12 00:43:34 crc kubenswrapper[4606]: I1212 00:43:34.493166 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-82vqv" Dec 12 00:43:34 crc kubenswrapper[4606]: I1212 00:43:34.496419 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-lhgft" Dec 12 00:43:34 crc kubenswrapper[4606]: I1212 00:43:34.496721 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Dec 12 00:43:34 crc kubenswrapper[4606]: I1212 00:43:34.497194 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Dec 12 00:43:34 crc kubenswrapper[4606]: I1212 00:43:34.497506 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Dec 12 00:43:34 crc kubenswrapper[4606]: I1212 00:43:34.513139 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-82vqv"] Dec 12 00:43:34 crc kubenswrapper[4606]: I1212 00:43:34.522273 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28522b6b-a691-4d48-a1f6-d7dad1b11a58-config\") pod \"dnsmasq-dns-675f4bcbfc-82vqv\" (UID: \"28522b6b-a691-4d48-a1f6-d7dad1b11a58\") " pod="openstack/dnsmasq-dns-675f4bcbfc-82vqv" Dec 12 00:43:34 crc kubenswrapper[4606]: I1212 00:43:34.522398 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rk4nr\" (UniqueName: \"kubernetes.io/projected/28522b6b-a691-4d48-a1f6-d7dad1b11a58-kube-api-access-rk4nr\") pod \"dnsmasq-dns-675f4bcbfc-82vqv\" (UID: \"28522b6b-a691-4d48-a1f6-d7dad1b11a58\") " pod="openstack/dnsmasq-dns-675f4bcbfc-82vqv" Dec 12 00:43:34 crc kubenswrapper[4606]: I1212 00:43:34.595476 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-mbzbq"] Dec 12 00:43:34 crc kubenswrapper[4606]: I1212 00:43:34.596805 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-mbzbq" Dec 12 00:43:34 crc kubenswrapper[4606]: I1212 00:43:34.602090 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Dec 12 00:43:34 crc kubenswrapper[4606]: I1212 00:43:34.607398 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-mbzbq"] Dec 12 00:43:34 crc kubenswrapper[4606]: I1212 00:43:34.625145 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a54f6780-951a-4e4b-953b-b470456b76f9-config\") pod \"dnsmasq-dns-78dd6ddcc-mbzbq\" (UID: \"a54f6780-951a-4e4b-953b-b470456b76f9\") " pod="openstack/dnsmasq-dns-78dd6ddcc-mbzbq" Dec 12 00:43:34 crc kubenswrapper[4606]: I1212 00:43:34.625243 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gg7b2\" (UniqueName: \"kubernetes.io/projected/a54f6780-951a-4e4b-953b-b470456b76f9-kube-api-access-gg7b2\") pod \"dnsmasq-dns-78dd6ddcc-mbzbq\" (UID: \"a54f6780-951a-4e4b-953b-b470456b76f9\") " pod="openstack/dnsmasq-dns-78dd6ddcc-mbzbq" Dec 12 00:43:34 crc kubenswrapper[4606]: I1212 00:43:34.625275 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rk4nr\" (UniqueName: \"kubernetes.io/projected/28522b6b-a691-4d48-a1f6-d7dad1b11a58-kube-api-access-rk4nr\") pod \"dnsmasq-dns-675f4bcbfc-82vqv\" (UID: \"28522b6b-a691-4d48-a1f6-d7dad1b11a58\") " pod="openstack/dnsmasq-dns-675f4bcbfc-82vqv" Dec 12 00:43:34 crc kubenswrapper[4606]: I1212 00:43:34.625314 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a54f6780-951a-4e4b-953b-b470456b76f9-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-mbzbq\" (UID: \"a54f6780-951a-4e4b-953b-b470456b76f9\") " pod="openstack/dnsmasq-dns-78dd6ddcc-mbzbq" Dec 12 00:43:34 crc kubenswrapper[4606]: I1212 00:43:34.625348 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28522b6b-a691-4d48-a1f6-d7dad1b11a58-config\") pod \"dnsmasq-dns-675f4bcbfc-82vqv\" (UID: \"28522b6b-a691-4d48-a1f6-d7dad1b11a58\") " pod="openstack/dnsmasq-dns-675f4bcbfc-82vqv" Dec 12 00:43:34 crc kubenswrapper[4606]: I1212 00:43:34.626402 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28522b6b-a691-4d48-a1f6-d7dad1b11a58-config\") pod \"dnsmasq-dns-675f4bcbfc-82vqv\" (UID: \"28522b6b-a691-4d48-a1f6-d7dad1b11a58\") " pod="openstack/dnsmasq-dns-675f4bcbfc-82vqv" Dec 12 00:43:34 crc kubenswrapper[4606]: I1212 00:43:34.653983 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rk4nr\" (UniqueName: \"kubernetes.io/projected/28522b6b-a691-4d48-a1f6-d7dad1b11a58-kube-api-access-rk4nr\") pod \"dnsmasq-dns-675f4bcbfc-82vqv\" (UID: \"28522b6b-a691-4d48-a1f6-d7dad1b11a58\") " pod="openstack/dnsmasq-dns-675f4bcbfc-82vqv" Dec 12 00:43:34 crc kubenswrapper[4606]: I1212 00:43:34.726083 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gg7b2\" (UniqueName: \"kubernetes.io/projected/a54f6780-951a-4e4b-953b-b470456b76f9-kube-api-access-gg7b2\") pod \"dnsmasq-dns-78dd6ddcc-mbzbq\" (UID: \"a54f6780-951a-4e4b-953b-b470456b76f9\") " pod="openstack/dnsmasq-dns-78dd6ddcc-mbzbq" Dec 12 00:43:34 crc kubenswrapper[4606]: I1212 00:43:34.726151 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a54f6780-951a-4e4b-953b-b470456b76f9-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-mbzbq\" (UID: \"a54f6780-951a-4e4b-953b-b470456b76f9\") " pod="openstack/dnsmasq-dns-78dd6ddcc-mbzbq" Dec 12 00:43:34 crc kubenswrapper[4606]: I1212 00:43:34.726228 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a54f6780-951a-4e4b-953b-b470456b76f9-config\") pod \"dnsmasq-dns-78dd6ddcc-mbzbq\" (UID: \"a54f6780-951a-4e4b-953b-b470456b76f9\") " pod="openstack/dnsmasq-dns-78dd6ddcc-mbzbq" Dec 12 00:43:34 crc kubenswrapper[4606]: I1212 00:43:34.726974 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a54f6780-951a-4e4b-953b-b470456b76f9-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-mbzbq\" (UID: \"a54f6780-951a-4e4b-953b-b470456b76f9\") " pod="openstack/dnsmasq-dns-78dd6ddcc-mbzbq" Dec 12 00:43:34 crc kubenswrapper[4606]: I1212 00:43:34.727047 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a54f6780-951a-4e4b-953b-b470456b76f9-config\") pod \"dnsmasq-dns-78dd6ddcc-mbzbq\" (UID: \"a54f6780-951a-4e4b-953b-b470456b76f9\") " pod="openstack/dnsmasq-dns-78dd6ddcc-mbzbq" Dec 12 00:43:34 crc kubenswrapper[4606]: I1212 00:43:34.762044 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gg7b2\" (UniqueName: \"kubernetes.io/projected/a54f6780-951a-4e4b-953b-b470456b76f9-kube-api-access-gg7b2\") pod \"dnsmasq-dns-78dd6ddcc-mbzbq\" (UID: \"a54f6780-951a-4e4b-953b-b470456b76f9\") " pod="openstack/dnsmasq-dns-78dd6ddcc-mbzbq" Dec 12 00:43:34 crc kubenswrapper[4606]: I1212 00:43:34.812902 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-82vqv" Dec 12 00:43:34 crc kubenswrapper[4606]: I1212 00:43:34.913155 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-mbzbq" Dec 12 00:43:35 crc kubenswrapper[4606]: I1212 00:43:35.325643 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-82vqv"] Dec 12 00:43:35 crc kubenswrapper[4606]: I1212 00:43:35.386196 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-mbzbq"] Dec 12 00:43:35 crc kubenswrapper[4606]: W1212 00:43:35.393500 4606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda54f6780_951a_4e4b_953b_b470456b76f9.slice/crio-7311652c144400795d901a6816fa9d6178f93fc73a87cce8b84dfe8d2c470443 WatchSource:0}: Error finding container 7311652c144400795d901a6816fa9d6178f93fc73a87cce8b84dfe8d2c470443: Status 404 returned error can't find the container with id 7311652c144400795d901a6816fa9d6178f93fc73a87cce8b84dfe8d2c470443 Dec 12 00:43:35 crc kubenswrapper[4606]: I1212 00:43:35.420166 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-82vqv" event={"ID":"28522b6b-a691-4d48-a1f6-d7dad1b11a58","Type":"ContainerStarted","Data":"00d57dc96d17d1f47940d0682406601c1fa943fb51d4d057074f5f9c32521f5a"} Dec 12 00:43:35 crc kubenswrapper[4606]: I1212 00:43:35.422352 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-mbzbq" event={"ID":"a54f6780-951a-4e4b-953b-b470456b76f9","Type":"ContainerStarted","Data":"7311652c144400795d901a6816fa9d6178f93fc73a87cce8b84dfe8d2c470443"} Dec 12 00:43:35 crc kubenswrapper[4606]: I1212 00:43:35.948312 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-82vqv"] Dec 12 00:43:35 crc kubenswrapper[4606]: I1212 00:43:35.979053 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-5p9bb"] Dec 12 00:43:35 crc kubenswrapper[4606]: I1212 00:43:35.980194 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-5p9bb" Dec 12 00:43:36 crc kubenswrapper[4606]: I1212 00:43:36.004816 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-5p9bb"] Dec 12 00:43:36 crc kubenswrapper[4606]: I1212 00:43:36.046276 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/398360c3-4b1c-4dc0-be54-511d4ac621ee-dns-svc\") pod \"dnsmasq-dns-666b6646f7-5p9bb\" (UID: \"398360c3-4b1c-4dc0-be54-511d4ac621ee\") " pod="openstack/dnsmasq-dns-666b6646f7-5p9bb" Dec 12 00:43:36 crc kubenswrapper[4606]: I1212 00:43:36.046333 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/398360c3-4b1c-4dc0-be54-511d4ac621ee-config\") pod \"dnsmasq-dns-666b6646f7-5p9bb\" (UID: \"398360c3-4b1c-4dc0-be54-511d4ac621ee\") " pod="openstack/dnsmasq-dns-666b6646f7-5p9bb" Dec 12 00:43:36 crc kubenswrapper[4606]: I1212 00:43:36.046419 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndh26\" (UniqueName: \"kubernetes.io/projected/398360c3-4b1c-4dc0-be54-511d4ac621ee-kube-api-access-ndh26\") pod \"dnsmasq-dns-666b6646f7-5p9bb\" (UID: \"398360c3-4b1c-4dc0-be54-511d4ac621ee\") " pod="openstack/dnsmasq-dns-666b6646f7-5p9bb" Dec 12 00:43:36 crc kubenswrapper[4606]: I1212 00:43:36.147246 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndh26\" (UniqueName: \"kubernetes.io/projected/398360c3-4b1c-4dc0-be54-511d4ac621ee-kube-api-access-ndh26\") pod \"dnsmasq-dns-666b6646f7-5p9bb\" (UID: \"398360c3-4b1c-4dc0-be54-511d4ac621ee\") " pod="openstack/dnsmasq-dns-666b6646f7-5p9bb" Dec 12 00:43:36 crc kubenswrapper[4606]: I1212 00:43:36.147318 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/398360c3-4b1c-4dc0-be54-511d4ac621ee-dns-svc\") pod \"dnsmasq-dns-666b6646f7-5p9bb\" (UID: \"398360c3-4b1c-4dc0-be54-511d4ac621ee\") " pod="openstack/dnsmasq-dns-666b6646f7-5p9bb" Dec 12 00:43:36 crc kubenswrapper[4606]: I1212 00:43:36.147349 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/398360c3-4b1c-4dc0-be54-511d4ac621ee-config\") pod \"dnsmasq-dns-666b6646f7-5p9bb\" (UID: \"398360c3-4b1c-4dc0-be54-511d4ac621ee\") " pod="openstack/dnsmasq-dns-666b6646f7-5p9bb" Dec 12 00:43:36 crc kubenswrapper[4606]: I1212 00:43:36.148115 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/398360c3-4b1c-4dc0-be54-511d4ac621ee-config\") pod \"dnsmasq-dns-666b6646f7-5p9bb\" (UID: \"398360c3-4b1c-4dc0-be54-511d4ac621ee\") " pod="openstack/dnsmasq-dns-666b6646f7-5p9bb" Dec 12 00:43:36 crc kubenswrapper[4606]: I1212 00:43:36.148306 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/398360c3-4b1c-4dc0-be54-511d4ac621ee-dns-svc\") pod \"dnsmasq-dns-666b6646f7-5p9bb\" (UID: \"398360c3-4b1c-4dc0-be54-511d4ac621ee\") " pod="openstack/dnsmasq-dns-666b6646f7-5p9bb" Dec 12 00:43:36 crc kubenswrapper[4606]: I1212 00:43:36.176502 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndh26\" (UniqueName: \"kubernetes.io/projected/398360c3-4b1c-4dc0-be54-511d4ac621ee-kube-api-access-ndh26\") pod \"dnsmasq-dns-666b6646f7-5p9bb\" (UID: \"398360c3-4b1c-4dc0-be54-511d4ac621ee\") " pod="openstack/dnsmasq-dns-666b6646f7-5p9bb" Dec 12 00:43:36 crc kubenswrapper[4606]: I1212 00:43:36.296379 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-5p9bb" Dec 12 00:43:36 crc kubenswrapper[4606]: I1212 00:43:36.355327 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-mbzbq"] Dec 12 00:43:36 crc kubenswrapper[4606]: I1212 00:43:36.380160 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-dk28j"] Dec 12 00:43:36 crc kubenswrapper[4606]: I1212 00:43:36.381788 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-dk28j" Dec 12 00:43:36 crc kubenswrapper[4606]: I1212 00:43:36.395935 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-dk28j"] Dec 12 00:43:36 crc kubenswrapper[4606]: I1212 00:43:36.553904 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rc7cv\" (UniqueName: \"kubernetes.io/projected/059cf819-e18e-493b-a65b-9f3f8b5d683f-kube-api-access-rc7cv\") pod \"dnsmasq-dns-57d769cc4f-dk28j\" (UID: \"059cf819-e18e-493b-a65b-9f3f8b5d683f\") " pod="openstack/dnsmasq-dns-57d769cc4f-dk28j" Dec 12 00:43:36 crc kubenswrapper[4606]: I1212 00:43:36.554055 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/059cf819-e18e-493b-a65b-9f3f8b5d683f-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-dk28j\" (UID: \"059cf819-e18e-493b-a65b-9f3f8b5d683f\") " pod="openstack/dnsmasq-dns-57d769cc4f-dk28j" Dec 12 00:43:36 crc kubenswrapper[4606]: I1212 00:43:36.554095 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/059cf819-e18e-493b-a65b-9f3f8b5d683f-config\") pod \"dnsmasq-dns-57d769cc4f-dk28j\" (UID: \"059cf819-e18e-493b-a65b-9f3f8b5d683f\") " pod="openstack/dnsmasq-dns-57d769cc4f-dk28j" Dec 12 00:43:36 crc kubenswrapper[4606]: I1212 00:43:36.654861 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rc7cv\" (UniqueName: \"kubernetes.io/projected/059cf819-e18e-493b-a65b-9f3f8b5d683f-kube-api-access-rc7cv\") pod \"dnsmasq-dns-57d769cc4f-dk28j\" (UID: \"059cf819-e18e-493b-a65b-9f3f8b5d683f\") " pod="openstack/dnsmasq-dns-57d769cc4f-dk28j" Dec 12 00:43:36 crc kubenswrapper[4606]: I1212 00:43:36.654973 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/059cf819-e18e-493b-a65b-9f3f8b5d683f-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-dk28j\" (UID: \"059cf819-e18e-493b-a65b-9f3f8b5d683f\") " pod="openstack/dnsmasq-dns-57d769cc4f-dk28j" Dec 12 00:43:36 crc kubenswrapper[4606]: I1212 00:43:36.655013 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/059cf819-e18e-493b-a65b-9f3f8b5d683f-config\") pod \"dnsmasq-dns-57d769cc4f-dk28j\" (UID: \"059cf819-e18e-493b-a65b-9f3f8b5d683f\") " pod="openstack/dnsmasq-dns-57d769cc4f-dk28j" Dec 12 00:43:36 crc kubenswrapper[4606]: I1212 00:43:36.656522 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/059cf819-e18e-493b-a65b-9f3f8b5d683f-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-dk28j\" (UID: \"059cf819-e18e-493b-a65b-9f3f8b5d683f\") " pod="openstack/dnsmasq-dns-57d769cc4f-dk28j" Dec 12 00:43:36 crc kubenswrapper[4606]: I1212 00:43:36.657049 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/059cf819-e18e-493b-a65b-9f3f8b5d683f-config\") pod \"dnsmasq-dns-57d769cc4f-dk28j\" (UID: \"059cf819-e18e-493b-a65b-9f3f8b5d683f\") " pod="openstack/dnsmasq-dns-57d769cc4f-dk28j" Dec 12 00:43:36 crc kubenswrapper[4606]: I1212 00:43:36.693818 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rc7cv\" (UniqueName: \"kubernetes.io/projected/059cf819-e18e-493b-a65b-9f3f8b5d683f-kube-api-access-rc7cv\") pod \"dnsmasq-dns-57d769cc4f-dk28j\" (UID: \"059cf819-e18e-493b-a65b-9f3f8b5d683f\") " pod="openstack/dnsmasq-dns-57d769cc4f-dk28j" Dec 12 00:43:36 crc kubenswrapper[4606]: I1212 00:43:36.709357 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-dk28j" Dec 12 00:43:36 crc kubenswrapper[4606]: I1212 00:43:36.922796 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-5p9bb"] Dec 12 00:43:37 crc kubenswrapper[4606]: I1212 00:43:37.175688 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 12 00:43:37 crc kubenswrapper[4606]: I1212 00:43:37.177085 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 12 00:43:37 crc kubenswrapper[4606]: I1212 00:43:37.182767 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 12 00:43:37 crc kubenswrapper[4606]: I1212 00:43:37.182996 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 12 00:43:37 crc kubenswrapper[4606]: I1212 00:43:37.183104 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Dec 12 00:43:37 crc kubenswrapper[4606]: I1212 00:43:37.183739 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-666rg" Dec 12 00:43:37 crc kubenswrapper[4606]: I1212 00:43:37.183922 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 12 00:43:37 crc kubenswrapper[4606]: I1212 00:43:37.184021 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 12 00:43:37 crc kubenswrapper[4606]: I1212 00:43:37.184123 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Dec 12 00:43:37 crc kubenswrapper[4606]: I1212 00:43:37.226746 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 12 00:43:37 crc kubenswrapper[4606]: I1212 00:43:37.282766 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0e415e37-636f-4f5d-a64e-4dd815e6030e-config-data\") pod \"rabbitmq-server-0\" (UID: \"0e415e37-636f-4f5d-a64e-4dd815e6030e\") " pod="openstack/rabbitmq-server-0" Dec 12 00:43:37 crc kubenswrapper[4606]: I1212 00:43:37.282856 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0e415e37-636f-4f5d-a64e-4dd815e6030e-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"0e415e37-636f-4f5d-a64e-4dd815e6030e\") " pod="openstack/rabbitmq-server-0" Dec 12 00:43:37 crc kubenswrapper[4606]: I1212 00:43:37.282913 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0e415e37-636f-4f5d-a64e-4dd815e6030e-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"0e415e37-636f-4f5d-a64e-4dd815e6030e\") " pod="openstack/rabbitmq-server-0" Dec 12 00:43:37 crc kubenswrapper[4606]: I1212 00:43:37.283039 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0e415e37-636f-4f5d-a64e-4dd815e6030e-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"0e415e37-636f-4f5d-a64e-4dd815e6030e\") " pod="openstack/rabbitmq-server-0" Dec 12 00:43:37 crc kubenswrapper[4606]: I1212 00:43:37.283110 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0e415e37-636f-4f5d-a64e-4dd815e6030e-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"0e415e37-636f-4f5d-a64e-4dd815e6030e\") " pod="openstack/rabbitmq-server-0" Dec 12 00:43:37 crc kubenswrapper[4606]: I1212 00:43:37.283148 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8fnp\" (UniqueName: \"kubernetes.io/projected/0e415e37-636f-4f5d-a64e-4dd815e6030e-kube-api-access-g8fnp\") pod \"rabbitmq-server-0\" (UID: \"0e415e37-636f-4f5d-a64e-4dd815e6030e\") " pod="openstack/rabbitmq-server-0" Dec 12 00:43:37 crc kubenswrapper[4606]: I1212 00:43:37.283203 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0e415e37-636f-4f5d-a64e-4dd815e6030e-server-conf\") pod \"rabbitmq-server-0\" (UID: \"0e415e37-636f-4f5d-a64e-4dd815e6030e\") " pod="openstack/rabbitmq-server-0" Dec 12 00:43:37 crc kubenswrapper[4606]: I1212 00:43:37.283229 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0e415e37-636f-4f5d-a64e-4dd815e6030e-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"0e415e37-636f-4f5d-a64e-4dd815e6030e\") " pod="openstack/rabbitmq-server-0" Dec 12 00:43:37 crc kubenswrapper[4606]: I1212 00:43:37.283371 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0e415e37-636f-4f5d-a64e-4dd815e6030e-pod-info\") pod \"rabbitmq-server-0\" (UID: \"0e415e37-636f-4f5d-a64e-4dd815e6030e\") " pod="openstack/rabbitmq-server-0" Dec 12 00:43:37 crc kubenswrapper[4606]: I1212 00:43:37.283533 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"0e415e37-636f-4f5d-a64e-4dd815e6030e\") " pod="openstack/rabbitmq-server-0" Dec 12 00:43:37 crc kubenswrapper[4606]: I1212 00:43:37.283568 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0e415e37-636f-4f5d-a64e-4dd815e6030e-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"0e415e37-636f-4f5d-a64e-4dd815e6030e\") " pod="openstack/rabbitmq-server-0" Dec 12 00:43:37 crc kubenswrapper[4606]: I1212 00:43:37.294448 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-dk28j"] Dec 12 00:43:37 crc kubenswrapper[4606]: W1212 00:43:37.305978 4606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod059cf819_e18e_493b_a65b_9f3f8b5d683f.slice/crio-a3ed91ad4c0871f703d8ca2e1e25d3cadf059b463aa54ac42174a5298320faad WatchSource:0}: Error finding container a3ed91ad4c0871f703d8ca2e1e25d3cadf059b463aa54ac42174a5298320faad: Status 404 returned error can't find the container with id a3ed91ad4c0871f703d8ca2e1e25d3cadf059b463aa54ac42174a5298320faad Dec 12 00:43:37 crc kubenswrapper[4606]: I1212 00:43:37.384763 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0e415e37-636f-4f5d-a64e-4dd815e6030e-pod-info\") pod \"rabbitmq-server-0\" (UID: \"0e415e37-636f-4f5d-a64e-4dd815e6030e\") " pod="openstack/rabbitmq-server-0" Dec 12 00:43:37 crc kubenswrapper[4606]: I1212 00:43:37.384815 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"0e415e37-636f-4f5d-a64e-4dd815e6030e\") " pod="openstack/rabbitmq-server-0" Dec 12 00:43:37 crc kubenswrapper[4606]: I1212 00:43:37.384840 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0e415e37-636f-4f5d-a64e-4dd815e6030e-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"0e415e37-636f-4f5d-a64e-4dd815e6030e\") " pod="openstack/rabbitmq-server-0" Dec 12 00:43:37 crc kubenswrapper[4606]: I1212 00:43:37.384877 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0e415e37-636f-4f5d-a64e-4dd815e6030e-config-data\") pod \"rabbitmq-server-0\" (UID: \"0e415e37-636f-4f5d-a64e-4dd815e6030e\") " pod="openstack/rabbitmq-server-0" Dec 12 00:43:37 crc kubenswrapper[4606]: I1212 00:43:37.384905 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0e415e37-636f-4f5d-a64e-4dd815e6030e-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"0e415e37-636f-4f5d-a64e-4dd815e6030e\") " pod="openstack/rabbitmq-server-0" Dec 12 00:43:37 crc kubenswrapper[4606]: I1212 00:43:37.384928 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0e415e37-636f-4f5d-a64e-4dd815e6030e-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"0e415e37-636f-4f5d-a64e-4dd815e6030e\") " pod="openstack/rabbitmq-server-0" Dec 12 00:43:37 crc kubenswrapper[4606]: I1212 00:43:37.384960 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0e415e37-636f-4f5d-a64e-4dd815e6030e-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"0e415e37-636f-4f5d-a64e-4dd815e6030e\") " pod="openstack/rabbitmq-server-0" Dec 12 00:43:37 crc kubenswrapper[4606]: I1212 00:43:37.384981 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0e415e37-636f-4f5d-a64e-4dd815e6030e-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"0e415e37-636f-4f5d-a64e-4dd815e6030e\") " pod="openstack/rabbitmq-server-0" Dec 12 00:43:37 crc kubenswrapper[4606]: I1212 00:43:37.384998 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8fnp\" (UniqueName: \"kubernetes.io/projected/0e415e37-636f-4f5d-a64e-4dd815e6030e-kube-api-access-g8fnp\") pod \"rabbitmq-server-0\" (UID: \"0e415e37-636f-4f5d-a64e-4dd815e6030e\") " pod="openstack/rabbitmq-server-0" Dec 12 00:43:37 crc kubenswrapper[4606]: I1212 00:43:37.385014 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0e415e37-636f-4f5d-a64e-4dd815e6030e-server-conf\") pod \"rabbitmq-server-0\" (UID: \"0e415e37-636f-4f5d-a64e-4dd815e6030e\") " pod="openstack/rabbitmq-server-0" Dec 12 00:43:37 crc kubenswrapper[4606]: I1212 00:43:37.385029 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0e415e37-636f-4f5d-a64e-4dd815e6030e-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"0e415e37-636f-4f5d-a64e-4dd815e6030e\") " pod="openstack/rabbitmq-server-0" Dec 12 00:43:37 crc kubenswrapper[4606]: I1212 00:43:37.385385 4606 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"0e415e37-636f-4f5d-a64e-4dd815e6030e\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/rabbitmq-server-0" Dec 12 00:43:37 crc kubenswrapper[4606]: I1212 00:43:37.386563 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0e415e37-636f-4f5d-a64e-4dd815e6030e-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"0e415e37-636f-4f5d-a64e-4dd815e6030e\") " pod="openstack/rabbitmq-server-0" Dec 12 00:43:37 crc kubenswrapper[4606]: I1212 00:43:37.387024 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0e415e37-636f-4f5d-a64e-4dd815e6030e-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"0e415e37-636f-4f5d-a64e-4dd815e6030e\") " pod="openstack/rabbitmq-server-0" Dec 12 00:43:37 crc kubenswrapper[4606]: I1212 00:43:37.387945 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0e415e37-636f-4f5d-a64e-4dd815e6030e-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"0e415e37-636f-4f5d-a64e-4dd815e6030e\") " pod="openstack/rabbitmq-server-0" Dec 12 00:43:37 crc kubenswrapper[4606]: I1212 00:43:37.388005 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0e415e37-636f-4f5d-a64e-4dd815e6030e-config-data\") pod \"rabbitmq-server-0\" (UID: \"0e415e37-636f-4f5d-a64e-4dd815e6030e\") " pod="openstack/rabbitmq-server-0" Dec 12 00:43:37 crc kubenswrapper[4606]: I1212 00:43:37.388138 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0e415e37-636f-4f5d-a64e-4dd815e6030e-server-conf\") pod \"rabbitmq-server-0\" (UID: \"0e415e37-636f-4f5d-a64e-4dd815e6030e\") " pod="openstack/rabbitmq-server-0" Dec 12 00:43:37 crc kubenswrapper[4606]: I1212 00:43:37.391533 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0e415e37-636f-4f5d-a64e-4dd815e6030e-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"0e415e37-636f-4f5d-a64e-4dd815e6030e\") " pod="openstack/rabbitmq-server-0" Dec 12 00:43:37 crc kubenswrapper[4606]: I1212 00:43:37.391814 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0e415e37-636f-4f5d-a64e-4dd815e6030e-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"0e415e37-636f-4f5d-a64e-4dd815e6030e\") " pod="openstack/rabbitmq-server-0" Dec 12 00:43:37 crc kubenswrapper[4606]: I1212 00:43:37.394785 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0e415e37-636f-4f5d-a64e-4dd815e6030e-pod-info\") pod \"rabbitmq-server-0\" (UID: \"0e415e37-636f-4f5d-a64e-4dd815e6030e\") " pod="openstack/rabbitmq-server-0" Dec 12 00:43:37 crc kubenswrapper[4606]: I1212 00:43:37.396447 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0e415e37-636f-4f5d-a64e-4dd815e6030e-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"0e415e37-636f-4f5d-a64e-4dd815e6030e\") " pod="openstack/rabbitmq-server-0" Dec 12 00:43:37 crc kubenswrapper[4606]: I1212 00:43:37.412370 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8fnp\" (UniqueName: \"kubernetes.io/projected/0e415e37-636f-4f5d-a64e-4dd815e6030e-kube-api-access-g8fnp\") pod \"rabbitmq-server-0\" (UID: \"0e415e37-636f-4f5d-a64e-4dd815e6030e\") " pod="openstack/rabbitmq-server-0" Dec 12 00:43:37 crc kubenswrapper[4606]: I1212 00:43:37.417223 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"0e415e37-636f-4f5d-a64e-4dd815e6030e\") " pod="openstack/rabbitmq-server-0" Dec 12 00:43:37 crc kubenswrapper[4606]: I1212 00:43:37.479663 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-5p9bb" event={"ID":"398360c3-4b1c-4dc0-be54-511d4ac621ee","Type":"ContainerStarted","Data":"65174a4b41c5b566cb9ca8e2a71ff42e51dc1909a4cd0b5f48ef814f809010fd"} Dec 12 00:43:37 crc kubenswrapper[4606]: I1212 00:43:37.481124 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-dk28j" event={"ID":"059cf819-e18e-493b-a65b-9f3f8b5d683f","Type":"ContainerStarted","Data":"a3ed91ad4c0871f703d8ca2e1e25d3cadf059b463aa54ac42174a5298320faad"} Dec 12 00:43:37 crc kubenswrapper[4606]: I1212 00:43:37.518463 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 12 00:43:37 crc kubenswrapper[4606]: I1212 00:43:37.548776 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 12 00:43:37 crc kubenswrapper[4606]: I1212 00:43:37.550329 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 12 00:43:37 crc kubenswrapper[4606]: I1212 00:43:37.553537 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-xdfhc" Dec 12 00:43:37 crc kubenswrapper[4606]: I1212 00:43:37.553727 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 12 00:43:37 crc kubenswrapper[4606]: I1212 00:43:37.553854 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Dec 12 00:43:37 crc kubenswrapper[4606]: I1212 00:43:37.553980 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 12 00:43:37 crc kubenswrapper[4606]: I1212 00:43:37.554201 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Dec 12 00:43:37 crc kubenswrapper[4606]: I1212 00:43:37.554315 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 12 00:43:37 crc kubenswrapper[4606]: I1212 00:43:37.554428 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 12 00:43:37 crc kubenswrapper[4606]: I1212 00:43:37.558447 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 12 00:43:37 crc kubenswrapper[4606]: I1212 00:43:37.696094 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bd9fd090-7c43-44f4-9951-10b4528fc8a2-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd9fd090-7c43-44f4-9951-10b4528fc8a2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 12 00:43:37 crc kubenswrapper[4606]: I1212 00:43:37.696135 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bd9fd090-7c43-44f4-9951-10b4528fc8a2-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd9fd090-7c43-44f4-9951-10b4528fc8a2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 12 00:43:37 crc kubenswrapper[4606]: I1212 00:43:37.696159 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bd9fd090-7c43-44f4-9951-10b4528fc8a2-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd9fd090-7c43-44f4-9951-10b4528fc8a2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 12 00:43:37 crc kubenswrapper[4606]: I1212 00:43:37.696217 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qcbhk\" (UniqueName: \"kubernetes.io/projected/bd9fd090-7c43-44f4-9951-10b4528fc8a2-kube-api-access-qcbhk\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd9fd090-7c43-44f4-9951-10b4528fc8a2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 12 00:43:37 crc kubenswrapper[4606]: I1212 00:43:37.696250 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bd9fd090-7c43-44f4-9951-10b4528fc8a2-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd9fd090-7c43-44f4-9951-10b4528fc8a2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 12 00:43:37 crc kubenswrapper[4606]: I1212 00:43:37.696273 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bd9fd090-7c43-44f4-9951-10b4528fc8a2-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd9fd090-7c43-44f4-9951-10b4528fc8a2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 12 00:43:37 crc kubenswrapper[4606]: I1212 00:43:37.696298 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bd9fd090-7c43-44f4-9951-10b4528fc8a2-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd9fd090-7c43-44f4-9951-10b4528fc8a2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 12 00:43:37 crc kubenswrapper[4606]: I1212 00:43:37.696325 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bd9fd090-7c43-44f4-9951-10b4528fc8a2-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd9fd090-7c43-44f4-9951-10b4528fc8a2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 12 00:43:37 crc kubenswrapper[4606]: I1212 00:43:37.696364 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd9fd090-7c43-44f4-9951-10b4528fc8a2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 12 00:43:37 crc kubenswrapper[4606]: I1212 00:43:37.696395 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/bd9fd090-7c43-44f4-9951-10b4528fc8a2-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd9fd090-7c43-44f4-9951-10b4528fc8a2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 12 00:43:37 crc kubenswrapper[4606]: I1212 00:43:37.696442 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bd9fd090-7c43-44f4-9951-10b4528fc8a2-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd9fd090-7c43-44f4-9951-10b4528fc8a2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 12 00:43:37 crc kubenswrapper[4606]: I1212 00:43:37.797623 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd9fd090-7c43-44f4-9951-10b4528fc8a2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 12 00:43:37 crc kubenswrapper[4606]: I1212 00:43:37.797695 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/bd9fd090-7c43-44f4-9951-10b4528fc8a2-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd9fd090-7c43-44f4-9951-10b4528fc8a2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 12 00:43:37 crc kubenswrapper[4606]: I1212 00:43:37.797746 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bd9fd090-7c43-44f4-9951-10b4528fc8a2-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd9fd090-7c43-44f4-9951-10b4528fc8a2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 12 00:43:37 crc kubenswrapper[4606]: I1212 00:43:37.797835 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bd9fd090-7c43-44f4-9951-10b4528fc8a2-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd9fd090-7c43-44f4-9951-10b4528fc8a2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 12 00:43:37 crc kubenswrapper[4606]: I1212 00:43:37.797866 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bd9fd090-7c43-44f4-9951-10b4528fc8a2-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd9fd090-7c43-44f4-9951-10b4528fc8a2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 12 00:43:37 crc kubenswrapper[4606]: I1212 00:43:37.797914 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bd9fd090-7c43-44f4-9951-10b4528fc8a2-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd9fd090-7c43-44f4-9951-10b4528fc8a2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 12 00:43:37 crc kubenswrapper[4606]: I1212 00:43:37.797951 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qcbhk\" (UniqueName: \"kubernetes.io/projected/bd9fd090-7c43-44f4-9951-10b4528fc8a2-kube-api-access-qcbhk\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd9fd090-7c43-44f4-9951-10b4528fc8a2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 12 00:43:37 crc kubenswrapper[4606]: I1212 00:43:37.797984 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bd9fd090-7c43-44f4-9951-10b4528fc8a2-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd9fd090-7c43-44f4-9951-10b4528fc8a2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 12 00:43:37 crc kubenswrapper[4606]: I1212 00:43:37.798020 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bd9fd090-7c43-44f4-9951-10b4528fc8a2-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd9fd090-7c43-44f4-9951-10b4528fc8a2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 12 00:43:37 crc kubenswrapper[4606]: I1212 00:43:37.798042 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bd9fd090-7c43-44f4-9951-10b4528fc8a2-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd9fd090-7c43-44f4-9951-10b4528fc8a2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 12 00:43:37 crc kubenswrapper[4606]: I1212 00:43:37.798070 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bd9fd090-7c43-44f4-9951-10b4528fc8a2-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd9fd090-7c43-44f4-9951-10b4528fc8a2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 12 00:43:37 crc kubenswrapper[4606]: I1212 00:43:37.798683 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bd9fd090-7c43-44f4-9951-10b4528fc8a2-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd9fd090-7c43-44f4-9951-10b4528fc8a2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 12 00:43:37 crc kubenswrapper[4606]: I1212 00:43:37.798697 4606 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd9fd090-7c43-44f4-9951-10b4528fc8a2\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/rabbitmq-cell1-server-0" Dec 12 00:43:37 crc kubenswrapper[4606]: I1212 00:43:37.798859 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bd9fd090-7c43-44f4-9951-10b4528fc8a2-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd9fd090-7c43-44f4-9951-10b4528fc8a2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 12 00:43:37 crc kubenswrapper[4606]: I1212 00:43:37.799350 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bd9fd090-7c43-44f4-9951-10b4528fc8a2-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd9fd090-7c43-44f4-9951-10b4528fc8a2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 12 00:43:37 crc kubenswrapper[4606]: I1212 00:43:37.803623 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bd9fd090-7c43-44f4-9951-10b4528fc8a2-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd9fd090-7c43-44f4-9951-10b4528fc8a2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 12 00:43:37 crc kubenswrapper[4606]: I1212 00:43:37.821021 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bd9fd090-7c43-44f4-9951-10b4528fc8a2-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd9fd090-7c43-44f4-9951-10b4528fc8a2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 12 00:43:37 crc kubenswrapper[4606]: I1212 00:43:37.821555 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/bd9fd090-7c43-44f4-9951-10b4528fc8a2-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd9fd090-7c43-44f4-9951-10b4528fc8a2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 12 00:43:37 crc kubenswrapper[4606]: I1212 00:43:37.822938 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bd9fd090-7c43-44f4-9951-10b4528fc8a2-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd9fd090-7c43-44f4-9951-10b4528fc8a2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 12 00:43:37 crc kubenswrapper[4606]: I1212 00:43:37.824784 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bd9fd090-7c43-44f4-9951-10b4528fc8a2-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd9fd090-7c43-44f4-9951-10b4528fc8a2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 12 00:43:37 crc kubenswrapper[4606]: I1212 00:43:37.837392 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bd9fd090-7c43-44f4-9951-10b4528fc8a2-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd9fd090-7c43-44f4-9951-10b4528fc8a2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 12 00:43:37 crc kubenswrapper[4606]: I1212 00:43:37.847083 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qcbhk\" (UniqueName: \"kubernetes.io/projected/bd9fd090-7c43-44f4-9951-10b4528fc8a2-kube-api-access-qcbhk\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd9fd090-7c43-44f4-9951-10b4528fc8a2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 12 00:43:37 crc kubenswrapper[4606]: I1212 00:43:37.851534 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd9fd090-7c43-44f4-9951-10b4528fc8a2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 12 00:43:37 crc kubenswrapper[4606]: I1212 00:43:37.911464 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 12 00:43:38 crc kubenswrapper[4606]: I1212 00:43:38.241940 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 12 00:43:38 crc kubenswrapper[4606]: I1212 00:43:38.491263 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"0e415e37-636f-4f5d-a64e-4dd815e6030e","Type":"ContainerStarted","Data":"46d6b3a8daa9ea589f99763355e721931535c6633a9bf7e7686bae3882f9f2d8"} Dec 12 00:43:38 crc kubenswrapper[4606]: I1212 00:43:38.553592 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 12 00:43:38 crc kubenswrapper[4606]: W1212 00:43:38.560019 4606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd9fd090_7c43_44f4_9951_10b4528fc8a2.slice/crio-1c8f9b1263c4b5b0e26b095d054e0d7444112d6765d12422bba276e6ffdc26a4 WatchSource:0}: Error finding container 1c8f9b1263c4b5b0e26b095d054e0d7444112d6765d12422bba276e6ffdc26a4: Status 404 returned error can't find the container with id 1c8f9b1263c4b5b0e26b095d054e0d7444112d6765d12422bba276e6ffdc26a4 Dec 12 00:43:38 crc kubenswrapper[4606]: I1212 00:43:38.799590 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Dec 12 00:43:38 crc kubenswrapper[4606]: I1212 00:43:38.801353 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 12 00:43:38 crc kubenswrapper[4606]: I1212 00:43:38.807295 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Dec 12 00:43:38 crc kubenswrapper[4606]: I1212 00:43:38.807356 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Dec 12 00:43:38 crc kubenswrapper[4606]: I1212 00:43:38.810878 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Dec 12 00:43:38 crc kubenswrapper[4606]: I1212 00:43:38.812391 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 12 00:43:38 crc kubenswrapper[4606]: I1212 00:43:38.814012 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-gxdxz" Dec 12 00:43:38 crc kubenswrapper[4606]: I1212 00:43:38.822432 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Dec 12 00:43:38 crc kubenswrapper[4606]: I1212 00:43:38.927486 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/bdf45e39-242c-4bf6-b3e5-7bbb9e0a72b9-config-data-generated\") pod \"openstack-galera-0\" (UID: \"bdf45e39-242c-4bf6-b3e5-7bbb9e0a72b9\") " pod="openstack/openstack-galera-0" Dec 12 00:43:38 crc kubenswrapper[4606]: I1212 00:43:38.927560 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-galera-0\" (UID: \"bdf45e39-242c-4bf6-b3e5-7bbb9e0a72b9\") " pod="openstack/openstack-galera-0" Dec 12 00:43:38 crc kubenswrapper[4606]: I1212 00:43:38.927586 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-csf4v\" (UniqueName: \"kubernetes.io/projected/bdf45e39-242c-4bf6-b3e5-7bbb9e0a72b9-kube-api-access-csf4v\") pod \"openstack-galera-0\" (UID: \"bdf45e39-242c-4bf6-b3e5-7bbb9e0a72b9\") " pod="openstack/openstack-galera-0" Dec 12 00:43:38 crc kubenswrapper[4606]: I1212 00:43:38.927697 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/bdf45e39-242c-4bf6-b3e5-7bbb9e0a72b9-config-data-default\") pod \"openstack-galera-0\" (UID: \"bdf45e39-242c-4bf6-b3e5-7bbb9e0a72b9\") " pod="openstack/openstack-galera-0" Dec 12 00:43:38 crc kubenswrapper[4606]: I1212 00:43:38.927767 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bdf45e39-242c-4bf6-b3e5-7bbb9e0a72b9-operator-scripts\") pod \"openstack-galera-0\" (UID: \"bdf45e39-242c-4bf6-b3e5-7bbb9e0a72b9\") " pod="openstack/openstack-galera-0" Dec 12 00:43:38 crc kubenswrapper[4606]: I1212 00:43:38.927810 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdf45e39-242c-4bf6-b3e5-7bbb9e0a72b9-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"bdf45e39-242c-4bf6-b3e5-7bbb9e0a72b9\") " pod="openstack/openstack-galera-0" Dec 12 00:43:38 crc kubenswrapper[4606]: I1212 00:43:38.927839 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/bdf45e39-242c-4bf6-b3e5-7bbb9e0a72b9-kolla-config\") pod \"openstack-galera-0\" (UID: \"bdf45e39-242c-4bf6-b3e5-7bbb9e0a72b9\") " pod="openstack/openstack-galera-0" Dec 12 00:43:38 crc kubenswrapper[4606]: I1212 00:43:38.928779 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/bdf45e39-242c-4bf6-b3e5-7bbb9e0a72b9-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"bdf45e39-242c-4bf6-b3e5-7bbb9e0a72b9\") " pod="openstack/openstack-galera-0" Dec 12 00:43:39 crc kubenswrapper[4606]: I1212 00:43:39.030545 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/bdf45e39-242c-4bf6-b3e5-7bbb9e0a72b9-kolla-config\") pod \"openstack-galera-0\" (UID: \"bdf45e39-242c-4bf6-b3e5-7bbb9e0a72b9\") " pod="openstack/openstack-galera-0" Dec 12 00:43:39 crc kubenswrapper[4606]: I1212 00:43:39.030584 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/bdf45e39-242c-4bf6-b3e5-7bbb9e0a72b9-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"bdf45e39-242c-4bf6-b3e5-7bbb9e0a72b9\") " pod="openstack/openstack-galera-0" Dec 12 00:43:39 crc kubenswrapper[4606]: I1212 00:43:39.030626 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/bdf45e39-242c-4bf6-b3e5-7bbb9e0a72b9-config-data-generated\") pod \"openstack-galera-0\" (UID: \"bdf45e39-242c-4bf6-b3e5-7bbb9e0a72b9\") " pod="openstack/openstack-galera-0" Dec 12 00:43:39 crc kubenswrapper[4606]: I1212 00:43:39.030662 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-galera-0\" (UID: \"bdf45e39-242c-4bf6-b3e5-7bbb9e0a72b9\") " pod="openstack/openstack-galera-0" Dec 12 00:43:39 crc kubenswrapper[4606]: I1212 00:43:39.030686 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-csf4v\" (UniqueName: \"kubernetes.io/projected/bdf45e39-242c-4bf6-b3e5-7bbb9e0a72b9-kube-api-access-csf4v\") pod \"openstack-galera-0\" (UID: \"bdf45e39-242c-4bf6-b3e5-7bbb9e0a72b9\") " pod="openstack/openstack-galera-0" Dec 12 00:43:39 crc kubenswrapper[4606]: I1212 00:43:39.031104 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/bdf45e39-242c-4bf6-b3e5-7bbb9e0a72b9-config-data-generated\") pod \"openstack-galera-0\" (UID: \"bdf45e39-242c-4bf6-b3e5-7bbb9e0a72b9\") " pod="openstack/openstack-galera-0" Dec 12 00:43:39 crc kubenswrapper[4606]: I1212 00:43:39.031236 4606 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-galera-0\" (UID: \"bdf45e39-242c-4bf6-b3e5-7bbb9e0a72b9\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/openstack-galera-0" Dec 12 00:43:39 crc kubenswrapper[4606]: I1212 00:43:39.031253 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/bdf45e39-242c-4bf6-b3e5-7bbb9e0a72b9-config-data-default\") pod \"openstack-galera-0\" (UID: \"bdf45e39-242c-4bf6-b3e5-7bbb9e0a72b9\") " pod="openstack/openstack-galera-0" Dec 12 00:43:39 crc kubenswrapper[4606]: I1212 00:43:39.031307 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bdf45e39-242c-4bf6-b3e5-7bbb9e0a72b9-operator-scripts\") pod \"openstack-galera-0\" (UID: \"bdf45e39-242c-4bf6-b3e5-7bbb9e0a72b9\") " pod="openstack/openstack-galera-0" Dec 12 00:43:39 crc kubenswrapper[4606]: I1212 00:43:39.031340 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdf45e39-242c-4bf6-b3e5-7bbb9e0a72b9-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"bdf45e39-242c-4bf6-b3e5-7bbb9e0a72b9\") " pod="openstack/openstack-galera-0" Dec 12 00:43:39 crc kubenswrapper[4606]: I1212 00:43:39.031652 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/bdf45e39-242c-4bf6-b3e5-7bbb9e0a72b9-kolla-config\") pod \"openstack-galera-0\" (UID: \"bdf45e39-242c-4bf6-b3e5-7bbb9e0a72b9\") " pod="openstack/openstack-galera-0" Dec 12 00:43:39 crc kubenswrapper[4606]: I1212 00:43:39.032689 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/bdf45e39-242c-4bf6-b3e5-7bbb9e0a72b9-config-data-default\") pod \"openstack-galera-0\" (UID: \"bdf45e39-242c-4bf6-b3e5-7bbb9e0a72b9\") " pod="openstack/openstack-galera-0" Dec 12 00:43:39 crc kubenswrapper[4606]: I1212 00:43:39.037634 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/bdf45e39-242c-4bf6-b3e5-7bbb9e0a72b9-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"bdf45e39-242c-4bf6-b3e5-7bbb9e0a72b9\") " pod="openstack/openstack-galera-0" Dec 12 00:43:39 crc kubenswrapper[4606]: I1212 00:43:39.045657 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdf45e39-242c-4bf6-b3e5-7bbb9e0a72b9-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"bdf45e39-242c-4bf6-b3e5-7bbb9e0a72b9\") " pod="openstack/openstack-galera-0" Dec 12 00:43:39 crc kubenswrapper[4606]: I1212 00:43:39.050724 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bdf45e39-242c-4bf6-b3e5-7bbb9e0a72b9-operator-scripts\") pod \"openstack-galera-0\" (UID: \"bdf45e39-242c-4bf6-b3e5-7bbb9e0a72b9\") " pod="openstack/openstack-galera-0" Dec 12 00:43:39 crc kubenswrapper[4606]: I1212 00:43:39.062912 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-csf4v\" (UniqueName: \"kubernetes.io/projected/bdf45e39-242c-4bf6-b3e5-7bbb9e0a72b9-kube-api-access-csf4v\") pod \"openstack-galera-0\" (UID: \"bdf45e39-242c-4bf6-b3e5-7bbb9e0a72b9\") " pod="openstack/openstack-galera-0" Dec 12 00:43:39 crc kubenswrapper[4606]: I1212 00:43:39.087516 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-galera-0\" (UID: \"bdf45e39-242c-4bf6-b3e5-7bbb9e0a72b9\") " pod="openstack/openstack-galera-0" Dec 12 00:43:39 crc kubenswrapper[4606]: I1212 00:43:39.135597 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 12 00:43:39 crc kubenswrapper[4606]: I1212 00:43:39.521083 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"bd9fd090-7c43-44f4-9951-10b4528fc8a2","Type":"ContainerStarted","Data":"1c8f9b1263c4b5b0e26b095d054e0d7444112d6765d12422bba276e6ffdc26a4"} Dec 12 00:43:39 crc kubenswrapper[4606]: I1212 00:43:39.766248 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 12 00:43:39 crc kubenswrapper[4606]: W1212 00:43:39.808226 4606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbdf45e39_242c_4bf6_b3e5_7bbb9e0a72b9.slice/crio-13dee800488a0376d27c7aab1d586074b23263d6208b12206517877abc467fda WatchSource:0}: Error finding container 13dee800488a0376d27c7aab1d586074b23263d6208b12206517877abc467fda: Status 404 returned error can't find the container with id 13dee800488a0376d27c7aab1d586074b23263d6208b12206517877abc467fda Dec 12 00:43:40 crc kubenswrapper[4606]: I1212 00:43:40.227208 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 12 00:43:40 crc kubenswrapper[4606]: I1212 00:43:40.231023 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 12 00:43:40 crc kubenswrapper[4606]: I1212 00:43:40.238021 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-dp978" Dec 12 00:43:40 crc kubenswrapper[4606]: I1212 00:43:40.239134 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Dec 12 00:43:40 crc kubenswrapper[4606]: I1212 00:43:40.239637 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Dec 12 00:43:40 crc kubenswrapper[4606]: I1212 00:43:40.240461 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Dec 12 00:43:40 crc kubenswrapper[4606]: I1212 00:43:40.245120 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 12 00:43:40 crc kubenswrapper[4606]: I1212 00:43:40.361760 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pswl4\" (UniqueName: \"kubernetes.io/projected/469d04e8-23ca-4aba-b3f1-0c4ad8da1562-kube-api-access-pswl4\") pod \"openstack-cell1-galera-0\" (UID: \"469d04e8-23ca-4aba-b3f1-0c4ad8da1562\") " pod="openstack/openstack-cell1-galera-0" Dec 12 00:43:40 crc kubenswrapper[4606]: I1212 00:43:40.361903 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-cell1-galera-0\" (UID: \"469d04e8-23ca-4aba-b3f1-0c4ad8da1562\") " pod="openstack/openstack-cell1-galera-0" Dec 12 00:43:40 crc kubenswrapper[4606]: I1212 00:43:40.361982 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/469d04e8-23ca-4aba-b3f1-0c4ad8da1562-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"469d04e8-23ca-4aba-b3f1-0c4ad8da1562\") " pod="openstack/openstack-cell1-galera-0" Dec 12 00:43:40 crc kubenswrapper[4606]: I1212 00:43:40.362031 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/469d04e8-23ca-4aba-b3f1-0c4ad8da1562-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"469d04e8-23ca-4aba-b3f1-0c4ad8da1562\") " pod="openstack/openstack-cell1-galera-0" Dec 12 00:43:40 crc kubenswrapper[4606]: I1212 00:43:40.362235 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/469d04e8-23ca-4aba-b3f1-0c4ad8da1562-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"469d04e8-23ca-4aba-b3f1-0c4ad8da1562\") " pod="openstack/openstack-cell1-galera-0" Dec 12 00:43:40 crc kubenswrapper[4606]: I1212 00:43:40.362286 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/469d04e8-23ca-4aba-b3f1-0c4ad8da1562-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"469d04e8-23ca-4aba-b3f1-0c4ad8da1562\") " pod="openstack/openstack-cell1-galera-0" Dec 12 00:43:40 crc kubenswrapper[4606]: I1212 00:43:40.362317 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/469d04e8-23ca-4aba-b3f1-0c4ad8da1562-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"469d04e8-23ca-4aba-b3f1-0c4ad8da1562\") " pod="openstack/openstack-cell1-galera-0" Dec 12 00:43:40 crc kubenswrapper[4606]: I1212 00:43:40.362425 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/469d04e8-23ca-4aba-b3f1-0c4ad8da1562-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"469d04e8-23ca-4aba-b3f1-0c4ad8da1562\") " pod="openstack/openstack-cell1-galera-0" Dec 12 00:43:40 crc kubenswrapper[4606]: I1212 00:43:40.464432 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/469d04e8-23ca-4aba-b3f1-0c4ad8da1562-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"469d04e8-23ca-4aba-b3f1-0c4ad8da1562\") " pod="openstack/openstack-cell1-galera-0" Dec 12 00:43:40 crc kubenswrapper[4606]: I1212 00:43:40.464495 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/469d04e8-23ca-4aba-b3f1-0c4ad8da1562-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"469d04e8-23ca-4aba-b3f1-0c4ad8da1562\") " pod="openstack/openstack-cell1-galera-0" Dec 12 00:43:40 crc kubenswrapper[4606]: I1212 00:43:40.464525 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/469d04e8-23ca-4aba-b3f1-0c4ad8da1562-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"469d04e8-23ca-4aba-b3f1-0c4ad8da1562\") " pod="openstack/openstack-cell1-galera-0" Dec 12 00:43:40 crc kubenswrapper[4606]: I1212 00:43:40.464577 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/469d04e8-23ca-4aba-b3f1-0c4ad8da1562-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"469d04e8-23ca-4aba-b3f1-0c4ad8da1562\") " pod="openstack/openstack-cell1-galera-0" Dec 12 00:43:40 crc kubenswrapper[4606]: I1212 00:43:40.464678 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-cell1-galera-0\" (UID: \"469d04e8-23ca-4aba-b3f1-0c4ad8da1562\") " pod="openstack/openstack-cell1-galera-0" Dec 12 00:43:40 crc kubenswrapper[4606]: I1212 00:43:40.464706 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pswl4\" (UniqueName: \"kubernetes.io/projected/469d04e8-23ca-4aba-b3f1-0c4ad8da1562-kube-api-access-pswl4\") pod \"openstack-cell1-galera-0\" (UID: \"469d04e8-23ca-4aba-b3f1-0c4ad8da1562\") " pod="openstack/openstack-cell1-galera-0" Dec 12 00:43:40 crc kubenswrapper[4606]: I1212 00:43:40.464738 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/469d04e8-23ca-4aba-b3f1-0c4ad8da1562-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"469d04e8-23ca-4aba-b3f1-0c4ad8da1562\") " pod="openstack/openstack-cell1-galera-0" Dec 12 00:43:40 crc kubenswrapper[4606]: I1212 00:43:40.464766 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/469d04e8-23ca-4aba-b3f1-0c4ad8da1562-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"469d04e8-23ca-4aba-b3f1-0c4ad8da1562\") " pod="openstack/openstack-cell1-galera-0" Dec 12 00:43:40 crc kubenswrapper[4606]: I1212 00:43:40.465748 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/469d04e8-23ca-4aba-b3f1-0c4ad8da1562-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"469d04e8-23ca-4aba-b3f1-0c4ad8da1562\") " pod="openstack/openstack-cell1-galera-0" Dec 12 00:43:40 crc kubenswrapper[4606]: I1212 00:43:40.465819 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/469d04e8-23ca-4aba-b3f1-0c4ad8da1562-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"469d04e8-23ca-4aba-b3f1-0c4ad8da1562\") " pod="openstack/openstack-cell1-galera-0" Dec 12 00:43:40 crc kubenswrapper[4606]: I1212 00:43:40.465916 4606 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-cell1-galera-0\" (UID: \"469d04e8-23ca-4aba-b3f1-0c4ad8da1562\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/openstack-cell1-galera-0" Dec 12 00:43:40 crc kubenswrapper[4606]: I1212 00:43:40.466235 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/469d04e8-23ca-4aba-b3f1-0c4ad8da1562-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"469d04e8-23ca-4aba-b3f1-0c4ad8da1562\") " pod="openstack/openstack-cell1-galera-0" Dec 12 00:43:40 crc kubenswrapper[4606]: I1212 00:43:40.476831 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/469d04e8-23ca-4aba-b3f1-0c4ad8da1562-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"469d04e8-23ca-4aba-b3f1-0c4ad8da1562\") " pod="openstack/openstack-cell1-galera-0" Dec 12 00:43:40 crc kubenswrapper[4606]: I1212 00:43:40.491846 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/469d04e8-23ca-4aba-b3f1-0c4ad8da1562-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"469d04e8-23ca-4aba-b3f1-0c4ad8da1562\") " pod="openstack/openstack-cell1-galera-0" Dec 12 00:43:40 crc kubenswrapper[4606]: I1212 00:43:40.494569 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/469d04e8-23ca-4aba-b3f1-0c4ad8da1562-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"469d04e8-23ca-4aba-b3f1-0c4ad8da1562\") " pod="openstack/openstack-cell1-galera-0" Dec 12 00:43:40 crc kubenswrapper[4606]: I1212 00:43:40.496919 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-cell1-galera-0\" (UID: \"469d04e8-23ca-4aba-b3f1-0c4ad8da1562\") " pod="openstack/openstack-cell1-galera-0" Dec 12 00:43:40 crc kubenswrapper[4606]: I1212 00:43:40.510751 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pswl4\" (UniqueName: \"kubernetes.io/projected/469d04e8-23ca-4aba-b3f1-0c4ad8da1562-kube-api-access-pswl4\") pod \"openstack-cell1-galera-0\" (UID: \"469d04e8-23ca-4aba-b3f1-0c4ad8da1562\") " pod="openstack/openstack-cell1-galera-0" Dec 12 00:43:40 crc kubenswrapper[4606]: I1212 00:43:40.565080 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 12 00:43:40 crc kubenswrapper[4606]: I1212 00:43:40.583365 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"bdf45e39-242c-4bf6-b3e5-7bbb9e0a72b9","Type":"ContainerStarted","Data":"13dee800488a0376d27c7aab1d586074b23263d6208b12206517877abc467fda"} Dec 12 00:43:40 crc kubenswrapper[4606]: I1212 00:43:40.636835 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Dec 12 00:43:40 crc kubenswrapper[4606]: I1212 00:43:40.641656 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 12 00:43:40 crc kubenswrapper[4606]: I1212 00:43:40.643976 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-t4vqd" Dec 12 00:43:40 crc kubenswrapper[4606]: I1212 00:43:40.655568 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Dec 12 00:43:40 crc kubenswrapper[4606]: I1212 00:43:40.655821 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Dec 12 00:43:40 crc kubenswrapper[4606]: I1212 00:43:40.678537 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 12 00:43:40 crc kubenswrapper[4606]: I1212 00:43:40.787210 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/eb404bf7-b4ad-4fd2-aeae-fc44a6315e39-kolla-config\") pod \"memcached-0\" (UID: \"eb404bf7-b4ad-4fd2-aeae-fc44a6315e39\") " pod="openstack/memcached-0" Dec 12 00:43:40 crc kubenswrapper[4606]: I1212 00:43:40.787300 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zg9m5\" (UniqueName: \"kubernetes.io/projected/eb404bf7-b4ad-4fd2-aeae-fc44a6315e39-kube-api-access-zg9m5\") pod \"memcached-0\" (UID: \"eb404bf7-b4ad-4fd2-aeae-fc44a6315e39\") " pod="openstack/memcached-0" Dec 12 00:43:40 crc kubenswrapper[4606]: I1212 00:43:40.787324 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb404bf7-b4ad-4fd2-aeae-fc44a6315e39-memcached-tls-certs\") pod \"memcached-0\" (UID: \"eb404bf7-b4ad-4fd2-aeae-fc44a6315e39\") " pod="openstack/memcached-0" Dec 12 00:43:40 crc kubenswrapper[4606]: I1212 00:43:40.787387 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb404bf7-b4ad-4fd2-aeae-fc44a6315e39-combined-ca-bundle\") pod \"memcached-0\" (UID: \"eb404bf7-b4ad-4fd2-aeae-fc44a6315e39\") " pod="openstack/memcached-0" Dec 12 00:43:40 crc kubenswrapper[4606]: I1212 00:43:40.787457 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/eb404bf7-b4ad-4fd2-aeae-fc44a6315e39-config-data\") pod \"memcached-0\" (UID: \"eb404bf7-b4ad-4fd2-aeae-fc44a6315e39\") " pod="openstack/memcached-0" Dec 12 00:43:40 crc kubenswrapper[4606]: I1212 00:43:40.890412 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb404bf7-b4ad-4fd2-aeae-fc44a6315e39-combined-ca-bundle\") pod \"memcached-0\" (UID: \"eb404bf7-b4ad-4fd2-aeae-fc44a6315e39\") " pod="openstack/memcached-0" Dec 12 00:43:40 crc kubenswrapper[4606]: I1212 00:43:40.890560 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/eb404bf7-b4ad-4fd2-aeae-fc44a6315e39-config-data\") pod \"memcached-0\" (UID: \"eb404bf7-b4ad-4fd2-aeae-fc44a6315e39\") " pod="openstack/memcached-0" Dec 12 00:43:40 crc kubenswrapper[4606]: I1212 00:43:40.890583 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/eb404bf7-b4ad-4fd2-aeae-fc44a6315e39-kolla-config\") pod \"memcached-0\" (UID: \"eb404bf7-b4ad-4fd2-aeae-fc44a6315e39\") " pod="openstack/memcached-0" Dec 12 00:43:40 crc kubenswrapper[4606]: I1212 00:43:40.890630 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zg9m5\" (UniqueName: \"kubernetes.io/projected/eb404bf7-b4ad-4fd2-aeae-fc44a6315e39-kube-api-access-zg9m5\") pod \"memcached-0\" (UID: \"eb404bf7-b4ad-4fd2-aeae-fc44a6315e39\") " pod="openstack/memcached-0" Dec 12 00:43:40 crc kubenswrapper[4606]: I1212 00:43:40.890651 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb404bf7-b4ad-4fd2-aeae-fc44a6315e39-memcached-tls-certs\") pod \"memcached-0\" (UID: \"eb404bf7-b4ad-4fd2-aeae-fc44a6315e39\") " pod="openstack/memcached-0" Dec 12 00:43:40 crc kubenswrapper[4606]: I1212 00:43:40.895250 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/eb404bf7-b4ad-4fd2-aeae-fc44a6315e39-config-data\") pod \"memcached-0\" (UID: \"eb404bf7-b4ad-4fd2-aeae-fc44a6315e39\") " pod="openstack/memcached-0" Dec 12 00:43:40 crc kubenswrapper[4606]: I1212 00:43:40.897424 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/eb404bf7-b4ad-4fd2-aeae-fc44a6315e39-kolla-config\") pod \"memcached-0\" (UID: \"eb404bf7-b4ad-4fd2-aeae-fc44a6315e39\") " pod="openstack/memcached-0" Dec 12 00:43:40 crc kubenswrapper[4606]: I1212 00:43:40.924498 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb404bf7-b4ad-4fd2-aeae-fc44a6315e39-combined-ca-bundle\") pod \"memcached-0\" (UID: \"eb404bf7-b4ad-4fd2-aeae-fc44a6315e39\") " pod="openstack/memcached-0" Dec 12 00:43:40 crc kubenswrapper[4606]: I1212 00:43:40.924561 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb404bf7-b4ad-4fd2-aeae-fc44a6315e39-memcached-tls-certs\") pod \"memcached-0\" (UID: \"eb404bf7-b4ad-4fd2-aeae-fc44a6315e39\") " pod="openstack/memcached-0" Dec 12 00:43:40 crc kubenswrapper[4606]: I1212 00:43:40.933099 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zg9m5\" (UniqueName: \"kubernetes.io/projected/eb404bf7-b4ad-4fd2-aeae-fc44a6315e39-kube-api-access-zg9m5\") pod \"memcached-0\" (UID: \"eb404bf7-b4ad-4fd2-aeae-fc44a6315e39\") " pod="openstack/memcached-0" Dec 12 00:43:41 crc kubenswrapper[4606]: I1212 00:43:41.070493 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 12 00:43:41 crc kubenswrapper[4606]: I1212 00:43:41.396958 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 12 00:43:41 crc kubenswrapper[4606]: I1212 00:43:41.605888 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"469d04e8-23ca-4aba-b3f1-0c4ad8da1562","Type":"ContainerStarted","Data":"5588c4d002ec0876c47e59909c63fe6500b03ae485ed8c04f6f7f0069af6f0ec"} Dec 12 00:43:41 crc kubenswrapper[4606]: I1212 00:43:41.797554 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 12 00:43:41 crc kubenswrapper[4606]: W1212 00:43:41.941798 4606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeb404bf7_b4ad_4fd2_aeae_fc44a6315e39.slice/crio-4eba2a09f6f0f3fcaaf3538c55f3be850e06a275e9dc4f1c82dd1481cfc6c185 WatchSource:0}: Error finding container 4eba2a09f6f0f3fcaaf3538c55f3be850e06a275e9dc4f1c82dd1481cfc6c185: Status 404 returned error can't find the container with id 4eba2a09f6f0f3fcaaf3538c55f3be850e06a275e9dc4f1c82dd1481cfc6c185 Dec 12 00:43:42 crc kubenswrapper[4606]: I1212 00:43:42.499379 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 12 00:43:42 crc kubenswrapper[4606]: I1212 00:43:42.500711 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 12 00:43:42 crc kubenswrapper[4606]: I1212 00:43:42.506003 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-42849" Dec 12 00:43:42 crc kubenswrapper[4606]: I1212 00:43:42.533526 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 12 00:43:42 crc kubenswrapper[4606]: I1212 00:43:42.635657 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmd5n\" (UniqueName: \"kubernetes.io/projected/fe774fb2-c953-4fc2-8f6b-ec94268d6e7d-kube-api-access-xmd5n\") pod \"kube-state-metrics-0\" (UID: \"fe774fb2-c953-4fc2-8f6b-ec94268d6e7d\") " pod="openstack/kube-state-metrics-0" Dec 12 00:43:42 crc kubenswrapper[4606]: I1212 00:43:42.692531 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"eb404bf7-b4ad-4fd2-aeae-fc44a6315e39","Type":"ContainerStarted","Data":"4eba2a09f6f0f3fcaaf3538c55f3be850e06a275e9dc4f1c82dd1481cfc6c185"} Dec 12 00:43:42 crc kubenswrapper[4606]: I1212 00:43:42.736604 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xmd5n\" (UniqueName: \"kubernetes.io/projected/fe774fb2-c953-4fc2-8f6b-ec94268d6e7d-kube-api-access-xmd5n\") pod \"kube-state-metrics-0\" (UID: \"fe774fb2-c953-4fc2-8f6b-ec94268d6e7d\") " pod="openstack/kube-state-metrics-0" Dec 12 00:43:42 crc kubenswrapper[4606]: I1212 00:43:42.761622 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmd5n\" (UniqueName: \"kubernetes.io/projected/fe774fb2-c953-4fc2-8f6b-ec94268d6e7d-kube-api-access-xmd5n\") pod \"kube-state-metrics-0\" (UID: \"fe774fb2-c953-4fc2-8f6b-ec94268d6e7d\") " pod="openstack/kube-state-metrics-0" Dec 12 00:43:42 crc kubenswrapper[4606]: I1212 00:43:42.843676 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 12 00:43:43 crc kubenswrapper[4606]: I1212 00:43:43.411800 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 12 00:43:43 crc kubenswrapper[4606]: I1212 00:43:43.724489 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"fe774fb2-c953-4fc2-8f6b-ec94268d6e7d","Type":"ContainerStarted","Data":"aa91c344846cd9f4512562ebc0594a7267f85faa5b548877caef0ee995c65cd7"} Dec 12 00:43:45 crc kubenswrapper[4606]: I1212 00:43:45.823753 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-666ck"] Dec 12 00:43:45 crc kubenswrapper[4606]: I1212 00:43:45.827084 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-666ck" Dec 12 00:43:45 crc kubenswrapper[4606]: I1212 00:43:45.830224 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-fh8p9"] Dec 12 00:43:45 crc kubenswrapper[4606]: I1212 00:43:45.831787 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-fh8p9" Dec 12 00:43:45 crc kubenswrapper[4606]: I1212 00:43:45.832362 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-8j49w" Dec 12 00:43:45 crc kubenswrapper[4606]: I1212 00:43:45.832557 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Dec 12 00:43:45 crc kubenswrapper[4606]: I1212 00:43:45.832668 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Dec 12 00:43:45 crc kubenswrapper[4606]: I1212 00:43:45.843440 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-666ck"] Dec 12 00:43:45 crc kubenswrapper[4606]: I1212 00:43:45.858304 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-fh8p9"] Dec 12 00:43:46 crc kubenswrapper[4606]: I1212 00:43:46.010223 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/23cadcb5-094e-4dc3-af06-6f1186b6cb98-etc-ovs\") pod \"ovn-controller-ovs-fh8p9\" (UID: \"23cadcb5-094e-4dc3-af06-6f1186b6cb98\") " pod="openstack/ovn-controller-ovs-fh8p9" Dec 12 00:43:46 crc kubenswrapper[4606]: I1212 00:43:46.010292 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rqgz\" (UniqueName: \"kubernetes.io/projected/23cadcb5-094e-4dc3-af06-6f1186b6cb98-kube-api-access-9rqgz\") pod \"ovn-controller-ovs-fh8p9\" (UID: \"23cadcb5-094e-4dc3-af06-6f1186b6cb98\") " pod="openstack/ovn-controller-ovs-fh8p9" Dec 12 00:43:46 crc kubenswrapper[4606]: I1212 00:43:46.010318 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/23cadcb5-094e-4dc3-af06-6f1186b6cb98-scripts\") pod \"ovn-controller-ovs-fh8p9\" (UID: \"23cadcb5-094e-4dc3-af06-6f1186b6cb98\") " pod="openstack/ovn-controller-ovs-fh8p9" Dec 12 00:43:46 crc kubenswrapper[4606]: I1212 00:43:46.010332 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbcj5\" (UniqueName: \"kubernetes.io/projected/015ed993-f4fd-4928-a5ec-d13ad04b0105-kube-api-access-bbcj5\") pod \"ovn-controller-666ck\" (UID: \"015ed993-f4fd-4928-a5ec-d13ad04b0105\") " pod="openstack/ovn-controller-666ck" Dec 12 00:43:46 crc kubenswrapper[4606]: I1212 00:43:46.010352 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/015ed993-f4fd-4928-a5ec-d13ad04b0105-ovn-controller-tls-certs\") pod \"ovn-controller-666ck\" (UID: \"015ed993-f4fd-4928-a5ec-d13ad04b0105\") " pod="openstack/ovn-controller-666ck" Dec 12 00:43:46 crc kubenswrapper[4606]: I1212 00:43:46.010371 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/015ed993-f4fd-4928-a5ec-d13ad04b0105-scripts\") pod \"ovn-controller-666ck\" (UID: \"015ed993-f4fd-4928-a5ec-d13ad04b0105\") " pod="openstack/ovn-controller-666ck" Dec 12 00:43:46 crc kubenswrapper[4606]: I1212 00:43:46.010400 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/015ed993-f4fd-4928-a5ec-d13ad04b0105-var-log-ovn\") pod \"ovn-controller-666ck\" (UID: \"015ed993-f4fd-4928-a5ec-d13ad04b0105\") " pod="openstack/ovn-controller-666ck" Dec 12 00:43:46 crc kubenswrapper[4606]: I1212 00:43:46.010425 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/015ed993-f4fd-4928-a5ec-d13ad04b0105-var-run\") pod \"ovn-controller-666ck\" (UID: \"015ed993-f4fd-4928-a5ec-d13ad04b0105\") " pod="openstack/ovn-controller-666ck" Dec 12 00:43:46 crc kubenswrapper[4606]: I1212 00:43:46.010443 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/015ed993-f4fd-4928-a5ec-d13ad04b0105-var-run-ovn\") pod \"ovn-controller-666ck\" (UID: \"015ed993-f4fd-4928-a5ec-d13ad04b0105\") " pod="openstack/ovn-controller-666ck" Dec 12 00:43:46 crc kubenswrapper[4606]: I1212 00:43:46.010471 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/23cadcb5-094e-4dc3-af06-6f1186b6cb98-var-run\") pod \"ovn-controller-ovs-fh8p9\" (UID: \"23cadcb5-094e-4dc3-af06-6f1186b6cb98\") " pod="openstack/ovn-controller-ovs-fh8p9" Dec 12 00:43:46 crc kubenswrapper[4606]: I1212 00:43:46.010497 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/015ed993-f4fd-4928-a5ec-d13ad04b0105-combined-ca-bundle\") pod \"ovn-controller-666ck\" (UID: \"015ed993-f4fd-4928-a5ec-d13ad04b0105\") " pod="openstack/ovn-controller-666ck" Dec 12 00:43:46 crc kubenswrapper[4606]: I1212 00:43:46.010514 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/23cadcb5-094e-4dc3-af06-6f1186b6cb98-var-log\") pod \"ovn-controller-ovs-fh8p9\" (UID: \"23cadcb5-094e-4dc3-af06-6f1186b6cb98\") " pod="openstack/ovn-controller-ovs-fh8p9" Dec 12 00:43:46 crc kubenswrapper[4606]: I1212 00:43:46.010534 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/23cadcb5-094e-4dc3-af06-6f1186b6cb98-var-lib\") pod \"ovn-controller-ovs-fh8p9\" (UID: \"23cadcb5-094e-4dc3-af06-6f1186b6cb98\") " pod="openstack/ovn-controller-ovs-fh8p9" Dec 12 00:43:46 crc kubenswrapper[4606]: I1212 00:43:46.115230 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rqgz\" (UniqueName: \"kubernetes.io/projected/23cadcb5-094e-4dc3-af06-6f1186b6cb98-kube-api-access-9rqgz\") pod \"ovn-controller-ovs-fh8p9\" (UID: \"23cadcb5-094e-4dc3-af06-6f1186b6cb98\") " pod="openstack/ovn-controller-ovs-fh8p9" Dec 12 00:43:46 crc kubenswrapper[4606]: I1212 00:43:46.115298 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/23cadcb5-094e-4dc3-af06-6f1186b6cb98-scripts\") pod \"ovn-controller-ovs-fh8p9\" (UID: \"23cadcb5-094e-4dc3-af06-6f1186b6cb98\") " pod="openstack/ovn-controller-ovs-fh8p9" Dec 12 00:43:46 crc kubenswrapper[4606]: I1212 00:43:46.115352 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbcj5\" (UniqueName: \"kubernetes.io/projected/015ed993-f4fd-4928-a5ec-d13ad04b0105-kube-api-access-bbcj5\") pod \"ovn-controller-666ck\" (UID: \"015ed993-f4fd-4928-a5ec-d13ad04b0105\") " pod="openstack/ovn-controller-666ck" Dec 12 00:43:46 crc kubenswrapper[4606]: I1212 00:43:46.115405 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/015ed993-f4fd-4928-a5ec-d13ad04b0105-ovn-controller-tls-certs\") pod \"ovn-controller-666ck\" (UID: \"015ed993-f4fd-4928-a5ec-d13ad04b0105\") " pod="openstack/ovn-controller-666ck" Dec 12 00:43:46 crc kubenswrapper[4606]: I1212 00:43:46.115435 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/015ed993-f4fd-4928-a5ec-d13ad04b0105-scripts\") pod \"ovn-controller-666ck\" (UID: \"015ed993-f4fd-4928-a5ec-d13ad04b0105\") " pod="openstack/ovn-controller-666ck" Dec 12 00:43:46 crc kubenswrapper[4606]: I1212 00:43:46.115494 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/015ed993-f4fd-4928-a5ec-d13ad04b0105-var-log-ovn\") pod \"ovn-controller-666ck\" (UID: \"015ed993-f4fd-4928-a5ec-d13ad04b0105\") " pod="openstack/ovn-controller-666ck" Dec 12 00:43:46 crc kubenswrapper[4606]: I1212 00:43:46.115576 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/015ed993-f4fd-4928-a5ec-d13ad04b0105-var-run\") pod \"ovn-controller-666ck\" (UID: \"015ed993-f4fd-4928-a5ec-d13ad04b0105\") " pod="openstack/ovn-controller-666ck" Dec 12 00:43:46 crc kubenswrapper[4606]: I1212 00:43:46.115594 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/015ed993-f4fd-4928-a5ec-d13ad04b0105-var-run-ovn\") pod \"ovn-controller-666ck\" (UID: \"015ed993-f4fd-4928-a5ec-d13ad04b0105\") " pod="openstack/ovn-controller-666ck" Dec 12 00:43:46 crc kubenswrapper[4606]: I1212 00:43:46.115637 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/23cadcb5-094e-4dc3-af06-6f1186b6cb98-var-run\") pod \"ovn-controller-ovs-fh8p9\" (UID: \"23cadcb5-094e-4dc3-af06-6f1186b6cb98\") " pod="openstack/ovn-controller-ovs-fh8p9" Dec 12 00:43:46 crc kubenswrapper[4606]: I1212 00:43:46.115680 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/015ed993-f4fd-4928-a5ec-d13ad04b0105-combined-ca-bundle\") pod \"ovn-controller-666ck\" (UID: \"015ed993-f4fd-4928-a5ec-d13ad04b0105\") " pod="openstack/ovn-controller-666ck" Dec 12 00:43:46 crc kubenswrapper[4606]: I1212 00:43:46.115704 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/23cadcb5-094e-4dc3-af06-6f1186b6cb98-var-log\") pod \"ovn-controller-ovs-fh8p9\" (UID: \"23cadcb5-094e-4dc3-af06-6f1186b6cb98\") " pod="openstack/ovn-controller-ovs-fh8p9" Dec 12 00:43:46 crc kubenswrapper[4606]: I1212 00:43:46.115729 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/23cadcb5-094e-4dc3-af06-6f1186b6cb98-var-lib\") pod \"ovn-controller-ovs-fh8p9\" (UID: \"23cadcb5-094e-4dc3-af06-6f1186b6cb98\") " pod="openstack/ovn-controller-ovs-fh8p9" Dec 12 00:43:46 crc kubenswrapper[4606]: I1212 00:43:46.115829 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/23cadcb5-094e-4dc3-af06-6f1186b6cb98-etc-ovs\") pod \"ovn-controller-ovs-fh8p9\" (UID: \"23cadcb5-094e-4dc3-af06-6f1186b6cb98\") " pod="openstack/ovn-controller-ovs-fh8p9" Dec 12 00:43:46 crc kubenswrapper[4606]: I1212 00:43:46.116422 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/23cadcb5-094e-4dc3-af06-6f1186b6cb98-etc-ovs\") pod \"ovn-controller-ovs-fh8p9\" (UID: \"23cadcb5-094e-4dc3-af06-6f1186b6cb98\") " pod="openstack/ovn-controller-ovs-fh8p9" Dec 12 00:43:46 crc kubenswrapper[4606]: I1212 00:43:46.119244 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/23cadcb5-094e-4dc3-af06-6f1186b6cb98-scripts\") pod \"ovn-controller-ovs-fh8p9\" (UID: \"23cadcb5-094e-4dc3-af06-6f1186b6cb98\") " pod="openstack/ovn-controller-ovs-fh8p9" Dec 12 00:43:46 crc kubenswrapper[4606]: I1212 00:43:46.120517 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/23cadcb5-094e-4dc3-af06-6f1186b6cb98-var-run\") pod \"ovn-controller-ovs-fh8p9\" (UID: \"23cadcb5-094e-4dc3-af06-6f1186b6cb98\") " pod="openstack/ovn-controller-ovs-fh8p9" Dec 12 00:43:46 crc kubenswrapper[4606]: I1212 00:43:46.120643 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/015ed993-f4fd-4928-a5ec-d13ad04b0105-var-log-ovn\") pod \"ovn-controller-666ck\" (UID: \"015ed993-f4fd-4928-a5ec-d13ad04b0105\") " pod="openstack/ovn-controller-666ck" Dec 12 00:43:46 crc kubenswrapper[4606]: I1212 00:43:46.120694 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/015ed993-f4fd-4928-a5ec-d13ad04b0105-var-run\") pod \"ovn-controller-666ck\" (UID: \"015ed993-f4fd-4928-a5ec-d13ad04b0105\") " pod="openstack/ovn-controller-666ck" Dec 12 00:43:46 crc kubenswrapper[4606]: I1212 00:43:46.120784 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/015ed993-f4fd-4928-a5ec-d13ad04b0105-var-run-ovn\") pod \"ovn-controller-666ck\" (UID: \"015ed993-f4fd-4928-a5ec-d13ad04b0105\") " pod="openstack/ovn-controller-666ck" Dec 12 00:43:46 crc kubenswrapper[4606]: I1212 00:43:46.121045 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/23cadcb5-094e-4dc3-af06-6f1186b6cb98-var-log\") pod \"ovn-controller-ovs-fh8p9\" (UID: \"23cadcb5-094e-4dc3-af06-6f1186b6cb98\") " pod="openstack/ovn-controller-ovs-fh8p9" Dec 12 00:43:46 crc kubenswrapper[4606]: I1212 00:43:46.121115 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/23cadcb5-094e-4dc3-af06-6f1186b6cb98-var-lib\") pod \"ovn-controller-ovs-fh8p9\" (UID: \"23cadcb5-094e-4dc3-af06-6f1186b6cb98\") " pod="openstack/ovn-controller-ovs-fh8p9" Dec 12 00:43:46 crc kubenswrapper[4606]: I1212 00:43:46.122062 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/015ed993-f4fd-4928-a5ec-d13ad04b0105-scripts\") pod \"ovn-controller-666ck\" (UID: \"015ed993-f4fd-4928-a5ec-d13ad04b0105\") " pod="openstack/ovn-controller-666ck" Dec 12 00:43:46 crc kubenswrapper[4606]: I1212 00:43:46.127711 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/015ed993-f4fd-4928-a5ec-d13ad04b0105-ovn-controller-tls-certs\") pod \"ovn-controller-666ck\" (UID: \"015ed993-f4fd-4928-a5ec-d13ad04b0105\") " pod="openstack/ovn-controller-666ck" Dec 12 00:43:46 crc kubenswrapper[4606]: I1212 00:43:46.137111 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rqgz\" (UniqueName: \"kubernetes.io/projected/23cadcb5-094e-4dc3-af06-6f1186b6cb98-kube-api-access-9rqgz\") pod \"ovn-controller-ovs-fh8p9\" (UID: \"23cadcb5-094e-4dc3-af06-6f1186b6cb98\") " pod="openstack/ovn-controller-ovs-fh8p9" Dec 12 00:43:46 crc kubenswrapper[4606]: I1212 00:43:46.152636 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/015ed993-f4fd-4928-a5ec-d13ad04b0105-combined-ca-bundle\") pod \"ovn-controller-666ck\" (UID: \"015ed993-f4fd-4928-a5ec-d13ad04b0105\") " pod="openstack/ovn-controller-666ck" Dec 12 00:43:46 crc kubenswrapper[4606]: I1212 00:43:46.153396 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbcj5\" (UniqueName: \"kubernetes.io/projected/015ed993-f4fd-4928-a5ec-d13ad04b0105-kube-api-access-bbcj5\") pod \"ovn-controller-666ck\" (UID: \"015ed993-f4fd-4928-a5ec-d13ad04b0105\") " pod="openstack/ovn-controller-666ck" Dec 12 00:43:46 crc kubenswrapper[4606]: I1212 00:43:46.154806 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-666ck" Dec 12 00:43:46 crc kubenswrapper[4606]: I1212 00:43:46.172378 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-fh8p9" Dec 12 00:43:46 crc kubenswrapper[4606]: I1212 00:43:46.722849 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 12 00:43:46 crc kubenswrapper[4606]: I1212 00:43:46.724699 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 12 00:43:46 crc kubenswrapper[4606]: I1212 00:43:46.731647 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Dec 12 00:43:46 crc kubenswrapper[4606]: I1212 00:43:46.731743 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Dec 12 00:43:46 crc kubenswrapper[4606]: I1212 00:43:46.733707 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Dec 12 00:43:46 crc kubenswrapper[4606]: I1212 00:43:46.733834 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Dec 12 00:43:46 crc kubenswrapper[4606]: I1212 00:43:46.734085 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-lqm26" Dec 12 00:43:46 crc kubenswrapper[4606]: I1212 00:43:46.734985 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 12 00:43:46 crc kubenswrapper[4606]: I1212 00:43:46.828607 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/867593e3-7035-4358-8583-0d2cb0878282-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"867593e3-7035-4358-8583-0d2cb0878282\") " pod="openstack/ovsdbserver-nb-0" Dec 12 00:43:46 crc kubenswrapper[4606]: I1212 00:43:46.828971 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/867593e3-7035-4358-8583-0d2cb0878282-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"867593e3-7035-4358-8583-0d2cb0878282\") " pod="openstack/ovsdbserver-nb-0" Dec 12 00:43:46 crc kubenswrapper[4606]: I1212 00:43:46.829005 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/867593e3-7035-4358-8583-0d2cb0878282-config\") pod \"ovsdbserver-nb-0\" (UID: \"867593e3-7035-4358-8583-0d2cb0878282\") " pod="openstack/ovsdbserver-nb-0" Dec 12 00:43:46 crc kubenswrapper[4606]: I1212 00:43:46.829023 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/867593e3-7035-4358-8583-0d2cb0878282-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"867593e3-7035-4358-8583-0d2cb0878282\") " pod="openstack/ovsdbserver-nb-0" Dec 12 00:43:46 crc kubenswrapper[4606]: I1212 00:43:46.829055 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/867593e3-7035-4358-8583-0d2cb0878282-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"867593e3-7035-4358-8583-0d2cb0878282\") " pod="openstack/ovsdbserver-nb-0" Dec 12 00:43:46 crc kubenswrapper[4606]: I1212 00:43:46.829085 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7pdd7\" (UniqueName: \"kubernetes.io/projected/867593e3-7035-4358-8583-0d2cb0878282-kube-api-access-7pdd7\") pod \"ovsdbserver-nb-0\" (UID: \"867593e3-7035-4358-8583-0d2cb0878282\") " pod="openstack/ovsdbserver-nb-0" Dec 12 00:43:46 crc kubenswrapper[4606]: I1212 00:43:46.829309 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/867593e3-7035-4358-8583-0d2cb0878282-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"867593e3-7035-4358-8583-0d2cb0878282\") " pod="openstack/ovsdbserver-nb-0" Dec 12 00:43:46 crc kubenswrapper[4606]: I1212 00:43:46.829560 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-nb-0\" (UID: \"867593e3-7035-4358-8583-0d2cb0878282\") " pod="openstack/ovsdbserver-nb-0" Dec 12 00:43:46 crc kubenswrapper[4606]: I1212 00:43:46.930709 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/867593e3-7035-4358-8583-0d2cb0878282-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"867593e3-7035-4358-8583-0d2cb0878282\") " pod="openstack/ovsdbserver-nb-0" Dec 12 00:43:46 crc kubenswrapper[4606]: I1212 00:43:46.930793 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-nb-0\" (UID: \"867593e3-7035-4358-8583-0d2cb0878282\") " pod="openstack/ovsdbserver-nb-0" Dec 12 00:43:46 crc kubenswrapper[4606]: I1212 00:43:46.930829 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/867593e3-7035-4358-8583-0d2cb0878282-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"867593e3-7035-4358-8583-0d2cb0878282\") " pod="openstack/ovsdbserver-nb-0" Dec 12 00:43:46 crc kubenswrapper[4606]: I1212 00:43:46.930867 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/867593e3-7035-4358-8583-0d2cb0878282-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"867593e3-7035-4358-8583-0d2cb0878282\") " pod="openstack/ovsdbserver-nb-0" Dec 12 00:43:46 crc kubenswrapper[4606]: I1212 00:43:46.930899 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/867593e3-7035-4358-8583-0d2cb0878282-config\") pod \"ovsdbserver-nb-0\" (UID: \"867593e3-7035-4358-8583-0d2cb0878282\") " pod="openstack/ovsdbserver-nb-0" Dec 12 00:43:46 crc kubenswrapper[4606]: I1212 00:43:46.930923 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/867593e3-7035-4358-8583-0d2cb0878282-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"867593e3-7035-4358-8583-0d2cb0878282\") " pod="openstack/ovsdbserver-nb-0" Dec 12 00:43:46 crc kubenswrapper[4606]: I1212 00:43:46.930959 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/867593e3-7035-4358-8583-0d2cb0878282-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"867593e3-7035-4358-8583-0d2cb0878282\") " pod="openstack/ovsdbserver-nb-0" Dec 12 00:43:46 crc kubenswrapper[4606]: I1212 00:43:46.930997 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7pdd7\" (UniqueName: \"kubernetes.io/projected/867593e3-7035-4358-8583-0d2cb0878282-kube-api-access-7pdd7\") pod \"ovsdbserver-nb-0\" (UID: \"867593e3-7035-4358-8583-0d2cb0878282\") " pod="openstack/ovsdbserver-nb-0" Dec 12 00:43:46 crc kubenswrapper[4606]: I1212 00:43:46.931153 4606 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-nb-0\" (UID: \"867593e3-7035-4358-8583-0d2cb0878282\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/ovsdbserver-nb-0" Dec 12 00:43:46 crc kubenswrapper[4606]: I1212 00:43:46.932578 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/867593e3-7035-4358-8583-0d2cb0878282-config\") pod \"ovsdbserver-nb-0\" (UID: \"867593e3-7035-4358-8583-0d2cb0878282\") " pod="openstack/ovsdbserver-nb-0" Dec 12 00:43:46 crc kubenswrapper[4606]: I1212 00:43:46.932869 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/867593e3-7035-4358-8583-0d2cb0878282-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"867593e3-7035-4358-8583-0d2cb0878282\") " pod="openstack/ovsdbserver-nb-0" Dec 12 00:43:46 crc kubenswrapper[4606]: I1212 00:43:46.933789 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/867593e3-7035-4358-8583-0d2cb0878282-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"867593e3-7035-4358-8583-0d2cb0878282\") " pod="openstack/ovsdbserver-nb-0" Dec 12 00:43:46 crc kubenswrapper[4606]: I1212 00:43:46.938188 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/867593e3-7035-4358-8583-0d2cb0878282-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"867593e3-7035-4358-8583-0d2cb0878282\") " pod="openstack/ovsdbserver-nb-0" Dec 12 00:43:46 crc kubenswrapper[4606]: I1212 00:43:46.945410 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/867593e3-7035-4358-8583-0d2cb0878282-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"867593e3-7035-4358-8583-0d2cb0878282\") " pod="openstack/ovsdbserver-nb-0" Dec 12 00:43:46 crc kubenswrapper[4606]: I1212 00:43:46.946207 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7pdd7\" (UniqueName: \"kubernetes.io/projected/867593e3-7035-4358-8583-0d2cb0878282-kube-api-access-7pdd7\") pod \"ovsdbserver-nb-0\" (UID: \"867593e3-7035-4358-8583-0d2cb0878282\") " pod="openstack/ovsdbserver-nb-0" Dec 12 00:43:46 crc kubenswrapper[4606]: I1212 00:43:46.954873 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/867593e3-7035-4358-8583-0d2cb0878282-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"867593e3-7035-4358-8583-0d2cb0878282\") " pod="openstack/ovsdbserver-nb-0" Dec 12 00:43:46 crc kubenswrapper[4606]: I1212 00:43:46.970950 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-nb-0\" (UID: \"867593e3-7035-4358-8583-0d2cb0878282\") " pod="openstack/ovsdbserver-nb-0" Dec 12 00:43:47 crc kubenswrapper[4606]: I1212 00:43:47.052660 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 12 00:43:50 crc kubenswrapper[4606]: I1212 00:43:50.328013 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 12 00:43:50 crc kubenswrapper[4606]: I1212 00:43:50.329269 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 12 00:43:50 crc kubenswrapper[4606]: I1212 00:43:50.336322 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-dq99h" Dec 12 00:43:50 crc kubenswrapper[4606]: I1212 00:43:50.336387 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Dec 12 00:43:50 crc kubenswrapper[4606]: I1212 00:43:50.337273 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Dec 12 00:43:50 crc kubenswrapper[4606]: I1212 00:43:50.337280 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Dec 12 00:43:50 crc kubenswrapper[4606]: I1212 00:43:50.351867 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 12 00:43:50 crc kubenswrapper[4606]: I1212 00:43:50.482073 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c68b0331-671b-4ca9-9f19-260d6faeada7-config\") pod \"ovsdbserver-sb-0\" (UID: \"c68b0331-671b-4ca9-9f19-260d6faeada7\") " pod="openstack/ovsdbserver-sb-0" Dec 12 00:43:50 crc kubenswrapper[4606]: I1212 00:43:50.482143 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c68b0331-671b-4ca9-9f19-260d6faeada7-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"c68b0331-671b-4ca9-9f19-260d6faeada7\") " pod="openstack/ovsdbserver-sb-0" Dec 12 00:43:50 crc kubenswrapper[4606]: I1212 00:43:50.482290 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c68b0331-671b-4ca9-9f19-260d6faeada7-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"c68b0331-671b-4ca9-9f19-260d6faeada7\") " pod="openstack/ovsdbserver-sb-0" Dec 12 00:43:50 crc kubenswrapper[4606]: I1212 00:43:50.482315 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c68b0331-671b-4ca9-9f19-260d6faeada7-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"c68b0331-671b-4ca9-9f19-260d6faeada7\") " pod="openstack/ovsdbserver-sb-0" Dec 12 00:43:50 crc kubenswrapper[4606]: I1212 00:43:50.482343 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c68b0331-671b-4ca9-9f19-260d6faeada7-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"c68b0331-671b-4ca9-9f19-260d6faeada7\") " pod="openstack/ovsdbserver-sb-0" Dec 12 00:43:50 crc kubenswrapper[4606]: I1212 00:43:50.482375 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-sb-0\" (UID: \"c68b0331-671b-4ca9-9f19-260d6faeada7\") " pod="openstack/ovsdbserver-sb-0" Dec 12 00:43:50 crc kubenswrapper[4606]: I1212 00:43:50.482395 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rpr4\" (UniqueName: \"kubernetes.io/projected/c68b0331-671b-4ca9-9f19-260d6faeada7-kube-api-access-4rpr4\") pod \"ovsdbserver-sb-0\" (UID: \"c68b0331-671b-4ca9-9f19-260d6faeada7\") " pod="openstack/ovsdbserver-sb-0" Dec 12 00:43:50 crc kubenswrapper[4606]: I1212 00:43:50.482455 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c68b0331-671b-4ca9-9f19-260d6faeada7-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"c68b0331-671b-4ca9-9f19-260d6faeada7\") " pod="openstack/ovsdbserver-sb-0" Dec 12 00:43:50 crc kubenswrapper[4606]: I1212 00:43:50.583839 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-sb-0\" (UID: \"c68b0331-671b-4ca9-9f19-260d6faeada7\") " pod="openstack/ovsdbserver-sb-0" Dec 12 00:43:50 crc kubenswrapper[4606]: I1212 00:43:50.583916 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rpr4\" (UniqueName: \"kubernetes.io/projected/c68b0331-671b-4ca9-9f19-260d6faeada7-kube-api-access-4rpr4\") pod \"ovsdbserver-sb-0\" (UID: \"c68b0331-671b-4ca9-9f19-260d6faeada7\") " pod="openstack/ovsdbserver-sb-0" Dec 12 00:43:50 crc kubenswrapper[4606]: I1212 00:43:50.583936 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c68b0331-671b-4ca9-9f19-260d6faeada7-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"c68b0331-671b-4ca9-9f19-260d6faeada7\") " pod="openstack/ovsdbserver-sb-0" Dec 12 00:43:50 crc kubenswrapper[4606]: I1212 00:43:50.583995 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c68b0331-671b-4ca9-9f19-260d6faeada7-config\") pod \"ovsdbserver-sb-0\" (UID: \"c68b0331-671b-4ca9-9f19-260d6faeada7\") " pod="openstack/ovsdbserver-sb-0" Dec 12 00:43:50 crc kubenswrapper[4606]: I1212 00:43:50.584044 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c68b0331-671b-4ca9-9f19-260d6faeada7-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"c68b0331-671b-4ca9-9f19-260d6faeada7\") " pod="openstack/ovsdbserver-sb-0" Dec 12 00:43:50 crc kubenswrapper[4606]: I1212 00:43:50.584119 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c68b0331-671b-4ca9-9f19-260d6faeada7-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"c68b0331-671b-4ca9-9f19-260d6faeada7\") " pod="openstack/ovsdbserver-sb-0" Dec 12 00:43:50 crc kubenswrapper[4606]: I1212 00:43:50.584138 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c68b0331-671b-4ca9-9f19-260d6faeada7-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"c68b0331-671b-4ca9-9f19-260d6faeada7\") " pod="openstack/ovsdbserver-sb-0" Dec 12 00:43:50 crc kubenswrapper[4606]: I1212 00:43:50.584193 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c68b0331-671b-4ca9-9f19-260d6faeada7-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"c68b0331-671b-4ca9-9f19-260d6faeada7\") " pod="openstack/ovsdbserver-sb-0" Dec 12 00:43:50 crc kubenswrapper[4606]: I1212 00:43:50.586036 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c68b0331-671b-4ca9-9f19-260d6faeada7-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"c68b0331-671b-4ca9-9f19-260d6faeada7\") " pod="openstack/ovsdbserver-sb-0" Dec 12 00:43:50 crc kubenswrapper[4606]: I1212 00:43:50.586403 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c68b0331-671b-4ca9-9f19-260d6faeada7-config\") pod \"ovsdbserver-sb-0\" (UID: \"c68b0331-671b-4ca9-9f19-260d6faeada7\") " pod="openstack/ovsdbserver-sb-0" Dec 12 00:43:50 crc kubenswrapper[4606]: I1212 00:43:50.586597 4606 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-sb-0\" (UID: \"c68b0331-671b-4ca9-9f19-260d6faeada7\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/ovsdbserver-sb-0" Dec 12 00:43:50 crc kubenswrapper[4606]: I1212 00:43:50.585762 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c68b0331-671b-4ca9-9f19-260d6faeada7-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"c68b0331-671b-4ca9-9f19-260d6faeada7\") " pod="openstack/ovsdbserver-sb-0" Dec 12 00:43:50 crc kubenswrapper[4606]: I1212 00:43:50.591681 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c68b0331-671b-4ca9-9f19-260d6faeada7-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"c68b0331-671b-4ca9-9f19-260d6faeada7\") " pod="openstack/ovsdbserver-sb-0" Dec 12 00:43:50 crc kubenswrapper[4606]: I1212 00:43:50.592038 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c68b0331-671b-4ca9-9f19-260d6faeada7-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"c68b0331-671b-4ca9-9f19-260d6faeada7\") " pod="openstack/ovsdbserver-sb-0" Dec 12 00:43:50 crc kubenswrapper[4606]: I1212 00:43:50.593367 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c68b0331-671b-4ca9-9f19-260d6faeada7-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"c68b0331-671b-4ca9-9f19-260d6faeada7\") " pod="openstack/ovsdbserver-sb-0" Dec 12 00:43:50 crc kubenswrapper[4606]: I1212 00:43:50.604704 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rpr4\" (UniqueName: \"kubernetes.io/projected/c68b0331-671b-4ca9-9f19-260d6faeada7-kube-api-access-4rpr4\") pod \"ovsdbserver-sb-0\" (UID: \"c68b0331-671b-4ca9-9f19-260d6faeada7\") " pod="openstack/ovsdbserver-sb-0" Dec 12 00:43:50 crc kubenswrapper[4606]: I1212 00:43:50.614024 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-sb-0\" (UID: \"c68b0331-671b-4ca9-9f19-260d6faeada7\") " pod="openstack/ovsdbserver-sb-0" Dec 12 00:43:50 crc kubenswrapper[4606]: I1212 00:43:50.651080 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 12 00:44:00 crc kubenswrapper[4606]: E1212 00:44:00.733890 4606 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified" Dec 12 00:44:00 crc kubenswrapper[4606]: E1212 00:44:00.734627 4606 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-g8fnp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-server-0_openstack(0e415e37-636f-4f5d-a64e-4dd815e6030e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 12 00:44:00 crc kubenswrapper[4606]: E1212 00:44:00.735838 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-server-0" podUID="0e415e37-636f-4f5d-a64e-4dd815e6030e" Dec 12 00:44:00 crc kubenswrapper[4606]: E1212 00:44:00.767439 4606 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified" Dec 12 00:44:00 crc kubenswrapper[4606]: E1212 00:44:00.767646 4606 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qcbhk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cell1-server-0_openstack(bd9fd090-7c43-44f4-9951-10b4528fc8a2): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 12 00:44:00 crc kubenswrapper[4606]: E1212 00:44:00.768843 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-cell1-server-0" podUID="bd9fd090-7c43-44f4-9951-10b4528fc8a2" Dec 12 00:44:00 crc kubenswrapper[4606]: E1212 00:44:00.909367 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified\\\"\"" pod="openstack/rabbitmq-cell1-server-0" podUID="bd9fd090-7c43-44f4-9951-10b4528fc8a2" Dec 12 00:44:00 crc kubenswrapper[4606]: E1212 00:44:00.911159 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified\\\"\"" pod="openstack/rabbitmq-server-0" podUID="0e415e37-636f-4f5d-a64e-4dd815e6030e" Dec 12 00:44:01 crc kubenswrapper[4606]: E1212 00:44:01.454009 4606 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-memcached:current-podified" Dec 12 00:44:01 crc kubenswrapper[4606]: E1212 00:44:01.454303 4606 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:memcached,Image:quay.io/podified-antelope-centos9/openstack-memcached:current-podified,Command:[/usr/bin/dumb-init -- /usr/local/bin/kolla_start],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:memcached,HostPort:0,ContainerPort:11211,Protocol:TCP,HostIP:,},ContainerPort{Name:memcached-tls,HostPort:0,ContainerPort:11212,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:POD_IPS,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIPs,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:CONFIG_HASH,Value:n579h78h645h96h65fhc8h684h6h546h7bh576h59fh65bh67fh5fch5f7h674hcdh5ch79h79h66bh5fhdchdh54dh5fbh98h657hf5h5cdh5ccq,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/src,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:memcached-tls-certs,ReadOnly:true,MountPath:/var/lib/config-data/tls/certs/memcached.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:memcached-tls-certs,ReadOnly:true,MountPath:/var/lib/config-data/tls/private/memcached.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zg9m5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 11211 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 11211 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42457,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42457,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod memcached-0_openstack(eb404bf7-b4ad-4fd2-aeae-fc44a6315e39): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 12 00:44:01 crc kubenswrapper[4606]: E1212 00:44:01.455511 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"memcached\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/memcached-0" podUID="eb404bf7-b4ad-4fd2-aeae-fc44a6315e39" Dec 12 00:44:01 crc kubenswrapper[4606]: E1212 00:44:01.918972 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"memcached\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-memcached:current-podified\\\"\"" pod="openstack/memcached-0" podUID="eb404bf7-b4ad-4fd2-aeae-fc44a6315e39" Dec 12 00:44:02 crc kubenswrapper[4606]: I1212 00:44:02.010609 4606 patch_prober.go:28] interesting pod/machine-config-daemon-cqmz5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 12 00:44:02 crc kubenswrapper[4606]: I1212 00:44:02.010657 4606 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 12 00:44:08 crc kubenswrapper[4606]: E1212 00:44:08.171765 4606 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 12 00:44:08 crc kubenswrapper[4606]: E1212 00:44:08.172490 4606 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gg7b2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-mbzbq_openstack(a54f6780-951a-4e4b-953b-b470456b76f9): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 12 00:44:08 crc kubenswrapper[4606]: E1212 00:44:08.173876 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-mbzbq" podUID="a54f6780-951a-4e4b-953b-b470456b76f9" Dec 12 00:44:08 crc kubenswrapper[4606]: E1212 00:44:08.204149 4606 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 12 00:44:08 crc kubenswrapper[4606]: E1212 00:44:08.204338 4606 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rc7cv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-57d769cc4f-dk28j_openstack(059cf819-e18e-493b-a65b-9f3f8b5d683f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 12 00:44:08 crc kubenswrapper[4606]: E1212 00:44:08.204878 4606 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 12 00:44:08 crc kubenswrapper[4606]: E1212 00:44:08.204948 4606 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ndh26,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-666b6646f7-5p9bb_openstack(398360c3-4b1c-4dc0-be54-511d4ac621ee): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 12 00:44:08 crc kubenswrapper[4606]: E1212 00:44:08.205955 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-57d769cc4f-dk28j" podUID="059cf819-e18e-493b-a65b-9f3f8b5d683f" Dec 12 00:44:08 crc kubenswrapper[4606]: E1212 00:44:08.206047 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-666b6646f7-5p9bb" podUID="398360c3-4b1c-4dc0-be54-511d4ac621ee" Dec 12 00:44:08 crc kubenswrapper[4606]: E1212 00:44:08.263543 4606 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 12 00:44:08 crc kubenswrapper[4606]: E1212 00:44:08.263713 4606 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rk4nr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-82vqv_openstack(28522b6b-a691-4d48-a1f6-d7dad1b11a58): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 12 00:44:08 crc kubenswrapper[4606]: E1212 00:44:08.265115 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-82vqv" podUID="28522b6b-a691-4d48-a1f6-d7dad1b11a58" Dec 12 00:44:08 crc kubenswrapper[4606]: I1212 00:44:08.814922 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-666ck"] Dec 12 00:44:08 crc kubenswrapper[4606]: I1212 00:44:08.913743 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-fh8p9"] Dec 12 00:44:08 crc kubenswrapper[4606]: I1212 00:44:08.973458 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-666ck" event={"ID":"015ed993-f4fd-4928-a5ec-d13ad04b0105","Type":"ContainerStarted","Data":"79b5ae742b2e43a158cc23115cbac94bd79b7f2e99196105fe927ec9947be4da"} Dec 12 00:44:08 crc kubenswrapper[4606]: E1212 00:44:08.978688 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-666b6646f7-5p9bb" podUID="398360c3-4b1c-4dc0-be54-511d4ac621ee" Dec 12 00:44:08 crc kubenswrapper[4606]: E1212 00:44:08.978787 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-57d769cc4f-dk28j" podUID="059cf819-e18e-493b-a65b-9f3f8b5d683f" Dec 12 00:44:09 crc kubenswrapper[4606]: I1212 00:44:09.034917 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 12 00:44:09 crc kubenswrapper[4606]: I1212 00:44:09.573577 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 12 00:44:10 crc kubenswrapper[4606]: E1212 00:44:10.029842 4606 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0" Dec 12 00:44:10 crc kubenswrapper[4606]: E1212 00:44:10.029891 4606 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0" Dec 12 00:44:10 crc kubenswrapper[4606]: E1212 00:44:10.030404 4606 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-state-metrics,Image:registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0,Command:[],Args:[--resources=pods --namespaces=openstack],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:http-metrics,HostPort:0,ContainerPort:8080,Protocol:TCP,HostIP:,},ContainerPort{Name:telemetry,HostPort:0,ContainerPort:8081,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-xmd5n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/livez,Port:{0 8080 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod kube-state-metrics-0_openstack(fe774fb2-c953-4fc2-8f6b-ec94268d6e7d): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 12 00:44:10 crc kubenswrapper[4606]: E1212 00:44:10.032139 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-state-metrics\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openstack/kube-state-metrics-0" podUID="fe774fb2-c953-4fc2-8f6b-ec94268d6e7d" Dec 12 00:44:10 crc kubenswrapper[4606]: I1212 00:44:10.259491 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-mbzbq" Dec 12 00:44:10 crc kubenswrapper[4606]: I1212 00:44:10.287270 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-82vqv" Dec 12 00:44:10 crc kubenswrapper[4606]: I1212 00:44:10.358485 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a54f6780-951a-4e4b-953b-b470456b76f9-dns-svc\") pod \"a54f6780-951a-4e4b-953b-b470456b76f9\" (UID: \"a54f6780-951a-4e4b-953b-b470456b76f9\") " Dec 12 00:44:10 crc kubenswrapper[4606]: I1212 00:44:10.358613 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a54f6780-951a-4e4b-953b-b470456b76f9-config\") pod \"a54f6780-951a-4e4b-953b-b470456b76f9\" (UID: \"a54f6780-951a-4e4b-953b-b470456b76f9\") " Dec 12 00:44:10 crc kubenswrapper[4606]: I1212 00:44:10.358656 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28522b6b-a691-4d48-a1f6-d7dad1b11a58-config\") pod \"28522b6b-a691-4d48-a1f6-d7dad1b11a58\" (UID: \"28522b6b-a691-4d48-a1f6-d7dad1b11a58\") " Dec 12 00:44:10 crc kubenswrapper[4606]: I1212 00:44:10.358676 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rk4nr\" (UniqueName: \"kubernetes.io/projected/28522b6b-a691-4d48-a1f6-d7dad1b11a58-kube-api-access-rk4nr\") pod \"28522b6b-a691-4d48-a1f6-d7dad1b11a58\" (UID: \"28522b6b-a691-4d48-a1f6-d7dad1b11a58\") " Dec 12 00:44:10 crc kubenswrapper[4606]: I1212 00:44:10.358697 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gg7b2\" (UniqueName: \"kubernetes.io/projected/a54f6780-951a-4e4b-953b-b470456b76f9-kube-api-access-gg7b2\") pod \"a54f6780-951a-4e4b-953b-b470456b76f9\" (UID: \"a54f6780-951a-4e4b-953b-b470456b76f9\") " Dec 12 00:44:10 crc kubenswrapper[4606]: I1212 00:44:10.358878 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a54f6780-951a-4e4b-953b-b470456b76f9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a54f6780-951a-4e4b-953b-b470456b76f9" (UID: "a54f6780-951a-4e4b-953b-b470456b76f9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:44:10 crc kubenswrapper[4606]: I1212 00:44:10.358923 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a54f6780-951a-4e4b-953b-b470456b76f9-config" (OuterVolumeSpecName: "config") pod "a54f6780-951a-4e4b-953b-b470456b76f9" (UID: "a54f6780-951a-4e4b-953b-b470456b76f9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:44:10 crc kubenswrapper[4606]: I1212 00:44:10.359313 4606 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a54f6780-951a-4e4b-953b-b470456b76f9-config\") on node \"crc\" DevicePath \"\"" Dec 12 00:44:10 crc kubenswrapper[4606]: I1212 00:44:10.359334 4606 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a54f6780-951a-4e4b-953b-b470456b76f9-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 12 00:44:10 crc kubenswrapper[4606]: I1212 00:44:10.359422 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28522b6b-a691-4d48-a1f6-d7dad1b11a58-config" (OuterVolumeSpecName: "config") pod "28522b6b-a691-4d48-a1f6-d7dad1b11a58" (UID: "28522b6b-a691-4d48-a1f6-d7dad1b11a58"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:44:10 crc kubenswrapper[4606]: I1212 00:44:10.366159 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28522b6b-a691-4d48-a1f6-d7dad1b11a58-kube-api-access-rk4nr" (OuterVolumeSpecName: "kube-api-access-rk4nr") pod "28522b6b-a691-4d48-a1f6-d7dad1b11a58" (UID: "28522b6b-a691-4d48-a1f6-d7dad1b11a58"). InnerVolumeSpecName "kube-api-access-rk4nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:44:10 crc kubenswrapper[4606]: I1212 00:44:10.373574 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a54f6780-951a-4e4b-953b-b470456b76f9-kube-api-access-gg7b2" (OuterVolumeSpecName: "kube-api-access-gg7b2") pod "a54f6780-951a-4e4b-953b-b470456b76f9" (UID: "a54f6780-951a-4e4b-953b-b470456b76f9"). InnerVolumeSpecName "kube-api-access-gg7b2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:44:10 crc kubenswrapper[4606]: I1212 00:44:10.467669 4606 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28522b6b-a691-4d48-a1f6-d7dad1b11a58-config\") on node \"crc\" DevicePath \"\"" Dec 12 00:44:10 crc kubenswrapper[4606]: I1212 00:44:10.467702 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rk4nr\" (UniqueName: \"kubernetes.io/projected/28522b6b-a691-4d48-a1f6-d7dad1b11a58-kube-api-access-rk4nr\") on node \"crc\" DevicePath \"\"" Dec 12 00:44:10 crc kubenswrapper[4606]: I1212 00:44:10.467712 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gg7b2\" (UniqueName: \"kubernetes.io/projected/a54f6780-951a-4e4b-953b-b470456b76f9-kube-api-access-gg7b2\") on node \"crc\" DevicePath \"\"" Dec 12 00:44:10 crc kubenswrapper[4606]: I1212 00:44:10.992031 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"867593e3-7035-4358-8583-0d2cb0878282","Type":"ContainerStarted","Data":"9051de7a6074d8a5ba00142361a840639892edc112977c4edbcec7c9582dc915"} Dec 12 00:44:10 crc kubenswrapper[4606]: I1212 00:44:10.995711 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"469d04e8-23ca-4aba-b3f1-0c4ad8da1562","Type":"ContainerStarted","Data":"2923a2a5a70001a050bf5fd88869b7e87edbeb5eb60f161a7e7d71dd19066c78"} Dec 12 00:44:10 crc kubenswrapper[4606]: I1212 00:44:10.997741 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"c68b0331-671b-4ca9-9f19-260d6faeada7","Type":"ContainerStarted","Data":"b96b5b3d8bfd10e11876c4c520fbb242190724a63ed1dfe9830fbf798c937bc5"} Dec 12 00:44:11 crc kubenswrapper[4606]: I1212 00:44:11.010334 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-82vqv" event={"ID":"28522b6b-a691-4d48-a1f6-d7dad1b11a58","Type":"ContainerDied","Data":"00d57dc96d17d1f47940d0682406601c1fa943fb51d4d057074f5f9c32521f5a"} Dec 12 00:44:11 crc kubenswrapper[4606]: I1212 00:44:11.010420 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-82vqv" Dec 12 00:44:11 crc kubenswrapper[4606]: I1212 00:44:11.022543 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"bdf45e39-242c-4bf6-b3e5-7bbb9e0a72b9","Type":"ContainerStarted","Data":"0f1d6a62607ae2194750b69ce73d6f98b6e34dce9c09e4c840264f8363dfbdc7"} Dec 12 00:44:11 crc kubenswrapper[4606]: I1212 00:44:11.024837 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-mbzbq" event={"ID":"a54f6780-951a-4e4b-953b-b470456b76f9","Type":"ContainerDied","Data":"7311652c144400795d901a6816fa9d6178f93fc73a87cce8b84dfe8d2c470443"} Dec 12 00:44:11 crc kubenswrapper[4606]: I1212 00:44:11.024906 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-mbzbq" Dec 12 00:44:11 crc kubenswrapper[4606]: I1212 00:44:11.027245 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-fh8p9" event={"ID":"23cadcb5-094e-4dc3-af06-6f1186b6cb98","Type":"ContainerStarted","Data":"19ec5583ea63a95f2e9c8e9acf14e27c260aa2c9e1e959f42b92958a5df94ba4"} Dec 12 00:44:11 crc kubenswrapper[4606]: E1212 00:44:11.029640 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-state-metrics\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0\\\"\"" pod="openstack/kube-state-metrics-0" podUID="fe774fb2-c953-4fc2-8f6b-ec94268d6e7d" Dec 12 00:44:11 crc kubenswrapper[4606]: I1212 00:44:11.135685 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-mbzbq"] Dec 12 00:44:11 crc kubenswrapper[4606]: I1212 00:44:11.135735 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-mbzbq"] Dec 12 00:44:11 crc kubenswrapper[4606]: I1212 00:44:11.173452 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-82vqv"] Dec 12 00:44:11 crc kubenswrapper[4606]: I1212 00:44:11.188696 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-82vqv"] Dec 12 00:44:11 crc kubenswrapper[4606]: I1212 00:44:11.730038 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28522b6b-a691-4d48-a1f6-d7dad1b11a58" path="/var/lib/kubelet/pods/28522b6b-a691-4d48-a1f6-d7dad1b11a58/volumes" Dec 12 00:44:11 crc kubenswrapper[4606]: I1212 00:44:11.731209 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a54f6780-951a-4e4b-953b-b470456b76f9" path="/var/lib/kubelet/pods/a54f6780-951a-4e4b-953b-b470456b76f9/volumes" Dec 12 00:44:14 crc kubenswrapper[4606]: I1212 00:44:14.054629 4606 generic.go:334] "Generic (PLEG): container finished" podID="bdf45e39-242c-4bf6-b3e5-7bbb9e0a72b9" containerID="0f1d6a62607ae2194750b69ce73d6f98b6e34dce9c09e4c840264f8363dfbdc7" exitCode=0 Dec 12 00:44:14 crc kubenswrapper[4606]: I1212 00:44:14.054848 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"bdf45e39-242c-4bf6-b3e5-7bbb9e0a72b9","Type":"ContainerDied","Data":"0f1d6a62607ae2194750b69ce73d6f98b6e34dce9c09e4c840264f8363dfbdc7"} Dec 12 00:44:14 crc kubenswrapper[4606]: I1212 00:44:14.064471 4606 generic.go:334] "Generic (PLEG): container finished" podID="469d04e8-23ca-4aba-b3f1-0c4ad8da1562" containerID="2923a2a5a70001a050bf5fd88869b7e87edbeb5eb60f161a7e7d71dd19066c78" exitCode=0 Dec 12 00:44:14 crc kubenswrapper[4606]: I1212 00:44:14.064512 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"469d04e8-23ca-4aba-b3f1-0c4ad8da1562","Type":"ContainerDied","Data":"2923a2a5a70001a050bf5fd88869b7e87edbeb5eb60f161a7e7d71dd19066c78"} Dec 12 00:44:15 crc kubenswrapper[4606]: I1212 00:44:15.078757 4606 generic.go:334] "Generic (PLEG): container finished" podID="23cadcb5-094e-4dc3-af06-6f1186b6cb98" containerID="6c566cc68f957b4bf9e7f2904380a1e2a7bbaec229819cc5c4fef40a391c91f8" exitCode=0 Dec 12 00:44:15 crc kubenswrapper[4606]: I1212 00:44:15.078846 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-fh8p9" event={"ID":"23cadcb5-094e-4dc3-af06-6f1186b6cb98","Type":"ContainerDied","Data":"6c566cc68f957b4bf9e7f2904380a1e2a7bbaec229819cc5c4fef40a391c91f8"} Dec 12 00:44:15 crc kubenswrapper[4606]: I1212 00:44:15.080799 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-666ck" event={"ID":"015ed993-f4fd-4928-a5ec-d13ad04b0105","Type":"ContainerStarted","Data":"cb6cf295fcfd9fbdb6d2cb7c8fce2e41da398f9dd2e55d8cf012f4394ee59e6d"} Dec 12 00:44:15 crc kubenswrapper[4606]: I1212 00:44:15.081583 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-666ck" Dec 12 00:44:15 crc kubenswrapper[4606]: I1212 00:44:15.083122 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"eb404bf7-b4ad-4fd2-aeae-fc44a6315e39","Type":"ContainerStarted","Data":"ed742290ef24e05e1b71c741c1061718dcfdf0590120edaaf354111b9f1667b8"} Dec 12 00:44:15 crc kubenswrapper[4606]: I1212 00:44:15.083395 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Dec 12 00:44:15 crc kubenswrapper[4606]: I1212 00:44:15.086158 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"867593e3-7035-4358-8583-0d2cb0878282","Type":"ContainerStarted","Data":"0ef7cb6e7b6bc8dd392a8b1cbb1a85906626b058214e0d66cb3f234ef49ff1dd"} Dec 12 00:44:15 crc kubenswrapper[4606]: I1212 00:44:15.089900 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"bd9fd090-7c43-44f4-9951-10b4528fc8a2","Type":"ContainerStarted","Data":"0de71c822d771a40ccdc61ecaaab12bca9931df22b2c4c086696c1a0a0173f7d"} Dec 12 00:44:15 crc kubenswrapper[4606]: I1212 00:44:15.096633 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"469d04e8-23ca-4aba-b3f1-0c4ad8da1562","Type":"ContainerStarted","Data":"fe4eec6439cb7b8298d1c0ed19752d2dceea7f600af32ae83f0dfd429b1c2096"} Dec 12 00:44:15 crc kubenswrapper[4606]: I1212 00:44:15.101816 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"c68b0331-671b-4ca9-9f19-260d6faeada7","Type":"ContainerStarted","Data":"2c548d0c988fefe1a6de01814afcfa06da6d36eff4faffeb8629e82b8fd3b388"} Dec 12 00:44:15 crc kubenswrapper[4606]: I1212 00:44:15.104857 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"bdf45e39-242c-4bf6-b3e5-7bbb9e0a72b9","Type":"ContainerStarted","Data":"4f25a030c06944d0f871d9701e8e1f970bfc02ad22ab74312e5eb2f738bf84a7"} Dec 12 00:44:15 crc kubenswrapper[4606]: I1212 00:44:15.182033 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=9.485505173 podStartE2EDuration="36.182008518s" podCreationTimestamp="2025-12-12 00:43:39 +0000 UTC" firstStartedPulling="2025-12-12 00:43:41.424908068 +0000 UTC m=+1211.970260934" lastFinishedPulling="2025-12-12 00:44:08.121411413 +0000 UTC m=+1238.666764279" observedRunningTime="2025-12-12 00:44:15.134605906 +0000 UTC m=+1245.679958772" watchObservedRunningTime="2025-12-12 00:44:15.182008518 +0000 UTC m=+1245.727361384" Dec 12 00:44:15 crc kubenswrapper[4606]: I1212 00:44:15.207450 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=2.667119332 podStartE2EDuration="35.207427181s" podCreationTimestamp="2025-12-12 00:43:40 +0000 UTC" firstStartedPulling="2025-12-12 00:43:41.987508374 +0000 UTC m=+1212.532861240" lastFinishedPulling="2025-12-12 00:44:14.527816223 +0000 UTC m=+1245.073169089" observedRunningTime="2025-12-12 00:44:15.202680343 +0000 UTC m=+1245.748033209" watchObservedRunningTime="2025-12-12 00:44:15.207427181 +0000 UTC m=+1245.752780047" Dec 12 00:44:15 crc kubenswrapper[4606]: I1212 00:44:15.225033 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-666ck" podStartSLOduration=25.111021237 podStartE2EDuration="30.225007262s" podCreationTimestamp="2025-12-12 00:43:45 +0000 UTC" firstStartedPulling="2025-12-12 00:44:08.889782434 +0000 UTC m=+1239.435135300" lastFinishedPulling="2025-12-12 00:44:14.003768449 +0000 UTC m=+1244.549121325" observedRunningTime="2025-12-12 00:44:15.216303929 +0000 UTC m=+1245.761656795" watchObservedRunningTime="2025-12-12 00:44:15.225007262 +0000 UTC m=+1245.770360128" Dec 12 00:44:15 crc kubenswrapper[4606]: I1212 00:44:15.245807 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=9.429404435 podStartE2EDuration="38.24578953s" podCreationTimestamp="2025-12-12 00:43:37 +0000 UTC" firstStartedPulling="2025-12-12 00:43:39.824695984 +0000 UTC m=+1210.370048850" lastFinishedPulling="2025-12-12 00:44:08.641081079 +0000 UTC m=+1239.186433945" observedRunningTime="2025-12-12 00:44:15.233345436 +0000 UTC m=+1245.778698302" watchObservedRunningTime="2025-12-12 00:44:15.24578953 +0000 UTC m=+1245.791142396" Dec 12 00:44:16 crc kubenswrapper[4606]: I1212 00:44:16.122101 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-fh8p9" event={"ID":"23cadcb5-094e-4dc3-af06-6f1186b6cb98","Type":"ContainerStarted","Data":"1888fe938b04117e675290b44d2bab0bb9b15439780ea567725e14a6eef05ffb"} Dec 12 00:44:16 crc kubenswrapper[4606]: I1212 00:44:16.122411 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-fh8p9" event={"ID":"23cadcb5-094e-4dc3-af06-6f1186b6cb98","Type":"ContainerStarted","Data":"9b5a2bc5848d881cb631ef0d94ce37671677fc889838976c1b40b2c9f4f28459"} Dec 12 00:44:16 crc kubenswrapper[4606]: I1212 00:44:16.122529 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-fh8p9" Dec 12 00:44:16 crc kubenswrapper[4606]: I1212 00:44:16.127933 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"0e415e37-636f-4f5d-a64e-4dd815e6030e","Type":"ContainerStarted","Data":"8f65950682d8d7663800a314d38af8809bafba5b459d243224796e51d49acb3c"} Dec 12 00:44:16 crc kubenswrapper[4606]: I1212 00:44:16.143987 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-fh8p9" podStartSLOduration=27.168010112 podStartE2EDuration="31.143962865s" podCreationTimestamp="2025-12-12 00:43:45 +0000 UTC" firstStartedPulling="2025-12-12 00:44:10.026552712 +0000 UTC m=+1240.571905578" lastFinishedPulling="2025-12-12 00:44:14.002505465 +0000 UTC m=+1244.547858331" observedRunningTime="2025-12-12 00:44:16.138491948 +0000 UTC m=+1246.683844814" watchObservedRunningTime="2025-12-12 00:44:16.143962865 +0000 UTC m=+1246.689315731" Dec 12 00:44:16 crc kubenswrapper[4606]: I1212 00:44:16.173574 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-fh8p9" Dec 12 00:44:19 crc kubenswrapper[4606]: I1212 00:44:19.136435 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Dec 12 00:44:19 crc kubenswrapper[4606]: I1212 00:44:19.137502 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Dec 12 00:44:19 crc kubenswrapper[4606]: I1212 00:44:19.165646 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"867593e3-7035-4358-8583-0d2cb0878282","Type":"ContainerStarted","Data":"2e848804ef0dafc77e25d443efa28a092fc3ab955d04f97bbf5978b13e4311da"} Dec 12 00:44:19 crc kubenswrapper[4606]: I1212 00:44:19.169532 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"c68b0331-671b-4ca9-9f19-260d6faeada7","Type":"ContainerStarted","Data":"86ccd53d5dacbf5ff09a9f7e543976c4f56dd42b1f6b89e8af5a837804e0d6be"} Dec 12 00:44:19 crc kubenswrapper[4606]: I1212 00:44:19.184063 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=25.872406522 podStartE2EDuration="34.184044841s" podCreationTimestamp="2025-12-12 00:43:45 +0000 UTC" firstStartedPulling="2025-12-12 00:44:10.052080677 +0000 UTC m=+1240.597433543" lastFinishedPulling="2025-12-12 00:44:18.363718996 +0000 UTC m=+1248.909071862" observedRunningTime="2025-12-12 00:44:19.180940568 +0000 UTC m=+1249.726293444" watchObservedRunningTime="2025-12-12 00:44:19.184044841 +0000 UTC m=+1249.729397707" Dec 12 00:44:19 crc kubenswrapper[4606]: I1212 00:44:19.214608 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=22.915097348 podStartE2EDuration="31.214581661s" podCreationTimestamp="2025-12-12 00:43:48 +0000 UTC" firstStartedPulling="2025-12-12 00:44:10.051849691 +0000 UTC m=+1240.597202557" lastFinishedPulling="2025-12-12 00:44:18.351334004 +0000 UTC m=+1248.896686870" observedRunningTime="2025-12-12 00:44:19.207152181 +0000 UTC m=+1249.752505067" watchObservedRunningTime="2025-12-12 00:44:19.214581661 +0000 UTC m=+1249.759934547" Dec 12 00:44:20 crc kubenswrapper[4606]: I1212 00:44:20.052931 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Dec 12 00:44:20 crc kubenswrapper[4606]: I1212 00:44:20.095355 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Dec 12 00:44:20 crc kubenswrapper[4606]: I1212 00:44:20.176784 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Dec 12 00:44:20 crc kubenswrapper[4606]: I1212 00:44:20.219311 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Dec 12 00:44:20 crc kubenswrapper[4606]: I1212 00:44:20.477105 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-5p9bb"] Dec 12 00:44:20 crc kubenswrapper[4606]: I1212 00:44:20.542197 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-g4dz7"] Dec 12 00:44:20 crc kubenswrapper[4606]: I1212 00:44:20.543458 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-g4dz7" Dec 12 00:44:20 crc kubenswrapper[4606]: I1212 00:44:20.549245 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Dec 12 00:44:20 crc kubenswrapper[4606]: I1212 00:44:20.549686 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-kzfxx"] Dec 12 00:44:20 crc kubenswrapper[4606]: I1212 00:44:20.550584 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-kzfxx" Dec 12 00:44:20 crc kubenswrapper[4606]: I1212 00:44:20.553153 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Dec 12 00:44:20 crc kubenswrapper[4606]: I1212 00:44:20.557528 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-kzfxx"] Dec 12 00:44:20 crc kubenswrapper[4606]: I1212 00:44:20.566632 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/cba16cfd-fd29-4c8c-8c17-5fbecbdf6ee0-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-kzfxx\" (UID: \"cba16cfd-fd29-4c8c-8c17-5fbecbdf6ee0\") " pod="openstack/ovn-controller-metrics-kzfxx" Dec 12 00:44:20 crc kubenswrapper[4606]: I1212 00:44:20.566674 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfxzc\" (UniqueName: \"kubernetes.io/projected/cba16cfd-fd29-4c8c-8c17-5fbecbdf6ee0-kube-api-access-lfxzc\") pod \"ovn-controller-metrics-kzfxx\" (UID: \"cba16cfd-fd29-4c8c-8c17-5fbecbdf6ee0\") " pod="openstack/ovn-controller-metrics-kzfxx" Dec 12 00:44:20 crc kubenswrapper[4606]: I1212 00:44:20.566698 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cba16cfd-fd29-4c8c-8c17-5fbecbdf6ee0-combined-ca-bundle\") pod \"ovn-controller-metrics-kzfxx\" (UID: \"cba16cfd-fd29-4c8c-8c17-5fbecbdf6ee0\") " pod="openstack/ovn-controller-metrics-kzfxx" Dec 12 00:44:20 crc kubenswrapper[4606]: I1212 00:44:20.566715 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dce868a6-4924-41d6-bf03-38c519f2ab97-ovsdbserver-nb\") pod \"dnsmasq-dns-5bf47b49b7-g4dz7\" (UID: \"dce868a6-4924-41d6-bf03-38c519f2ab97\") " pod="openstack/dnsmasq-dns-5bf47b49b7-g4dz7" Dec 12 00:44:20 crc kubenswrapper[4606]: I1212 00:44:20.566741 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dce868a6-4924-41d6-bf03-38c519f2ab97-dns-svc\") pod \"dnsmasq-dns-5bf47b49b7-g4dz7\" (UID: \"dce868a6-4924-41d6-bf03-38c519f2ab97\") " pod="openstack/dnsmasq-dns-5bf47b49b7-g4dz7" Dec 12 00:44:20 crc kubenswrapper[4606]: I1212 00:44:20.566758 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cba16cfd-fd29-4c8c-8c17-5fbecbdf6ee0-config\") pod \"ovn-controller-metrics-kzfxx\" (UID: \"cba16cfd-fd29-4c8c-8c17-5fbecbdf6ee0\") " pod="openstack/ovn-controller-metrics-kzfxx" Dec 12 00:44:20 crc kubenswrapper[4606]: I1212 00:44:20.566784 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dce868a6-4924-41d6-bf03-38c519f2ab97-config\") pod \"dnsmasq-dns-5bf47b49b7-g4dz7\" (UID: \"dce868a6-4924-41d6-bf03-38c519f2ab97\") " pod="openstack/dnsmasq-dns-5bf47b49b7-g4dz7" Dec 12 00:44:20 crc kubenswrapper[4606]: I1212 00:44:20.566816 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvjww\" (UniqueName: \"kubernetes.io/projected/dce868a6-4924-41d6-bf03-38c519f2ab97-kube-api-access-wvjww\") pod \"dnsmasq-dns-5bf47b49b7-g4dz7\" (UID: \"dce868a6-4924-41d6-bf03-38c519f2ab97\") " pod="openstack/dnsmasq-dns-5bf47b49b7-g4dz7" Dec 12 00:44:20 crc kubenswrapper[4606]: I1212 00:44:20.566849 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/cba16cfd-fd29-4c8c-8c17-5fbecbdf6ee0-ovs-rundir\") pod \"ovn-controller-metrics-kzfxx\" (UID: \"cba16cfd-fd29-4c8c-8c17-5fbecbdf6ee0\") " pod="openstack/ovn-controller-metrics-kzfxx" Dec 12 00:44:20 crc kubenswrapper[4606]: I1212 00:44:20.566884 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/cba16cfd-fd29-4c8c-8c17-5fbecbdf6ee0-ovn-rundir\") pod \"ovn-controller-metrics-kzfxx\" (UID: \"cba16cfd-fd29-4c8c-8c17-5fbecbdf6ee0\") " pod="openstack/ovn-controller-metrics-kzfxx" Dec 12 00:44:20 crc kubenswrapper[4606]: I1212 00:44:20.566959 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Dec 12 00:44:20 crc kubenswrapper[4606]: I1212 00:44:20.567725 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Dec 12 00:44:20 crc kubenswrapper[4606]: I1212 00:44:20.567757 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-g4dz7"] Dec 12 00:44:20 crc kubenswrapper[4606]: I1212 00:44:20.651461 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Dec 12 00:44:20 crc kubenswrapper[4606]: I1212 00:44:20.651638 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Dec 12 00:44:20 crc kubenswrapper[4606]: I1212 00:44:20.668378 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/cba16cfd-fd29-4c8c-8c17-5fbecbdf6ee0-ovs-rundir\") pod \"ovn-controller-metrics-kzfxx\" (UID: \"cba16cfd-fd29-4c8c-8c17-5fbecbdf6ee0\") " pod="openstack/ovn-controller-metrics-kzfxx" Dec 12 00:44:20 crc kubenswrapper[4606]: I1212 00:44:20.668449 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/cba16cfd-fd29-4c8c-8c17-5fbecbdf6ee0-ovn-rundir\") pod \"ovn-controller-metrics-kzfxx\" (UID: \"cba16cfd-fd29-4c8c-8c17-5fbecbdf6ee0\") " pod="openstack/ovn-controller-metrics-kzfxx" Dec 12 00:44:20 crc kubenswrapper[4606]: I1212 00:44:20.668504 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/cba16cfd-fd29-4c8c-8c17-5fbecbdf6ee0-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-kzfxx\" (UID: \"cba16cfd-fd29-4c8c-8c17-5fbecbdf6ee0\") " pod="openstack/ovn-controller-metrics-kzfxx" Dec 12 00:44:20 crc kubenswrapper[4606]: I1212 00:44:20.668527 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lfxzc\" (UniqueName: \"kubernetes.io/projected/cba16cfd-fd29-4c8c-8c17-5fbecbdf6ee0-kube-api-access-lfxzc\") pod \"ovn-controller-metrics-kzfxx\" (UID: \"cba16cfd-fd29-4c8c-8c17-5fbecbdf6ee0\") " pod="openstack/ovn-controller-metrics-kzfxx" Dec 12 00:44:20 crc kubenswrapper[4606]: I1212 00:44:20.668555 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cba16cfd-fd29-4c8c-8c17-5fbecbdf6ee0-combined-ca-bundle\") pod \"ovn-controller-metrics-kzfxx\" (UID: \"cba16cfd-fd29-4c8c-8c17-5fbecbdf6ee0\") " pod="openstack/ovn-controller-metrics-kzfxx" Dec 12 00:44:20 crc kubenswrapper[4606]: I1212 00:44:20.668573 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dce868a6-4924-41d6-bf03-38c519f2ab97-ovsdbserver-nb\") pod \"dnsmasq-dns-5bf47b49b7-g4dz7\" (UID: \"dce868a6-4924-41d6-bf03-38c519f2ab97\") " pod="openstack/dnsmasq-dns-5bf47b49b7-g4dz7" Dec 12 00:44:20 crc kubenswrapper[4606]: I1212 00:44:20.668598 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dce868a6-4924-41d6-bf03-38c519f2ab97-dns-svc\") pod \"dnsmasq-dns-5bf47b49b7-g4dz7\" (UID: \"dce868a6-4924-41d6-bf03-38c519f2ab97\") " pod="openstack/dnsmasq-dns-5bf47b49b7-g4dz7" Dec 12 00:44:20 crc kubenswrapper[4606]: I1212 00:44:20.668619 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cba16cfd-fd29-4c8c-8c17-5fbecbdf6ee0-config\") pod \"ovn-controller-metrics-kzfxx\" (UID: \"cba16cfd-fd29-4c8c-8c17-5fbecbdf6ee0\") " pod="openstack/ovn-controller-metrics-kzfxx" Dec 12 00:44:20 crc kubenswrapper[4606]: I1212 00:44:20.668649 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dce868a6-4924-41d6-bf03-38c519f2ab97-config\") pod \"dnsmasq-dns-5bf47b49b7-g4dz7\" (UID: \"dce868a6-4924-41d6-bf03-38c519f2ab97\") " pod="openstack/dnsmasq-dns-5bf47b49b7-g4dz7" Dec 12 00:44:20 crc kubenswrapper[4606]: I1212 00:44:20.668686 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvjww\" (UniqueName: \"kubernetes.io/projected/dce868a6-4924-41d6-bf03-38c519f2ab97-kube-api-access-wvjww\") pod \"dnsmasq-dns-5bf47b49b7-g4dz7\" (UID: \"dce868a6-4924-41d6-bf03-38c519f2ab97\") " pod="openstack/dnsmasq-dns-5bf47b49b7-g4dz7" Dec 12 00:44:20 crc kubenswrapper[4606]: I1212 00:44:20.668761 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/cba16cfd-fd29-4c8c-8c17-5fbecbdf6ee0-ovs-rundir\") pod \"ovn-controller-metrics-kzfxx\" (UID: \"cba16cfd-fd29-4c8c-8c17-5fbecbdf6ee0\") " pod="openstack/ovn-controller-metrics-kzfxx" Dec 12 00:44:20 crc kubenswrapper[4606]: I1212 00:44:20.669020 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/cba16cfd-fd29-4c8c-8c17-5fbecbdf6ee0-ovn-rundir\") pod \"ovn-controller-metrics-kzfxx\" (UID: \"cba16cfd-fd29-4c8c-8c17-5fbecbdf6ee0\") " pod="openstack/ovn-controller-metrics-kzfxx" Dec 12 00:44:20 crc kubenswrapper[4606]: I1212 00:44:20.671216 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cba16cfd-fd29-4c8c-8c17-5fbecbdf6ee0-config\") pod \"ovn-controller-metrics-kzfxx\" (UID: \"cba16cfd-fd29-4c8c-8c17-5fbecbdf6ee0\") " pod="openstack/ovn-controller-metrics-kzfxx" Dec 12 00:44:20 crc kubenswrapper[4606]: I1212 00:44:20.671922 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dce868a6-4924-41d6-bf03-38c519f2ab97-dns-svc\") pod \"dnsmasq-dns-5bf47b49b7-g4dz7\" (UID: \"dce868a6-4924-41d6-bf03-38c519f2ab97\") " pod="openstack/dnsmasq-dns-5bf47b49b7-g4dz7" Dec 12 00:44:20 crc kubenswrapper[4606]: I1212 00:44:20.672932 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dce868a6-4924-41d6-bf03-38c519f2ab97-config\") pod \"dnsmasq-dns-5bf47b49b7-g4dz7\" (UID: \"dce868a6-4924-41d6-bf03-38c519f2ab97\") " pod="openstack/dnsmasq-dns-5bf47b49b7-g4dz7" Dec 12 00:44:20 crc kubenswrapper[4606]: I1212 00:44:20.676996 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cba16cfd-fd29-4c8c-8c17-5fbecbdf6ee0-combined-ca-bundle\") pod \"ovn-controller-metrics-kzfxx\" (UID: \"cba16cfd-fd29-4c8c-8c17-5fbecbdf6ee0\") " pod="openstack/ovn-controller-metrics-kzfxx" Dec 12 00:44:20 crc kubenswrapper[4606]: I1212 00:44:20.693283 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dce868a6-4924-41d6-bf03-38c519f2ab97-ovsdbserver-nb\") pod \"dnsmasq-dns-5bf47b49b7-g4dz7\" (UID: \"dce868a6-4924-41d6-bf03-38c519f2ab97\") " pod="openstack/dnsmasq-dns-5bf47b49b7-g4dz7" Dec 12 00:44:20 crc kubenswrapper[4606]: I1212 00:44:20.695842 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/cba16cfd-fd29-4c8c-8c17-5fbecbdf6ee0-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-kzfxx\" (UID: \"cba16cfd-fd29-4c8c-8c17-5fbecbdf6ee0\") " pod="openstack/ovn-controller-metrics-kzfxx" Dec 12 00:44:20 crc kubenswrapper[4606]: I1212 00:44:20.788311 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfxzc\" (UniqueName: \"kubernetes.io/projected/cba16cfd-fd29-4c8c-8c17-5fbecbdf6ee0-kube-api-access-lfxzc\") pod \"ovn-controller-metrics-kzfxx\" (UID: \"cba16cfd-fd29-4c8c-8c17-5fbecbdf6ee0\") " pod="openstack/ovn-controller-metrics-kzfxx" Dec 12 00:44:20 crc kubenswrapper[4606]: I1212 00:44:20.789381 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvjww\" (UniqueName: \"kubernetes.io/projected/dce868a6-4924-41d6-bf03-38c519f2ab97-kube-api-access-wvjww\") pod \"dnsmasq-dns-5bf47b49b7-g4dz7\" (UID: \"dce868a6-4924-41d6-bf03-38c519f2ab97\") " pod="openstack/dnsmasq-dns-5bf47b49b7-g4dz7" Dec 12 00:44:20 crc kubenswrapper[4606]: I1212 00:44:20.873265 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-g4dz7" Dec 12 00:44:20 crc kubenswrapper[4606]: I1212 00:44:20.882977 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-kzfxx" Dec 12 00:44:20 crc kubenswrapper[4606]: I1212 00:44:20.938342 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Dec 12 00:44:21 crc kubenswrapper[4606]: I1212 00:44:21.081491 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Dec 12 00:44:21 crc kubenswrapper[4606]: I1212 00:44:21.170939 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-dk28j"] Dec 12 00:44:21 crc kubenswrapper[4606]: I1212 00:44:21.212226 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-5p9bb" Dec 12 00:44:21 crc kubenswrapper[4606]: I1212 00:44:21.217181 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ndh26\" (UniqueName: \"kubernetes.io/projected/398360c3-4b1c-4dc0-be54-511d4ac621ee-kube-api-access-ndh26\") pod \"398360c3-4b1c-4dc0-be54-511d4ac621ee\" (UID: \"398360c3-4b1c-4dc0-be54-511d4ac621ee\") " Dec 12 00:44:21 crc kubenswrapper[4606]: I1212 00:44:21.217379 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/398360c3-4b1c-4dc0-be54-511d4ac621ee-dns-svc\") pod \"398360c3-4b1c-4dc0-be54-511d4ac621ee\" (UID: \"398360c3-4b1c-4dc0-be54-511d4ac621ee\") " Dec 12 00:44:21 crc kubenswrapper[4606]: I1212 00:44:21.217451 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/398360c3-4b1c-4dc0-be54-511d4ac621ee-config\") pod \"398360c3-4b1c-4dc0-be54-511d4ac621ee\" (UID: \"398360c3-4b1c-4dc0-be54-511d4ac621ee\") " Dec 12 00:44:21 crc kubenswrapper[4606]: I1212 00:44:21.218504 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/398360c3-4b1c-4dc0-be54-511d4ac621ee-config" (OuterVolumeSpecName: "config") pod "398360c3-4b1c-4dc0-be54-511d4ac621ee" (UID: "398360c3-4b1c-4dc0-be54-511d4ac621ee"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:44:21 crc kubenswrapper[4606]: I1212 00:44:21.226918 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/398360c3-4b1c-4dc0-be54-511d4ac621ee-kube-api-access-ndh26" (OuterVolumeSpecName: "kube-api-access-ndh26") pod "398360c3-4b1c-4dc0-be54-511d4ac621ee" (UID: "398360c3-4b1c-4dc0-be54-511d4ac621ee"). InnerVolumeSpecName "kube-api-access-ndh26". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:44:21 crc kubenswrapper[4606]: I1212 00:44:21.248611 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-5p9bb" event={"ID":"398360c3-4b1c-4dc0-be54-511d4ac621ee","Type":"ContainerDied","Data":"65174a4b41c5b566cb9ca8e2a71ff42e51dc1909a4cd0b5f48ef814f809010fd"} Dec 12 00:44:21 crc kubenswrapper[4606]: I1212 00:44:21.249941 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/398360c3-4b1c-4dc0-be54-511d4ac621ee-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "398360c3-4b1c-4dc0-be54-511d4ac621ee" (UID: "398360c3-4b1c-4dc0-be54-511d4ac621ee"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:44:21 crc kubenswrapper[4606]: I1212 00:44:21.266958 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8554648995-bscb8"] Dec 12 00:44:21 crc kubenswrapper[4606]: I1212 00:44:21.289440 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-bscb8" Dec 12 00:44:21 crc kubenswrapper[4606]: I1212 00:44:21.297319 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Dec 12 00:44:21 crc kubenswrapper[4606]: I1212 00:44:21.321654 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-bscb8"] Dec 12 00:44:21 crc kubenswrapper[4606]: I1212 00:44:21.323306 4606 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/398360c3-4b1c-4dc0-be54-511d4ac621ee-config\") on node \"crc\" DevicePath \"\"" Dec 12 00:44:21 crc kubenswrapper[4606]: I1212 00:44:21.323332 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ndh26\" (UniqueName: \"kubernetes.io/projected/398360c3-4b1c-4dc0-be54-511d4ac621ee-kube-api-access-ndh26\") on node \"crc\" DevicePath \"\"" Dec 12 00:44:21 crc kubenswrapper[4606]: I1212 00:44:21.323342 4606 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/398360c3-4b1c-4dc0-be54-511d4ac621ee-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 12 00:44:21 crc kubenswrapper[4606]: I1212 00:44:21.414912 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Dec 12 00:44:21 crc kubenswrapper[4606]: I1212 00:44:21.425530 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rf5c\" (UniqueName: \"kubernetes.io/projected/be7f7dee-5960-4b93-890f-d5898b9d6457-kube-api-access-4rf5c\") pod \"dnsmasq-dns-8554648995-bscb8\" (UID: \"be7f7dee-5960-4b93-890f-d5898b9d6457\") " pod="openstack/dnsmasq-dns-8554648995-bscb8" Dec 12 00:44:21 crc kubenswrapper[4606]: I1212 00:44:21.425607 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be7f7dee-5960-4b93-890f-d5898b9d6457-config\") pod \"dnsmasq-dns-8554648995-bscb8\" (UID: \"be7f7dee-5960-4b93-890f-d5898b9d6457\") " pod="openstack/dnsmasq-dns-8554648995-bscb8" Dec 12 00:44:21 crc kubenswrapper[4606]: I1212 00:44:21.425642 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/be7f7dee-5960-4b93-890f-d5898b9d6457-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-bscb8\" (UID: \"be7f7dee-5960-4b93-890f-d5898b9d6457\") " pod="openstack/dnsmasq-dns-8554648995-bscb8" Dec 12 00:44:21 crc kubenswrapper[4606]: I1212 00:44:21.425767 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/be7f7dee-5960-4b93-890f-d5898b9d6457-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-bscb8\" (UID: \"be7f7dee-5960-4b93-890f-d5898b9d6457\") " pod="openstack/dnsmasq-dns-8554648995-bscb8" Dec 12 00:44:21 crc kubenswrapper[4606]: I1212 00:44:21.425874 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/be7f7dee-5960-4b93-890f-d5898b9d6457-dns-svc\") pod \"dnsmasq-dns-8554648995-bscb8\" (UID: \"be7f7dee-5960-4b93-890f-d5898b9d6457\") " pod="openstack/dnsmasq-dns-8554648995-bscb8" Dec 12 00:44:21 crc kubenswrapper[4606]: I1212 00:44:21.440740 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Dec 12 00:44:21 crc kubenswrapper[4606]: I1212 00:44:21.526876 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/be7f7dee-5960-4b93-890f-d5898b9d6457-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-bscb8\" (UID: \"be7f7dee-5960-4b93-890f-d5898b9d6457\") " pod="openstack/dnsmasq-dns-8554648995-bscb8" Dec 12 00:44:21 crc kubenswrapper[4606]: I1212 00:44:21.526942 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/be7f7dee-5960-4b93-890f-d5898b9d6457-dns-svc\") pod \"dnsmasq-dns-8554648995-bscb8\" (UID: \"be7f7dee-5960-4b93-890f-d5898b9d6457\") " pod="openstack/dnsmasq-dns-8554648995-bscb8" Dec 12 00:44:21 crc kubenswrapper[4606]: I1212 00:44:21.526996 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rf5c\" (UniqueName: \"kubernetes.io/projected/be7f7dee-5960-4b93-890f-d5898b9d6457-kube-api-access-4rf5c\") pod \"dnsmasq-dns-8554648995-bscb8\" (UID: \"be7f7dee-5960-4b93-890f-d5898b9d6457\") " pod="openstack/dnsmasq-dns-8554648995-bscb8" Dec 12 00:44:21 crc kubenswrapper[4606]: I1212 00:44:21.527026 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be7f7dee-5960-4b93-890f-d5898b9d6457-config\") pod \"dnsmasq-dns-8554648995-bscb8\" (UID: \"be7f7dee-5960-4b93-890f-d5898b9d6457\") " pod="openstack/dnsmasq-dns-8554648995-bscb8" Dec 12 00:44:21 crc kubenswrapper[4606]: I1212 00:44:21.527054 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/be7f7dee-5960-4b93-890f-d5898b9d6457-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-bscb8\" (UID: \"be7f7dee-5960-4b93-890f-d5898b9d6457\") " pod="openstack/dnsmasq-dns-8554648995-bscb8" Dec 12 00:44:21 crc kubenswrapper[4606]: I1212 00:44:21.527775 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/be7f7dee-5960-4b93-890f-d5898b9d6457-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-bscb8\" (UID: \"be7f7dee-5960-4b93-890f-d5898b9d6457\") " pod="openstack/dnsmasq-dns-8554648995-bscb8" Dec 12 00:44:21 crc kubenswrapper[4606]: I1212 00:44:21.527916 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/be7f7dee-5960-4b93-890f-d5898b9d6457-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-bscb8\" (UID: \"be7f7dee-5960-4b93-890f-d5898b9d6457\") " pod="openstack/dnsmasq-dns-8554648995-bscb8" Dec 12 00:44:21 crc kubenswrapper[4606]: I1212 00:44:21.528527 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be7f7dee-5960-4b93-890f-d5898b9d6457-config\") pod \"dnsmasq-dns-8554648995-bscb8\" (UID: \"be7f7dee-5960-4b93-890f-d5898b9d6457\") " pod="openstack/dnsmasq-dns-8554648995-bscb8" Dec 12 00:44:21 crc kubenswrapper[4606]: I1212 00:44:21.529035 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/be7f7dee-5960-4b93-890f-d5898b9d6457-dns-svc\") pod \"dnsmasq-dns-8554648995-bscb8\" (UID: \"be7f7dee-5960-4b93-890f-d5898b9d6457\") " pod="openstack/dnsmasq-dns-8554648995-bscb8" Dec 12 00:44:21 crc kubenswrapper[4606]: I1212 00:44:21.549046 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rf5c\" (UniqueName: \"kubernetes.io/projected/be7f7dee-5960-4b93-890f-d5898b9d6457-kube-api-access-4rf5c\") pod \"dnsmasq-dns-8554648995-bscb8\" (UID: \"be7f7dee-5960-4b93-890f-d5898b9d6457\") " pod="openstack/dnsmasq-dns-8554648995-bscb8" Dec 12 00:44:21 crc kubenswrapper[4606]: I1212 00:44:21.652707 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-bscb8" Dec 12 00:44:21 crc kubenswrapper[4606]: I1212 00:44:21.663254 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-g4dz7"] Dec 12 00:44:21 crc kubenswrapper[4606]: I1212 00:44:21.753341 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-kzfxx"] Dec 12 00:44:21 crc kubenswrapper[4606]: I1212 00:44:21.893891 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Dec 12 00:44:21 crc kubenswrapper[4606]: I1212 00:44:21.895793 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 12 00:44:21 crc kubenswrapper[4606]: I1212 00:44:21.901218 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-dk28j" Dec 12 00:44:21 crc kubenswrapper[4606]: I1212 00:44:21.904193 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Dec 12 00:44:21 crc kubenswrapper[4606]: I1212 00:44:21.904349 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-xkrjs" Dec 12 00:44:21 crc kubenswrapper[4606]: I1212 00:44:21.909715 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Dec 12 00:44:21 crc kubenswrapper[4606]: I1212 00:44:21.910008 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Dec 12 00:44:21 crc kubenswrapper[4606]: I1212 00:44:21.911376 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 12 00:44:21 crc kubenswrapper[4606]: I1212 00:44:21.934707 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/059cf819-e18e-493b-a65b-9f3f8b5d683f-config\") pod \"059cf819-e18e-493b-a65b-9f3f8b5d683f\" (UID: \"059cf819-e18e-493b-a65b-9f3f8b5d683f\") " Dec 12 00:44:21 crc kubenswrapper[4606]: I1212 00:44:21.934857 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/059cf819-e18e-493b-a65b-9f3f8b5d683f-dns-svc\") pod \"059cf819-e18e-493b-a65b-9f3f8b5d683f\" (UID: \"059cf819-e18e-493b-a65b-9f3f8b5d683f\") " Dec 12 00:44:21 crc kubenswrapper[4606]: I1212 00:44:21.935024 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rc7cv\" (UniqueName: \"kubernetes.io/projected/059cf819-e18e-493b-a65b-9f3f8b5d683f-kube-api-access-rc7cv\") pod \"059cf819-e18e-493b-a65b-9f3f8b5d683f\" (UID: \"059cf819-e18e-493b-a65b-9f3f8b5d683f\") " Dec 12 00:44:21 crc kubenswrapper[4606]: I1212 00:44:21.935299 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krbmf\" (UniqueName: \"kubernetes.io/projected/07c6dfb8-2190-4189-a3b7-f85da57160a1-kube-api-access-krbmf\") pod \"ovn-northd-0\" (UID: \"07c6dfb8-2190-4189-a3b7-f85da57160a1\") " pod="openstack/ovn-northd-0" Dec 12 00:44:21 crc kubenswrapper[4606]: I1212 00:44:21.935381 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07c6dfb8-2190-4189-a3b7-f85da57160a1-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"07c6dfb8-2190-4189-a3b7-f85da57160a1\") " pod="openstack/ovn-northd-0" Dec 12 00:44:21 crc kubenswrapper[4606]: I1212 00:44:21.935421 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/07c6dfb8-2190-4189-a3b7-f85da57160a1-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"07c6dfb8-2190-4189-a3b7-f85da57160a1\") " pod="openstack/ovn-northd-0" Dec 12 00:44:21 crc kubenswrapper[4606]: I1212 00:44:21.935444 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/07c6dfb8-2190-4189-a3b7-f85da57160a1-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"07c6dfb8-2190-4189-a3b7-f85da57160a1\") " pod="openstack/ovn-northd-0" Dec 12 00:44:21 crc kubenswrapper[4606]: I1212 00:44:21.935481 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/07c6dfb8-2190-4189-a3b7-f85da57160a1-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"07c6dfb8-2190-4189-a3b7-f85da57160a1\") " pod="openstack/ovn-northd-0" Dec 12 00:44:21 crc kubenswrapper[4606]: I1212 00:44:21.935515 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07c6dfb8-2190-4189-a3b7-f85da57160a1-config\") pod \"ovn-northd-0\" (UID: \"07c6dfb8-2190-4189-a3b7-f85da57160a1\") " pod="openstack/ovn-northd-0" Dec 12 00:44:21 crc kubenswrapper[4606]: I1212 00:44:21.935535 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/07c6dfb8-2190-4189-a3b7-f85da57160a1-scripts\") pod \"ovn-northd-0\" (UID: \"07c6dfb8-2190-4189-a3b7-f85da57160a1\") " pod="openstack/ovn-northd-0" Dec 12 00:44:21 crc kubenswrapper[4606]: I1212 00:44:21.935841 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/059cf819-e18e-493b-a65b-9f3f8b5d683f-config" (OuterVolumeSpecName: "config") pod "059cf819-e18e-493b-a65b-9f3f8b5d683f" (UID: "059cf819-e18e-493b-a65b-9f3f8b5d683f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:44:21 crc kubenswrapper[4606]: I1212 00:44:21.942606 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/059cf819-e18e-493b-a65b-9f3f8b5d683f-kube-api-access-rc7cv" (OuterVolumeSpecName: "kube-api-access-rc7cv") pod "059cf819-e18e-493b-a65b-9f3f8b5d683f" (UID: "059cf819-e18e-493b-a65b-9f3f8b5d683f"). InnerVolumeSpecName "kube-api-access-rc7cv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:44:21 crc kubenswrapper[4606]: I1212 00:44:21.946010 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/059cf819-e18e-493b-a65b-9f3f8b5d683f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "059cf819-e18e-493b-a65b-9f3f8b5d683f" (UID: "059cf819-e18e-493b-a65b-9f3f8b5d683f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:44:21 crc kubenswrapper[4606]: I1212 00:44:21.978492 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Dec 12 00:44:22 crc kubenswrapper[4606]: I1212 00:44:22.036860 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/07c6dfb8-2190-4189-a3b7-f85da57160a1-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"07c6dfb8-2190-4189-a3b7-f85da57160a1\") " pod="openstack/ovn-northd-0" Dec 12 00:44:22 crc kubenswrapper[4606]: I1212 00:44:22.036892 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07c6dfb8-2190-4189-a3b7-f85da57160a1-config\") pod \"ovn-northd-0\" (UID: \"07c6dfb8-2190-4189-a3b7-f85da57160a1\") " pod="openstack/ovn-northd-0" Dec 12 00:44:22 crc kubenswrapper[4606]: I1212 00:44:22.036918 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/07c6dfb8-2190-4189-a3b7-f85da57160a1-scripts\") pod \"ovn-northd-0\" (UID: \"07c6dfb8-2190-4189-a3b7-f85da57160a1\") " pod="openstack/ovn-northd-0" Dec 12 00:44:22 crc kubenswrapper[4606]: I1212 00:44:22.036972 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krbmf\" (UniqueName: \"kubernetes.io/projected/07c6dfb8-2190-4189-a3b7-f85da57160a1-kube-api-access-krbmf\") pod \"ovn-northd-0\" (UID: \"07c6dfb8-2190-4189-a3b7-f85da57160a1\") " pod="openstack/ovn-northd-0" Dec 12 00:44:22 crc kubenswrapper[4606]: I1212 00:44:22.037055 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07c6dfb8-2190-4189-a3b7-f85da57160a1-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"07c6dfb8-2190-4189-a3b7-f85da57160a1\") " pod="openstack/ovn-northd-0" Dec 12 00:44:22 crc kubenswrapper[4606]: I1212 00:44:22.037104 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/07c6dfb8-2190-4189-a3b7-f85da57160a1-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"07c6dfb8-2190-4189-a3b7-f85da57160a1\") " pod="openstack/ovn-northd-0" Dec 12 00:44:22 crc kubenswrapper[4606]: I1212 00:44:22.037123 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/07c6dfb8-2190-4189-a3b7-f85da57160a1-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"07c6dfb8-2190-4189-a3b7-f85da57160a1\") " pod="openstack/ovn-northd-0" Dec 12 00:44:22 crc kubenswrapper[4606]: I1212 00:44:22.037201 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rc7cv\" (UniqueName: \"kubernetes.io/projected/059cf819-e18e-493b-a65b-9f3f8b5d683f-kube-api-access-rc7cv\") on node \"crc\" DevicePath \"\"" Dec 12 00:44:22 crc kubenswrapper[4606]: I1212 00:44:22.037213 4606 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/059cf819-e18e-493b-a65b-9f3f8b5d683f-config\") on node \"crc\" DevicePath \"\"" Dec 12 00:44:22 crc kubenswrapper[4606]: I1212 00:44:22.037223 4606 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/059cf819-e18e-493b-a65b-9f3f8b5d683f-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 12 00:44:22 crc kubenswrapper[4606]: I1212 00:44:22.037977 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07c6dfb8-2190-4189-a3b7-f85da57160a1-config\") pod \"ovn-northd-0\" (UID: \"07c6dfb8-2190-4189-a3b7-f85da57160a1\") " pod="openstack/ovn-northd-0" Dec 12 00:44:22 crc kubenswrapper[4606]: I1212 00:44:22.038415 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/07c6dfb8-2190-4189-a3b7-f85da57160a1-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"07c6dfb8-2190-4189-a3b7-f85da57160a1\") " pod="openstack/ovn-northd-0" Dec 12 00:44:22 crc kubenswrapper[4606]: I1212 00:44:22.039495 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/07c6dfb8-2190-4189-a3b7-f85da57160a1-scripts\") pod \"ovn-northd-0\" (UID: \"07c6dfb8-2190-4189-a3b7-f85da57160a1\") " pod="openstack/ovn-northd-0" Dec 12 00:44:22 crc kubenswrapper[4606]: I1212 00:44:22.041071 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07c6dfb8-2190-4189-a3b7-f85da57160a1-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"07c6dfb8-2190-4189-a3b7-f85da57160a1\") " pod="openstack/ovn-northd-0" Dec 12 00:44:22 crc kubenswrapper[4606]: I1212 00:44:22.044581 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/07c6dfb8-2190-4189-a3b7-f85da57160a1-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"07c6dfb8-2190-4189-a3b7-f85da57160a1\") " pod="openstack/ovn-northd-0" Dec 12 00:44:22 crc kubenswrapper[4606]: I1212 00:44:22.045692 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/07c6dfb8-2190-4189-a3b7-f85da57160a1-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"07c6dfb8-2190-4189-a3b7-f85da57160a1\") " pod="openstack/ovn-northd-0" Dec 12 00:44:22 crc kubenswrapper[4606]: I1212 00:44:22.061321 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krbmf\" (UniqueName: \"kubernetes.io/projected/07c6dfb8-2190-4189-a3b7-f85da57160a1-kube-api-access-krbmf\") pod \"ovn-northd-0\" (UID: \"07c6dfb8-2190-4189-a3b7-f85da57160a1\") " pod="openstack/ovn-northd-0" Dec 12 00:44:22 crc kubenswrapper[4606]: I1212 00:44:22.093582 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Dec 12 00:44:22 crc kubenswrapper[4606]: I1212 00:44:22.222286 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 12 00:44:22 crc kubenswrapper[4606]: I1212 00:44:22.254255 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-dk28j" event={"ID":"059cf819-e18e-493b-a65b-9f3f8b5d683f","Type":"ContainerDied","Data":"a3ed91ad4c0871f703d8ca2e1e25d3cadf059b463aa54ac42174a5298320faad"} Dec 12 00:44:22 crc kubenswrapper[4606]: I1212 00:44:22.254536 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-dk28j" Dec 12 00:44:22 crc kubenswrapper[4606]: I1212 00:44:22.257440 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-kzfxx" event={"ID":"cba16cfd-fd29-4c8c-8c17-5fbecbdf6ee0","Type":"ContainerStarted","Data":"0e9e3136e2865c7cfd4f75c68f573bf24c7922e60bad094e65dd5f220cbb2144"} Dec 12 00:44:22 crc kubenswrapper[4606]: I1212 00:44:22.257474 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-kzfxx" event={"ID":"cba16cfd-fd29-4c8c-8c17-5fbecbdf6ee0","Type":"ContainerStarted","Data":"e129e761c0acd336757e8a94c29a2f98c45e93b6a281c87c22ebaaa5bacfea01"} Dec 12 00:44:22 crc kubenswrapper[4606]: I1212 00:44:22.264952 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-5p9bb" Dec 12 00:44:22 crc kubenswrapper[4606]: I1212 00:44:22.269959 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf47b49b7-g4dz7" event={"ID":"dce868a6-4924-41d6-bf03-38c519f2ab97","Type":"ContainerStarted","Data":"2617a7822d137f8e129507dc2667fbcefa73f1b819442f0fe98e18223d0e284f"} Dec 12 00:44:22 crc kubenswrapper[4606]: I1212 00:44:22.313117 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-bscb8"] Dec 12 00:44:22 crc kubenswrapper[4606]: I1212 00:44:22.317631 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-kzfxx" podStartSLOduration=2.317449692 podStartE2EDuration="2.317449692s" podCreationTimestamp="2025-12-12 00:44:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:44:22.286151192 +0000 UTC m=+1252.831504068" watchObservedRunningTime="2025-12-12 00:44:22.317449692 +0000 UTC m=+1252.862802558" Dec 12 00:44:22 crc kubenswrapper[4606]: I1212 00:44:22.414630 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-5p9bb"] Dec 12 00:44:22 crc kubenswrapper[4606]: I1212 00:44:22.444815 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-5p9bb"] Dec 12 00:44:22 crc kubenswrapper[4606]: I1212 00:44:22.478187 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Dec 12 00:44:22 crc kubenswrapper[4606]: I1212 00:44:22.504665 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-dk28j"] Dec 12 00:44:22 crc kubenswrapper[4606]: I1212 00:44:22.519673 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-dk28j"] Dec 12 00:44:22 crc kubenswrapper[4606]: I1212 00:44:22.831338 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 12 00:44:22 crc kubenswrapper[4606]: I1212 00:44:22.837768 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-g4dz7"] Dec 12 00:44:22 crc kubenswrapper[4606]: I1212 00:44:22.894026 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-v224c"] Dec 12 00:44:22 crc kubenswrapper[4606]: I1212 00:44:22.899724 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-v224c" Dec 12 00:44:22 crc kubenswrapper[4606]: I1212 00:44:22.947629 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-v224c"] Dec 12 00:44:22 crc kubenswrapper[4606]: I1212 00:44:22.964459 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sczjk\" (UniqueName: \"kubernetes.io/projected/d9a87b6a-f358-4e8f-90a9-0c3cc5b8971d-kube-api-access-sczjk\") pod \"dnsmasq-dns-b8fbc5445-v224c\" (UID: \"d9a87b6a-f358-4e8f-90a9-0c3cc5b8971d\") " pod="openstack/dnsmasq-dns-b8fbc5445-v224c" Dec 12 00:44:22 crc kubenswrapper[4606]: I1212 00:44:22.964509 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d9a87b6a-f358-4e8f-90a9-0c3cc5b8971d-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-v224c\" (UID: \"d9a87b6a-f358-4e8f-90a9-0c3cc5b8971d\") " pod="openstack/dnsmasq-dns-b8fbc5445-v224c" Dec 12 00:44:22 crc kubenswrapper[4606]: I1212 00:44:22.964643 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d9a87b6a-f358-4e8f-90a9-0c3cc5b8971d-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-v224c\" (UID: \"d9a87b6a-f358-4e8f-90a9-0c3cc5b8971d\") " pod="openstack/dnsmasq-dns-b8fbc5445-v224c" Dec 12 00:44:22 crc kubenswrapper[4606]: I1212 00:44:22.964673 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9a87b6a-f358-4e8f-90a9-0c3cc5b8971d-config\") pod \"dnsmasq-dns-b8fbc5445-v224c\" (UID: \"d9a87b6a-f358-4e8f-90a9-0c3cc5b8971d\") " pod="openstack/dnsmasq-dns-b8fbc5445-v224c" Dec 12 00:44:22 crc kubenswrapper[4606]: I1212 00:44:22.964721 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d9a87b6a-f358-4e8f-90a9-0c3cc5b8971d-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-v224c\" (UID: \"d9a87b6a-f358-4e8f-90a9-0c3cc5b8971d\") " pod="openstack/dnsmasq-dns-b8fbc5445-v224c" Dec 12 00:44:23 crc kubenswrapper[4606]: I1212 00:44:23.068862 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d9a87b6a-f358-4e8f-90a9-0c3cc5b8971d-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-v224c\" (UID: \"d9a87b6a-f358-4e8f-90a9-0c3cc5b8971d\") " pod="openstack/dnsmasq-dns-b8fbc5445-v224c" Dec 12 00:44:23 crc kubenswrapper[4606]: I1212 00:44:23.068913 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9a87b6a-f358-4e8f-90a9-0c3cc5b8971d-config\") pod \"dnsmasq-dns-b8fbc5445-v224c\" (UID: \"d9a87b6a-f358-4e8f-90a9-0c3cc5b8971d\") " pod="openstack/dnsmasq-dns-b8fbc5445-v224c" Dec 12 00:44:23 crc kubenswrapper[4606]: I1212 00:44:23.068940 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d9a87b6a-f358-4e8f-90a9-0c3cc5b8971d-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-v224c\" (UID: \"d9a87b6a-f358-4e8f-90a9-0c3cc5b8971d\") " pod="openstack/dnsmasq-dns-b8fbc5445-v224c" Dec 12 00:44:23 crc kubenswrapper[4606]: I1212 00:44:23.068989 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sczjk\" (UniqueName: \"kubernetes.io/projected/d9a87b6a-f358-4e8f-90a9-0c3cc5b8971d-kube-api-access-sczjk\") pod \"dnsmasq-dns-b8fbc5445-v224c\" (UID: \"d9a87b6a-f358-4e8f-90a9-0c3cc5b8971d\") " pod="openstack/dnsmasq-dns-b8fbc5445-v224c" Dec 12 00:44:23 crc kubenswrapper[4606]: I1212 00:44:23.069008 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d9a87b6a-f358-4e8f-90a9-0c3cc5b8971d-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-v224c\" (UID: \"d9a87b6a-f358-4e8f-90a9-0c3cc5b8971d\") " pod="openstack/dnsmasq-dns-b8fbc5445-v224c" Dec 12 00:44:23 crc kubenswrapper[4606]: I1212 00:44:23.070015 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d9a87b6a-f358-4e8f-90a9-0c3cc5b8971d-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-v224c\" (UID: \"d9a87b6a-f358-4e8f-90a9-0c3cc5b8971d\") " pod="openstack/dnsmasq-dns-b8fbc5445-v224c" Dec 12 00:44:23 crc kubenswrapper[4606]: I1212 00:44:23.070329 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9a87b6a-f358-4e8f-90a9-0c3cc5b8971d-config\") pod \"dnsmasq-dns-b8fbc5445-v224c\" (UID: \"d9a87b6a-f358-4e8f-90a9-0c3cc5b8971d\") " pod="openstack/dnsmasq-dns-b8fbc5445-v224c" Dec 12 00:44:23 crc kubenswrapper[4606]: I1212 00:44:23.070384 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d9a87b6a-f358-4e8f-90a9-0c3cc5b8971d-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-v224c\" (UID: \"d9a87b6a-f358-4e8f-90a9-0c3cc5b8971d\") " pod="openstack/dnsmasq-dns-b8fbc5445-v224c" Dec 12 00:44:23 crc kubenswrapper[4606]: I1212 00:44:23.072556 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d9a87b6a-f358-4e8f-90a9-0c3cc5b8971d-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-v224c\" (UID: \"d9a87b6a-f358-4e8f-90a9-0c3cc5b8971d\") " pod="openstack/dnsmasq-dns-b8fbc5445-v224c" Dec 12 00:44:23 crc kubenswrapper[4606]: I1212 00:44:23.091730 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sczjk\" (UniqueName: \"kubernetes.io/projected/d9a87b6a-f358-4e8f-90a9-0c3cc5b8971d-kube-api-access-sczjk\") pod \"dnsmasq-dns-b8fbc5445-v224c\" (UID: \"d9a87b6a-f358-4e8f-90a9-0c3cc5b8971d\") " pod="openstack/dnsmasq-dns-b8fbc5445-v224c" Dec 12 00:44:23 crc kubenswrapper[4606]: I1212 00:44:23.236098 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-v224c" Dec 12 00:44:23 crc kubenswrapper[4606]: I1212 00:44:23.284011 4606 generic.go:334] "Generic (PLEG): container finished" podID="dce868a6-4924-41d6-bf03-38c519f2ab97" containerID="d9834e04803c8db6539f6defbba4e8febbafcd8f184a7debbaba80cb62a3bd73" exitCode=0 Dec 12 00:44:23 crc kubenswrapper[4606]: I1212 00:44:23.284097 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf47b49b7-g4dz7" event={"ID":"dce868a6-4924-41d6-bf03-38c519f2ab97","Type":"ContainerDied","Data":"d9834e04803c8db6539f6defbba4e8febbafcd8f184a7debbaba80cb62a3bd73"} Dec 12 00:44:23 crc kubenswrapper[4606]: I1212 00:44:23.297758 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"07c6dfb8-2190-4189-a3b7-f85da57160a1","Type":"ContainerStarted","Data":"db4303be012507ca5d6997da1e2999e11883323c9e2316ef8eee5c09c6e6ad61"} Dec 12 00:44:23 crc kubenswrapper[4606]: I1212 00:44:23.330809 4606 generic.go:334] "Generic (PLEG): container finished" podID="be7f7dee-5960-4b93-890f-d5898b9d6457" containerID="74d246e6ae84a658eb0035be57608d48d855503201885c64f7329680fb7080a0" exitCode=0 Dec 12 00:44:23 crc kubenswrapper[4606]: I1212 00:44:23.332709 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-bscb8" event={"ID":"be7f7dee-5960-4b93-890f-d5898b9d6457","Type":"ContainerDied","Data":"74d246e6ae84a658eb0035be57608d48d855503201885c64f7329680fb7080a0"} Dec 12 00:44:23 crc kubenswrapper[4606]: I1212 00:44:23.332775 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-bscb8" event={"ID":"be7f7dee-5960-4b93-890f-d5898b9d6457","Type":"ContainerStarted","Data":"7680f06a03aba3a0069633a8c7ef72e6b573ef733282aeacd9451c82514da4bc"} Dec 12 00:44:23 crc kubenswrapper[4606]: E1212 00:44:23.591205 4606 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Dec 12 00:44:23 crc kubenswrapper[4606]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/dce868a6-4924-41d6-bf03-38c519f2ab97/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Dec 12 00:44:23 crc kubenswrapper[4606]: > podSandboxID="2617a7822d137f8e129507dc2667fbcefa73f1b819442f0fe98e18223d0e284f" Dec 12 00:44:23 crc kubenswrapper[4606]: E1212 00:44:23.591815 4606 kuberuntime_manager.go:1274] "Unhandled Error" err=< Dec 12 00:44:23 crc kubenswrapper[4606]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n58bh65dh95hf6h595hf6hf5h59dh6h57dh558h55ch5dbh5f5h565h5f7h9fh76h58ch54dh84h59bh7fh6bh5b9h59h67fh566h56h5f4h554h58fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-nb,SubPath:ovsdbserver-nb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wvjww,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-5bf47b49b7-g4dz7_openstack(dce868a6-4924-41d6-bf03-38c519f2ab97): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/dce868a6-4924-41d6-bf03-38c519f2ab97/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Dec 12 00:44:23 crc kubenswrapper[4606]: > logger="UnhandledError" Dec 12 00:44:23 crc kubenswrapper[4606]: E1212 00:44:23.593332 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/dce868a6-4924-41d6-bf03-38c519f2ab97/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-5bf47b49b7-g4dz7" podUID="dce868a6-4924-41d6-bf03-38c519f2ab97" Dec 12 00:44:23 crc kubenswrapper[4606]: E1212 00:44:23.630792 4606 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Dec 12 00:44:23 crc kubenswrapper[4606]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/be7f7dee-5960-4b93-890f-d5898b9d6457/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Dec 12 00:44:23 crc kubenswrapper[4606]: > podSandboxID="7680f06a03aba3a0069633a8c7ef72e6b573ef733282aeacd9451c82514da4bc" Dec 12 00:44:23 crc kubenswrapper[4606]: E1212 00:44:23.631116 4606 kuberuntime_manager.go:1274] "Unhandled Error" err=< Dec 12 00:44:23 crc kubenswrapper[4606]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n654h99h64ch5dbh6dh555h587h64bh5cfh647h5fdh57ch679h9h597h5f5hbch59bh54fh575h566h667h586h5f5h65ch5bch57h68h65ch58bh694h5cfq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-nb,SubPath:ovsdbserver-nb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-sb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-sb,SubPath:ovsdbserver-sb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4rf5c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-8554648995-bscb8_openstack(be7f7dee-5960-4b93-890f-d5898b9d6457): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/be7f7dee-5960-4b93-890f-d5898b9d6457/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Dec 12 00:44:23 crc kubenswrapper[4606]: > logger="UnhandledError" Dec 12 00:44:23 crc kubenswrapper[4606]: E1212 00:44:23.632274 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/be7f7dee-5960-4b93-890f-d5898b9d6457/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-8554648995-bscb8" podUID="be7f7dee-5960-4b93-890f-d5898b9d6457" Dec 12 00:44:23 crc kubenswrapper[4606]: I1212 00:44:23.713318 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="059cf819-e18e-493b-a65b-9f3f8b5d683f" path="/var/lib/kubelet/pods/059cf819-e18e-493b-a65b-9f3f8b5d683f/volumes" Dec 12 00:44:23 crc kubenswrapper[4606]: I1212 00:44:23.713678 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="398360c3-4b1c-4dc0-be54-511d4ac621ee" path="/var/lib/kubelet/pods/398360c3-4b1c-4dc0-be54-511d4ac621ee/volumes" Dec 12 00:44:23 crc kubenswrapper[4606]: I1212 00:44:23.752876 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-v224c"] Dec 12 00:44:23 crc kubenswrapper[4606]: I1212 00:44:23.973667 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Dec 12 00:44:23 crc kubenswrapper[4606]: I1212 00:44:23.987367 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 12 00:44:23 crc kubenswrapper[4606]: I1212 00:44:23.991339 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Dec 12 00:44:23 crc kubenswrapper[4606]: I1212 00:44:23.991597 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Dec 12 00:44:23 crc kubenswrapper[4606]: I1212 00:44:23.991680 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Dec 12 00:44:23 crc kubenswrapper[4606]: I1212 00:44:23.991822 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-nlnhx" Dec 12 00:44:24 crc kubenswrapper[4606]: I1212 00:44:24.030132 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Dec 12 00:44:24 crc kubenswrapper[4606]: I1212 00:44:24.096463 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"swift-storage-0\" (UID: \"6963a48d-4eff-4349-bc36-2356ec73c08c\") " pod="openstack/swift-storage-0" Dec 12 00:44:24 crc kubenswrapper[4606]: I1212 00:44:24.096550 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xtk74\" (UniqueName: \"kubernetes.io/projected/6963a48d-4eff-4349-bc36-2356ec73c08c-kube-api-access-xtk74\") pod \"swift-storage-0\" (UID: \"6963a48d-4eff-4349-bc36-2356ec73c08c\") " pod="openstack/swift-storage-0" Dec 12 00:44:24 crc kubenswrapper[4606]: I1212 00:44:24.096593 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6963a48d-4eff-4349-bc36-2356ec73c08c-etc-swift\") pod \"swift-storage-0\" (UID: \"6963a48d-4eff-4349-bc36-2356ec73c08c\") " pod="openstack/swift-storage-0" Dec 12 00:44:24 crc kubenswrapper[4606]: I1212 00:44:24.096804 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/6963a48d-4eff-4349-bc36-2356ec73c08c-cache\") pod \"swift-storage-0\" (UID: \"6963a48d-4eff-4349-bc36-2356ec73c08c\") " pod="openstack/swift-storage-0" Dec 12 00:44:24 crc kubenswrapper[4606]: I1212 00:44:24.096836 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/6963a48d-4eff-4349-bc36-2356ec73c08c-lock\") pod \"swift-storage-0\" (UID: \"6963a48d-4eff-4349-bc36-2356ec73c08c\") " pod="openstack/swift-storage-0" Dec 12 00:44:24 crc kubenswrapper[4606]: I1212 00:44:24.198140 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/6963a48d-4eff-4349-bc36-2356ec73c08c-cache\") pod \"swift-storage-0\" (UID: \"6963a48d-4eff-4349-bc36-2356ec73c08c\") " pod="openstack/swift-storage-0" Dec 12 00:44:24 crc kubenswrapper[4606]: I1212 00:44:24.198208 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/6963a48d-4eff-4349-bc36-2356ec73c08c-lock\") pod \"swift-storage-0\" (UID: \"6963a48d-4eff-4349-bc36-2356ec73c08c\") " pod="openstack/swift-storage-0" Dec 12 00:44:24 crc kubenswrapper[4606]: I1212 00:44:24.198270 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"swift-storage-0\" (UID: \"6963a48d-4eff-4349-bc36-2356ec73c08c\") " pod="openstack/swift-storage-0" Dec 12 00:44:24 crc kubenswrapper[4606]: I1212 00:44:24.198306 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xtk74\" (UniqueName: \"kubernetes.io/projected/6963a48d-4eff-4349-bc36-2356ec73c08c-kube-api-access-xtk74\") pod \"swift-storage-0\" (UID: \"6963a48d-4eff-4349-bc36-2356ec73c08c\") " pod="openstack/swift-storage-0" Dec 12 00:44:24 crc kubenswrapper[4606]: I1212 00:44:24.198338 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6963a48d-4eff-4349-bc36-2356ec73c08c-etc-swift\") pod \"swift-storage-0\" (UID: \"6963a48d-4eff-4349-bc36-2356ec73c08c\") " pod="openstack/swift-storage-0" Dec 12 00:44:24 crc kubenswrapper[4606]: E1212 00:44:24.198512 4606 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 12 00:44:24 crc kubenswrapper[4606]: E1212 00:44:24.198526 4606 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 12 00:44:24 crc kubenswrapper[4606]: E1212 00:44:24.198569 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6963a48d-4eff-4349-bc36-2356ec73c08c-etc-swift podName:6963a48d-4eff-4349-bc36-2356ec73c08c nodeName:}" failed. No retries permitted until 2025-12-12 00:44:24.698552836 +0000 UTC m=+1255.243905702 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/6963a48d-4eff-4349-bc36-2356ec73c08c-etc-swift") pod "swift-storage-0" (UID: "6963a48d-4eff-4349-bc36-2356ec73c08c") : configmap "swift-ring-files" not found Dec 12 00:44:24 crc kubenswrapper[4606]: I1212 00:44:24.199103 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/6963a48d-4eff-4349-bc36-2356ec73c08c-cache\") pod \"swift-storage-0\" (UID: \"6963a48d-4eff-4349-bc36-2356ec73c08c\") " pod="openstack/swift-storage-0" Dec 12 00:44:24 crc kubenswrapper[4606]: I1212 00:44:24.199333 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/6963a48d-4eff-4349-bc36-2356ec73c08c-lock\") pod \"swift-storage-0\" (UID: \"6963a48d-4eff-4349-bc36-2356ec73c08c\") " pod="openstack/swift-storage-0" Dec 12 00:44:24 crc kubenswrapper[4606]: I1212 00:44:24.199571 4606 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"swift-storage-0\" (UID: \"6963a48d-4eff-4349-bc36-2356ec73c08c\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/swift-storage-0" Dec 12 00:44:24 crc kubenswrapper[4606]: I1212 00:44:24.219934 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"swift-storage-0\" (UID: \"6963a48d-4eff-4349-bc36-2356ec73c08c\") " pod="openstack/swift-storage-0" Dec 12 00:44:24 crc kubenswrapper[4606]: I1212 00:44:24.232360 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xtk74\" (UniqueName: \"kubernetes.io/projected/6963a48d-4eff-4349-bc36-2356ec73c08c-kube-api-access-xtk74\") pod \"swift-storage-0\" (UID: \"6963a48d-4eff-4349-bc36-2356ec73c08c\") " pod="openstack/swift-storage-0" Dec 12 00:44:24 crc kubenswrapper[4606]: I1212 00:44:24.338592 4606 generic.go:334] "Generic (PLEG): container finished" podID="d9a87b6a-f358-4e8f-90a9-0c3cc5b8971d" containerID="b389a3beb415877ef288eec5fb32660337c20df2a722a5732809ce332fcb288d" exitCode=0 Dec 12 00:44:24 crc kubenswrapper[4606]: I1212 00:44:24.338742 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-v224c" event={"ID":"d9a87b6a-f358-4e8f-90a9-0c3cc5b8971d","Type":"ContainerDied","Data":"b389a3beb415877ef288eec5fb32660337c20df2a722a5732809ce332fcb288d"} Dec 12 00:44:24 crc kubenswrapper[4606]: I1212 00:44:24.339567 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-v224c" event={"ID":"d9a87b6a-f358-4e8f-90a9-0c3cc5b8971d","Type":"ContainerStarted","Data":"cbab86f029bb8028e8c380aaee0515f5ed8f624f81a5f5f244e2b95c6ce8c203"} Dec 12 00:44:24 crc kubenswrapper[4606]: I1212 00:44:24.672759 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-g4dz7" Dec 12 00:44:24 crc kubenswrapper[4606]: I1212 00:44:24.710035 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6963a48d-4eff-4349-bc36-2356ec73c08c-etc-swift\") pod \"swift-storage-0\" (UID: \"6963a48d-4eff-4349-bc36-2356ec73c08c\") " pod="openstack/swift-storage-0" Dec 12 00:44:24 crc kubenswrapper[4606]: E1212 00:44:24.710399 4606 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 12 00:44:24 crc kubenswrapper[4606]: E1212 00:44:24.710432 4606 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 12 00:44:24 crc kubenswrapper[4606]: E1212 00:44:24.710470 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6963a48d-4eff-4349-bc36-2356ec73c08c-etc-swift podName:6963a48d-4eff-4349-bc36-2356ec73c08c nodeName:}" failed. No retries permitted until 2025-12-12 00:44:25.710457474 +0000 UTC m=+1256.255810340 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/6963a48d-4eff-4349-bc36-2356ec73c08c-etc-swift") pod "swift-storage-0" (UID: "6963a48d-4eff-4349-bc36-2356ec73c08c") : configmap "swift-ring-files" not found Dec 12 00:44:24 crc kubenswrapper[4606]: I1212 00:44:24.812469 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dce868a6-4924-41d6-bf03-38c519f2ab97-dns-svc\") pod \"dce868a6-4924-41d6-bf03-38c519f2ab97\" (UID: \"dce868a6-4924-41d6-bf03-38c519f2ab97\") " Dec 12 00:44:24 crc kubenswrapper[4606]: I1212 00:44:24.812507 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wvjww\" (UniqueName: \"kubernetes.io/projected/dce868a6-4924-41d6-bf03-38c519f2ab97-kube-api-access-wvjww\") pod \"dce868a6-4924-41d6-bf03-38c519f2ab97\" (UID: \"dce868a6-4924-41d6-bf03-38c519f2ab97\") " Dec 12 00:44:24 crc kubenswrapper[4606]: I1212 00:44:24.812678 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dce868a6-4924-41d6-bf03-38c519f2ab97-ovsdbserver-nb\") pod \"dce868a6-4924-41d6-bf03-38c519f2ab97\" (UID: \"dce868a6-4924-41d6-bf03-38c519f2ab97\") " Dec 12 00:44:24 crc kubenswrapper[4606]: I1212 00:44:24.812719 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dce868a6-4924-41d6-bf03-38c519f2ab97-config\") pod \"dce868a6-4924-41d6-bf03-38c519f2ab97\" (UID: \"dce868a6-4924-41d6-bf03-38c519f2ab97\") " Dec 12 00:44:24 crc kubenswrapper[4606]: I1212 00:44:24.821604 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dce868a6-4924-41d6-bf03-38c519f2ab97-kube-api-access-wvjww" (OuterVolumeSpecName: "kube-api-access-wvjww") pod "dce868a6-4924-41d6-bf03-38c519f2ab97" (UID: "dce868a6-4924-41d6-bf03-38c519f2ab97"). InnerVolumeSpecName "kube-api-access-wvjww". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:44:24 crc kubenswrapper[4606]: I1212 00:44:24.884307 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dce868a6-4924-41d6-bf03-38c519f2ab97-config" (OuterVolumeSpecName: "config") pod "dce868a6-4924-41d6-bf03-38c519f2ab97" (UID: "dce868a6-4924-41d6-bf03-38c519f2ab97"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:44:24 crc kubenswrapper[4606]: I1212 00:44:24.896578 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dce868a6-4924-41d6-bf03-38c519f2ab97-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "dce868a6-4924-41d6-bf03-38c519f2ab97" (UID: "dce868a6-4924-41d6-bf03-38c519f2ab97"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:44:24 crc kubenswrapper[4606]: I1212 00:44:24.900984 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dce868a6-4924-41d6-bf03-38c519f2ab97-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "dce868a6-4924-41d6-bf03-38c519f2ab97" (UID: "dce868a6-4924-41d6-bf03-38c519f2ab97"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:44:24 crc kubenswrapper[4606]: I1212 00:44:24.914793 4606 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dce868a6-4924-41d6-bf03-38c519f2ab97-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 12 00:44:24 crc kubenswrapper[4606]: I1212 00:44:24.914932 4606 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dce868a6-4924-41d6-bf03-38c519f2ab97-config\") on node \"crc\" DevicePath \"\"" Dec 12 00:44:24 crc kubenswrapper[4606]: I1212 00:44:24.915458 4606 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dce868a6-4924-41d6-bf03-38c519f2ab97-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 12 00:44:24 crc kubenswrapper[4606]: I1212 00:44:24.915583 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wvjww\" (UniqueName: \"kubernetes.io/projected/dce868a6-4924-41d6-bf03-38c519f2ab97-kube-api-access-wvjww\") on node \"crc\" DevicePath \"\"" Dec 12 00:44:25 crc kubenswrapper[4606]: I1212 00:44:25.348223 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf47b49b7-g4dz7" event={"ID":"dce868a6-4924-41d6-bf03-38c519f2ab97","Type":"ContainerDied","Data":"2617a7822d137f8e129507dc2667fbcefa73f1b819442f0fe98e18223d0e284f"} Dec 12 00:44:25 crc kubenswrapper[4606]: I1212 00:44:25.348256 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-g4dz7" Dec 12 00:44:25 crc kubenswrapper[4606]: I1212 00:44:25.348292 4606 scope.go:117] "RemoveContainer" containerID="d9834e04803c8db6539f6defbba4e8febbafcd8f184a7debbaba80cb62a3bd73" Dec 12 00:44:25 crc kubenswrapper[4606]: I1212 00:44:25.354451 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"fe774fb2-c953-4fc2-8f6b-ec94268d6e7d","Type":"ContainerStarted","Data":"19d90ed114f2005fa9d7fa01c73eda1cca64f83fdbd5875c73bf85360b9c124c"} Dec 12 00:44:25 crc kubenswrapper[4606]: I1212 00:44:25.354658 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 12 00:44:25 crc kubenswrapper[4606]: I1212 00:44:25.364940 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"07c6dfb8-2190-4189-a3b7-f85da57160a1","Type":"ContainerStarted","Data":"0d36334f6504874007cd98b803bfd6bef56e0819f87ce6e0fd82527ddeed307d"} Dec 12 00:44:25 crc kubenswrapper[4606]: I1212 00:44:25.364979 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"07c6dfb8-2190-4189-a3b7-f85da57160a1","Type":"ContainerStarted","Data":"78e5fcfcf1a744d26871ae368c406d1217c5e9b86b2dd9c4ad29c17f2e4b505d"} Dec 12 00:44:25 crc kubenswrapper[4606]: I1212 00:44:25.365679 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Dec 12 00:44:25 crc kubenswrapper[4606]: I1212 00:44:25.367831 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-v224c" event={"ID":"d9a87b6a-f358-4e8f-90a9-0c3cc5b8971d","Type":"ContainerStarted","Data":"bb4b0e071f6d1520c484ab216a2d53fd85e9cf5e48c7849c5f250a898e52f497"} Dec 12 00:44:25 crc kubenswrapper[4606]: I1212 00:44:25.368267 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b8fbc5445-v224c" Dec 12 00:44:25 crc kubenswrapper[4606]: I1212 00:44:25.388424 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-bscb8" event={"ID":"be7f7dee-5960-4b93-890f-d5898b9d6457","Type":"ContainerStarted","Data":"3d69d59647502af67a6e346718d81263dc3f1321fd97427771810090805c7351"} Dec 12 00:44:25 crc kubenswrapper[4606]: I1212 00:44:25.390379 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8554648995-bscb8" Dec 12 00:44:25 crc kubenswrapper[4606]: I1212 00:44:25.399584 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=1.68563518 podStartE2EDuration="43.399569637s" podCreationTimestamp="2025-12-12 00:43:42 +0000 UTC" firstStartedPulling="2025-12-12 00:43:43.43447953 +0000 UTC m=+1213.979832396" lastFinishedPulling="2025-12-12 00:44:25.148413987 +0000 UTC m=+1255.693766853" observedRunningTime="2025-12-12 00:44:25.37623182 +0000 UTC m=+1255.921584686" watchObservedRunningTime="2025-12-12 00:44:25.399569637 +0000 UTC m=+1255.944922503" Dec 12 00:44:25 crc kubenswrapper[4606]: I1212 00:44:25.405461 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.586324954 podStartE2EDuration="4.405440144s" podCreationTimestamp="2025-12-12 00:44:21 +0000 UTC" firstStartedPulling="2025-12-12 00:44:22.833661046 +0000 UTC m=+1253.379013912" lastFinishedPulling="2025-12-12 00:44:24.652776236 +0000 UTC m=+1255.198129102" observedRunningTime="2025-12-12 00:44:25.397974014 +0000 UTC m=+1255.943326900" watchObservedRunningTime="2025-12-12 00:44:25.405440144 +0000 UTC m=+1255.950793010" Dec 12 00:44:25 crc kubenswrapper[4606]: I1212 00:44:25.423382 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-b8fbc5445-v224c" podStartSLOduration=3.423366855 podStartE2EDuration="3.423366855s" podCreationTimestamp="2025-12-12 00:44:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:44:25.418073493 +0000 UTC m=+1255.963426359" watchObservedRunningTime="2025-12-12 00:44:25.423366855 +0000 UTC m=+1255.968719721" Dec 12 00:44:25 crc kubenswrapper[4606]: I1212 00:44:25.463876 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-g4dz7"] Dec 12 00:44:25 crc kubenswrapper[4606]: I1212 00:44:25.475933 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-g4dz7"] Dec 12 00:44:25 crc kubenswrapper[4606]: I1212 00:44:25.479030 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8554648995-bscb8" podStartSLOduration=4.479018749 podStartE2EDuration="4.479018749s" podCreationTimestamp="2025-12-12 00:44:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:44:25.468762794 +0000 UTC m=+1256.014115660" watchObservedRunningTime="2025-12-12 00:44:25.479018749 +0000 UTC m=+1256.024371615" Dec 12 00:44:25 crc kubenswrapper[4606]: I1212 00:44:25.712470 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dce868a6-4924-41d6-bf03-38c519f2ab97" path="/var/lib/kubelet/pods/dce868a6-4924-41d6-bf03-38c519f2ab97/volumes" Dec 12 00:44:25 crc kubenswrapper[4606]: I1212 00:44:25.729121 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6963a48d-4eff-4349-bc36-2356ec73c08c-etc-swift\") pod \"swift-storage-0\" (UID: \"6963a48d-4eff-4349-bc36-2356ec73c08c\") " pod="openstack/swift-storage-0" Dec 12 00:44:25 crc kubenswrapper[4606]: E1212 00:44:25.729302 4606 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 12 00:44:25 crc kubenswrapper[4606]: E1212 00:44:25.729326 4606 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 12 00:44:25 crc kubenswrapper[4606]: E1212 00:44:25.729387 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6963a48d-4eff-4349-bc36-2356ec73c08c-etc-swift podName:6963a48d-4eff-4349-bc36-2356ec73c08c nodeName:}" failed. No retries permitted until 2025-12-12 00:44:27.729369148 +0000 UTC m=+1258.274722014 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/6963a48d-4eff-4349-bc36-2356ec73c08c-etc-swift") pod "swift-storage-0" (UID: "6963a48d-4eff-4349-bc36-2356ec73c08c") : configmap "swift-ring-files" not found Dec 12 00:44:27 crc kubenswrapper[4606]: I1212 00:44:27.762337 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6963a48d-4eff-4349-bc36-2356ec73c08c-etc-swift\") pod \"swift-storage-0\" (UID: \"6963a48d-4eff-4349-bc36-2356ec73c08c\") " pod="openstack/swift-storage-0" Dec 12 00:44:27 crc kubenswrapper[4606]: E1212 00:44:27.762753 4606 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 12 00:44:27 crc kubenswrapper[4606]: E1212 00:44:27.764273 4606 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 12 00:44:27 crc kubenswrapper[4606]: E1212 00:44:27.764367 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6963a48d-4eff-4349-bc36-2356ec73c08c-etc-swift podName:6963a48d-4eff-4349-bc36-2356ec73c08c nodeName:}" failed. No retries permitted until 2025-12-12 00:44:31.764341021 +0000 UTC m=+1262.309693927 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/6963a48d-4eff-4349-bc36-2356ec73c08c-etc-swift") pod "swift-storage-0" (UID: "6963a48d-4eff-4349-bc36-2356ec73c08c") : configmap "swift-ring-files" not found Dec 12 00:44:27 crc kubenswrapper[4606]: I1212 00:44:27.918116 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-t97v8"] Dec 12 00:44:27 crc kubenswrapper[4606]: E1212 00:44:27.918542 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dce868a6-4924-41d6-bf03-38c519f2ab97" containerName="init" Dec 12 00:44:27 crc kubenswrapper[4606]: I1212 00:44:27.918560 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="dce868a6-4924-41d6-bf03-38c519f2ab97" containerName="init" Dec 12 00:44:27 crc kubenswrapper[4606]: I1212 00:44:27.918751 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="dce868a6-4924-41d6-bf03-38c519f2ab97" containerName="init" Dec 12 00:44:27 crc kubenswrapper[4606]: I1212 00:44:27.919396 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-t97v8" Dec 12 00:44:27 crc kubenswrapper[4606]: I1212 00:44:27.926065 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Dec 12 00:44:27 crc kubenswrapper[4606]: I1212 00:44:27.926144 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Dec 12 00:44:27 crc kubenswrapper[4606]: I1212 00:44:27.926218 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Dec 12 00:44:27 crc kubenswrapper[4606]: I1212 00:44:27.935406 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-t97v8"] Dec 12 00:44:27 crc kubenswrapper[4606]: I1212 00:44:27.967322 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d4a54eac-00ee-452a-9c4b-e777e338e670-dispersionconf\") pod \"swift-ring-rebalance-t97v8\" (UID: \"d4a54eac-00ee-452a-9c4b-e777e338e670\") " pod="openstack/swift-ring-rebalance-t97v8" Dec 12 00:44:27 crc kubenswrapper[4606]: I1212 00:44:27.967374 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d4a54eac-00ee-452a-9c4b-e777e338e670-swiftconf\") pod \"swift-ring-rebalance-t97v8\" (UID: \"d4a54eac-00ee-452a-9c4b-e777e338e670\") " pod="openstack/swift-ring-rebalance-t97v8" Dec 12 00:44:27 crc kubenswrapper[4606]: I1212 00:44:27.967409 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d4a54eac-00ee-452a-9c4b-e777e338e670-etc-swift\") pod \"swift-ring-rebalance-t97v8\" (UID: \"d4a54eac-00ee-452a-9c4b-e777e338e670\") " pod="openstack/swift-ring-rebalance-t97v8" Dec 12 00:44:27 crc kubenswrapper[4606]: I1212 00:44:27.967430 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d4a54eac-00ee-452a-9c4b-e777e338e670-ring-data-devices\") pod \"swift-ring-rebalance-t97v8\" (UID: \"d4a54eac-00ee-452a-9c4b-e777e338e670\") " pod="openstack/swift-ring-rebalance-t97v8" Dec 12 00:44:27 crc kubenswrapper[4606]: I1212 00:44:27.967461 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4a54eac-00ee-452a-9c4b-e777e338e670-combined-ca-bundle\") pod \"swift-ring-rebalance-t97v8\" (UID: \"d4a54eac-00ee-452a-9c4b-e777e338e670\") " pod="openstack/swift-ring-rebalance-t97v8" Dec 12 00:44:27 crc kubenswrapper[4606]: I1212 00:44:27.967480 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d4a54eac-00ee-452a-9c4b-e777e338e670-scripts\") pod \"swift-ring-rebalance-t97v8\" (UID: \"d4a54eac-00ee-452a-9c4b-e777e338e670\") " pod="openstack/swift-ring-rebalance-t97v8" Dec 12 00:44:27 crc kubenswrapper[4606]: I1212 00:44:27.967506 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dn7n2\" (UniqueName: \"kubernetes.io/projected/d4a54eac-00ee-452a-9c4b-e777e338e670-kube-api-access-dn7n2\") pod \"swift-ring-rebalance-t97v8\" (UID: \"d4a54eac-00ee-452a-9c4b-e777e338e670\") " pod="openstack/swift-ring-rebalance-t97v8" Dec 12 00:44:28 crc kubenswrapper[4606]: I1212 00:44:28.069385 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d4a54eac-00ee-452a-9c4b-e777e338e670-etc-swift\") pod \"swift-ring-rebalance-t97v8\" (UID: \"d4a54eac-00ee-452a-9c4b-e777e338e670\") " pod="openstack/swift-ring-rebalance-t97v8" Dec 12 00:44:28 crc kubenswrapper[4606]: I1212 00:44:28.069454 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d4a54eac-00ee-452a-9c4b-e777e338e670-ring-data-devices\") pod \"swift-ring-rebalance-t97v8\" (UID: \"d4a54eac-00ee-452a-9c4b-e777e338e670\") " pod="openstack/swift-ring-rebalance-t97v8" Dec 12 00:44:28 crc kubenswrapper[4606]: I1212 00:44:28.069508 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4a54eac-00ee-452a-9c4b-e777e338e670-combined-ca-bundle\") pod \"swift-ring-rebalance-t97v8\" (UID: \"d4a54eac-00ee-452a-9c4b-e777e338e670\") " pod="openstack/swift-ring-rebalance-t97v8" Dec 12 00:44:28 crc kubenswrapper[4606]: I1212 00:44:28.069535 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d4a54eac-00ee-452a-9c4b-e777e338e670-scripts\") pod \"swift-ring-rebalance-t97v8\" (UID: \"d4a54eac-00ee-452a-9c4b-e777e338e670\") " pod="openstack/swift-ring-rebalance-t97v8" Dec 12 00:44:28 crc kubenswrapper[4606]: I1212 00:44:28.069573 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dn7n2\" (UniqueName: \"kubernetes.io/projected/d4a54eac-00ee-452a-9c4b-e777e338e670-kube-api-access-dn7n2\") pod \"swift-ring-rebalance-t97v8\" (UID: \"d4a54eac-00ee-452a-9c4b-e777e338e670\") " pod="openstack/swift-ring-rebalance-t97v8" Dec 12 00:44:28 crc kubenswrapper[4606]: I1212 00:44:28.069663 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d4a54eac-00ee-452a-9c4b-e777e338e670-dispersionconf\") pod \"swift-ring-rebalance-t97v8\" (UID: \"d4a54eac-00ee-452a-9c4b-e777e338e670\") " pod="openstack/swift-ring-rebalance-t97v8" Dec 12 00:44:28 crc kubenswrapper[4606]: I1212 00:44:28.069701 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d4a54eac-00ee-452a-9c4b-e777e338e670-swiftconf\") pod \"swift-ring-rebalance-t97v8\" (UID: \"d4a54eac-00ee-452a-9c4b-e777e338e670\") " pod="openstack/swift-ring-rebalance-t97v8" Dec 12 00:44:28 crc kubenswrapper[4606]: I1212 00:44:28.069928 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d4a54eac-00ee-452a-9c4b-e777e338e670-etc-swift\") pod \"swift-ring-rebalance-t97v8\" (UID: \"d4a54eac-00ee-452a-9c4b-e777e338e670\") " pod="openstack/swift-ring-rebalance-t97v8" Dec 12 00:44:28 crc kubenswrapper[4606]: I1212 00:44:28.070374 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d4a54eac-00ee-452a-9c4b-e777e338e670-ring-data-devices\") pod \"swift-ring-rebalance-t97v8\" (UID: \"d4a54eac-00ee-452a-9c4b-e777e338e670\") " pod="openstack/swift-ring-rebalance-t97v8" Dec 12 00:44:28 crc kubenswrapper[4606]: I1212 00:44:28.070471 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d4a54eac-00ee-452a-9c4b-e777e338e670-scripts\") pod \"swift-ring-rebalance-t97v8\" (UID: \"d4a54eac-00ee-452a-9c4b-e777e338e670\") " pod="openstack/swift-ring-rebalance-t97v8" Dec 12 00:44:28 crc kubenswrapper[4606]: I1212 00:44:28.074905 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d4a54eac-00ee-452a-9c4b-e777e338e670-swiftconf\") pod \"swift-ring-rebalance-t97v8\" (UID: \"d4a54eac-00ee-452a-9c4b-e777e338e670\") " pod="openstack/swift-ring-rebalance-t97v8" Dec 12 00:44:28 crc kubenswrapper[4606]: I1212 00:44:28.076270 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d4a54eac-00ee-452a-9c4b-e777e338e670-dispersionconf\") pod \"swift-ring-rebalance-t97v8\" (UID: \"d4a54eac-00ee-452a-9c4b-e777e338e670\") " pod="openstack/swift-ring-rebalance-t97v8" Dec 12 00:44:28 crc kubenswrapper[4606]: I1212 00:44:28.077414 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4a54eac-00ee-452a-9c4b-e777e338e670-combined-ca-bundle\") pod \"swift-ring-rebalance-t97v8\" (UID: \"d4a54eac-00ee-452a-9c4b-e777e338e670\") " pod="openstack/swift-ring-rebalance-t97v8" Dec 12 00:44:28 crc kubenswrapper[4606]: I1212 00:44:28.089030 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dn7n2\" (UniqueName: \"kubernetes.io/projected/d4a54eac-00ee-452a-9c4b-e777e338e670-kube-api-access-dn7n2\") pod \"swift-ring-rebalance-t97v8\" (UID: \"d4a54eac-00ee-452a-9c4b-e777e338e670\") " pod="openstack/swift-ring-rebalance-t97v8" Dec 12 00:44:28 crc kubenswrapper[4606]: I1212 00:44:28.238277 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-t97v8" Dec 12 00:44:28 crc kubenswrapper[4606]: I1212 00:44:28.511863 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-t97v8"] Dec 12 00:44:29 crc kubenswrapper[4606]: I1212 00:44:29.430289 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-t97v8" event={"ID":"d4a54eac-00ee-452a-9c4b-e777e338e670","Type":"ContainerStarted","Data":"db2ffc32613601924f1da4713ab038b95d293a0db158ae4f9d163c10075200e8"} Dec 12 00:44:30 crc kubenswrapper[4606]: I1212 00:44:30.691756 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-z942v"] Dec 12 00:44:30 crc kubenswrapper[4606]: I1212 00:44:30.693250 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-z942v" Dec 12 00:44:30 crc kubenswrapper[4606]: I1212 00:44:30.715702 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-ee90-account-create-update-2wb28"] Dec 12 00:44:30 crc kubenswrapper[4606]: I1212 00:44:30.717126 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-ee90-account-create-update-2wb28" Dec 12 00:44:30 crc kubenswrapper[4606]: I1212 00:44:30.720852 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Dec 12 00:44:30 crc kubenswrapper[4606]: I1212 00:44:30.729064 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbt62\" (UniqueName: \"kubernetes.io/projected/38c5f767-2214-4103-94ef-c9b98cfb9269-kube-api-access-rbt62\") pod \"keystone-db-create-z942v\" (UID: \"38c5f767-2214-4103-94ef-c9b98cfb9269\") " pod="openstack/keystone-db-create-z942v" Dec 12 00:44:30 crc kubenswrapper[4606]: I1212 00:44:30.729100 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/38c5f767-2214-4103-94ef-c9b98cfb9269-operator-scripts\") pod \"keystone-db-create-z942v\" (UID: \"38c5f767-2214-4103-94ef-c9b98cfb9269\") " pod="openstack/keystone-db-create-z942v" Dec 12 00:44:30 crc kubenswrapper[4606]: I1212 00:44:30.729136 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6c857da8-746f-4a51-b509-e6ed45614ab6-operator-scripts\") pod \"keystone-ee90-account-create-update-2wb28\" (UID: \"6c857da8-746f-4a51-b509-e6ed45614ab6\") " pod="openstack/keystone-ee90-account-create-update-2wb28" Dec 12 00:44:30 crc kubenswrapper[4606]: I1212 00:44:30.729239 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmpzd\" (UniqueName: \"kubernetes.io/projected/6c857da8-746f-4a51-b509-e6ed45614ab6-kube-api-access-vmpzd\") pod \"keystone-ee90-account-create-update-2wb28\" (UID: \"6c857da8-746f-4a51-b509-e6ed45614ab6\") " pod="openstack/keystone-ee90-account-create-update-2wb28" Dec 12 00:44:30 crc kubenswrapper[4606]: I1212 00:44:30.730187 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-z942v"] Dec 12 00:44:30 crc kubenswrapper[4606]: I1212 00:44:30.737053 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-ee90-account-create-update-2wb28"] Dec 12 00:44:30 crc kubenswrapper[4606]: I1212 00:44:30.830752 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbt62\" (UniqueName: \"kubernetes.io/projected/38c5f767-2214-4103-94ef-c9b98cfb9269-kube-api-access-rbt62\") pod \"keystone-db-create-z942v\" (UID: \"38c5f767-2214-4103-94ef-c9b98cfb9269\") " pod="openstack/keystone-db-create-z942v" Dec 12 00:44:30 crc kubenswrapper[4606]: I1212 00:44:30.830803 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/38c5f767-2214-4103-94ef-c9b98cfb9269-operator-scripts\") pod \"keystone-db-create-z942v\" (UID: \"38c5f767-2214-4103-94ef-c9b98cfb9269\") " pod="openstack/keystone-db-create-z942v" Dec 12 00:44:30 crc kubenswrapper[4606]: I1212 00:44:30.830839 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6c857da8-746f-4a51-b509-e6ed45614ab6-operator-scripts\") pod \"keystone-ee90-account-create-update-2wb28\" (UID: \"6c857da8-746f-4a51-b509-e6ed45614ab6\") " pod="openstack/keystone-ee90-account-create-update-2wb28" Dec 12 00:44:30 crc kubenswrapper[4606]: I1212 00:44:30.830900 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmpzd\" (UniqueName: \"kubernetes.io/projected/6c857da8-746f-4a51-b509-e6ed45614ab6-kube-api-access-vmpzd\") pod \"keystone-ee90-account-create-update-2wb28\" (UID: \"6c857da8-746f-4a51-b509-e6ed45614ab6\") " pod="openstack/keystone-ee90-account-create-update-2wb28" Dec 12 00:44:30 crc kubenswrapper[4606]: I1212 00:44:30.832148 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6c857da8-746f-4a51-b509-e6ed45614ab6-operator-scripts\") pod \"keystone-ee90-account-create-update-2wb28\" (UID: \"6c857da8-746f-4a51-b509-e6ed45614ab6\") " pod="openstack/keystone-ee90-account-create-update-2wb28" Dec 12 00:44:30 crc kubenswrapper[4606]: I1212 00:44:30.833657 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/38c5f767-2214-4103-94ef-c9b98cfb9269-operator-scripts\") pod \"keystone-db-create-z942v\" (UID: \"38c5f767-2214-4103-94ef-c9b98cfb9269\") " pod="openstack/keystone-db-create-z942v" Dec 12 00:44:30 crc kubenswrapper[4606]: I1212 00:44:30.850670 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmpzd\" (UniqueName: \"kubernetes.io/projected/6c857da8-746f-4a51-b509-e6ed45614ab6-kube-api-access-vmpzd\") pod \"keystone-ee90-account-create-update-2wb28\" (UID: \"6c857da8-746f-4a51-b509-e6ed45614ab6\") " pod="openstack/keystone-ee90-account-create-update-2wb28" Dec 12 00:44:30 crc kubenswrapper[4606]: I1212 00:44:30.863665 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbt62\" (UniqueName: \"kubernetes.io/projected/38c5f767-2214-4103-94ef-c9b98cfb9269-kube-api-access-rbt62\") pod \"keystone-db-create-z942v\" (UID: \"38c5f767-2214-4103-94ef-c9b98cfb9269\") " pod="openstack/keystone-db-create-z942v" Dec 12 00:44:30 crc kubenswrapper[4606]: I1212 00:44:30.931465 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-rpf5j"] Dec 12 00:44:30 crc kubenswrapper[4606]: I1212 00:44:30.932424 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-rpf5j" Dec 12 00:44:30 crc kubenswrapper[4606]: I1212 00:44:30.968216 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-rpf5j"] Dec 12 00:44:31 crc kubenswrapper[4606]: I1212 00:44:31.014501 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-z942v" Dec 12 00:44:31 crc kubenswrapper[4606]: I1212 00:44:31.035379 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4r7w\" (UniqueName: \"kubernetes.io/projected/f8b5bcea-0e2f-4d42-b6bd-0f7ea87ed0e5-kube-api-access-k4r7w\") pod \"placement-db-create-rpf5j\" (UID: \"f8b5bcea-0e2f-4d42-b6bd-0f7ea87ed0e5\") " pod="openstack/placement-db-create-rpf5j" Dec 12 00:44:31 crc kubenswrapper[4606]: I1212 00:44:31.035500 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f8b5bcea-0e2f-4d42-b6bd-0f7ea87ed0e5-operator-scripts\") pod \"placement-db-create-rpf5j\" (UID: \"f8b5bcea-0e2f-4d42-b6bd-0f7ea87ed0e5\") " pod="openstack/placement-db-create-rpf5j" Dec 12 00:44:31 crc kubenswrapper[4606]: I1212 00:44:31.045664 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-ee90-account-create-update-2wb28" Dec 12 00:44:31 crc kubenswrapper[4606]: I1212 00:44:31.069042 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-6e57-account-create-update-bstwq"] Dec 12 00:44:31 crc kubenswrapper[4606]: I1212 00:44:31.070012 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6e57-account-create-update-bstwq" Dec 12 00:44:31 crc kubenswrapper[4606]: I1212 00:44:31.075524 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Dec 12 00:44:31 crc kubenswrapper[4606]: I1212 00:44:31.090289 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6e57-account-create-update-bstwq"] Dec 12 00:44:31 crc kubenswrapper[4606]: I1212 00:44:31.136878 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhkj9\" (UniqueName: \"kubernetes.io/projected/d34d547f-c6ae-48f9-8df6-1d2d35942f23-kube-api-access-lhkj9\") pod \"placement-6e57-account-create-update-bstwq\" (UID: \"d34d547f-c6ae-48f9-8df6-1d2d35942f23\") " pod="openstack/placement-6e57-account-create-update-bstwq" Dec 12 00:44:31 crc kubenswrapper[4606]: I1212 00:44:31.136931 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d34d547f-c6ae-48f9-8df6-1d2d35942f23-operator-scripts\") pod \"placement-6e57-account-create-update-bstwq\" (UID: \"d34d547f-c6ae-48f9-8df6-1d2d35942f23\") " pod="openstack/placement-6e57-account-create-update-bstwq" Dec 12 00:44:31 crc kubenswrapper[4606]: I1212 00:44:31.137114 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f8b5bcea-0e2f-4d42-b6bd-0f7ea87ed0e5-operator-scripts\") pod \"placement-db-create-rpf5j\" (UID: \"f8b5bcea-0e2f-4d42-b6bd-0f7ea87ed0e5\") " pod="openstack/placement-db-create-rpf5j" Dec 12 00:44:31 crc kubenswrapper[4606]: I1212 00:44:31.137322 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4r7w\" (UniqueName: \"kubernetes.io/projected/f8b5bcea-0e2f-4d42-b6bd-0f7ea87ed0e5-kube-api-access-k4r7w\") pod \"placement-db-create-rpf5j\" (UID: \"f8b5bcea-0e2f-4d42-b6bd-0f7ea87ed0e5\") " pod="openstack/placement-db-create-rpf5j" Dec 12 00:44:31 crc kubenswrapper[4606]: I1212 00:44:31.138675 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f8b5bcea-0e2f-4d42-b6bd-0f7ea87ed0e5-operator-scripts\") pod \"placement-db-create-rpf5j\" (UID: \"f8b5bcea-0e2f-4d42-b6bd-0f7ea87ed0e5\") " pod="openstack/placement-db-create-rpf5j" Dec 12 00:44:31 crc kubenswrapper[4606]: I1212 00:44:31.167412 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4r7w\" (UniqueName: \"kubernetes.io/projected/f8b5bcea-0e2f-4d42-b6bd-0f7ea87ed0e5-kube-api-access-k4r7w\") pod \"placement-db-create-rpf5j\" (UID: \"f8b5bcea-0e2f-4d42-b6bd-0f7ea87ed0e5\") " pod="openstack/placement-db-create-rpf5j" Dec 12 00:44:31 crc kubenswrapper[4606]: I1212 00:44:31.238857 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhkj9\" (UniqueName: \"kubernetes.io/projected/d34d547f-c6ae-48f9-8df6-1d2d35942f23-kube-api-access-lhkj9\") pod \"placement-6e57-account-create-update-bstwq\" (UID: \"d34d547f-c6ae-48f9-8df6-1d2d35942f23\") " pod="openstack/placement-6e57-account-create-update-bstwq" Dec 12 00:44:31 crc kubenswrapper[4606]: I1212 00:44:31.238925 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d34d547f-c6ae-48f9-8df6-1d2d35942f23-operator-scripts\") pod \"placement-6e57-account-create-update-bstwq\" (UID: \"d34d547f-c6ae-48f9-8df6-1d2d35942f23\") " pod="openstack/placement-6e57-account-create-update-bstwq" Dec 12 00:44:31 crc kubenswrapper[4606]: I1212 00:44:31.239872 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d34d547f-c6ae-48f9-8df6-1d2d35942f23-operator-scripts\") pod \"placement-6e57-account-create-update-bstwq\" (UID: \"d34d547f-c6ae-48f9-8df6-1d2d35942f23\") " pod="openstack/placement-6e57-account-create-update-bstwq" Dec 12 00:44:31 crc kubenswrapper[4606]: I1212 00:44:31.256119 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhkj9\" (UniqueName: \"kubernetes.io/projected/d34d547f-c6ae-48f9-8df6-1d2d35942f23-kube-api-access-lhkj9\") pod \"placement-6e57-account-create-update-bstwq\" (UID: \"d34d547f-c6ae-48f9-8df6-1d2d35942f23\") " pod="openstack/placement-6e57-account-create-update-bstwq" Dec 12 00:44:31 crc kubenswrapper[4606]: I1212 00:44:31.270117 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-rpf5j" Dec 12 00:44:31 crc kubenswrapper[4606]: I1212 00:44:31.379642 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-nm7sl"] Dec 12 00:44:31 crc kubenswrapper[4606]: I1212 00:44:31.381158 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-nm7sl" Dec 12 00:44:31 crc kubenswrapper[4606]: I1212 00:44:31.387979 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-nm7sl"] Dec 12 00:44:31 crc kubenswrapper[4606]: I1212 00:44:31.389753 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6e57-account-create-update-bstwq" Dec 12 00:44:31 crc kubenswrapper[4606]: I1212 00:44:31.445993 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b2358288-ebbf-4430-9684-7bcdf01349a4-operator-scripts\") pod \"glance-db-create-nm7sl\" (UID: \"b2358288-ebbf-4430-9684-7bcdf01349a4\") " pod="openstack/glance-db-create-nm7sl" Dec 12 00:44:31 crc kubenswrapper[4606]: I1212 00:44:31.446394 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2cdqr\" (UniqueName: \"kubernetes.io/projected/b2358288-ebbf-4430-9684-7bcdf01349a4-kube-api-access-2cdqr\") pod \"glance-db-create-nm7sl\" (UID: \"b2358288-ebbf-4430-9684-7bcdf01349a4\") " pod="openstack/glance-db-create-nm7sl" Dec 12 00:44:31 crc kubenswrapper[4606]: I1212 00:44:31.497021 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-4df9-account-create-update-7t79l"] Dec 12 00:44:31 crc kubenswrapper[4606]: I1212 00:44:31.499377 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-4df9-account-create-update-7t79l" Dec 12 00:44:31 crc kubenswrapper[4606]: I1212 00:44:31.506100 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Dec 12 00:44:31 crc kubenswrapper[4606]: I1212 00:44:31.510058 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-4df9-account-create-update-7t79l"] Dec 12 00:44:31 crc kubenswrapper[4606]: I1212 00:44:31.548385 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rznv\" (UniqueName: \"kubernetes.io/projected/92703eda-3f9e-40c1-9eef-c637ccfe0552-kube-api-access-2rznv\") pod \"glance-4df9-account-create-update-7t79l\" (UID: \"92703eda-3f9e-40c1-9eef-c637ccfe0552\") " pod="openstack/glance-4df9-account-create-update-7t79l" Dec 12 00:44:31 crc kubenswrapper[4606]: I1212 00:44:31.548786 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/92703eda-3f9e-40c1-9eef-c637ccfe0552-operator-scripts\") pod \"glance-4df9-account-create-update-7t79l\" (UID: \"92703eda-3f9e-40c1-9eef-c637ccfe0552\") " pod="openstack/glance-4df9-account-create-update-7t79l" Dec 12 00:44:31 crc kubenswrapper[4606]: I1212 00:44:31.548823 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b2358288-ebbf-4430-9684-7bcdf01349a4-operator-scripts\") pod \"glance-db-create-nm7sl\" (UID: \"b2358288-ebbf-4430-9684-7bcdf01349a4\") " pod="openstack/glance-db-create-nm7sl" Dec 12 00:44:31 crc kubenswrapper[4606]: I1212 00:44:31.548863 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2cdqr\" (UniqueName: \"kubernetes.io/projected/b2358288-ebbf-4430-9684-7bcdf01349a4-kube-api-access-2cdqr\") pod \"glance-db-create-nm7sl\" (UID: \"b2358288-ebbf-4430-9684-7bcdf01349a4\") " pod="openstack/glance-db-create-nm7sl" Dec 12 00:44:31 crc kubenswrapper[4606]: I1212 00:44:31.549620 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b2358288-ebbf-4430-9684-7bcdf01349a4-operator-scripts\") pod \"glance-db-create-nm7sl\" (UID: \"b2358288-ebbf-4430-9684-7bcdf01349a4\") " pod="openstack/glance-db-create-nm7sl" Dec 12 00:44:31 crc kubenswrapper[4606]: I1212 00:44:31.564558 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2cdqr\" (UniqueName: \"kubernetes.io/projected/b2358288-ebbf-4430-9684-7bcdf01349a4-kube-api-access-2cdqr\") pod \"glance-db-create-nm7sl\" (UID: \"b2358288-ebbf-4430-9684-7bcdf01349a4\") " pod="openstack/glance-db-create-nm7sl" Dec 12 00:44:31 crc kubenswrapper[4606]: I1212 00:44:31.650655 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2rznv\" (UniqueName: \"kubernetes.io/projected/92703eda-3f9e-40c1-9eef-c637ccfe0552-kube-api-access-2rznv\") pod \"glance-4df9-account-create-update-7t79l\" (UID: \"92703eda-3f9e-40c1-9eef-c637ccfe0552\") " pod="openstack/glance-4df9-account-create-update-7t79l" Dec 12 00:44:31 crc kubenswrapper[4606]: I1212 00:44:31.650952 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/92703eda-3f9e-40c1-9eef-c637ccfe0552-operator-scripts\") pod \"glance-4df9-account-create-update-7t79l\" (UID: \"92703eda-3f9e-40c1-9eef-c637ccfe0552\") " pod="openstack/glance-4df9-account-create-update-7t79l" Dec 12 00:44:31 crc kubenswrapper[4606]: I1212 00:44:31.651636 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/92703eda-3f9e-40c1-9eef-c637ccfe0552-operator-scripts\") pod \"glance-4df9-account-create-update-7t79l\" (UID: \"92703eda-3f9e-40c1-9eef-c637ccfe0552\") " pod="openstack/glance-4df9-account-create-update-7t79l" Dec 12 00:44:31 crc kubenswrapper[4606]: I1212 00:44:31.654087 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8554648995-bscb8" Dec 12 00:44:31 crc kubenswrapper[4606]: I1212 00:44:31.691835 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rznv\" (UniqueName: \"kubernetes.io/projected/92703eda-3f9e-40c1-9eef-c637ccfe0552-kube-api-access-2rznv\") pod \"glance-4df9-account-create-update-7t79l\" (UID: \"92703eda-3f9e-40c1-9eef-c637ccfe0552\") " pod="openstack/glance-4df9-account-create-update-7t79l" Dec 12 00:44:31 crc kubenswrapper[4606]: I1212 00:44:31.777525 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-nm7sl" Dec 12 00:44:31 crc kubenswrapper[4606]: I1212 00:44:31.831787 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-4df9-account-create-update-7t79l" Dec 12 00:44:31 crc kubenswrapper[4606]: I1212 00:44:31.855341 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6963a48d-4eff-4349-bc36-2356ec73c08c-etc-swift\") pod \"swift-storage-0\" (UID: \"6963a48d-4eff-4349-bc36-2356ec73c08c\") " pod="openstack/swift-storage-0" Dec 12 00:44:31 crc kubenswrapper[4606]: E1212 00:44:31.855775 4606 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 12 00:44:31 crc kubenswrapper[4606]: E1212 00:44:31.855893 4606 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 12 00:44:31 crc kubenswrapper[4606]: E1212 00:44:31.856021 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6963a48d-4eff-4349-bc36-2356ec73c08c-etc-swift podName:6963a48d-4eff-4349-bc36-2356ec73c08c nodeName:}" failed. No retries permitted until 2025-12-12 00:44:39.855999228 +0000 UTC m=+1270.401352094 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/6963a48d-4eff-4349-bc36-2356ec73c08c-etc-swift") pod "swift-storage-0" (UID: "6963a48d-4eff-4349-bc36-2356ec73c08c") : configmap "swift-ring-files" not found Dec 12 00:44:32 crc kubenswrapper[4606]: I1212 00:44:32.010049 4606 patch_prober.go:28] interesting pod/machine-config-daemon-cqmz5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 12 00:44:32 crc kubenswrapper[4606]: I1212 00:44:32.010093 4606 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 12 00:44:32 crc kubenswrapper[4606]: I1212 00:44:32.851569 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 12 00:44:33 crc kubenswrapper[4606]: I1212 00:44:33.238183 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-b8fbc5445-v224c" Dec 12 00:44:33 crc kubenswrapper[4606]: I1212 00:44:33.306390 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-bscb8"] Dec 12 00:44:33 crc kubenswrapper[4606]: I1212 00:44:33.330725 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8554648995-bscb8" podUID="be7f7dee-5960-4b93-890f-d5898b9d6457" containerName="dnsmasq-dns" containerID="cri-o://3d69d59647502af67a6e346718d81263dc3f1321fd97427771810090805c7351" gracePeriod=10 Dec 12 00:44:34 crc kubenswrapper[4606]: I1212 00:44:34.490308 4606 generic.go:334] "Generic (PLEG): container finished" podID="be7f7dee-5960-4b93-890f-d5898b9d6457" containerID="3d69d59647502af67a6e346718d81263dc3f1321fd97427771810090805c7351" exitCode=0 Dec 12 00:44:34 crc kubenswrapper[4606]: I1212 00:44:34.490348 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-bscb8" event={"ID":"be7f7dee-5960-4b93-890f-d5898b9d6457","Type":"ContainerDied","Data":"3d69d59647502af67a6e346718d81263dc3f1321fd97427771810090805c7351"} Dec 12 00:44:35 crc kubenswrapper[4606]: I1212 00:44:35.208857 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-bscb8" Dec 12 00:44:35 crc kubenswrapper[4606]: I1212 00:44:35.325087 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/be7f7dee-5960-4b93-890f-d5898b9d6457-ovsdbserver-nb\") pod \"be7f7dee-5960-4b93-890f-d5898b9d6457\" (UID: \"be7f7dee-5960-4b93-890f-d5898b9d6457\") " Dec 12 00:44:35 crc kubenswrapper[4606]: I1212 00:44:35.325190 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be7f7dee-5960-4b93-890f-d5898b9d6457-config\") pod \"be7f7dee-5960-4b93-890f-d5898b9d6457\" (UID: \"be7f7dee-5960-4b93-890f-d5898b9d6457\") " Dec 12 00:44:35 crc kubenswrapper[4606]: I1212 00:44:35.325229 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4rf5c\" (UniqueName: \"kubernetes.io/projected/be7f7dee-5960-4b93-890f-d5898b9d6457-kube-api-access-4rf5c\") pod \"be7f7dee-5960-4b93-890f-d5898b9d6457\" (UID: \"be7f7dee-5960-4b93-890f-d5898b9d6457\") " Dec 12 00:44:35 crc kubenswrapper[4606]: I1212 00:44:35.325295 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/be7f7dee-5960-4b93-890f-d5898b9d6457-ovsdbserver-sb\") pod \"be7f7dee-5960-4b93-890f-d5898b9d6457\" (UID: \"be7f7dee-5960-4b93-890f-d5898b9d6457\") " Dec 12 00:44:35 crc kubenswrapper[4606]: I1212 00:44:35.326227 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/be7f7dee-5960-4b93-890f-d5898b9d6457-dns-svc\") pod \"be7f7dee-5960-4b93-890f-d5898b9d6457\" (UID: \"be7f7dee-5960-4b93-890f-d5898b9d6457\") " Dec 12 00:44:35 crc kubenswrapper[4606]: I1212 00:44:35.338771 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-z942v"] Dec 12 00:44:35 crc kubenswrapper[4606]: W1212 00:44:35.346777 4606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod38c5f767_2214_4103_94ef_c9b98cfb9269.slice/crio-ca1aa3cbda517801dead0c003e676ddcc787129e7edc82dbe0ff9ef13b3b980b WatchSource:0}: Error finding container ca1aa3cbda517801dead0c003e676ddcc787129e7edc82dbe0ff9ef13b3b980b: Status 404 returned error can't find the container with id ca1aa3cbda517801dead0c003e676ddcc787129e7edc82dbe0ff9ef13b3b980b Dec 12 00:44:35 crc kubenswrapper[4606]: I1212 00:44:35.351345 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be7f7dee-5960-4b93-890f-d5898b9d6457-kube-api-access-4rf5c" (OuterVolumeSpecName: "kube-api-access-4rf5c") pod "be7f7dee-5960-4b93-890f-d5898b9d6457" (UID: "be7f7dee-5960-4b93-890f-d5898b9d6457"). InnerVolumeSpecName "kube-api-access-4rf5c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:44:35 crc kubenswrapper[4606]: I1212 00:44:35.379500 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be7f7dee-5960-4b93-890f-d5898b9d6457-config" (OuterVolumeSpecName: "config") pod "be7f7dee-5960-4b93-890f-d5898b9d6457" (UID: "be7f7dee-5960-4b93-890f-d5898b9d6457"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:44:35 crc kubenswrapper[4606]: I1212 00:44:35.386041 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be7f7dee-5960-4b93-890f-d5898b9d6457-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "be7f7dee-5960-4b93-890f-d5898b9d6457" (UID: "be7f7dee-5960-4b93-890f-d5898b9d6457"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:44:35 crc kubenswrapper[4606]: I1212 00:44:35.390820 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be7f7dee-5960-4b93-890f-d5898b9d6457-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "be7f7dee-5960-4b93-890f-d5898b9d6457" (UID: "be7f7dee-5960-4b93-890f-d5898b9d6457"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:44:35 crc kubenswrapper[4606]: I1212 00:44:35.399750 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be7f7dee-5960-4b93-890f-d5898b9d6457-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "be7f7dee-5960-4b93-890f-d5898b9d6457" (UID: "be7f7dee-5960-4b93-890f-d5898b9d6457"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:44:35 crc kubenswrapper[4606]: I1212 00:44:35.435219 4606 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/be7f7dee-5960-4b93-890f-d5898b9d6457-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 12 00:44:35 crc kubenswrapper[4606]: I1212 00:44:35.435246 4606 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/be7f7dee-5960-4b93-890f-d5898b9d6457-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 12 00:44:35 crc kubenswrapper[4606]: I1212 00:44:35.435273 4606 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be7f7dee-5960-4b93-890f-d5898b9d6457-config\") on node \"crc\" DevicePath \"\"" Dec 12 00:44:35 crc kubenswrapper[4606]: I1212 00:44:35.435283 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4rf5c\" (UniqueName: \"kubernetes.io/projected/be7f7dee-5960-4b93-890f-d5898b9d6457-kube-api-access-4rf5c\") on node \"crc\" DevicePath \"\"" Dec 12 00:44:35 crc kubenswrapper[4606]: I1212 00:44:35.435292 4606 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/be7f7dee-5960-4b93-890f-d5898b9d6457-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 12 00:44:35 crc kubenswrapper[4606]: I1212 00:44:35.500781 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-t97v8" event={"ID":"d4a54eac-00ee-452a-9c4b-e777e338e670","Type":"ContainerStarted","Data":"b87adbb36f427f1e31235fbaca48e48cf1cbd3844df9ace46372196c22ec9e6a"} Dec 12 00:44:35 crc kubenswrapper[4606]: I1212 00:44:35.508762 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-z942v" event={"ID":"38c5f767-2214-4103-94ef-c9b98cfb9269","Type":"ContainerStarted","Data":"bfd693e38c2149d439bc9ac37a35c49a4535ee289f44be5a6267d84afb08398c"} Dec 12 00:44:35 crc kubenswrapper[4606]: I1212 00:44:35.508793 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-z942v" event={"ID":"38c5f767-2214-4103-94ef-c9b98cfb9269","Type":"ContainerStarted","Data":"ca1aa3cbda517801dead0c003e676ddcc787129e7edc82dbe0ff9ef13b3b980b"} Dec 12 00:44:35 crc kubenswrapper[4606]: I1212 00:44:35.514070 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-bscb8" event={"ID":"be7f7dee-5960-4b93-890f-d5898b9d6457","Type":"ContainerDied","Data":"7680f06a03aba3a0069633a8c7ef72e6b573ef733282aeacd9451c82514da4bc"} Dec 12 00:44:35 crc kubenswrapper[4606]: I1212 00:44:35.514102 4606 scope.go:117] "RemoveContainer" containerID="3d69d59647502af67a6e346718d81263dc3f1321fd97427771810090805c7351" Dec 12 00:44:35 crc kubenswrapper[4606]: I1212 00:44:35.514224 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-bscb8" Dec 12 00:44:35 crc kubenswrapper[4606]: I1212 00:44:35.524978 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-t97v8" podStartSLOduration=1.9967523950000001 podStartE2EDuration="8.524963732s" podCreationTimestamp="2025-12-12 00:44:27 +0000 UTC" firstStartedPulling="2025-12-12 00:44:28.521951073 +0000 UTC m=+1259.067303939" lastFinishedPulling="2025-12-12 00:44:35.05016241 +0000 UTC m=+1265.595515276" observedRunningTime="2025-12-12 00:44:35.524731976 +0000 UTC m=+1266.070084852" watchObservedRunningTime="2025-12-12 00:44:35.524963732 +0000 UTC m=+1266.070316598" Dec 12 00:44:35 crc kubenswrapper[4606]: I1212 00:44:35.549204 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-ee90-account-create-update-2wb28"] Dec 12 00:44:35 crc kubenswrapper[4606]: I1212 00:44:35.557643 4606 scope.go:117] "RemoveContainer" containerID="74d246e6ae84a658eb0035be57608d48d855503201885c64f7329680fb7080a0" Dec 12 00:44:35 crc kubenswrapper[4606]: I1212 00:44:35.557789 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-rpf5j"] Dec 12 00:44:35 crc kubenswrapper[4606]: I1212 00:44:35.568207 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-create-z942v" podStartSLOduration=5.568192192 podStartE2EDuration="5.568192192s" podCreationTimestamp="2025-12-12 00:44:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:44:35.546239103 +0000 UTC m=+1266.091591969" watchObservedRunningTime="2025-12-12 00:44:35.568192192 +0000 UTC m=+1266.113545058" Dec 12 00:44:35 crc kubenswrapper[4606]: I1212 00:44:35.586191 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6e57-account-create-update-bstwq"] Dec 12 00:44:35 crc kubenswrapper[4606]: I1212 00:44:35.597715 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-4df9-account-create-update-7t79l"] Dec 12 00:44:35 crc kubenswrapper[4606]: I1212 00:44:35.602823 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-bscb8"] Dec 12 00:44:35 crc kubenswrapper[4606]: I1212 00:44:35.619373 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8554648995-bscb8"] Dec 12 00:44:35 crc kubenswrapper[4606]: I1212 00:44:35.722094 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be7f7dee-5960-4b93-890f-d5898b9d6457" path="/var/lib/kubelet/pods/be7f7dee-5960-4b93-890f-d5898b9d6457/volumes" Dec 12 00:44:35 crc kubenswrapper[4606]: I1212 00:44:35.722650 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-nm7sl"] Dec 12 00:44:36 crc kubenswrapper[4606]: I1212 00:44:36.527592 4606 generic.go:334] "Generic (PLEG): container finished" podID="38c5f767-2214-4103-94ef-c9b98cfb9269" containerID="bfd693e38c2149d439bc9ac37a35c49a4535ee289f44be5a6267d84afb08398c" exitCode=0 Dec 12 00:44:36 crc kubenswrapper[4606]: I1212 00:44:36.527933 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-z942v" event={"ID":"38c5f767-2214-4103-94ef-c9b98cfb9269","Type":"ContainerDied","Data":"bfd693e38c2149d439bc9ac37a35c49a4535ee289f44be5a6267d84afb08398c"} Dec 12 00:44:36 crc kubenswrapper[4606]: I1212 00:44:36.530981 4606 generic.go:334] "Generic (PLEG): container finished" podID="92703eda-3f9e-40c1-9eef-c637ccfe0552" containerID="f25dad178361a4e06a05c81b514a8a431465f4ea2fe6bba51af3760c62b8d500" exitCode=0 Dec 12 00:44:36 crc kubenswrapper[4606]: I1212 00:44:36.531141 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-4df9-account-create-update-7t79l" event={"ID":"92703eda-3f9e-40c1-9eef-c637ccfe0552","Type":"ContainerDied","Data":"f25dad178361a4e06a05c81b514a8a431465f4ea2fe6bba51af3760c62b8d500"} Dec 12 00:44:36 crc kubenswrapper[4606]: I1212 00:44:36.531363 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-4df9-account-create-update-7t79l" event={"ID":"92703eda-3f9e-40c1-9eef-c637ccfe0552","Type":"ContainerStarted","Data":"4342006cf3e8856aa8265219e541b6dd49c8a48a89e37420c864d33862b49e76"} Dec 12 00:44:36 crc kubenswrapper[4606]: I1212 00:44:36.558342 4606 generic.go:334] "Generic (PLEG): container finished" podID="b2358288-ebbf-4430-9684-7bcdf01349a4" containerID="dd2e2914261d84eecc80e972bb81081ea3349b7dddbb356f4faf14ffc5996a2a" exitCode=0 Dec 12 00:44:36 crc kubenswrapper[4606]: I1212 00:44:36.558431 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-nm7sl" event={"ID":"b2358288-ebbf-4430-9684-7bcdf01349a4","Type":"ContainerDied","Data":"dd2e2914261d84eecc80e972bb81081ea3349b7dddbb356f4faf14ffc5996a2a"} Dec 12 00:44:36 crc kubenswrapper[4606]: I1212 00:44:36.558463 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-nm7sl" event={"ID":"b2358288-ebbf-4430-9684-7bcdf01349a4","Type":"ContainerStarted","Data":"52428afdea2abfa879cc2e1634480a678dbda793b19ecb60a88de3b14473fa68"} Dec 12 00:44:36 crc kubenswrapper[4606]: I1212 00:44:36.564731 4606 generic.go:334] "Generic (PLEG): container finished" podID="d34d547f-c6ae-48f9-8df6-1d2d35942f23" containerID="a3b6a1da3cc761d686854df968b045d2f1625ef9d6816a1aae23f451a66e1453" exitCode=0 Dec 12 00:44:36 crc kubenswrapper[4606]: I1212 00:44:36.564815 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6e57-account-create-update-bstwq" event={"ID":"d34d547f-c6ae-48f9-8df6-1d2d35942f23","Type":"ContainerDied","Data":"a3b6a1da3cc761d686854df968b045d2f1625ef9d6816a1aae23f451a66e1453"} Dec 12 00:44:36 crc kubenswrapper[4606]: I1212 00:44:36.564846 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6e57-account-create-update-bstwq" event={"ID":"d34d547f-c6ae-48f9-8df6-1d2d35942f23","Type":"ContainerStarted","Data":"7e19dec114c640e929dd23e6d6394d86d21a378c99b440645911b5fbb8d74a0a"} Dec 12 00:44:36 crc kubenswrapper[4606]: I1212 00:44:36.574672 4606 generic.go:334] "Generic (PLEG): container finished" podID="6c857da8-746f-4a51-b509-e6ed45614ab6" containerID="f453eb4163c594ea7b3a74aba61b9ff9e7cc35070add842376817c36698caa93" exitCode=0 Dec 12 00:44:36 crc kubenswrapper[4606]: I1212 00:44:36.574751 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-ee90-account-create-update-2wb28" event={"ID":"6c857da8-746f-4a51-b509-e6ed45614ab6","Type":"ContainerDied","Data":"f453eb4163c594ea7b3a74aba61b9ff9e7cc35070add842376817c36698caa93"} Dec 12 00:44:36 crc kubenswrapper[4606]: I1212 00:44:36.574785 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-ee90-account-create-update-2wb28" event={"ID":"6c857da8-746f-4a51-b509-e6ed45614ab6","Type":"ContainerStarted","Data":"809617d458fc9417e7c8cb68686c6a0cf51dd96295d7543ef842d6e16e6ec010"} Dec 12 00:44:36 crc kubenswrapper[4606]: I1212 00:44:36.580363 4606 generic.go:334] "Generic (PLEG): container finished" podID="f8b5bcea-0e2f-4d42-b6bd-0f7ea87ed0e5" containerID="e3707bda609b96a4e83c0b905bc76c57410e4ae657ff1a5cf39a5ac924f18912" exitCode=0 Dec 12 00:44:36 crc kubenswrapper[4606]: I1212 00:44:36.581481 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-rpf5j" event={"ID":"f8b5bcea-0e2f-4d42-b6bd-0f7ea87ed0e5","Type":"ContainerDied","Data":"e3707bda609b96a4e83c0b905bc76c57410e4ae657ff1a5cf39a5ac924f18912"} Dec 12 00:44:36 crc kubenswrapper[4606]: I1212 00:44:36.581544 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-rpf5j" event={"ID":"f8b5bcea-0e2f-4d42-b6bd-0f7ea87ed0e5","Type":"ContainerStarted","Data":"e07202f06e075e65c26d9871dd004c200fb7179795cf8de55bfec9d4a5113908"} Dec 12 00:44:37 crc kubenswrapper[4606]: I1212 00:44:37.336412 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Dec 12 00:44:38 crc kubenswrapper[4606]: I1212 00:44:38.022851 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-nm7sl" Dec 12 00:44:38 crc kubenswrapper[4606]: I1212 00:44:38.080563 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2cdqr\" (UniqueName: \"kubernetes.io/projected/b2358288-ebbf-4430-9684-7bcdf01349a4-kube-api-access-2cdqr\") pod \"b2358288-ebbf-4430-9684-7bcdf01349a4\" (UID: \"b2358288-ebbf-4430-9684-7bcdf01349a4\") " Dec 12 00:44:38 crc kubenswrapper[4606]: I1212 00:44:38.080679 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b2358288-ebbf-4430-9684-7bcdf01349a4-operator-scripts\") pod \"b2358288-ebbf-4430-9684-7bcdf01349a4\" (UID: \"b2358288-ebbf-4430-9684-7bcdf01349a4\") " Dec 12 00:44:38 crc kubenswrapper[4606]: I1212 00:44:38.081893 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b2358288-ebbf-4430-9684-7bcdf01349a4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b2358288-ebbf-4430-9684-7bcdf01349a4" (UID: "b2358288-ebbf-4430-9684-7bcdf01349a4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:44:38 crc kubenswrapper[4606]: I1212 00:44:38.090087 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2358288-ebbf-4430-9684-7bcdf01349a4-kube-api-access-2cdqr" (OuterVolumeSpecName: "kube-api-access-2cdqr") pod "b2358288-ebbf-4430-9684-7bcdf01349a4" (UID: "b2358288-ebbf-4430-9684-7bcdf01349a4"). InnerVolumeSpecName "kube-api-access-2cdqr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:44:38 crc kubenswrapper[4606]: I1212 00:44:38.184152 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2cdqr\" (UniqueName: \"kubernetes.io/projected/b2358288-ebbf-4430-9684-7bcdf01349a4-kube-api-access-2cdqr\") on node \"crc\" DevicePath \"\"" Dec 12 00:44:38 crc kubenswrapper[4606]: I1212 00:44:38.184208 4606 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b2358288-ebbf-4430-9684-7bcdf01349a4-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 12 00:44:38 crc kubenswrapper[4606]: I1212 00:44:38.349543 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6e57-account-create-update-bstwq" Dec 12 00:44:38 crc kubenswrapper[4606]: I1212 00:44:38.357496 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-ee90-account-create-update-2wb28" Dec 12 00:44:38 crc kubenswrapper[4606]: I1212 00:44:38.364077 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-z942v" Dec 12 00:44:38 crc kubenswrapper[4606]: I1212 00:44:38.373878 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-rpf5j" Dec 12 00:44:38 crc kubenswrapper[4606]: I1212 00:44:38.387096 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-4df9-account-create-update-7t79l" Dec 12 00:44:38 crc kubenswrapper[4606]: I1212 00:44:38.388782 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rbt62\" (UniqueName: \"kubernetes.io/projected/38c5f767-2214-4103-94ef-c9b98cfb9269-kube-api-access-rbt62\") pod \"38c5f767-2214-4103-94ef-c9b98cfb9269\" (UID: \"38c5f767-2214-4103-94ef-c9b98cfb9269\") " Dec 12 00:44:38 crc kubenswrapper[4606]: I1212 00:44:38.388870 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d34d547f-c6ae-48f9-8df6-1d2d35942f23-operator-scripts\") pod \"d34d547f-c6ae-48f9-8df6-1d2d35942f23\" (UID: \"d34d547f-c6ae-48f9-8df6-1d2d35942f23\") " Dec 12 00:44:38 crc kubenswrapper[4606]: I1212 00:44:38.388896 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/38c5f767-2214-4103-94ef-c9b98cfb9269-operator-scripts\") pod \"38c5f767-2214-4103-94ef-c9b98cfb9269\" (UID: \"38c5f767-2214-4103-94ef-c9b98cfb9269\") " Dec 12 00:44:38 crc kubenswrapper[4606]: I1212 00:44:38.388951 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6c857da8-746f-4a51-b509-e6ed45614ab6-operator-scripts\") pod \"6c857da8-746f-4a51-b509-e6ed45614ab6\" (UID: \"6c857da8-746f-4a51-b509-e6ed45614ab6\") " Dec 12 00:44:38 crc kubenswrapper[4606]: I1212 00:44:38.389008 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vmpzd\" (UniqueName: \"kubernetes.io/projected/6c857da8-746f-4a51-b509-e6ed45614ab6-kube-api-access-vmpzd\") pod \"6c857da8-746f-4a51-b509-e6ed45614ab6\" (UID: \"6c857da8-746f-4a51-b509-e6ed45614ab6\") " Dec 12 00:44:38 crc kubenswrapper[4606]: I1212 00:44:38.389065 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lhkj9\" (UniqueName: \"kubernetes.io/projected/d34d547f-c6ae-48f9-8df6-1d2d35942f23-kube-api-access-lhkj9\") pod \"d34d547f-c6ae-48f9-8df6-1d2d35942f23\" (UID: \"d34d547f-c6ae-48f9-8df6-1d2d35942f23\") " Dec 12 00:44:38 crc kubenswrapper[4606]: I1212 00:44:38.389674 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38c5f767-2214-4103-94ef-c9b98cfb9269-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "38c5f767-2214-4103-94ef-c9b98cfb9269" (UID: "38c5f767-2214-4103-94ef-c9b98cfb9269"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:44:38 crc kubenswrapper[4606]: I1212 00:44:38.389757 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d34d547f-c6ae-48f9-8df6-1d2d35942f23-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d34d547f-c6ae-48f9-8df6-1d2d35942f23" (UID: "d34d547f-c6ae-48f9-8df6-1d2d35942f23"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:44:38 crc kubenswrapper[4606]: I1212 00:44:38.390330 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c857da8-746f-4a51-b509-e6ed45614ab6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6c857da8-746f-4a51-b509-e6ed45614ab6" (UID: "6c857da8-746f-4a51-b509-e6ed45614ab6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:44:38 crc kubenswrapper[4606]: I1212 00:44:38.392990 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d34d547f-c6ae-48f9-8df6-1d2d35942f23-kube-api-access-lhkj9" (OuterVolumeSpecName: "kube-api-access-lhkj9") pod "d34d547f-c6ae-48f9-8df6-1d2d35942f23" (UID: "d34d547f-c6ae-48f9-8df6-1d2d35942f23"). InnerVolumeSpecName "kube-api-access-lhkj9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:44:38 crc kubenswrapper[4606]: I1212 00:44:38.394685 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c857da8-746f-4a51-b509-e6ed45614ab6-kube-api-access-vmpzd" (OuterVolumeSpecName: "kube-api-access-vmpzd") pod "6c857da8-746f-4a51-b509-e6ed45614ab6" (UID: "6c857da8-746f-4a51-b509-e6ed45614ab6"). InnerVolumeSpecName "kube-api-access-vmpzd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:44:38 crc kubenswrapper[4606]: I1212 00:44:38.407757 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38c5f767-2214-4103-94ef-c9b98cfb9269-kube-api-access-rbt62" (OuterVolumeSpecName: "kube-api-access-rbt62") pod "38c5f767-2214-4103-94ef-c9b98cfb9269" (UID: "38c5f767-2214-4103-94ef-c9b98cfb9269"). InnerVolumeSpecName "kube-api-access-rbt62". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:44:38 crc kubenswrapper[4606]: I1212 00:44:38.490787 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f8b5bcea-0e2f-4d42-b6bd-0f7ea87ed0e5-operator-scripts\") pod \"f8b5bcea-0e2f-4d42-b6bd-0f7ea87ed0e5\" (UID: \"f8b5bcea-0e2f-4d42-b6bd-0f7ea87ed0e5\") " Dec 12 00:44:38 crc kubenswrapper[4606]: I1212 00:44:38.491156 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2rznv\" (UniqueName: \"kubernetes.io/projected/92703eda-3f9e-40c1-9eef-c637ccfe0552-kube-api-access-2rznv\") pod \"92703eda-3f9e-40c1-9eef-c637ccfe0552\" (UID: \"92703eda-3f9e-40c1-9eef-c637ccfe0552\") " Dec 12 00:44:38 crc kubenswrapper[4606]: I1212 00:44:38.491415 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/92703eda-3f9e-40c1-9eef-c637ccfe0552-operator-scripts\") pod \"92703eda-3f9e-40c1-9eef-c637ccfe0552\" (UID: \"92703eda-3f9e-40c1-9eef-c637ccfe0552\") " Dec 12 00:44:38 crc kubenswrapper[4606]: I1212 00:44:38.491516 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k4r7w\" (UniqueName: \"kubernetes.io/projected/f8b5bcea-0e2f-4d42-b6bd-0f7ea87ed0e5-kube-api-access-k4r7w\") pod \"f8b5bcea-0e2f-4d42-b6bd-0f7ea87ed0e5\" (UID: \"f8b5bcea-0e2f-4d42-b6bd-0f7ea87ed0e5\") " Dec 12 00:44:38 crc kubenswrapper[4606]: I1212 00:44:38.492802 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rbt62\" (UniqueName: \"kubernetes.io/projected/38c5f767-2214-4103-94ef-c9b98cfb9269-kube-api-access-rbt62\") on node \"crc\" DevicePath \"\"" Dec 12 00:44:38 crc kubenswrapper[4606]: I1212 00:44:38.492958 4606 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d34d547f-c6ae-48f9-8df6-1d2d35942f23-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 12 00:44:38 crc kubenswrapper[4606]: I1212 00:44:38.493068 4606 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/38c5f767-2214-4103-94ef-c9b98cfb9269-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 12 00:44:38 crc kubenswrapper[4606]: I1212 00:44:38.493189 4606 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6c857da8-746f-4a51-b509-e6ed45614ab6-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 12 00:44:38 crc kubenswrapper[4606]: I1212 00:44:38.493279 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vmpzd\" (UniqueName: \"kubernetes.io/projected/6c857da8-746f-4a51-b509-e6ed45614ab6-kube-api-access-vmpzd\") on node \"crc\" DevicePath \"\"" Dec 12 00:44:38 crc kubenswrapper[4606]: I1212 00:44:38.493385 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lhkj9\" (UniqueName: \"kubernetes.io/projected/d34d547f-c6ae-48f9-8df6-1d2d35942f23-kube-api-access-lhkj9\") on node \"crc\" DevicePath \"\"" Dec 12 00:44:38 crc kubenswrapper[4606]: I1212 00:44:38.491987 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92703eda-3f9e-40c1-9eef-c637ccfe0552-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "92703eda-3f9e-40c1-9eef-c637ccfe0552" (UID: "92703eda-3f9e-40c1-9eef-c637ccfe0552"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:44:38 crc kubenswrapper[4606]: I1212 00:44:38.494956 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8b5bcea-0e2f-4d42-b6bd-0f7ea87ed0e5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f8b5bcea-0e2f-4d42-b6bd-0f7ea87ed0e5" (UID: "f8b5bcea-0e2f-4d42-b6bd-0f7ea87ed0e5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:44:38 crc kubenswrapper[4606]: I1212 00:44:38.496500 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92703eda-3f9e-40c1-9eef-c637ccfe0552-kube-api-access-2rznv" (OuterVolumeSpecName: "kube-api-access-2rznv") pod "92703eda-3f9e-40c1-9eef-c637ccfe0552" (UID: "92703eda-3f9e-40c1-9eef-c637ccfe0552"). InnerVolumeSpecName "kube-api-access-2rznv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:44:38 crc kubenswrapper[4606]: I1212 00:44:38.496611 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8b5bcea-0e2f-4d42-b6bd-0f7ea87ed0e5-kube-api-access-k4r7w" (OuterVolumeSpecName: "kube-api-access-k4r7w") pod "f8b5bcea-0e2f-4d42-b6bd-0f7ea87ed0e5" (UID: "f8b5bcea-0e2f-4d42-b6bd-0f7ea87ed0e5"). InnerVolumeSpecName "kube-api-access-k4r7w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:44:38 crc kubenswrapper[4606]: I1212 00:44:38.594272 4606 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f8b5bcea-0e2f-4d42-b6bd-0f7ea87ed0e5-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 12 00:44:38 crc kubenswrapper[4606]: I1212 00:44:38.594300 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2rznv\" (UniqueName: \"kubernetes.io/projected/92703eda-3f9e-40c1-9eef-c637ccfe0552-kube-api-access-2rznv\") on node \"crc\" DevicePath \"\"" Dec 12 00:44:38 crc kubenswrapper[4606]: I1212 00:44:38.594311 4606 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/92703eda-3f9e-40c1-9eef-c637ccfe0552-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 12 00:44:38 crc kubenswrapper[4606]: I1212 00:44:38.594321 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k4r7w\" (UniqueName: \"kubernetes.io/projected/f8b5bcea-0e2f-4d42-b6bd-0f7ea87ed0e5-kube-api-access-k4r7w\") on node \"crc\" DevicePath \"\"" Dec 12 00:44:38 crc kubenswrapper[4606]: I1212 00:44:38.608241 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-ee90-account-create-update-2wb28" event={"ID":"6c857da8-746f-4a51-b509-e6ed45614ab6","Type":"ContainerDied","Data":"809617d458fc9417e7c8cb68686c6a0cf51dd96295d7543ef842d6e16e6ec010"} Dec 12 00:44:38 crc kubenswrapper[4606]: I1212 00:44:38.608440 4606 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="809617d458fc9417e7c8cb68686c6a0cf51dd96295d7543ef842d6e16e6ec010" Dec 12 00:44:38 crc kubenswrapper[4606]: I1212 00:44:38.608618 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-ee90-account-create-update-2wb28" Dec 12 00:44:38 crc kubenswrapper[4606]: I1212 00:44:38.610627 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-rpf5j" event={"ID":"f8b5bcea-0e2f-4d42-b6bd-0f7ea87ed0e5","Type":"ContainerDied","Data":"e07202f06e075e65c26d9871dd004c200fb7179795cf8de55bfec9d4a5113908"} Dec 12 00:44:38 crc kubenswrapper[4606]: I1212 00:44:38.610668 4606 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e07202f06e075e65c26d9871dd004c200fb7179795cf8de55bfec9d4a5113908" Dec 12 00:44:38 crc kubenswrapper[4606]: I1212 00:44:38.610787 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-rpf5j" Dec 12 00:44:38 crc kubenswrapper[4606]: I1212 00:44:38.617395 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-z942v" Dec 12 00:44:38 crc kubenswrapper[4606]: I1212 00:44:38.617421 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-z942v" event={"ID":"38c5f767-2214-4103-94ef-c9b98cfb9269","Type":"ContainerDied","Data":"ca1aa3cbda517801dead0c003e676ddcc787129e7edc82dbe0ff9ef13b3b980b"} Dec 12 00:44:38 crc kubenswrapper[4606]: I1212 00:44:38.617470 4606 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ca1aa3cbda517801dead0c003e676ddcc787129e7edc82dbe0ff9ef13b3b980b" Dec 12 00:44:38 crc kubenswrapper[4606]: I1212 00:44:38.622405 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-4df9-account-create-update-7t79l" event={"ID":"92703eda-3f9e-40c1-9eef-c637ccfe0552","Type":"ContainerDied","Data":"4342006cf3e8856aa8265219e541b6dd49c8a48a89e37420c864d33862b49e76"} Dec 12 00:44:38 crc kubenswrapper[4606]: I1212 00:44:38.622437 4606 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4342006cf3e8856aa8265219e541b6dd49c8a48a89e37420c864d33862b49e76" Dec 12 00:44:38 crc kubenswrapper[4606]: I1212 00:44:38.622562 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-4df9-account-create-update-7t79l" Dec 12 00:44:38 crc kubenswrapper[4606]: I1212 00:44:38.624477 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-nm7sl" event={"ID":"b2358288-ebbf-4430-9684-7bcdf01349a4","Type":"ContainerDied","Data":"52428afdea2abfa879cc2e1634480a678dbda793b19ecb60a88de3b14473fa68"} Dec 12 00:44:38 crc kubenswrapper[4606]: I1212 00:44:38.624486 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-nm7sl" Dec 12 00:44:38 crc kubenswrapper[4606]: I1212 00:44:38.624496 4606 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="52428afdea2abfa879cc2e1634480a678dbda793b19ecb60a88de3b14473fa68" Dec 12 00:44:38 crc kubenswrapper[4606]: I1212 00:44:38.626686 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6e57-account-create-update-bstwq" event={"ID":"d34d547f-c6ae-48f9-8df6-1d2d35942f23","Type":"ContainerDied","Data":"7e19dec114c640e929dd23e6d6394d86d21a378c99b440645911b5fbb8d74a0a"} Dec 12 00:44:38 crc kubenswrapper[4606]: I1212 00:44:38.626714 4606 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7e19dec114c640e929dd23e6d6394d86d21a378c99b440645911b5fbb8d74a0a" Dec 12 00:44:38 crc kubenswrapper[4606]: I1212 00:44:38.626767 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6e57-account-create-update-bstwq" Dec 12 00:44:39 crc kubenswrapper[4606]: I1212 00:44:39.916649 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6963a48d-4eff-4349-bc36-2356ec73c08c-etc-swift\") pod \"swift-storage-0\" (UID: \"6963a48d-4eff-4349-bc36-2356ec73c08c\") " pod="openstack/swift-storage-0" Dec 12 00:44:39 crc kubenswrapper[4606]: E1212 00:44:39.916964 4606 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 12 00:44:39 crc kubenswrapper[4606]: E1212 00:44:39.917334 4606 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 12 00:44:39 crc kubenswrapper[4606]: E1212 00:44:39.917416 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6963a48d-4eff-4349-bc36-2356ec73c08c-etc-swift podName:6963a48d-4eff-4349-bc36-2356ec73c08c nodeName:}" failed. No retries permitted until 2025-12-12 00:44:55.917390431 +0000 UTC m=+1286.462743337 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/6963a48d-4eff-4349-bc36-2356ec73c08c-etc-swift") pod "swift-storage-0" (UID: "6963a48d-4eff-4349-bc36-2356ec73c08c") : configmap "swift-ring-files" not found Dec 12 00:44:41 crc kubenswrapper[4606]: I1212 00:44:41.687380 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-hhlpv"] Dec 12 00:44:41 crc kubenswrapper[4606]: E1212 00:44:41.687721 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2358288-ebbf-4430-9684-7bcdf01349a4" containerName="mariadb-database-create" Dec 12 00:44:41 crc kubenswrapper[4606]: I1212 00:44:41.687733 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2358288-ebbf-4430-9684-7bcdf01349a4" containerName="mariadb-database-create" Dec 12 00:44:41 crc kubenswrapper[4606]: E1212 00:44:41.687742 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d34d547f-c6ae-48f9-8df6-1d2d35942f23" containerName="mariadb-account-create-update" Dec 12 00:44:41 crc kubenswrapper[4606]: I1212 00:44:41.687748 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="d34d547f-c6ae-48f9-8df6-1d2d35942f23" containerName="mariadb-account-create-update" Dec 12 00:44:41 crc kubenswrapper[4606]: E1212 00:44:41.687760 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be7f7dee-5960-4b93-890f-d5898b9d6457" containerName="init" Dec 12 00:44:41 crc kubenswrapper[4606]: I1212 00:44:41.687767 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="be7f7dee-5960-4b93-890f-d5898b9d6457" containerName="init" Dec 12 00:44:41 crc kubenswrapper[4606]: E1212 00:44:41.687776 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38c5f767-2214-4103-94ef-c9b98cfb9269" containerName="mariadb-database-create" Dec 12 00:44:41 crc kubenswrapper[4606]: I1212 00:44:41.687782 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="38c5f767-2214-4103-94ef-c9b98cfb9269" containerName="mariadb-database-create" Dec 12 00:44:41 crc kubenswrapper[4606]: E1212 00:44:41.687790 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c857da8-746f-4a51-b509-e6ed45614ab6" containerName="mariadb-account-create-update" Dec 12 00:44:41 crc kubenswrapper[4606]: I1212 00:44:41.687797 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c857da8-746f-4a51-b509-e6ed45614ab6" containerName="mariadb-account-create-update" Dec 12 00:44:41 crc kubenswrapper[4606]: E1212 00:44:41.687810 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be7f7dee-5960-4b93-890f-d5898b9d6457" containerName="dnsmasq-dns" Dec 12 00:44:41 crc kubenswrapper[4606]: I1212 00:44:41.687817 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="be7f7dee-5960-4b93-890f-d5898b9d6457" containerName="dnsmasq-dns" Dec 12 00:44:41 crc kubenswrapper[4606]: E1212 00:44:41.687828 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92703eda-3f9e-40c1-9eef-c637ccfe0552" containerName="mariadb-account-create-update" Dec 12 00:44:41 crc kubenswrapper[4606]: I1212 00:44:41.687834 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="92703eda-3f9e-40c1-9eef-c637ccfe0552" containerName="mariadb-account-create-update" Dec 12 00:44:41 crc kubenswrapper[4606]: E1212 00:44:41.687843 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8b5bcea-0e2f-4d42-b6bd-0f7ea87ed0e5" containerName="mariadb-database-create" Dec 12 00:44:41 crc kubenswrapper[4606]: I1212 00:44:41.687848 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8b5bcea-0e2f-4d42-b6bd-0f7ea87ed0e5" containerName="mariadb-database-create" Dec 12 00:44:41 crc kubenswrapper[4606]: I1212 00:44:41.688029 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2358288-ebbf-4430-9684-7bcdf01349a4" containerName="mariadb-database-create" Dec 12 00:44:41 crc kubenswrapper[4606]: I1212 00:44:41.688039 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="38c5f767-2214-4103-94ef-c9b98cfb9269" containerName="mariadb-database-create" Dec 12 00:44:41 crc kubenswrapper[4606]: I1212 00:44:41.688051 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="d34d547f-c6ae-48f9-8df6-1d2d35942f23" containerName="mariadb-account-create-update" Dec 12 00:44:41 crc kubenswrapper[4606]: I1212 00:44:41.688057 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="be7f7dee-5960-4b93-890f-d5898b9d6457" containerName="dnsmasq-dns" Dec 12 00:44:41 crc kubenswrapper[4606]: I1212 00:44:41.688066 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c857da8-746f-4a51-b509-e6ed45614ab6" containerName="mariadb-account-create-update" Dec 12 00:44:41 crc kubenswrapper[4606]: I1212 00:44:41.688075 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8b5bcea-0e2f-4d42-b6bd-0f7ea87ed0e5" containerName="mariadb-database-create" Dec 12 00:44:41 crc kubenswrapper[4606]: I1212 00:44:41.688082 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="92703eda-3f9e-40c1-9eef-c637ccfe0552" containerName="mariadb-account-create-update" Dec 12 00:44:41 crc kubenswrapper[4606]: I1212 00:44:41.688614 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-hhlpv" Dec 12 00:44:41 crc kubenswrapper[4606]: I1212 00:44:41.691425 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-gs7ln" Dec 12 00:44:41 crc kubenswrapper[4606]: I1212 00:44:41.692543 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Dec 12 00:44:41 crc kubenswrapper[4606]: I1212 00:44:41.695900 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-hhlpv"] Dec 12 00:44:41 crc kubenswrapper[4606]: I1212 00:44:41.768131 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9db5468c-db5e-4cbb-a854-d3e805d9744e-combined-ca-bundle\") pod \"glance-db-sync-hhlpv\" (UID: \"9db5468c-db5e-4cbb-a854-d3e805d9744e\") " pod="openstack/glance-db-sync-hhlpv" Dec 12 00:44:41 crc kubenswrapper[4606]: I1212 00:44:41.768506 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9db5468c-db5e-4cbb-a854-d3e805d9744e-db-sync-config-data\") pod \"glance-db-sync-hhlpv\" (UID: \"9db5468c-db5e-4cbb-a854-d3e805d9744e\") " pod="openstack/glance-db-sync-hhlpv" Dec 12 00:44:41 crc kubenswrapper[4606]: I1212 00:44:41.768564 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4wll\" (UniqueName: \"kubernetes.io/projected/9db5468c-db5e-4cbb-a854-d3e805d9744e-kube-api-access-h4wll\") pod \"glance-db-sync-hhlpv\" (UID: \"9db5468c-db5e-4cbb-a854-d3e805d9744e\") " pod="openstack/glance-db-sync-hhlpv" Dec 12 00:44:41 crc kubenswrapper[4606]: I1212 00:44:41.768626 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9db5468c-db5e-4cbb-a854-d3e805d9744e-config-data\") pod \"glance-db-sync-hhlpv\" (UID: \"9db5468c-db5e-4cbb-a854-d3e805d9744e\") " pod="openstack/glance-db-sync-hhlpv" Dec 12 00:44:41 crc kubenswrapper[4606]: I1212 00:44:41.870461 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9db5468c-db5e-4cbb-a854-d3e805d9744e-combined-ca-bundle\") pod \"glance-db-sync-hhlpv\" (UID: \"9db5468c-db5e-4cbb-a854-d3e805d9744e\") " pod="openstack/glance-db-sync-hhlpv" Dec 12 00:44:41 crc kubenswrapper[4606]: I1212 00:44:41.870520 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9db5468c-db5e-4cbb-a854-d3e805d9744e-db-sync-config-data\") pod \"glance-db-sync-hhlpv\" (UID: \"9db5468c-db5e-4cbb-a854-d3e805d9744e\") " pod="openstack/glance-db-sync-hhlpv" Dec 12 00:44:41 crc kubenswrapper[4606]: I1212 00:44:41.870572 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h4wll\" (UniqueName: \"kubernetes.io/projected/9db5468c-db5e-4cbb-a854-d3e805d9744e-kube-api-access-h4wll\") pod \"glance-db-sync-hhlpv\" (UID: \"9db5468c-db5e-4cbb-a854-d3e805d9744e\") " pod="openstack/glance-db-sync-hhlpv" Dec 12 00:44:41 crc kubenswrapper[4606]: I1212 00:44:41.870624 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9db5468c-db5e-4cbb-a854-d3e805d9744e-config-data\") pod \"glance-db-sync-hhlpv\" (UID: \"9db5468c-db5e-4cbb-a854-d3e805d9744e\") " pod="openstack/glance-db-sync-hhlpv" Dec 12 00:44:41 crc kubenswrapper[4606]: I1212 00:44:41.876157 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9db5468c-db5e-4cbb-a854-d3e805d9744e-db-sync-config-data\") pod \"glance-db-sync-hhlpv\" (UID: \"9db5468c-db5e-4cbb-a854-d3e805d9744e\") " pod="openstack/glance-db-sync-hhlpv" Dec 12 00:44:41 crc kubenswrapper[4606]: I1212 00:44:41.877146 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9db5468c-db5e-4cbb-a854-d3e805d9744e-config-data\") pod \"glance-db-sync-hhlpv\" (UID: \"9db5468c-db5e-4cbb-a854-d3e805d9744e\") " pod="openstack/glance-db-sync-hhlpv" Dec 12 00:44:41 crc kubenswrapper[4606]: I1212 00:44:41.877882 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9db5468c-db5e-4cbb-a854-d3e805d9744e-combined-ca-bundle\") pod \"glance-db-sync-hhlpv\" (UID: \"9db5468c-db5e-4cbb-a854-d3e805d9744e\") " pod="openstack/glance-db-sync-hhlpv" Dec 12 00:44:41 crc kubenswrapper[4606]: I1212 00:44:41.911283 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4wll\" (UniqueName: \"kubernetes.io/projected/9db5468c-db5e-4cbb-a854-d3e805d9744e-kube-api-access-h4wll\") pod \"glance-db-sync-hhlpv\" (UID: \"9db5468c-db5e-4cbb-a854-d3e805d9744e\") " pod="openstack/glance-db-sync-hhlpv" Dec 12 00:44:42 crc kubenswrapper[4606]: I1212 00:44:42.048514 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-hhlpv" Dec 12 00:44:42 crc kubenswrapper[4606]: I1212 00:44:42.706223 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-hhlpv"] Dec 12 00:44:43 crc kubenswrapper[4606]: I1212 00:44:43.663892 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-hhlpv" event={"ID":"9db5468c-db5e-4cbb-a854-d3e805d9744e","Type":"ContainerStarted","Data":"85882ecd292359054740918b438d6aa4b74f055c5c8bc7fe9a73dc2ebef109f4"} Dec 12 00:44:45 crc kubenswrapper[4606]: I1212 00:44:45.686518 4606 generic.go:334] "Generic (PLEG): container finished" podID="d4a54eac-00ee-452a-9c4b-e777e338e670" containerID="b87adbb36f427f1e31235fbaca48e48cf1cbd3844df9ace46372196c22ec9e6a" exitCode=0 Dec 12 00:44:45 crc kubenswrapper[4606]: I1212 00:44:45.686610 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-t97v8" event={"ID":"d4a54eac-00ee-452a-9c4b-e777e338e670","Type":"ContainerDied","Data":"b87adbb36f427f1e31235fbaca48e48cf1cbd3844df9ace46372196c22ec9e6a"} Dec 12 00:44:46 crc kubenswrapper[4606]: I1212 00:44:46.192242 4606 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-666ck" podUID="015ed993-f4fd-4928-a5ec-d13ad04b0105" containerName="ovn-controller" probeResult="failure" output=< Dec 12 00:44:46 crc kubenswrapper[4606]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Dec 12 00:44:46 crc kubenswrapper[4606]: > Dec 12 00:44:46 crc kubenswrapper[4606]: I1212 00:44:46.213292 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-fh8p9" Dec 12 00:44:46 crc kubenswrapper[4606]: I1212 00:44:46.223154 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-fh8p9" Dec 12 00:44:46 crc kubenswrapper[4606]: I1212 00:44:46.452499 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-666ck-config-mjsd6"] Dec 12 00:44:46 crc kubenswrapper[4606]: I1212 00:44:46.453498 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-666ck-config-mjsd6" Dec 12 00:44:46 crc kubenswrapper[4606]: I1212 00:44:46.458784 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Dec 12 00:44:46 crc kubenswrapper[4606]: I1212 00:44:46.464372 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-666ck-config-mjsd6"] Dec 12 00:44:46 crc kubenswrapper[4606]: I1212 00:44:46.549683 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/30efe5fb-3953-4a70-88f3-d56ecf9bacf1-additional-scripts\") pod \"ovn-controller-666ck-config-mjsd6\" (UID: \"30efe5fb-3953-4a70-88f3-d56ecf9bacf1\") " pod="openstack/ovn-controller-666ck-config-mjsd6" Dec 12 00:44:46 crc kubenswrapper[4606]: I1212 00:44:46.549737 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/30efe5fb-3953-4a70-88f3-d56ecf9bacf1-var-run\") pod \"ovn-controller-666ck-config-mjsd6\" (UID: \"30efe5fb-3953-4a70-88f3-d56ecf9bacf1\") " pod="openstack/ovn-controller-666ck-config-mjsd6" Dec 12 00:44:46 crc kubenswrapper[4606]: I1212 00:44:46.549777 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/30efe5fb-3953-4a70-88f3-d56ecf9bacf1-var-log-ovn\") pod \"ovn-controller-666ck-config-mjsd6\" (UID: \"30efe5fb-3953-4a70-88f3-d56ecf9bacf1\") " pod="openstack/ovn-controller-666ck-config-mjsd6" Dec 12 00:44:46 crc kubenswrapper[4606]: I1212 00:44:46.549803 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/30efe5fb-3953-4a70-88f3-d56ecf9bacf1-scripts\") pod \"ovn-controller-666ck-config-mjsd6\" (UID: \"30efe5fb-3953-4a70-88f3-d56ecf9bacf1\") " pod="openstack/ovn-controller-666ck-config-mjsd6" Dec 12 00:44:46 crc kubenswrapper[4606]: I1212 00:44:46.549835 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/30efe5fb-3953-4a70-88f3-d56ecf9bacf1-var-run-ovn\") pod \"ovn-controller-666ck-config-mjsd6\" (UID: \"30efe5fb-3953-4a70-88f3-d56ecf9bacf1\") " pod="openstack/ovn-controller-666ck-config-mjsd6" Dec 12 00:44:46 crc kubenswrapper[4606]: I1212 00:44:46.549896 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhnll\" (UniqueName: \"kubernetes.io/projected/30efe5fb-3953-4a70-88f3-d56ecf9bacf1-kube-api-access-rhnll\") pod \"ovn-controller-666ck-config-mjsd6\" (UID: \"30efe5fb-3953-4a70-88f3-d56ecf9bacf1\") " pod="openstack/ovn-controller-666ck-config-mjsd6" Dec 12 00:44:46 crc kubenswrapper[4606]: I1212 00:44:46.651486 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/30efe5fb-3953-4a70-88f3-d56ecf9bacf1-var-log-ovn\") pod \"ovn-controller-666ck-config-mjsd6\" (UID: \"30efe5fb-3953-4a70-88f3-d56ecf9bacf1\") " pod="openstack/ovn-controller-666ck-config-mjsd6" Dec 12 00:44:46 crc kubenswrapper[4606]: I1212 00:44:46.651551 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/30efe5fb-3953-4a70-88f3-d56ecf9bacf1-scripts\") pod \"ovn-controller-666ck-config-mjsd6\" (UID: \"30efe5fb-3953-4a70-88f3-d56ecf9bacf1\") " pod="openstack/ovn-controller-666ck-config-mjsd6" Dec 12 00:44:46 crc kubenswrapper[4606]: I1212 00:44:46.651586 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/30efe5fb-3953-4a70-88f3-d56ecf9bacf1-var-run-ovn\") pod \"ovn-controller-666ck-config-mjsd6\" (UID: \"30efe5fb-3953-4a70-88f3-d56ecf9bacf1\") " pod="openstack/ovn-controller-666ck-config-mjsd6" Dec 12 00:44:46 crc kubenswrapper[4606]: I1212 00:44:46.651646 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rhnll\" (UniqueName: \"kubernetes.io/projected/30efe5fb-3953-4a70-88f3-d56ecf9bacf1-kube-api-access-rhnll\") pod \"ovn-controller-666ck-config-mjsd6\" (UID: \"30efe5fb-3953-4a70-88f3-d56ecf9bacf1\") " pod="openstack/ovn-controller-666ck-config-mjsd6" Dec 12 00:44:46 crc kubenswrapper[4606]: I1212 00:44:46.651698 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/30efe5fb-3953-4a70-88f3-d56ecf9bacf1-additional-scripts\") pod \"ovn-controller-666ck-config-mjsd6\" (UID: \"30efe5fb-3953-4a70-88f3-d56ecf9bacf1\") " pod="openstack/ovn-controller-666ck-config-mjsd6" Dec 12 00:44:46 crc kubenswrapper[4606]: I1212 00:44:46.651715 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/30efe5fb-3953-4a70-88f3-d56ecf9bacf1-var-run\") pod \"ovn-controller-666ck-config-mjsd6\" (UID: \"30efe5fb-3953-4a70-88f3-d56ecf9bacf1\") " pod="openstack/ovn-controller-666ck-config-mjsd6" Dec 12 00:44:46 crc kubenswrapper[4606]: I1212 00:44:46.651830 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/30efe5fb-3953-4a70-88f3-d56ecf9bacf1-var-run\") pod \"ovn-controller-666ck-config-mjsd6\" (UID: \"30efe5fb-3953-4a70-88f3-d56ecf9bacf1\") " pod="openstack/ovn-controller-666ck-config-mjsd6" Dec 12 00:44:46 crc kubenswrapper[4606]: I1212 00:44:46.651819 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/30efe5fb-3953-4a70-88f3-d56ecf9bacf1-var-log-ovn\") pod \"ovn-controller-666ck-config-mjsd6\" (UID: \"30efe5fb-3953-4a70-88f3-d56ecf9bacf1\") " pod="openstack/ovn-controller-666ck-config-mjsd6" Dec 12 00:44:46 crc kubenswrapper[4606]: I1212 00:44:46.652149 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/30efe5fb-3953-4a70-88f3-d56ecf9bacf1-var-run-ovn\") pod \"ovn-controller-666ck-config-mjsd6\" (UID: \"30efe5fb-3953-4a70-88f3-d56ecf9bacf1\") " pod="openstack/ovn-controller-666ck-config-mjsd6" Dec 12 00:44:46 crc kubenswrapper[4606]: I1212 00:44:46.652784 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/30efe5fb-3953-4a70-88f3-d56ecf9bacf1-additional-scripts\") pod \"ovn-controller-666ck-config-mjsd6\" (UID: \"30efe5fb-3953-4a70-88f3-d56ecf9bacf1\") " pod="openstack/ovn-controller-666ck-config-mjsd6" Dec 12 00:44:46 crc kubenswrapper[4606]: I1212 00:44:46.654407 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/30efe5fb-3953-4a70-88f3-d56ecf9bacf1-scripts\") pod \"ovn-controller-666ck-config-mjsd6\" (UID: \"30efe5fb-3953-4a70-88f3-d56ecf9bacf1\") " pod="openstack/ovn-controller-666ck-config-mjsd6" Dec 12 00:44:46 crc kubenswrapper[4606]: I1212 00:44:46.692791 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhnll\" (UniqueName: \"kubernetes.io/projected/30efe5fb-3953-4a70-88f3-d56ecf9bacf1-kube-api-access-rhnll\") pod \"ovn-controller-666ck-config-mjsd6\" (UID: \"30efe5fb-3953-4a70-88f3-d56ecf9bacf1\") " pod="openstack/ovn-controller-666ck-config-mjsd6" Dec 12 00:44:46 crc kubenswrapper[4606]: I1212 00:44:46.803653 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-666ck-config-mjsd6" Dec 12 00:44:47 crc kubenswrapper[4606]: I1212 00:44:47.068200 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-t97v8" Dec 12 00:44:47 crc kubenswrapper[4606]: I1212 00:44:47.163527 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d4a54eac-00ee-452a-9c4b-e777e338e670-scripts\") pod \"d4a54eac-00ee-452a-9c4b-e777e338e670\" (UID: \"d4a54eac-00ee-452a-9c4b-e777e338e670\") " Dec 12 00:44:47 crc kubenswrapper[4606]: I1212 00:44:47.163612 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d4a54eac-00ee-452a-9c4b-e777e338e670-etc-swift\") pod \"d4a54eac-00ee-452a-9c4b-e777e338e670\" (UID: \"d4a54eac-00ee-452a-9c4b-e777e338e670\") " Dec 12 00:44:47 crc kubenswrapper[4606]: I1212 00:44:47.163692 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d4a54eac-00ee-452a-9c4b-e777e338e670-swiftconf\") pod \"d4a54eac-00ee-452a-9c4b-e777e338e670\" (UID: \"d4a54eac-00ee-452a-9c4b-e777e338e670\") " Dec 12 00:44:47 crc kubenswrapper[4606]: I1212 00:44:47.163809 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dn7n2\" (UniqueName: \"kubernetes.io/projected/d4a54eac-00ee-452a-9c4b-e777e338e670-kube-api-access-dn7n2\") pod \"d4a54eac-00ee-452a-9c4b-e777e338e670\" (UID: \"d4a54eac-00ee-452a-9c4b-e777e338e670\") " Dec 12 00:44:47 crc kubenswrapper[4606]: I1212 00:44:47.163844 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d4a54eac-00ee-452a-9c4b-e777e338e670-dispersionconf\") pod \"d4a54eac-00ee-452a-9c4b-e777e338e670\" (UID: \"d4a54eac-00ee-452a-9c4b-e777e338e670\") " Dec 12 00:44:47 crc kubenswrapper[4606]: I1212 00:44:47.163882 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4a54eac-00ee-452a-9c4b-e777e338e670-combined-ca-bundle\") pod \"d4a54eac-00ee-452a-9c4b-e777e338e670\" (UID: \"d4a54eac-00ee-452a-9c4b-e777e338e670\") " Dec 12 00:44:47 crc kubenswrapper[4606]: I1212 00:44:47.163934 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d4a54eac-00ee-452a-9c4b-e777e338e670-ring-data-devices\") pod \"d4a54eac-00ee-452a-9c4b-e777e338e670\" (UID: \"d4a54eac-00ee-452a-9c4b-e777e338e670\") " Dec 12 00:44:47 crc kubenswrapper[4606]: I1212 00:44:47.166647 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d4a54eac-00ee-452a-9c4b-e777e338e670-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "d4a54eac-00ee-452a-9c4b-e777e338e670" (UID: "d4a54eac-00ee-452a-9c4b-e777e338e670"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:44:47 crc kubenswrapper[4606]: I1212 00:44:47.176313 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4a54eac-00ee-452a-9c4b-e777e338e670-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "d4a54eac-00ee-452a-9c4b-e777e338e670" (UID: "d4a54eac-00ee-452a-9c4b-e777e338e670"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 00:44:47 crc kubenswrapper[4606]: I1212 00:44:47.207874 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4a54eac-00ee-452a-9c4b-e777e338e670-kube-api-access-dn7n2" (OuterVolumeSpecName: "kube-api-access-dn7n2") pod "d4a54eac-00ee-452a-9c4b-e777e338e670" (UID: "d4a54eac-00ee-452a-9c4b-e777e338e670"). InnerVolumeSpecName "kube-api-access-dn7n2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:44:47 crc kubenswrapper[4606]: I1212 00:44:47.223377 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4a54eac-00ee-452a-9c4b-e777e338e670-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "d4a54eac-00ee-452a-9c4b-e777e338e670" (UID: "d4a54eac-00ee-452a-9c4b-e777e338e670"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:44:47 crc kubenswrapper[4606]: I1212 00:44:47.260973 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4a54eac-00ee-452a-9c4b-e777e338e670-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d4a54eac-00ee-452a-9c4b-e777e338e670" (UID: "d4a54eac-00ee-452a-9c4b-e777e338e670"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:44:47 crc kubenswrapper[4606]: I1212 00:44:47.266384 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4a54eac-00ee-452a-9c4b-e777e338e670-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "d4a54eac-00ee-452a-9c4b-e777e338e670" (UID: "d4a54eac-00ee-452a-9c4b-e777e338e670"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:44:47 crc kubenswrapper[4606]: I1212 00:44:47.267387 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d4a54eac-00ee-452a-9c4b-e777e338e670-scripts" (OuterVolumeSpecName: "scripts") pod "d4a54eac-00ee-452a-9c4b-e777e338e670" (UID: "d4a54eac-00ee-452a-9c4b-e777e338e670"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:44:47 crc kubenswrapper[4606]: I1212 00:44:47.267698 4606 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d4a54eac-00ee-452a-9c4b-e777e338e670-swiftconf\") on node \"crc\" DevicePath \"\"" Dec 12 00:44:47 crc kubenswrapper[4606]: I1212 00:44:47.267721 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dn7n2\" (UniqueName: \"kubernetes.io/projected/d4a54eac-00ee-452a-9c4b-e777e338e670-kube-api-access-dn7n2\") on node \"crc\" DevicePath \"\"" Dec 12 00:44:47 crc kubenswrapper[4606]: I1212 00:44:47.267733 4606 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d4a54eac-00ee-452a-9c4b-e777e338e670-dispersionconf\") on node \"crc\" DevicePath \"\"" Dec 12 00:44:47 crc kubenswrapper[4606]: I1212 00:44:47.267741 4606 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4a54eac-00ee-452a-9c4b-e777e338e670-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 12 00:44:47 crc kubenswrapper[4606]: I1212 00:44:47.267751 4606 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d4a54eac-00ee-452a-9c4b-e777e338e670-ring-data-devices\") on node \"crc\" DevicePath \"\"" Dec 12 00:44:47 crc kubenswrapper[4606]: I1212 00:44:47.267760 4606 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d4a54eac-00ee-452a-9c4b-e777e338e670-scripts\") on node \"crc\" DevicePath \"\"" Dec 12 00:44:47 crc kubenswrapper[4606]: I1212 00:44:47.267769 4606 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d4a54eac-00ee-452a-9c4b-e777e338e670-etc-swift\") on node \"crc\" DevicePath \"\"" Dec 12 00:44:47 crc kubenswrapper[4606]: I1212 00:44:47.342256 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-666ck-config-mjsd6"] Dec 12 00:44:47 crc kubenswrapper[4606]: I1212 00:44:47.710907 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-t97v8" Dec 12 00:44:47 crc kubenswrapper[4606]: I1212 00:44:47.716147 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-t97v8" event={"ID":"d4a54eac-00ee-452a-9c4b-e777e338e670","Type":"ContainerDied","Data":"db2ffc32613601924f1da4713ab038b95d293a0db158ae4f9d163c10075200e8"} Dec 12 00:44:47 crc kubenswrapper[4606]: I1212 00:44:47.716192 4606 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="db2ffc32613601924f1da4713ab038b95d293a0db158ae4f9d163c10075200e8" Dec 12 00:44:47 crc kubenswrapper[4606]: I1212 00:44:47.725464 4606 generic.go:334] "Generic (PLEG): container finished" podID="0e415e37-636f-4f5d-a64e-4dd815e6030e" containerID="8f65950682d8d7663800a314d38af8809bafba5b459d243224796e51d49acb3c" exitCode=0 Dec 12 00:44:47 crc kubenswrapper[4606]: I1212 00:44:47.725534 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"0e415e37-636f-4f5d-a64e-4dd815e6030e","Type":"ContainerDied","Data":"8f65950682d8d7663800a314d38af8809bafba5b459d243224796e51d49acb3c"} Dec 12 00:44:47 crc kubenswrapper[4606]: I1212 00:44:47.734135 4606 generic.go:334] "Generic (PLEG): container finished" podID="bd9fd090-7c43-44f4-9951-10b4528fc8a2" containerID="0de71c822d771a40ccdc61ecaaab12bca9931df22b2c4c086696c1a0a0173f7d" exitCode=0 Dec 12 00:44:47 crc kubenswrapper[4606]: I1212 00:44:47.734240 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"bd9fd090-7c43-44f4-9951-10b4528fc8a2","Type":"ContainerDied","Data":"0de71c822d771a40ccdc61ecaaab12bca9931df22b2c4c086696c1a0a0173f7d"} Dec 12 00:44:47 crc kubenswrapper[4606]: I1212 00:44:47.745542 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-666ck-config-mjsd6" event={"ID":"30efe5fb-3953-4a70-88f3-d56ecf9bacf1","Type":"ContainerStarted","Data":"19267573f605240774577b26e61c1f3f1d10bebac2b7012f0af98dd1b671898b"} Dec 12 00:44:47 crc kubenswrapper[4606]: I1212 00:44:47.745586 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-666ck-config-mjsd6" event={"ID":"30efe5fb-3953-4a70-88f3-d56ecf9bacf1","Type":"ContainerStarted","Data":"54e9c3b44c5fc9a676632d3598a192bcf9a784b2a6009d40de583b07ff537e34"} Dec 12 00:44:47 crc kubenswrapper[4606]: I1212 00:44:47.808834 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-666ck-config-mjsd6" podStartSLOduration=1.808813653 podStartE2EDuration="1.808813653s" podCreationTimestamp="2025-12-12 00:44:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:44:47.76920372 +0000 UTC m=+1278.314556616" watchObservedRunningTime="2025-12-12 00:44:47.808813653 +0000 UTC m=+1278.354166519" Dec 12 00:44:48 crc kubenswrapper[4606]: I1212 00:44:48.754580 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"bd9fd090-7c43-44f4-9951-10b4528fc8a2","Type":"ContainerStarted","Data":"bb5b1403233bfd6bb8bd59205d11593b5e080ca5002733c88e5911b16beedaf3"} Dec 12 00:44:48 crc kubenswrapper[4606]: I1212 00:44:48.756038 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 12 00:44:48 crc kubenswrapper[4606]: I1212 00:44:48.757701 4606 generic.go:334] "Generic (PLEG): container finished" podID="30efe5fb-3953-4a70-88f3-d56ecf9bacf1" containerID="19267573f605240774577b26e61c1f3f1d10bebac2b7012f0af98dd1b671898b" exitCode=0 Dec 12 00:44:48 crc kubenswrapper[4606]: I1212 00:44:48.757753 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-666ck-config-mjsd6" event={"ID":"30efe5fb-3953-4a70-88f3-d56ecf9bacf1","Type":"ContainerDied","Data":"19267573f605240774577b26e61c1f3f1d10bebac2b7012f0af98dd1b671898b"} Dec 12 00:44:48 crc kubenswrapper[4606]: I1212 00:44:48.760172 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"0e415e37-636f-4f5d-a64e-4dd815e6030e","Type":"ContainerStarted","Data":"b69a0aacdfb7feb06f0e290611d83e131639980def628f7c7a0480887ccff02d"} Dec 12 00:44:48 crc kubenswrapper[4606]: I1212 00:44:48.760598 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 12 00:44:48 crc kubenswrapper[4606]: I1212 00:44:48.785906 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.339365547 podStartE2EDuration="1m12.785892225s" podCreationTimestamp="2025-12-12 00:43:36 +0000 UTC" firstStartedPulling="2025-12-12 00:43:38.563866459 +0000 UTC m=+1209.109219325" lastFinishedPulling="2025-12-12 00:44:14.010393127 +0000 UTC m=+1244.555746003" observedRunningTime="2025-12-12 00:44:48.783945633 +0000 UTC m=+1279.329298499" watchObservedRunningTime="2025-12-12 00:44:48.785892225 +0000 UTC m=+1279.331245081" Dec 12 00:44:48 crc kubenswrapper[4606]: I1212 00:44:48.862965 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=-9223371963.991835 podStartE2EDuration="1m12.862940493s" podCreationTimestamp="2025-12-12 00:43:36 +0000 UTC" firstStartedPulling="2025-12-12 00:43:38.275968434 +0000 UTC m=+1208.821321300" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:44:48.848332221 +0000 UTC m=+1279.393685087" watchObservedRunningTime="2025-12-12 00:44:48.862940493 +0000 UTC m=+1279.408293359" Dec 12 00:44:51 crc kubenswrapper[4606]: I1212 00:44:51.199048 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-666ck" Dec 12 00:44:56 crc kubenswrapper[4606]: I1212 00:44:56.009716 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6963a48d-4eff-4349-bc36-2356ec73c08c-etc-swift\") pod \"swift-storage-0\" (UID: \"6963a48d-4eff-4349-bc36-2356ec73c08c\") " pod="openstack/swift-storage-0" Dec 12 00:44:56 crc kubenswrapper[4606]: I1212 00:44:56.018139 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6963a48d-4eff-4349-bc36-2356ec73c08c-etc-swift\") pod \"swift-storage-0\" (UID: \"6963a48d-4eff-4349-bc36-2356ec73c08c\") " pod="openstack/swift-storage-0" Dec 12 00:44:56 crc kubenswrapper[4606]: I1212 00:44:56.105794 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 12 00:44:58 crc kubenswrapper[4606]: I1212 00:44:58.389608 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-666ck-config-mjsd6" Dec 12 00:44:58 crc kubenswrapper[4606]: I1212 00:44:58.567756 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/30efe5fb-3953-4a70-88f3-d56ecf9bacf1-scripts\") pod \"30efe5fb-3953-4a70-88f3-d56ecf9bacf1\" (UID: \"30efe5fb-3953-4a70-88f3-d56ecf9bacf1\") " Dec 12 00:44:58 crc kubenswrapper[4606]: I1212 00:44:58.568140 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/30efe5fb-3953-4a70-88f3-d56ecf9bacf1-var-run-ovn\") pod \"30efe5fb-3953-4a70-88f3-d56ecf9bacf1\" (UID: \"30efe5fb-3953-4a70-88f3-d56ecf9bacf1\") " Dec 12 00:44:58 crc kubenswrapper[4606]: I1212 00:44:58.568254 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/30efe5fb-3953-4a70-88f3-d56ecf9bacf1-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "30efe5fb-3953-4a70-88f3-d56ecf9bacf1" (UID: "30efe5fb-3953-4a70-88f3-d56ecf9bacf1"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 12 00:44:58 crc kubenswrapper[4606]: I1212 00:44:58.568451 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/30efe5fb-3953-4a70-88f3-d56ecf9bacf1-additional-scripts\") pod \"30efe5fb-3953-4a70-88f3-d56ecf9bacf1\" (UID: \"30efe5fb-3953-4a70-88f3-d56ecf9bacf1\") " Dec 12 00:44:58 crc kubenswrapper[4606]: I1212 00:44:58.568535 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rhnll\" (UniqueName: \"kubernetes.io/projected/30efe5fb-3953-4a70-88f3-d56ecf9bacf1-kube-api-access-rhnll\") pod \"30efe5fb-3953-4a70-88f3-d56ecf9bacf1\" (UID: \"30efe5fb-3953-4a70-88f3-d56ecf9bacf1\") " Dec 12 00:44:58 crc kubenswrapper[4606]: I1212 00:44:58.568565 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/30efe5fb-3953-4a70-88f3-d56ecf9bacf1-var-log-ovn\") pod \"30efe5fb-3953-4a70-88f3-d56ecf9bacf1\" (UID: \"30efe5fb-3953-4a70-88f3-d56ecf9bacf1\") " Dec 12 00:44:58 crc kubenswrapper[4606]: I1212 00:44:58.568591 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/30efe5fb-3953-4a70-88f3-d56ecf9bacf1-var-run\") pod \"30efe5fb-3953-4a70-88f3-d56ecf9bacf1\" (UID: \"30efe5fb-3953-4a70-88f3-d56ecf9bacf1\") " Dec 12 00:44:58 crc kubenswrapper[4606]: I1212 00:44:58.568720 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/30efe5fb-3953-4a70-88f3-d56ecf9bacf1-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "30efe5fb-3953-4a70-88f3-d56ecf9bacf1" (UID: "30efe5fb-3953-4a70-88f3-d56ecf9bacf1"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 12 00:44:58 crc kubenswrapper[4606]: I1212 00:44:58.568784 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/30efe5fb-3953-4a70-88f3-d56ecf9bacf1-var-run" (OuterVolumeSpecName: "var-run") pod "30efe5fb-3953-4a70-88f3-d56ecf9bacf1" (UID: "30efe5fb-3953-4a70-88f3-d56ecf9bacf1"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 12 00:44:58 crc kubenswrapper[4606]: I1212 00:44:58.568834 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/30efe5fb-3953-4a70-88f3-d56ecf9bacf1-scripts" (OuterVolumeSpecName: "scripts") pod "30efe5fb-3953-4a70-88f3-d56ecf9bacf1" (UID: "30efe5fb-3953-4a70-88f3-d56ecf9bacf1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:44:58 crc kubenswrapper[4606]: I1212 00:44:58.569036 4606 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/30efe5fb-3953-4a70-88f3-d56ecf9bacf1-var-log-ovn\") on node \"crc\" DevicePath \"\"" Dec 12 00:44:58 crc kubenswrapper[4606]: I1212 00:44:58.569050 4606 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/30efe5fb-3953-4a70-88f3-d56ecf9bacf1-var-run\") on node \"crc\" DevicePath \"\"" Dec 12 00:44:58 crc kubenswrapper[4606]: I1212 00:44:58.569673 4606 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/30efe5fb-3953-4a70-88f3-d56ecf9bacf1-scripts\") on node \"crc\" DevicePath \"\"" Dec 12 00:44:58 crc kubenswrapper[4606]: I1212 00:44:58.569688 4606 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/30efe5fb-3953-4a70-88f3-d56ecf9bacf1-var-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 12 00:44:58 crc kubenswrapper[4606]: I1212 00:44:58.569334 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/30efe5fb-3953-4a70-88f3-d56ecf9bacf1-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "30efe5fb-3953-4a70-88f3-d56ecf9bacf1" (UID: "30efe5fb-3953-4a70-88f3-d56ecf9bacf1"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:44:58 crc kubenswrapper[4606]: I1212 00:44:58.571845 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30efe5fb-3953-4a70-88f3-d56ecf9bacf1-kube-api-access-rhnll" (OuterVolumeSpecName: "kube-api-access-rhnll") pod "30efe5fb-3953-4a70-88f3-d56ecf9bacf1" (UID: "30efe5fb-3953-4a70-88f3-d56ecf9bacf1"). InnerVolumeSpecName "kube-api-access-rhnll". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:44:58 crc kubenswrapper[4606]: I1212 00:44:58.671220 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rhnll\" (UniqueName: \"kubernetes.io/projected/30efe5fb-3953-4a70-88f3-d56ecf9bacf1-kube-api-access-rhnll\") on node \"crc\" DevicePath \"\"" Dec 12 00:44:58 crc kubenswrapper[4606]: I1212 00:44:58.671498 4606 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/30efe5fb-3953-4a70-88f3-d56ecf9bacf1-additional-scripts\") on node \"crc\" DevicePath \"\"" Dec 12 00:44:58 crc kubenswrapper[4606]: I1212 00:44:58.782999 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Dec 12 00:44:58 crc kubenswrapper[4606]: I1212 00:44:58.853297 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6963a48d-4eff-4349-bc36-2356ec73c08c","Type":"ContainerStarted","Data":"17aa0ce87641585f93da4d72274f4f39c2c38c21308b9557f7653c3890f57134"} Dec 12 00:44:58 crc kubenswrapper[4606]: I1212 00:44:58.854377 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-666ck-config-mjsd6" event={"ID":"30efe5fb-3953-4a70-88f3-d56ecf9bacf1","Type":"ContainerDied","Data":"54e9c3b44c5fc9a676632d3598a192bcf9a784b2a6009d40de583b07ff537e34"} Dec 12 00:44:58 crc kubenswrapper[4606]: I1212 00:44:58.854401 4606 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="54e9c3b44c5fc9a676632d3598a192bcf9a784b2a6009d40de583b07ff537e34" Dec 12 00:44:58 crc kubenswrapper[4606]: I1212 00:44:58.854422 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-666ck-config-mjsd6" Dec 12 00:44:59 crc kubenswrapper[4606]: I1212 00:44:59.509035 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-666ck-config-mjsd6"] Dec 12 00:44:59 crc kubenswrapper[4606]: I1212 00:44:59.516609 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-666ck-config-mjsd6"] Dec 12 00:44:59 crc kubenswrapper[4606]: I1212 00:44:59.710317 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30efe5fb-3953-4a70-88f3-d56ecf9bacf1" path="/var/lib/kubelet/pods/30efe5fb-3953-4a70-88f3-d56ecf9bacf1/volumes" Dec 12 00:44:59 crc kubenswrapper[4606]: I1212 00:44:59.866596 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-hhlpv" event={"ID":"9db5468c-db5e-4cbb-a854-d3e805d9744e","Type":"ContainerStarted","Data":"8944007f23342877bee55299498a19ce188152f5110dc69dad0fc44d79539aa2"} Dec 12 00:45:00 crc kubenswrapper[4606]: I1212 00:45:00.139639 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-hhlpv" podStartSLOduration=3.493235603 podStartE2EDuration="19.139620094s" podCreationTimestamp="2025-12-12 00:44:41 +0000 UTC" firstStartedPulling="2025-12-12 00:44:42.725602375 +0000 UTC m=+1273.270955251" lastFinishedPulling="2025-12-12 00:44:58.371986876 +0000 UTC m=+1288.917339742" observedRunningTime="2025-12-12 00:44:59.892964674 +0000 UTC m=+1290.438317540" watchObservedRunningTime="2025-12-12 00:45:00.139620094 +0000 UTC m=+1290.684972960" Dec 12 00:45:00 crc kubenswrapper[4606]: I1212 00:45:00.170345 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29425005-jc4rq"] Dec 12 00:45:00 crc kubenswrapper[4606]: E1212 00:45:00.171427 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30efe5fb-3953-4a70-88f3-d56ecf9bacf1" containerName="ovn-config" Dec 12 00:45:00 crc kubenswrapper[4606]: I1212 00:45:00.171454 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="30efe5fb-3953-4a70-88f3-d56ecf9bacf1" containerName="ovn-config" Dec 12 00:45:00 crc kubenswrapper[4606]: E1212 00:45:00.171471 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4a54eac-00ee-452a-9c4b-e777e338e670" containerName="swift-ring-rebalance" Dec 12 00:45:00 crc kubenswrapper[4606]: I1212 00:45:00.171480 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4a54eac-00ee-452a-9c4b-e777e338e670" containerName="swift-ring-rebalance" Dec 12 00:45:00 crc kubenswrapper[4606]: I1212 00:45:00.171904 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="30efe5fb-3953-4a70-88f3-d56ecf9bacf1" containerName="ovn-config" Dec 12 00:45:00 crc kubenswrapper[4606]: I1212 00:45:00.171932 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4a54eac-00ee-452a-9c4b-e777e338e670" containerName="swift-ring-rebalance" Dec 12 00:45:00 crc kubenswrapper[4606]: I1212 00:45:00.172901 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29425005-jc4rq" Dec 12 00:45:00 crc kubenswrapper[4606]: I1212 00:45:00.179613 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 12 00:45:00 crc kubenswrapper[4606]: I1212 00:45:00.180053 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 12 00:45:00 crc kubenswrapper[4606]: I1212 00:45:00.199371 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29425005-jc4rq"] Dec 12 00:45:00 crc kubenswrapper[4606]: I1212 00:45:00.309085 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b47c7\" (UniqueName: \"kubernetes.io/projected/a80843b7-dee1-423c-b28b-3fbcdf367999-kube-api-access-b47c7\") pod \"collect-profiles-29425005-jc4rq\" (UID: \"a80843b7-dee1-423c-b28b-3fbcdf367999\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29425005-jc4rq" Dec 12 00:45:00 crc kubenswrapper[4606]: I1212 00:45:00.309138 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a80843b7-dee1-423c-b28b-3fbcdf367999-secret-volume\") pod \"collect-profiles-29425005-jc4rq\" (UID: \"a80843b7-dee1-423c-b28b-3fbcdf367999\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29425005-jc4rq" Dec 12 00:45:00 crc kubenswrapper[4606]: I1212 00:45:00.309158 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a80843b7-dee1-423c-b28b-3fbcdf367999-config-volume\") pod \"collect-profiles-29425005-jc4rq\" (UID: \"a80843b7-dee1-423c-b28b-3fbcdf367999\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29425005-jc4rq" Dec 12 00:45:00 crc kubenswrapper[4606]: I1212 00:45:00.410939 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b47c7\" (UniqueName: \"kubernetes.io/projected/a80843b7-dee1-423c-b28b-3fbcdf367999-kube-api-access-b47c7\") pod \"collect-profiles-29425005-jc4rq\" (UID: \"a80843b7-dee1-423c-b28b-3fbcdf367999\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29425005-jc4rq" Dec 12 00:45:00 crc kubenswrapper[4606]: I1212 00:45:00.411004 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a80843b7-dee1-423c-b28b-3fbcdf367999-secret-volume\") pod \"collect-profiles-29425005-jc4rq\" (UID: \"a80843b7-dee1-423c-b28b-3fbcdf367999\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29425005-jc4rq" Dec 12 00:45:00 crc kubenswrapper[4606]: I1212 00:45:00.411031 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a80843b7-dee1-423c-b28b-3fbcdf367999-config-volume\") pod \"collect-profiles-29425005-jc4rq\" (UID: \"a80843b7-dee1-423c-b28b-3fbcdf367999\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29425005-jc4rq" Dec 12 00:45:00 crc kubenswrapper[4606]: I1212 00:45:00.412100 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a80843b7-dee1-423c-b28b-3fbcdf367999-config-volume\") pod \"collect-profiles-29425005-jc4rq\" (UID: \"a80843b7-dee1-423c-b28b-3fbcdf367999\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29425005-jc4rq" Dec 12 00:45:00 crc kubenswrapper[4606]: I1212 00:45:00.418528 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a80843b7-dee1-423c-b28b-3fbcdf367999-secret-volume\") pod \"collect-profiles-29425005-jc4rq\" (UID: \"a80843b7-dee1-423c-b28b-3fbcdf367999\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29425005-jc4rq" Dec 12 00:45:00 crc kubenswrapper[4606]: I1212 00:45:00.437757 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b47c7\" (UniqueName: \"kubernetes.io/projected/a80843b7-dee1-423c-b28b-3fbcdf367999-kube-api-access-b47c7\") pod \"collect-profiles-29425005-jc4rq\" (UID: \"a80843b7-dee1-423c-b28b-3fbcdf367999\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29425005-jc4rq" Dec 12 00:45:00 crc kubenswrapper[4606]: I1212 00:45:00.503995 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29425005-jc4rq" Dec 12 00:45:00 crc kubenswrapper[4606]: I1212 00:45:00.881167 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6963a48d-4eff-4349-bc36-2356ec73c08c","Type":"ContainerStarted","Data":"cfefaed313a195e1bcc369c896675b2c626cc8ae4abfe735ffa90f2042ee0f6e"} Dec 12 00:45:00 crc kubenswrapper[4606]: I1212 00:45:00.881345 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6963a48d-4eff-4349-bc36-2356ec73c08c","Type":"ContainerStarted","Data":"aedde826d9f1ef5e25d319592325d5ebffcae11fe106a41bcbb1369ef7a96c65"} Dec 12 00:45:00 crc kubenswrapper[4606]: I1212 00:45:00.881355 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6963a48d-4eff-4349-bc36-2356ec73c08c","Type":"ContainerStarted","Data":"4935a8b5c891c6b80cf0db4b478dedce3e4e748b9e3aa94b2a5de67201107ca7"} Dec 12 00:45:01 crc kubenswrapper[4606]: I1212 00:45:01.015444 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29425005-jc4rq"] Dec 12 00:45:01 crc kubenswrapper[4606]: I1212 00:45:01.893737 4606 generic.go:334] "Generic (PLEG): container finished" podID="a80843b7-dee1-423c-b28b-3fbcdf367999" containerID="e99e5aad3425459457c138f7c529a53e5bfb99297b683e12d0fdd1457fd6c6e2" exitCode=0 Dec 12 00:45:01 crc kubenswrapper[4606]: I1212 00:45:01.893847 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29425005-jc4rq" event={"ID":"a80843b7-dee1-423c-b28b-3fbcdf367999","Type":"ContainerDied","Data":"e99e5aad3425459457c138f7c529a53e5bfb99297b683e12d0fdd1457fd6c6e2"} Dec 12 00:45:01 crc kubenswrapper[4606]: I1212 00:45:01.894151 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29425005-jc4rq" event={"ID":"a80843b7-dee1-423c-b28b-3fbcdf367999","Type":"ContainerStarted","Data":"ffec15cd1d4bb6f4acaa4693fc53f37dd9571bb9d04058309732034c2986aa48"} Dec 12 00:45:01 crc kubenswrapper[4606]: I1212 00:45:01.898455 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6963a48d-4eff-4349-bc36-2356ec73c08c","Type":"ContainerStarted","Data":"af28b65435015d29d9623c3859248859f3934d12e6d6bd33a869fe8a4e39c926"} Dec 12 00:45:02 crc kubenswrapper[4606]: I1212 00:45:02.010633 4606 patch_prober.go:28] interesting pod/machine-config-daemon-cqmz5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 12 00:45:02 crc kubenswrapper[4606]: I1212 00:45:02.010713 4606 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 12 00:45:02 crc kubenswrapper[4606]: I1212 00:45:02.010780 4606 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" Dec 12 00:45:02 crc kubenswrapper[4606]: I1212 00:45:02.011522 4606 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e80327b00df207db6b4792ec6ccf6cd67956fd25c801ee29c3af3ba674cbe9cc"} pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 12 00:45:02 crc kubenswrapper[4606]: I1212 00:45:02.011582 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" containerName="machine-config-daemon" containerID="cri-o://e80327b00df207db6b4792ec6ccf6cd67956fd25c801ee29c3af3ba674cbe9cc" gracePeriod=600 Dec 12 00:45:02 crc kubenswrapper[4606]: I1212 00:45:02.910817 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6963a48d-4eff-4349-bc36-2356ec73c08c","Type":"ContainerStarted","Data":"fd12dc7f1c2f03242f4354e990fba2720b0f37d6d03a1eb30d8a2ff04965b035"} Dec 12 00:45:02 crc kubenswrapper[4606]: I1212 00:45:02.912038 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6963a48d-4eff-4349-bc36-2356ec73c08c","Type":"ContainerStarted","Data":"8ef189df5a0b2069d9fc34287c85d4ae3c1b87149f04495437c91f05d6e45600"} Dec 12 00:45:02 crc kubenswrapper[4606]: I1212 00:45:02.923725 4606 generic.go:334] "Generic (PLEG): container finished" podID="a543e227-be89-40cb-941d-b4707cc28921" containerID="e80327b00df207db6b4792ec6ccf6cd67956fd25c801ee29c3af3ba674cbe9cc" exitCode=0 Dec 12 00:45:02 crc kubenswrapper[4606]: I1212 00:45:02.923774 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" event={"ID":"a543e227-be89-40cb-941d-b4707cc28921","Type":"ContainerDied","Data":"e80327b00df207db6b4792ec6ccf6cd67956fd25c801ee29c3af3ba674cbe9cc"} Dec 12 00:45:02 crc kubenswrapper[4606]: I1212 00:45:02.923806 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" event={"ID":"a543e227-be89-40cb-941d-b4707cc28921","Type":"ContainerStarted","Data":"77f50006b910ef408eff1d278d14d4c6df84559c43dc90a75f9e7ad4aa5dad37"} Dec 12 00:45:02 crc kubenswrapper[4606]: I1212 00:45:02.923822 4606 scope.go:117] "RemoveContainer" containerID="98193dc190ed04d9478e682edb5e4363e657a585ee1347d2eb910b80fed16f3f" Dec 12 00:45:03 crc kubenswrapper[4606]: I1212 00:45:03.300825 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29425005-jc4rq" Dec 12 00:45:03 crc kubenswrapper[4606]: I1212 00:45:03.467679 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b47c7\" (UniqueName: \"kubernetes.io/projected/a80843b7-dee1-423c-b28b-3fbcdf367999-kube-api-access-b47c7\") pod \"a80843b7-dee1-423c-b28b-3fbcdf367999\" (UID: \"a80843b7-dee1-423c-b28b-3fbcdf367999\") " Dec 12 00:45:03 crc kubenswrapper[4606]: I1212 00:45:03.467766 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a80843b7-dee1-423c-b28b-3fbcdf367999-config-volume\") pod \"a80843b7-dee1-423c-b28b-3fbcdf367999\" (UID: \"a80843b7-dee1-423c-b28b-3fbcdf367999\") " Dec 12 00:45:03 crc kubenswrapper[4606]: I1212 00:45:03.467911 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a80843b7-dee1-423c-b28b-3fbcdf367999-secret-volume\") pod \"a80843b7-dee1-423c-b28b-3fbcdf367999\" (UID: \"a80843b7-dee1-423c-b28b-3fbcdf367999\") " Dec 12 00:45:03 crc kubenswrapper[4606]: I1212 00:45:03.469594 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a80843b7-dee1-423c-b28b-3fbcdf367999-config-volume" (OuterVolumeSpecName: "config-volume") pod "a80843b7-dee1-423c-b28b-3fbcdf367999" (UID: "a80843b7-dee1-423c-b28b-3fbcdf367999"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:45:03 crc kubenswrapper[4606]: I1212 00:45:03.478396 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a80843b7-dee1-423c-b28b-3fbcdf367999-kube-api-access-b47c7" (OuterVolumeSpecName: "kube-api-access-b47c7") pod "a80843b7-dee1-423c-b28b-3fbcdf367999" (UID: "a80843b7-dee1-423c-b28b-3fbcdf367999"). InnerVolumeSpecName "kube-api-access-b47c7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:45:03 crc kubenswrapper[4606]: I1212 00:45:03.479224 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a80843b7-dee1-423c-b28b-3fbcdf367999-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "a80843b7-dee1-423c-b28b-3fbcdf367999" (UID: "a80843b7-dee1-423c-b28b-3fbcdf367999"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:45:03 crc kubenswrapper[4606]: I1212 00:45:03.571017 4606 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a80843b7-dee1-423c-b28b-3fbcdf367999-config-volume\") on node \"crc\" DevicePath \"\"" Dec 12 00:45:03 crc kubenswrapper[4606]: I1212 00:45:03.571052 4606 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a80843b7-dee1-423c-b28b-3fbcdf367999-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 12 00:45:03 crc kubenswrapper[4606]: I1212 00:45:03.571062 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b47c7\" (UniqueName: \"kubernetes.io/projected/a80843b7-dee1-423c-b28b-3fbcdf367999-kube-api-access-b47c7\") on node \"crc\" DevicePath \"\"" Dec 12 00:45:03 crc kubenswrapper[4606]: I1212 00:45:03.935370 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6963a48d-4eff-4349-bc36-2356ec73c08c","Type":"ContainerStarted","Data":"31eb18085840f9f7c7e24f130ff0c34484b6d4de0ebc848939e8249c8e509442"} Dec 12 00:45:03 crc kubenswrapper[4606]: I1212 00:45:03.936622 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6963a48d-4eff-4349-bc36-2356ec73c08c","Type":"ContainerStarted","Data":"f22fb2c83c287d5a3c8a63f9c15891d4c97f8af2e4d1b03ee2e8509cf57d003f"} Dec 12 00:45:03 crc kubenswrapper[4606]: I1212 00:45:03.941392 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29425005-jc4rq" event={"ID":"a80843b7-dee1-423c-b28b-3fbcdf367999","Type":"ContainerDied","Data":"ffec15cd1d4bb6f4acaa4693fc53f37dd9571bb9d04058309732034c2986aa48"} Dec 12 00:45:03 crc kubenswrapper[4606]: I1212 00:45:03.941431 4606 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ffec15cd1d4bb6f4acaa4693fc53f37dd9571bb9d04058309732034c2986aa48" Dec 12 00:45:03 crc kubenswrapper[4606]: I1212 00:45:03.941679 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29425005-jc4rq" Dec 12 00:45:04 crc kubenswrapper[4606]: I1212 00:45:04.975285 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6963a48d-4eff-4349-bc36-2356ec73c08c","Type":"ContainerStarted","Data":"0d816c60ec3817802b1c51565dccefc4efb48830429e4b362a4046d3b8f25e4b"} Dec 12 00:45:04 crc kubenswrapper[4606]: I1212 00:45:04.976322 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6963a48d-4eff-4349-bc36-2356ec73c08c","Type":"ContainerStarted","Data":"a4b009182958886b5e73eedda8aabef469d4e3cb46b0888df4c80e230cca2705"} Dec 12 00:45:05 crc kubenswrapper[4606]: I1212 00:45:05.988782 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6963a48d-4eff-4349-bc36-2356ec73c08c","Type":"ContainerStarted","Data":"072afa5b078b4eb31ffc0a2427b161d2f942c9795b3387f1ae586378af8460be"} Dec 12 00:45:05 crc kubenswrapper[4606]: I1212 00:45:05.990413 4606 generic.go:334] "Generic (PLEG): container finished" podID="9db5468c-db5e-4cbb-a854-d3e805d9744e" containerID="8944007f23342877bee55299498a19ce188152f5110dc69dad0fc44d79539aa2" exitCode=0 Dec 12 00:45:05 crc kubenswrapper[4606]: I1212 00:45:05.990463 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-hhlpv" event={"ID":"9db5468c-db5e-4cbb-a854-d3e805d9744e","Type":"ContainerDied","Data":"8944007f23342877bee55299498a19ce188152f5110dc69dad0fc44d79539aa2"} Dec 12 00:45:07 crc kubenswrapper[4606]: I1212 00:45:07.522516 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 12 00:45:07 crc kubenswrapper[4606]: I1212 00:45:07.835270 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-8mpd7"] Dec 12 00:45:07 crc kubenswrapper[4606]: E1212 00:45:07.836086 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a80843b7-dee1-423c-b28b-3fbcdf367999" containerName="collect-profiles" Dec 12 00:45:07 crc kubenswrapper[4606]: I1212 00:45:07.836167 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="a80843b7-dee1-423c-b28b-3fbcdf367999" containerName="collect-profiles" Dec 12 00:45:07 crc kubenswrapper[4606]: I1212 00:45:07.836416 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="a80843b7-dee1-423c-b28b-3fbcdf367999" containerName="collect-profiles" Dec 12 00:45:07 crc kubenswrapper[4606]: I1212 00:45:07.836949 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-8mpd7" Dec 12 00:45:07 crc kubenswrapper[4606]: I1212 00:45:07.848318 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-8mpd7"] Dec 12 00:45:07 crc kubenswrapper[4606]: I1212 00:45:07.915372 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 12 00:45:07 crc kubenswrapper[4606]: I1212 00:45:07.944141 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b00e1c7b-96a5-46de-8f2e-66ed8ff9275c-operator-scripts\") pod \"cinder-db-create-8mpd7\" (UID: \"b00e1c7b-96a5-46de-8f2e-66ed8ff9275c\") " pod="openstack/cinder-db-create-8mpd7" Dec 12 00:45:07 crc kubenswrapper[4606]: I1212 00:45:07.944471 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t442m\" (UniqueName: \"kubernetes.io/projected/b00e1c7b-96a5-46de-8f2e-66ed8ff9275c-kube-api-access-t442m\") pod \"cinder-db-create-8mpd7\" (UID: \"b00e1c7b-96a5-46de-8f2e-66ed8ff9275c\") " pod="openstack/cinder-db-create-8mpd7" Dec 12 00:45:08 crc kubenswrapper[4606]: I1212 00:45:08.007402 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-019c-account-create-update-h2brj"] Dec 12 00:45:08 crc kubenswrapper[4606]: I1212 00:45:08.008513 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-019c-account-create-update-h2brj" Dec 12 00:45:08 crc kubenswrapper[4606]: I1212 00:45:08.012415 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Dec 12 00:45:08 crc kubenswrapper[4606]: I1212 00:45:08.045564 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b00e1c7b-96a5-46de-8f2e-66ed8ff9275c-operator-scripts\") pod \"cinder-db-create-8mpd7\" (UID: \"b00e1c7b-96a5-46de-8f2e-66ed8ff9275c\") " pod="openstack/cinder-db-create-8mpd7" Dec 12 00:45:08 crc kubenswrapper[4606]: I1212 00:45:08.045610 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t442m\" (UniqueName: \"kubernetes.io/projected/b00e1c7b-96a5-46de-8f2e-66ed8ff9275c-kube-api-access-t442m\") pod \"cinder-db-create-8mpd7\" (UID: \"b00e1c7b-96a5-46de-8f2e-66ed8ff9275c\") " pod="openstack/cinder-db-create-8mpd7" Dec 12 00:45:08 crc kubenswrapper[4606]: I1212 00:45:08.046746 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b00e1c7b-96a5-46de-8f2e-66ed8ff9275c-operator-scripts\") pod \"cinder-db-create-8mpd7\" (UID: \"b00e1c7b-96a5-46de-8f2e-66ed8ff9275c\") " pod="openstack/cinder-db-create-8mpd7" Dec 12 00:45:08 crc kubenswrapper[4606]: I1212 00:45:08.063291 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6963a48d-4eff-4349-bc36-2356ec73c08c","Type":"ContainerStarted","Data":"aa9eb720caa0e75d2ef3627d4870cbb783eaae34bee2db03cc651827a61ceda6"} Dec 12 00:45:08 crc kubenswrapper[4606]: I1212 00:45:08.071256 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-019c-account-create-update-h2brj"] Dec 12 00:45:08 crc kubenswrapper[4606]: I1212 00:45:08.077855 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t442m\" (UniqueName: \"kubernetes.io/projected/b00e1c7b-96a5-46de-8f2e-66ed8ff9275c-kube-api-access-t442m\") pod \"cinder-db-create-8mpd7\" (UID: \"b00e1c7b-96a5-46de-8f2e-66ed8ff9275c\") " pod="openstack/cinder-db-create-8mpd7" Dec 12 00:45:08 crc kubenswrapper[4606]: I1212 00:45:08.152824 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/be20f016-2639-4715-80af-5719a068a857-operator-scripts\") pod \"cinder-019c-account-create-update-h2brj\" (UID: \"be20f016-2639-4715-80af-5719a068a857\") " pod="openstack/cinder-019c-account-create-update-h2brj" Dec 12 00:45:08 crc kubenswrapper[4606]: I1212 00:45:08.153269 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxxj5\" (UniqueName: \"kubernetes.io/projected/be20f016-2639-4715-80af-5719a068a857-kube-api-access-rxxj5\") pod \"cinder-019c-account-create-update-h2brj\" (UID: \"be20f016-2639-4715-80af-5719a068a857\") " pod="openstack/cinder-019c-account-create-update-h2brj" Dec 12 00:45:08 crc kubenswrapper[4606]: I1212 00:45:08.161286 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-8mpd7" Dec 12 00:45:08 crc kubenswrapper[4606]: I1212 00:45:08.177874 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-nrs7j"] Dec 12 00:45:08 crc kubenswrapper[4606]: I1212 00:45:08.189593 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-nrs7j"] Dec 12 00:45:08 crc kubenswrapper[4606]: I1212 00:45:08.189707 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-nrs7j" Dec 12 00:45:08 crc kubenswrapper[4606]: I1212 00:45:08.231252 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-5080-account-create-update-ztlj6"] Dec 12 00:45:08 crc kubenswrapper[4606]: I1212 00:45:08.235494 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-5080-account-create-update-ztlj6" Dec 12 00:45:08 crc kubenswrapper[4606]: I1212 00:45:08.241812 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Dec 12 00:45:08 crc kubenswrapper[4606]: I1212 00:45:08.255991 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/be20f016-2639-4715-80af-5719a068a857-operator-scripts\") pod \"cinder-019c-account-create-update-h2brj\" (UID: \"be20f016-2639-4715-80af-5719a068a857\") " pod="openstack/cinder-019c-account-create-update-h2brj" Dec 12 00:45:08 crc kubenswrapper[4606]: I1212 00:45:08.256129 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxxj5\" (UniqueName: \"kubernetes.io/projected/be20f016-2639-4715-80af-5719a068a857-kube-api-access-rxxj5\") pod \"cinder-019c-account-create-update-h2brj\" (UID: \"be20f016-2639-4715-80af-5719a068a857\") " pod="openstack/cinder-019c-account-create-update-h2brj" Dec 12 00:45:08 crc kubenswrapper[4606]: I1212 00:45:08.257555 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/be20f016-2639-4715-80af-5719a068a857-operator-scripts\") pod \"cinder-019c-account-create-update-h2brj\" (UID: \"be20f016-2639-4715-80af-5719a068a857\") " pod="openstack/cinder-019c-account-create-update-h2brj" Dec 12 00:45:08 crc kubenswrapper[4606]: I1212 00:45:08.259534 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-5080-account-create-update-ztlj6"] Dec 12 00:45:08 crc kubenswrapper[4606]: I1212 00:45:08.306303 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxxj5\" (UniqueName: \"kubernetes.io/projected/be20f016-2639-4715-80af-5719a068a857-kube-api-access-rxxj5\") pod \"cinder-019c-account-create-update-h2brj\" (UID: \"be20f016-2639-4715-80af-5719a068a857\") " pod="openstack/cinder-019c-account-create-update-h2brj" Dec 12 00:45:08 crc kubenswrapper[4606]: I1212 00:45:08.339098 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-019c-account-create-update-h2brj" Dec 12 00:45:08 crc kubenswrapper[4606]: I1212 00:45:08.360949 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/92f2dac5-52f1-4438-aa85-059425ed7822-operator-scripts\") pod \"barbican-5080-account-create-update-ztlj6\" (UID: \"92f2dac5-52f1-4438-aa85-059425ed7822\") " pod="openstack/barbican-5080-account-create-update-ztlj6" Dec 12 00:45:08 crc kubenswrapper[4606]: I1212 00:45:08.361022 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6kdf\" (UniqueName: \"kubernetes.io/projected/92f2dac5-52f1-4438-aa85-059425ed7822-kube-api-access-q6kdf\") pod \"barbican-5080-account-create-update-ztlj6\" (UID: \"92f2dac5-52f1-4438-aa85-059425ed7822\") " pod="openstack/barbican-5080-account-create-update-ztlj6" Dec 12 00:45:08 crc kubenswrapper[4606]: I1212 00:45:08.361089 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjn6b\" (UniqueName: \"kubernetes.io/projected/5d88d3ae-b52d-4a92-9bdb-ba5d5541abaa-kube-api-access-pjn6b\") pod \"barbican-db-create-nrs7j\" (UID: \"5d88d3ae-b52d-4a92-9bdb-ba5d5541abaa\") " pod="openstack/barbican-db-create-nrs7j" Dec 12 00:45:08 crc kubenswrapper[4606]: I1212 00:45:08.361122 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5d88d3ae-b52d-4a92-9bdb-ba5d5541abaa-operator-scripts\") pod \"barbican-db-create-nrs7j\" (UID: \"5d88d3ae-b52d-4a92-9bdb-ba5d5541abaa\") " pod="openstack/barbican-db-create-nrs7j" Dec 12 00:45:08 crc kubenswrapper[4606]: I1212 00:45:08.374330 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-4rtsn"] Dec 12 00:45:08 crc kubenswrapper[4606]: I1212 00:45:08.375483 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-4rtsn" Dec 12 00:45:08 crc kubenswrapper[4606]: I1212 00:45:08.417908 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-4rtsn"] Dec 12 00:45:08 crc kubenswrapper[4606]: I1212 00:45:08.463001 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjn6b\" (UniqueName: \"kubernetes.io/projected/5d88d3ae-b52d-4a92-9bdb-ba5d5541abaa-kube-api-access-pjn6b\") pod \"barbican-db-create-nrs7j\" (UID: \"5d88d3ae-b52d-4a92-9bdb-ba5d5541abaa\") " pod="openstack/barbican-db-create-nrs7j" Dec 12 00:45:08 crc kubenswrapper[4606]: I1212 00:45:08.463060 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjpjt\" (UniqueName: \"kubernetes.io/projected/0ea658e5-7318-49c3-ab43-00c885902e4f-kube-api-access-fjpjt\") pod \"neutron-db-create-4rtsn\" (UID: \"0ea658e5-7318-49c3-ab43-00c885902e4f\") " pod="openstack/neutron-db-create-4rtsn" Dec 12 00:45:08 crc kubenswrapper[4606]: I1212 00:45:08.463089 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5d88d3ae-b52d-4a92-9bdb-ba5d5541abaa-operator-scripts\") pod \"barbican-db-create-nrs7j\" (UID: \"5d88d3ae-b52d-4a92-9bdb-ba5d5541abaa\") " pod="openstack/barbican-db-create-nrs7j" Dec 12 00:45:08 crc kubenswrapper[4606]: I1212 00:45:08.463192 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/92f2dac5-52f1-4438-aa85-059425ed7822-operator-scripts\") pod \"barbican-5080-account-create-update-ztlj6\" (UID: \"92f2dac5-52f1-4438-aa85-059425ed7822\") " pod="openstack/barbican-5080-account-create-update-ztlj6" Dec 12 00:45:08 crc kubenswrapper[4606]: I1212 00:45:08.463229 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6kdf\" (UniqueName: \"kubernetes.io/projected/92f2dac5-52f1-4438-aa85-059425ed7822-kube-api-access-q6kdf\") pod \"barbican-5080-account-create-update-ztlj6\" (UID: \"92f2dac5-52f1-4438-aa85-059425ed7822\") " pod="openstack/barbican-5080-account-create-update-ztlj6" Dec 12 00:45:08 crc kubenswrapper[4606]: I1212 00:45:08.463255 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0ea658e5-7318-49c3-ab43-00c885902e4f-operator-scripts\") pod \"neutron-db-create-4rtsn\" (UID: \"0ea658e5-7318-49c3-ab43-00c885902e4f\") " pod="openstack/neutron-db-create-4rtsn" Dec 12 00:45:08 crc kubenswrapper[4606]: I1212 00:45:08.464209 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/92f2dac5-52f1-4438-aa85-059425ed7822-operator-scripts\") pod \"barbican-5080-account-create-update-ztlj6\" (UID: \"92f2dac5-52f1-4438-aa85-059425ed7822\") " pod="openstack/barbican-5080-account-create-update-ztlj6" Dec 12 00:45:08 crc kubenswrapper[4606]: I1212 00:45:08.465003 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5d88d3ae-b52d-4a92-9bdb-ba5d5541abaa-operator-scripts\") pod \"barbican-db-create-nrs7j\" (UID: \"5d88d3ae-b52d-4a92-9bdb-ba5d5541abaa\") " pod="openstack/barbican-db-create-nrs7j" Dec 12 00:45:08 crc kubenswrapper[4606]: I1212 00:45:08.494573 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjn6b\" (UniqueName: \"kubernetes.io/projected/5d88d3ae-b52d-4a92-9bdb-ba5d5541abaa-kube-api-access-pjn6b\") pod \"barbican-db-create-nrs7j\" (UID: \"5d88d3ae-b52d-4a92-9bdb-ba5d5541abaa\") " pod="openstack/barbican-db-create-nrs7j" Dec 12 00:45:08 crc kubenswrapper[4606]: I1212 00:45:08.547983 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-nrs7j" Dec 12 00:45:08 crc kubenswrapper[4606]: I1212 00:45:08.559134 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6kdf\" (UniqueName: \"kubernetes.io/projected/92f2dac5-52f1-4438-aa85-059425ed7822-kube-api-access-q6kdf\") pod \"barbican-5080-account-create-update-ztlj6\" (UID: \"92f2dac5-52f1-4438-aa85-059425ed7822\") " pod="openstack/barbican-5080-account-create-update-ztlj6" Dec 12 00:45:08 crc kubenswrapper[4606]: I1212 00:45:08.565618 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0ea658e5-7318-49c3-ab43-00c885902e4f-operator-scripts\") pod \"neutron-db-create-4rtsn\" (UID: \"0ea658e5-7318-49c3-ab43-00c885902e4f\") " pod="openstack/neutron-db-create-4rtsn" Dec 12 00:45:08 crc kubenswrapper[4606]: I1212 00:45:08.565684 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjpjt\" (UniqueName: \"kubernetes.io/projected/0ea658e5-7318-49c3-ab43-00c885902e4f-kube-api-access-fjpjt\") pod \"neutron-db-create-4rtsn\" (UID: \"0ea658e5-7318-49c3-ab43-00c885902e4f\") " pod="openstack/neutron-db-create-4rtsn" Dec 12 00:45:08 crc kubenswrapper[4606]: I1212 00:45:08.566717 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0ea658e5-7318-49c3-ab43-00c885902e4f-operator-scripts\") pod \"neutron-db-create-4rtsn\" (UID: \"0ea658e5-7318-49c3-ab43-00c885902e4f\") " pod="openstack/neutron-db-create-4rtsn" Dec 12 00:45:08 crc kubenswrapper[4606]: I1212 00:45:08.602996 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjpjt\" (UniqueName: \"kubernetes.io/projected/0ea658e5-7318-49c3-ab43-00c885902e4f-kube-api-access-fjpjt\") pod \"neutron-db-create-4rtsn\" (UID: \"0ea658e5-7318-49c3-ab43-00c885902e4f\") " pod="openstack/neutron-db-create-4rtsn" Dec 12 00:45:08 crc kubenswrapper[4606]: I1212 00:45:08.607599 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-j24jk"] Dec 12 00:45:08 crc kubenswrapper[4606]: I1212 00:45:08.619346 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-j24jk" Dec 12 00:45:08 crc kubenswrapper[4606]: I1212 00:45:08.628934 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5d1d-account-create-update-r2rq9"] Dec 12 00:45:08 crc kubenswrapper[4606]: I1212 00:45:08.630185 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5d1d-account-create-update-r2rq9" Dec 12 00:45:08 crc kubenswrapper[4606]: I1212 00:45:08.633640 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-wkdqx" Dec 12 00:45:08 crc kubenswrapper[4606]: I1212 00:45:08.633776 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 12 00:45:08 crc kubenswrapper[4606]: I1212 00:45:08.633738 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 12 00:45:08 crc kubenswrapper[4606]: I1212 00:45:08.634094 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 12 00:45:08 crc kubenswrapper[4606]: I1212 00:45:08.638402 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5d1d-account-create-update-r2rq9"] Dec 12 00:45:08 crc kubenswrapper[4606]: I1212 00:45:08.645794 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-j24jk"] Dec 12 00:45:08 crc kubenswrapper[4606]: I1212 00:45:08.649702 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Dec 12 00:45:08 crc kubenswrapper[4606]: I1212 00:45:08.764649 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-5080-account-create-update-ztlj6" Dec 12 00:45:08 crc kubenswrapper[4606]: I1212 00:45:08.768766 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-hhlpv" Dec 12 00:45:08 crc kubenswrapper[4606]: I1212 00:45:08.770509 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbz25\" (UniqueName: \"kubernetes.io/projected/a58ecb89-37d3-4d35-9a1a-9820df05848e-kube-api-access-xbz25\") pod \"keystone-db-sync-j24jk\" (UID: \"a58ecb89-37d3-4d35-9a1a-9820df05848e\") " pod="openstack/keystone-db-sync-j24jk" Dec 12 00:45:08 crc kubenswrapper[4606]: I1212 00:45:08.770559 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a58ecb89-37d3-4d35-9a1a-9820df05848e-config-data\") pod \"keystone-db-sync-j24jk\" (UID: \"a58ecb89-37d3-4d35-9a1a-9820df05848e\") " pod="openstack/keystone-db-sync-j24jk" Dec 12 00:45:08 crc kubenswrapper[4606]: I1212 00:45:08.770628 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5v2r\" (UniqueName: \"kubernetes.io/projected/18577071-2d04-4c75-99fd-cd721836a571-kube-api-access-j5v2r\") pod \"neutron-5d1d-account-create-update-r2rq9\" (UID: \"18577071-2d04-4c75-99fd-cd721836a571\") " pod="openstack/neutron-5d1d-account-create-update-r2rq9" Dec 12 00:45:08 crc kubenswrapper[4606]: I1212 00:45:08.770658 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/18577071-2d04-4c75-99fd-cd721836a571-operator-scripts\") pod \"neutron-5d1d-account-create-update-r2rq9\" (UID: \"18577071-2d04-4c75-99fd-cd721836a571\") " pod="openstack/neutron-5d1d-account-create-update-r2rq9" Dec 12 00:45:08 crc kubenswrapper[4606]: I1212 00:45:08.770682 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a58ecb89-37d3-4d35-9a1a-9820df05848e-combined-ca-bundle\") pod \"keystone-db-sync-j24jk\" (UID: \"a58ecb89-37d3-4d35-9a1a-9820df05848e\") " pod="openstack/keystone-db-sync-j24jk" Dec 12 00:45:08 crc kubenswrapper[4606]: I1212 00:45:08.770785 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-4rtsn" Dec 12 00:45:08 crc kubenswrapper[4606]: I1212 00:45:08.871790 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9db5468c-db5e-4cbb-a854-d3e805d9744e-combined-ca-bundle\") pod \"9db5468c-db5e-4cbb-a854-d3e805d9744e\" (UID: \"9db5468c-db5e-4cbb-a854-d3e805d9744e\") " Dec 12 00:45:08 crc kubenswrapper[4606]: I1212 00:45:08.871906 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h4wll\" (UniqueName: \"kubernetes.io/projected/9db5468c-db5e-4cbb-a854-d3e805d9744e-kube-api-access-h4wll\") pod \"9db5468c-db5e-4cbb-a854-d3e805d9744e\" (UID: \"9db5468c-db5e-4cbb-a854-d3e805d9744e\") " Dec 12 00:45:08 crc kubenswrapper[4606]: I1212 00:45:08.871994 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9db5468c-db5e-4cbb-a854-d3e805d9744e-db-sync-config-data\") pod \"9db5468c-db5e-4cbb-a854-d3e805d9744e\" (UID: \"9db5468c-db5e-4cbb-a854-d3e805d9744e\") " Dec 12 00:45:08 crc kubenswrapper[4606]: I1212 00:45:08.872027 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9db5468c-db5e-4cbb-a854-d3e805d9744e-config-data\") pod \"9db5468c-db5e-4cbb-a854-d3e805d9744e\" (UID: \"9db5468c-db5e-4cbb-a854-d3e805d9744e\") " Dec 12 00:45:08 crc kubenswrapper[4606]: I1212 00:45:08.872470 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbz25\" (UniqueName: \"kubernetes.io/projected/a58ecb89-37d3-4d35-9a1a-9820df05848e-kube-api-access-xbz25\") pod \"keystone-db-sync-j24jk\" (UID: \"a58ecb89-37d3-4d35-9a1a-9820df05848e\") " pod="openstack/keystone-db-sync-j24jk" Dec 12 00:45:08 crc kubenswrapper[4606]: I1212 00:45:08.872514 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a58ecb89-37d3-4d35-9a1a-9820df05848e-config-data\") pod \"keystone-db-sync-j24jk\" (UID: \"a58ecb89-37d3-4d35-9a1a-9820df05848e\") " pod="openstack/keystone-db-sync-j24jk" Dec 12 00:45:08 crc kubenswrapper[4606]: I1212 00:45:08.872597 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5v2r\" (UniqueName: \"kubernetes.io/projected/18577071-2d04-4c75-99fd-cd721836a571-kube-api-access-j5v2r\") pod \"neutron-5d1d-account-create-update-r2rq9\" (UID: \"18577071-2d04-4c75-99fd-cd721836a571\") " pod="openstack/neutron-5d1d-account-create-update-r2rq9" Dec 12 00:45:08 crc kubenswrapper[4606]: I1212 00:45:08.872623 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/18577071-2d04-4c75-99fd-cd721836a571-operator-scripts\") pod \"neutron-5d1d-account-create-update-r2rq9\" (UID: \"18577071-2d04-4c75-99fd-cd721836a571\") " pod="openstack/neutron-5d1d-account-create-update-r2rq9" Dec 12 00:45:08 crc kubenswrapper[4606]: I1212 00:45:08.872648 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a58ecb89-37d3-4d35-9a1a-9820df05848e-combined-ca-bundle\") pod \"keystone-db-sync-j24jk\" (UID: \"a58ecb89-37d3-4d35-9a1a-9820df05848e\") " pod="openstack/keystone-db-sync-j24jk" Dec 12 00:45:08 crc kubenswrapper[4606]: I1212 00:45:08.878862 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/18577071-2d04-4c75-99fd-cd721836a571-operator-scripts\") pod \"neutron-5d1d-account-create-update-r2rq9\" (UID: \"18577071-2d04-4c75-99fd-cd721836a571\") " pod="openstack/neutron-5d1d-account-create-update-r2rq9" Dec 12 00:45:08 crc kubenswrapper[4606]: I1212 00:45:08.887887 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9db5468c-db5e-4cbb-a854-d3e805d9744e-kube-api-access-h4wll" (OuterVolumeSpecName: "kube-api-access-h4wll") pod "9db5468c-db5e-4cbb-a854-d3e805d9744e" (UID: "9db5468c-db5e-4cbb-a854-d3e805d9744e"). InnerVolumeSpecName "kube-api-access-h4wll". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:45:08 crc kubenswrapper[4606]: I1212 00:45:08.888885 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a58ecb89-37d3-4d35-9a1a-9820df05848e-combined-ca-bundle\") pod \"keystone-db-sync-j24jk\" (UID: \"a58ecb89-37d3-4d35-9a1a-9820df05848e\") " pod="openstack/keystone-db-sync-j24jk" Dec 12 00:45:08 crc kubenswrapper[4606]: I1212 00:45:08.890706 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a58ecb89-37d3-4d35-9a1a-9820df05848e-config-data\") pod \"keystone-db-sync-j24jk\" (UID: \"a58ecb89-37d3-4d35-9a1a-9820df05848e\") " pod="openstack/keystone-db-sync-j24jk" Dec 12 00:45:08 crc kubenswrapper[4606]: I1212 00:45:08.894415 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9db5468c-db5e-4cbb-a854-d3e805d9744e-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "9db5468c-db5e-4cbb-a854-d3e805d9744e" (UID: "9db5468c-db5e-4cbb-a854-d3e805d9744e"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:45:08 crc kubenswrapper[4606]: I1212 00:45:08.923769 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbz25\" (UniqueName: \"kubernetes.io/projected/a58ecb89-37d3-4d35-9a1a-9820df05848e-kube-api-access-xbz25\") pod \"keystone-db-sync-j24jk\" (UID: \"a58ecb89-37d3-4d35-9a1a-9820df05848e\") " pod="openstack/keystone-db-sync-j24jk" Dec 12 00:45:08 crc kubenswrapper[4606]: I1212 00:45:08.931026 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5v2r\" (UniqueName: \"kubernetes.io/projected/18577071-2d04-4c75-99fd-cd721836a571-kube-api-access-j5v2r\") pod \"neutron-5d1d-account-create-update-r2rq9\" (UID: \"18577071-2d04-4c75-99fd-cd721836a571\") " pod="openstack/neutron-5d1d-account-create-update-r2rq9" Dec 12 00:45:08 crc kubenswrapper[4606]: I1212 00:45:08.972283 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9db5468c-db5e-4cbb-a854-d3e805d9744e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9db5468c-db5e-4cbb-a854-d3e805d9744e" (UID: "9db5468c-db5e-4cbb-a854-d3e805d9744e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:45:08 crc kubenswrapper[4606]: I1212 00:45:08.976359 4606 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9db5468c-db5e-4cbb-a854-d3e805d9744e-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 12 00:45:08 crc kubenswrapper[4606]: I1212 00:45:08.976395 4606 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9db5468c-db5e-4cbb-a854-d3e805d9744e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 12 00:45:08 crc kubenswrapper[4606]: I1212 00:45:08.976405 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h4wll\" (UniqueName: \"kubernetes.io/projected/9db5468c-db5e-4cbb-a854-d3e805d9744e-kube-api-access-h4wll\") on node \"crc\" DevicePath \"\"" Dec 12 00:45:08 crc kubenswrapper[4606]: I1212 00:45:08.993464 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-j24jk" Dec 12 00:45:08 crc kubenswrapper[4606]: I1212 00:45:08.998538 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5d1d-account-create-update-r2rq9" Dec 12 00:45:09 crc kubenswrapper[4606]: I1212 00:45:09.016792 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9db5468c-db5e-4cbb-a854-d3e805d9744e-config-data" (OuterVolumeSpecName: "config-data") pod "9db5468c-db5e-4cbb-a854-d3e805d9744e" (UID: "9db5468c-db5e-4cbb-a854-d3e805d9744e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:45:09 crc kubenswrapper[4606]: I1212 00:45:09.053341 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-8mpd7"] Dec 12 00:45:09 crc kubenswrapper[4606]: I1212 00:45:09.079045 4606 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9db5468c-db5e-4cbb-a854-d3e805d9744e-config-data\") on node \"crc\" DevicePath \"\"" Dec 12 00:45:09 crc kubenswrapper[4606]: I1212 00:45:09.147698 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6963a48d-4eff-4349-bc36-2356ec73c08c","Type":"ContainerStarted","Data":"0c7f3ffc2ea0c381903e654819c90f68a82ebf6b82fab6f8932fa09a67561e61"} Dec 12 00:45:09 crc kubenswrapper[4606]: I1212 00:45:09.169809 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-hhlpv" event={"ID":"9db5468c-db5e-4cbb-a854-d3e805d9744e","Type":"ContainerDied","Data":"85882ecd292359054740918b438d6aa4b74f055c5c8bc7fe9a73dc2ebef109f4"} Dec 12 00:45:09 crc kubenswrapper[4606]: I1212 00:45:09.169849 4606 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="85882ecd292359054740918b438d6aa4b74f055c5c8bc7fe9a73dc2ebef109f4" Dec 12 00:45:09 crc kubenswrapper[4606]: I1212 00:45:09.169908 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-hhlpv" Dec 12 00:45:09 crc kubenswrapper[4606]: W1212 00:45:09.172408 4606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb00e1c7b_96a5_46de_8f2e_66ed8ff9275c.slice/crio-5cee1959311ff9754ac3a4c3bd618855096fd46052eccf453f7074a8403a8fa0 WatchSource:0}: Error finding container 5cee1959311ff9754ac3a4c3bd618855096fd46052eccf453f7074a8403a8fa0: Status 404 returned error can't find the container with id 5cee1959311ff9754ac3a4c3bd618855096fd46052eccf453f7074a8403a8fa0 Dec 12 00:45:09 crc kubenswrapper[4606]: I1212 00:45:09.505156 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-019c-account-create-update-h2brj"] Dec 12 00:45:09 crc kubenswrapper[4606]: I1212 00:45:09.587201 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-nrs7j"] Dec 12 00:45:09 crc kubenswrapper[4606]: I1212 00:45:09.626150 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-5080-account-create-update-ztlj6"] Dec 12 00:45:09 crc kubenswrapper[4606]: I1212 00:45:09.972852 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-4rtsn"] Dec 12 00:45:10 crc kubenswrapper[4606]: I1212 00:45:10.032045 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5d1d-account-create-update-r2rq9"] Dec 12 00:45:10 crc kubenswrapper[4606]: I1212 00:45:10.079252 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-j24jk"] Dec 12 00:45:10 crc kubenswrapper[4606]: I1212 00:45:10.242158 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-nrs7j" event={"ID":"5d88d3ae-b52d-4a92-9bdb-ba5d5541abaa","Type":"ContainerStarted","Data":"28dd3737c76d219cc9b1806a838aaa6e389ed9dc5614f000677c18465b7744ec"} Dec 12 00:45:10 crc kubenswrapper[4606]: I1212 00:45:10.242418 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-nrs7j" event={"ID":"5d88d3ae-b52d-4a92-9bdb-ba5d5541abaa","Type":"ContainerStarted","Data":"39b6a4bc0fce184e32bef85c40612c3f29be4c1f83517b7a82af50470e7ff488"} Dec 12 00:45:10 crc kubenswrapper[4606]: I1212 00:45:10.250996 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5d1d-account-create-update-r2rq9" event={"ID":"18577071-2d04-4c75-99fd-cd721836a571","Type":"ContainerStarted","Data":"765362ecf5875898861ae1419e00ce1b3a54c4be5093289c2b6469508b050feb"} Dec 12 00:45:10 crc kubenswrapper[4606]: I1212 00:45:10.252151 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-4rtsn" event={"ID":"0ea658e5-7318-49c3-ab43-00c885902e4f","Type":"ContainerStarted","Data":"38b2f19e1c99058db1edbdd0abf4e3a5f4ecd02d6a4e19811442534942b1499d"} Dec 12 00:45:10 crc kubenswrapper[4606]: I1212 00:45:10.277381 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-8mpd7" event={"ID":"b00e1c7b-96a5-46de-8f2e-66ed8ff9275c","Type":"ContainerStarted","Data":"2960132e59f66efa0c957f351e2825dee6705cc8549fd74234110e8fe867a2bf"} Dec 12 00:45:10 crc kubenswrapper[4606]: I1212 00:45:10.277421 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-8mpd7" event={"ID":"b00e1c7b-96a5-46de-8f2e-66ed8ff9275c","Type":"ContainerStarted","Data":"5cee1959311ff9754ac3a4c3bd618855096fd46052eccf453f7074a8403a8fa0"} Dec 12 00:45:10 crc kubenswrapper[4606]: I1212 00:45:10.282722 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-create-nrs7j" podStartSLOduration=2.282707243 podStartE2EDuration="2.282707243s" podCreationTimestamp="2025-12-12 00:45:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:45:10.272438687 +0000 UTC m=+1300.817791553" watchObservedRunningTime="2025-12-12 00:45:10.282707243 +0000 UTC m=+1300.828060109" Dec 12 00:45:10 crc kubenswrapper[4606]: I1212 00:45:10.297666 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-j24jk" event={"ID":"a58ecb89-37d3-4d35-9a1a-9820df05848e","Type":"ContainerStarted","Data":"a89b6129fedb42ded877a2fc205b4e982a69d5020aa53fc35a5828c112134a7d"} Dec 12 00:45:10 crc kubenswrapper[4606]: I1212 00:45:10.315478 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-create-8mpd7" podStartSLOduration=3.315459292 podStartE2EDuration="3.315459292s" podCreationTimestamp="2025-12-12 00:45:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:45:10.308346961 +0000 UTC m=+1300.853699827" watchObservedRunningTime="2025-12-12 00:45:10.315459292 +0000 UTC m=+1300.860812158" Dec 12 00:45:10 crc kubenswrapper[4606]: I1212 00:45:10.323526 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-5080-account-create-update-ztlj6" event={"ID":"92f2dac5-52f1-4438-aa85-059425ed7822","Type":"ContainerStarted","Data":"e76773484484c312e8bf03d6e9ac05b4491b9cd0a33d4352cfc58f910464c2e5"} Dec 12 00:45:10 crc kubenswrapper[4606]: I1212 00:45:10.327609 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-5080-account-create-update-ztlj6" event={"ID":"92f2dac5-52f1-4438-aa85-059425ed7822","Type":"ContainerStarted","Data":"eb252b5ca0081c62840a7096bc15f2aa9bab374502d16fe5eeade13595384b0f"} Dec 12 00:45:10 crc kubenswrapper[4606]: I1212 00:45:10.350523 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-74dc88fc-9rbg7"] Dec 12 00:45:10 crc kubenswrapper[4606]: E1212 00:45:10.358668 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9db5468c-db5e-4cbb-a854-d3e805d9744e" containerName="glance-db-sync" Dec 12 00:45:10 crc kubenswrapper[4606]: I1212 00:45:10.358701 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="9db5468c-db5e-4cbb-a854-d3e805d9744e" containerName="glance-db-sync" Dec 12 00:45:10 crc kubenswrapper[4606]: I1212 00:45:10.358865 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="9db5468c-db5e-4cbb-a854-d3e805d9744e" containerName="glance-db-sync" Dec 12 00:45:10 crc kubenswrapper[4606]: I1212 00:45:10.359798 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74dc88fc-9rbg7" Dec 12 00:45:10 crc kubenswrapper[4606]: I1212 00:45:10.368313 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74dc88fc-9rbg7"] Dec 12 00:45:10 crc kubenswrapper[4606]: I1212 00:45:10.389083 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-5080-account-create-update-ztlj6" podStartSLOduration=2.389064217 podStartE2EDuration="2.389064217s" podCreationTimestamp="2025-12-12 00:45:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:45:10.379075369 +0000 UTC m=+1300.924428245" watchObservedRunningTime="2025-12-12 00:45:10.389064217 +0000 UTC m=+1300.934417083" Dec 12 00:45:10 crc kubenswrapper[4606]: I1212 00:45:10.454427 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6963a48d-4eff-4349-bc36-2356ec73c08c","Type":"ContainerStarted","Data":"dcff755e36f3c435e3ddcba9f1765905e16e64f596a157e0250b37962b242d23"} Dec 12 00:45:10 crc kubenswrapper[4606]: I1212 00:45:10.454470 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6963a48d-4eff-4349-bc36-2356ec73c08c","Type":"ContainerStarted","Data":"124c0a8c272a3373b3a7371c10b0e7ac28a4d1877509848e9ffedda23ca309d0"} Dec 12 00:45:10 crc kubenswrapper[4606]: I1212 00:45:10.469940 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-019c-account-create-update-h2brj" event={"ID":"be20f016-2639-4715-80af-5719a068a857","Type":"ContainerStarted","Data":"54016490d6a36738eecae0c922fe64bcf2044ea1661f592cea65e0ef980ce0ae"} Dec 12 00:45:10 crc kubenswrapper[4606]: I1212 00:45:10.469995 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-019c-account-create-update-h2brj" event={"ID":"be20f016-2639-4715-80af-5719a068a857","Type":"ContainerStarted","Data":"674a4c415bd9dafe9e0eba94768168650ce80ad885678af11a9939e107bef37e"} Dec 12 00:45:10 crc kubenswrapper[4606]: I1212 00:45:10.532967 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5017a1a6-ef1c-4277-9282-9bf58af30ae7-ovsdbserver-nb\") pod \"dnsmasq-dns-74dc88fc-9rbg7\" (UID: \"5017a1a6-ef1c-4277-9282-9bf58af30ae7\") " pod="openstack/dnsmasq-dns-74dc88fc-9rbg7" Dec 12 00:45:10 crc kubenswrapper[4606]: I1212 00:45:10.533242 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5017a1a6-ef1c-4277-9282-9bf58af30ae7-dns-svc\") pod \"dnsmasq-dns-74dc88fc-9rbg7\" (UID: \"5017a1a6-ef1c-4277-9282-9bf58af30ae7\") " pod="openstack/dnsmasq-dns-74dc88fc-9rbg7" Dec 12 00:45:10 crc kubenswrapper[4606]: I1212 00:45:10.533329 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5017a1a6-ef1c-4277-9282-9bf58af30ae7-config\") pod \"dnsmasq-dns-74dc88fc-9rbg7\" (UID: \"5017a1a6-ef1c-4277-9282-9bf58af30ae7\") " pod="openstack/dnsmasq-dns-74dc88fc-9rbg7" Dec 12 00:45:10 crc kubenswrapper[4606]: I1212 00:45:10.533476 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67ch9\" (UniqueName: \"kubernetes.io/projected/5017a1a6-ef1c-4277-9282-9bf58af30ae7-kube-api-access-67ch9\") pod \"dnsmasq-dns-74dc88fc-9rbg7\" (UID: \"5017a1a6-ef1c-4277-9282-9bf58af30ae7\") " pod="openstack/dnsmasq-dns-74dc88fc-9rbg7" Dec 12 00:45:10 crc kubenswrapper[4606]: I1212 00:45:10.533564 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5017a1a6-ef1c-4277-9282-9bf58af30ae7-ovsdbserver-sb\") pod \"dnsmasq-dns-74dc88fc-9rbg7\" (UID: \"5017a1a6-ef1c-4277-9282-9bf58af30ae7\") " pod="openstack/dnsmasq-dns-74dc88fc-9rbg7" Dec 12 00:45:10 crc kubenswrapper[4606]: I1212 00:45:10.552976 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=42.804719501 podStartE2EDuration="48.552959916s" podCreationTimestamp="2025-12-12 00:44:22 +0000 UTC" firstStartedPulling="2025-12-12 00:44:58.794084224 +0000 UTC m=+1289.339437090" lastFinishedPulling="2025-12-12 00:45:04.542324639 +0000 UTC m=+1295.087677505" observedRunningTime="2025-12-12 00:45:10.510303351 +0000 UTC m=+1301.055656227" watchObservedRunningTime="2025-12-12 00:45:10.552959916 +0000 UTC m=+1301.098312782" Dec 12 00:45:10 crc kubenswrapper[4606]: I1212 00:45:10.555535 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-019c-account-create-update-h2brj" podStartSLOduration=3.555528135 podStartE2EDuration="3.555528135s" podCreationTimestamp="2025-12-12 00:45:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:45:10.551470416 +0000 UTC m=+1301.096823282" watchObservedRunningTime="2025-12-12 00:45:10.555528135 +0000 UTC m=+1301.100880991" Dec 12 00:45:10 crc kubenswrapper[4606]: I1212 00:45:10.636093 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5017a1a6-ef1c-4277-9282-9bf58af30ae7-ovsdbserver-nb\") pod \"dnsmasq-dns-74dc88fc-9rbg7\" (UID: \"5017a1a6-ef1c-4277-9282-9bf58af30ae7\") " pod="openstack/dnsmasq-dns-74dc88fc-9rbg7" Dec 12 00:45:10 crc kubenswrapper[4606]: I1212 00:45:10.636989 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5017a1a6-ef1c-4277-9282-9bf58af30ae7-dns-svc\") pod \"dnsmasq-dns-74dc88fc-9rbg7\" (UID: \"5017a1a6-ef1c-4277-9282-9bf58af30ae7\") " pod="openstack/dnsmasq-dns-74dc88fc-9rbg7" Dec 12 00:45:10 crc kubenswrapper[4606]: I1212 00:45:10.637072 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5017a1a6-ef1c-4277-9282-9bf58af30ae7-config\") pod \"dnsmasq-dns-74dc88fc-9rbg7\" (UID: \"5017a1a6-ef1c-4277-9282-9bf58af30ae7\") " pod="openstack/dnsmasq-dns-74dc88fc-9rbg7" Dec 12 00:45:10 crc kubenswrapper[4606]: I1212 00:45:10.637209 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67ch9\" (UniqueName: \"kubernetes.io/projected/5017a1a6-ef1c-4277-9282-9bf58af30ae7-kube-api-access-67ch9\") pod \"dnsmasq-dns-74dc88fc-9rbg7\" (UID: \"5017a1a6-ef1c-4277-9282-9bf58af30ae7\") " pod="openstack/dnsmasq-dns-74dc88fc-9rbg7" Dec 12 00:45:10 crc kubenswrapper[4606]: I1212 00:45:10.637301 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5017a1a6-ef1c-4277-9282-9bf58af30ae7-ovsdbserver-sb\") pod \"dnsmasq-dns-74dc88fc-9rbg7\" (UID: \"5017a1a6-ef1c-4277-9282-9bf58af30ae7\") " pod="openstack/dnsmasq-dns-74dc88fc-9rbg7" Dec 12 00:45:10 crc kubenswrapper[4606]: I1212 00:45:10.637970 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5017a1a6-ef1c-4277-9282-9bf58af30ae7-ovsdbserver-nb\") pod \"dnsmasq-dns-74dc88fc-9rbg7\" (UID: \"5017a1a6-ef1c-4277-9282-9bf58af30ae7\") " pod="openstack/dnsmasq-dns-74dc88fc-9rbg7" Dec 12 00:45:10 crc kubenswrapper[4606]: I1212 00:45:10.640729 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5017a1a6-ef1c-4277-9282-9bf58af30ae7-config\") pod \"dnsmasq-dns-74dc88fc-9rbg7\" (UID: \"5017a1a6-ef1c-4277-9282-9bf58af30ae7\") " pod="openstack/dnsmasq-dns-74dc88fc-9rbg7" Dec 12 00:45:10 crc kubenswrapper[4606]: I1212 00:45:10.641046 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5017a1a6-ef1c-4277-9282-9bf58af30ae7-ovsdbserver-sb\") pod \"dnsmasq-dns-74dc88fc-9rbg7\" (UID: \"5017a1a6-ef1c-4277-9282-9bf58af30ae7\") " pod="openstack/dnsmasq-dns-74dc88fc-9rbg7" Dec 12 00:45:10 crc kubenswrapper[4606]: I1212 00:45:10.641308 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5017a1a6-ef1c-4277-9282-9bf58af30ae7-dns-svc\") pod \"dnsmasq-dns-74dc88fc-9rbg7\" (UID: \"5017a1a6-ef1c-4277-9282-9bf58af30ae7\") " pod="openstack/dnsmasq-dns-74dc88fc-9rbg7" Dec 12 00:45:10 crc kubenswrapper[4606]: I1212 00:45:10.682566 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67ch9\" (UniqueName: \"kubernetes.io/projected/5017a1a6-ef1c-4277-9282-9bf58af30ae7-kube-api-access-67ch9\") pod \"dnsmasq-dns-74dc88fc-9rbg7\" (UID: \"5017a1a6-ef1c-4277-9282-9bf58af30ae7\") " pod="openstack/dnsmasq-dns-74dc88fc-9rbg7" Dec 12 00:45:10 crc kubenswrapper[4606]: I1212 00:45:10.691933 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74dc88fc-9rbg7" Dec 12 00:45:11 crc kubenswrapper[4606]: I1212 00:45:10.978949 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74dc88fc-9rbg7"] Dec 12 00:45:11 crc kubenswrapper[4606]: I1212 00:45:11.014246 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-6ms77"] Dec 12 00:45:11 crc kubenswrapper[4606]: I1212 00:45:11.032975 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f59b8f679-6ms77" Dec 12 00:45:11 crc kubenswrapper[4606]: I1212 00:45:11.038976 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Dec 12 00:45:11 crc kubenswrapper[4606]: I1212 00:45:11.104504 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-6ms77"] Dec 12 00:45:11 crc kubenswrapper[4606]: I1212 00:45:11.161684 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/31335089-493b-42bb-9a5f-cb4ea39951f4-config\") pod \"dnsmasq-dns-5f59b8f679-6ms77\" (UID: \"31335089-493b-42bb-9a5f-cb4ea39951f4\") " pod="openstack/dnsmasq-dns-5f59b8f679-6ms77" Dec 12 00:45:11 crc kubenswrapper[4606]: I1212 00:45:11.161768 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/31335089-493b-42bb-9a5f-cb4ea39951f4-ovsdbserver-sb\") pod \"dnsmasq-dns-5f59b8f679-6ms77\" (UID: \"31335089-493b-42bb-9a5f-cb4ea39951f4\") " pod="openstack/dnsmasq-dns-5f59b8f679-6ms77" Dec 12 00:45:11 crc kubenswrapper[4606]: I1212 00:45:11.161796 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/31335089-493b-42bb-9a5f-cb4ea39951f4-ovsdbserver-nb\") pod \"dnsmasq-dns-5f59b8f679-6ms77\" (UID: \"31335089-493b-42bb-9a5f-cb4ea39951f4\") " pod="openstack/dnsmasq-dns-5f59b8f679-6ms77" Dec 12 00:45:11 crc kubenswrapper[4606]: I1212 00:45:11.161961 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/31335089-493b-42bb-9a5f-cb4ea39951f4-dns-swift-storage-0\") pod \"dnsmasq-dns-5f59b8f679-6ms77\" (UID: \"31335089-493b-42bb-9a5f-cb4ea39951f4\") " pod="openstack/dnsmasq-dns-5f59b8f679-6ms77" Dec 12 00:45:11 crc kubenswrapper[4606]: I1212 00:45:11.162021 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vsfhm\" (UniqueName: \"kubernetes.io/projected/31335089-493b-42bb-9a5f-cb4ea39951f4-kube-api-access-vsfhm\") pod \"dnsmasq-dns-5f59b8f679-6ms77\" (UID: \"31335089-493b-42bb-9a5f-cb4ea39951f4\") " pod="openstack/dnsmasq-dns-5f59b8f679-6ms77" Dec 12 00:45:11 crc kubenswrapper[4606]: I1212 00:45:11.162269 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/31335089-493b-42bb-9a5f-cb4ea39951f4-dns-svc\") pod \"dnsmasq-dns-5f59b8f679-6ms77\" (UID: \"31335089-493b-42bb-9a5f-cb4ea39951f4\") " pod="openstack/dnsmasq-dns-5f59b8f679-6ms77" Dec 12 00:45:11 crc kubenswrapper[4606]: I1212 00:45:11.264074 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/31335089-493b-42bb-9a5f-cb4ea39951f4-dns-svc\") pod \"dnsmasq-dns-5f59b8f679-6ms77\" (UID: \"31335089-493b-42bb-9a5f-cb4ea39951f4\") " pod="openstack/dnsmasq-dns-5f59b8f679-6ms77" Dec 12 00:45:11 crc kubenswrapper[4606]: I1212 00:45:11.264133 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/31335089-493b-42bb-9a5f-cb4ea39951f4-config\") pod \"dnsmasq-dns-5f59b8f679-6ms77\" (UID: \"31335089-493b-42bb-9a5f-cb4ea39951f4\") " pod="openstack/dnsmasq-dns-5f59b8f679-6ms77" Dec 12 00:45:11 crc kubenswrapper[4606]: I1212 00:45:11.264183 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/31335089-493b-42bb-9a5f-cb4ea39951f4-ovsdbserver-sb\") pod \"dnsmasq-dns-5f59b8f679-6ms77\" (UID: \"31335089-493b-42bb-9a5f-cb4ea39951f4\") " pod="openstack/dnsmasq-dns-5f59b8f679-6ms77" Dec 12 00:45:11 crc kubenswrapper[4606]: I1212 00:45:11.264201 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/31335089-493b-42bb-9a5f-cb4ea39951f4-ovsdbserver-nb\") pod \"dnsmasq-dns-5f59b8f679-6ms77\" (UID: \"31335089-493b-42bb-9a5f-cb4ea39951f4\") " pod="openstack/dnsmasq-dns-5f59b8f679-6ms77" Dec 12 00:45:11 crc kubenswrapper[4606]: I1212 00:45:11.264228 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/31335089-493b-42bb-9a5f-cb4ea39951f4-dns-swift-storage-0\") pod \"dnsmasq-dns-5f59b8f679-6ms77\" (UID: \"31335089-493b-42bb-9a5f-cb4ea39951f4\") " pod="openstack/dnsmasq-dns-5f59b8f679-6ms77" Dec 12 00:45:11 crc kubenswrapper[4606]: I1212 00:45:11.264245 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vsfhm\" (UniqueName: \"kubernetes.io/projected/31335089-493b-42bb-9a5f-cb4ea39951f4-kube-api-access-vsfhm\") pod \"dnsmasq-dns-5f59b8f679-6ms77\" (UID: \"31335089-493b-42bb-9a5f-cb4ea39951f4\") " pod="openstack/dnsmasq-dns-5f59b8f679-6ms77" Dec 12 00:45:11 crc kubenswrapper[4606]: I1212 00:45:11.265255 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/31335089-493b-42bb-9a5f-cb4ea39951f4-dns-svc\") pod \"dnsmasq-dns-5f59b8f679-6ms77\" (UID: \"31335089-493b-42bb-9a5f-cb4ea39951f4\") " pod="openstack/dnsmasq-dns-5f59b8f679-6ms77" Dec 12 00:45:11 crc kubenswrapper[4606]: I1212 00:45:11.265909 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/31335089-493b-42bb-9a5f-cb4ea39951f4-config\") pod \"dnsmasq-dns-5f59b8f679-6ms77\" (UID: \"31335089-493b-42bb-9a5f-cb4ea39951f4\") " pod="openstack/dnsmasq-dns-5f59b8f679-6ms77" Dec 12 00:45:11 crc kubenswrapper[4606]: I1212 00:45:11.266350 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/31335089-493b-42bb-9a5f-cb4ea39951f4-ovsdbserver-nb\") pod \"dnsmasq-dns-5f59b8f679-6ms77\" (UID: \"31335089-493b-42bb-9a5f-cb4ea39951f4\") " pod="openstack/dnsmasq-dns-5f59b8f679-6ms77" Dec 12 00:45:11 crc kubenswrapper[4606]: I1212 00:45:11.266468 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/31335089-493b-42bb-9a5f-cb4ea39951f4-dns-swift-storage-0\") pod \"dnsmasq-dns-5f59b8f679-6ms77\" (UID: \"31335089-493b-42bb-9a5f-cb4ea39951f4\") " pod="openstack/dnsmasq-dns-5f59b8f679-6ms77" Dec 12 00:45:11 crc kubenswrapper[4606]: I1212 00:45:11.266998 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/31335089-493b-42bb-9a5f-cb4ea39951f4-ovsdbserver-sb\") pod \"dnsmasq-dns-5f59b8f679-6ms77\" (UID: \"31335089-493b-42bb-9a5f-cb4ea39951f4\") " pod="openstack/dnsmasq-dns-5f59b8f679-6ms77" Dec 12 00:45:11 crc kubenswrapper[4606]: I1212 00:45:11.319820 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vsfhm\" (UniqueName: \"kubernetes.io/projected/31335089-493b-42bb-9a5f-cb4ea39951f4-kube-api-access-vsfhm\") pod \"dnsmasq-dns-5f59b8f679-6ms77\" (UID: \"31335089-493b-42bb-9a5f-cb4ea39951f4\") " pod="openstack/dnsmasq-dns-5f59b8f679-6ms77" Dec 12 00:45:11 crc kubenswrapper[4606]: I1212 00:45:11.452482 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f59b8f679-6ms77" Dec 12 00:45:11 crc kubenswrapper[4606]: I1212 00:45:11.529855 4606 generic.go:334] "Generic (PLEG): container finished" podID="be20f016-2639-4715-80af-5719a068a857" containerID="54016490d6a36738eecae0c922fe64bcf2044ea1661f592cea65e0ef980ce0ae" exitCode=0 Dec 12 00:45:11 crc kubenswrapper[4606]: I1212 00:45:11.529926 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-019c-account-create-update-h2brj" event={"ID":"be20f016-2639-4715-80af-5719a068a857","Type":"ContainerDied","Data":"54016490d6a36738eecae0c922fe64bcf2044ea1661f592cea65e0ef980ce0ae"} Dec 12 00:45:11 crc kubenswrapper[4606]: I1212 00:45:11.552203 4606 generic.go:334] "Generic (PLEG): container finished" podID="5d88d3ae-b52d-4a92-9bdb-ba5d5541abaa" containerID="28dd3737c76d219cc9b1806a838aaa6e389ed9dc5614f000677c18465b7744ec" exitCode=0 Dec 12 00:45:11 crc kubenswrapper[4606]: I1212 00:45:11.552297 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-nrs7j" event={"ID":"5d88d3ae-b52d-4a92-9bdb-ba5d5541abaa","Type":"ContainerDied","Data":"28dd3737c76d219cc9b1806a838aaa6e389ed9dc5614f000677c18465b7744ec"} Dec 12 00:45:11 crc kubenswrapper[4606]: I1212 00:45:11.564325 4606 generic.go:334] "Generic (PLEG): container finished" podID="18577071-2d04-4c75-99fd-cd721836a571" containerID="f02cd2fb8847afdb298b2342547cde2b3f66ced980c44d729cd2003e20832d41" exitCode=0 Dec 12 00:45:11 crc kubenswrapper[4606]: I1212 00:45:11.564458 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5d1d-account-create-update-r2rq9" event={"ID":"18577071-2d04-4c75-99fd-cd721836a571","Type":"ContainerDied","Data":"f02cd2fb8847afdb298b2342547cde2b3f66ced980c44d729cd2003e20832d41"} Dec 12 00:45:11 crc kubenswrapper[4606]: I1212 00:45:11.572714 4606 generic.go:334] "Generic (PLEG): container finished" podID="0ea658e5-7318-49c3-ab43-00c885902e4f" containerID="7590f4e5bf7c08d9e7683102e025e145ecbaa39ebc5127fba13a965ecebb7d1c" exitCode=0 Dec 12 00:45:11 crc kubenswrapper[4606]: I1212 00:45:11.572906 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-4rtsn" event={"ID":"0ea658e5-7318-49c3-ab43-00c885902e4f","Type":"ContainerDied","Data":"7590f4e5bf7c08d9e7683102e025e145ecbaa39ebc5127fba13a965ecebb7d1c"} Dec 12 00:45:11 crc kubenswrapper[4606]: I1212 00:45:11.579732 4606 generic.go:334] "Generic (PLEG): container finished" podID="b00e1c7b-96a5-46de-8f2e-66ed8ff9275c" containerID="2960132e59f66efa0c957f351e2825dee6705cc8549fd74234110e8fe867a2bf" exitCode=0 Dec 12 00:45:11 crc kubenswrapper[4606]: I1212 00:45:11.579850 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-8mpd7" event={"ID":"b00e1c7b-96a5-46de-8f2e-66ed8ff9275c","Type":"ContainerDied","Data":"2960132e59f66efa0c957f351e2825dee6705cc8549fd74234110e8fe867a2bf"} Dec 12 00:45:11 crc kubenswrapper[4606]: I1212 00:45:11.581869 4606 generic.go:334] "Generic (PLEG): container finished" podID="92f2dac5-52f1-4438-aa85-059425ed7822" containerID="e76773484484c312e8bf03d6e9ac05b4491b9cd0a33d4352cfc58f910464c2e5" exitCode=0 Dec 12 00:45:11 crc kubenswrapper[4606]: I1212 00:45:11.582002 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-5080-account-create-update-ztlj6" event={"ID":"92f2dac5-52f1-4438-aa85-059425ed7822","Type":"ContainerDied","Data":"e76773484484c312e8bf03d6e9ac05b4491b9cd0a33d4352cfc58f910464c2e5"} Dec 12 00:45:11 crc kubenswrapper[4606]: I1212 00:45:11.822358 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-6ms77"] Dec 12 00:45:11 crc kubenswrapper[4606]: I1212 00:45:11.957687 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74dc88fc-9rbg7"] Dec 12 00:45:11 crc kubenswrapper[4606]: W1212 00:45:11.965974 4606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5017a1a6_ef1c_4277_9282_9bf58af30ae7.slice/crio-5956968746bcb4d7165de8640225bd81e5e1198d33069b2941c9d3dd02adfbc2 WatchSource:0}: Error finding container 5956968746bcb4d7165de8640225bd81e5e1198d33069b2941c9d3dd02adfbc2: Status 404 returned error can't find the container with id 5956968746bcb4d7165de8640225bd81e5e1198d33069b2941c9d3dd02adfbc2 Dec 12 00:45:12 crc kubenswrapper[4606]: I1212 00:45:12.594100 4606 generic.go:334] "Generic (PLEG): container finished" podID="31335089-493b-42bb-9a5f-cb4ea39951f4" containerID="00daec83dafa2cf69b8acb19b94061424145cdfeacf788446937d11ece95dcaa" exitCode=0 Dec 12 00:45:12 crc kubenswrapper[4606]: I1212 00:45:12.594203 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-6ms77" event={"ID":"31335089-493b-42bb-9a5f-cb4ea39951f4","Type":"ContainerDied","Data":"00daec83dafa2cf69b8acb19b94061424145cdfeacf788446937d11ece95dcaa"} Dec 12 00:45:12 crc kubenswrapper[4606]: I1212 00:45:12.594236 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-6ms77" event={"ID":"31335089-493b-42bb-9a5f-cb4ea39951f4","Type":"ContainerStarted","Data":"fb90acd762d492feb0afe414e82b66b34cee5ac9bd7f1633a2105839c7ab13cf"} Dec 12 00:45:12 crc kubenswrapper[4606]: I1212 00:45:12.599089 4606 generic.go:334] "Generic (PLEG): container finished" podID="5017a1a6-ef1c-4277-9282-9bf58af30ae7" containerID="b39e9b4464e2bf1800ce2d3a8bf8ffac2bf9bea032bcacdd1d5b472198b806d2" exitCode=0 Dec 12 00:45:12 crc kubenswrapper[4606]: I1212 00:45:12.599449 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74dc88fc-9rbg7" event={"ID":"5017a1a6-ef1c-4277-9282-9bf58af30ae7","Type":"ContainerDied","Data":"b39e9b4464e2bf1800ce2d3a8bf8ffac2bf9bea032bcacdd1d5b472198b806d2"} Dec 12 00:45:12 crc kubenswrapper[4606]: I1212 00:45:12.599475 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74dc88fc-9rbg7" event={"ID":"5017a1a6-ef1c-4277-9282-9bf58af30ae7","Type":"ContainerStarted","Data":"5956968746bcb4d7165de8640225bd81e5e1198d33069b2941c9d3dd02adfbc2"} Dec 12 00:45:13 crc kubenswrapper[4606]: I1212 00:45:13.007909 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-4rtsn" Dec 12 00:45:13 crc kubenswrapper[4606]: I1212 00:45:13.105859 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fjpjt\" (UniqueName: \"kubernetes.io/projected/0ea658e5-7318-49c3-ab43-00c885902e4f-kube-api-access-fjpjt\") pod \"0ea658e5-7318-49c3-ab43-00c885902e4f\" (UID: \"0ea658e5-7318-49c3-ab43-00c885902e4f\") " Dec 12 00:45:13 crc kubenswrapper[4606]: I1212 00:45:13.106020 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0ea658e5-7318-49c3-ab43-00c885902e4f-operator-scripts\") pod \"0ea658e5-7318-49c3-ab43-00c885902e4f\" (UID: \"0ea658e5-7318-49c3-ab43-00c885902e4f\") " Dec 12 00:45:13 crc kubenswrapper[4606]: I1212 00:45:13.109300 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ea658e5-7318-49c3-ab43-00c885902e4f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0ea658e5-7318-49c3-ab43-00c885902e4f" (UID: "0ea658e5-7318-49c3-ab43-00c885902e4f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:45:13 crc kubenswrapper[4606]: I1212 00:45:13.114395 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ea658e5-7318-49c3-ab43-00c885902e4f-kube-api-access-fjpjt" (OuterVolumeSpecName: "kube-api-access-fjpjt") pod "0ea658e5-7318-49c3-ab43-00c885902e4f" (UID: "0ea658e5-7318-49c3-ab43-00c885902e4f"). InnerVolumeSpecName "kube-api-access-fjpjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:45:13 crc kubenswrapper[4606]: I1212 00:45:13.210945 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fjpjt\" (UniqueName: \"kubernetes.io/projected/0ea658e5-7318-49c3-ab43-00c885902e4f-kube-api-access-fjpjt\") on node \"crc\" DevicePath \"\"" Dec 12 00:45:13 crc kubenswrapper[4606]: I1212 00:45:13.211386 4606 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0ea658e5-7318-49c3-ab43-00c885902e4f-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 12 00:45:13 crc kubenswrapper[4606]: I1212 00:45:13.327854 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-5080-account-create-update-ztlj6" Dec 12 00:45:13 crc kubenswrapper[4606]: I1212 00:45:13.344362 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5d1d-account-create-update-r2rq9" Dec 12 00:45:13 crc kubenswrapper[4606]: I1212 00:45:13.344972 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74dc88fc-9rbg7" Dec 12 00:45:13 crc kubenswrapper[4606]: I1212 00:45:13.367605 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-8mpd7" Dec 12 00:45:13 crc kubenswrapper[4606]: I1212 00:45:13.415615 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/92f2dac5-52f1-4438-aa85-059425ed7822-operator-scripts\") pod \"92f2dac5-52f1-4438-aa85-059425ed7822\" (UID: \"92f2dac5-52f1-4438-aa85-059425ed7822\") " Dec 12 00:45:13 crc kubenswrapper[4606]: I1212 00:45:13.415844 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5017a1a6-ef1c-4277-9282-9bf58af30ae7-config\") pod \"5017a1a6-ef1c-4277-9282-9bf58af30ae7\" (UID: \"5017a1a6-ef1c-4277-9282-9bf58af30ae7\") " Dec 12 00:45:13 crc kubenswrapper[4606]: I1212 00:45:13.416088 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5017a1a6-ef1c-4277-9282-9bf58af30ae7-ovsdbserver-sb\") pod \"5017a1a6-ef1c-4277-9282-9bf58af30ae7\" (UID: \"5017a1a6-ef1c-4277-9282-9bf58af30ae7\") " Dec 12 00:45:13 crc kubenswrapper[4606]: I1212 00:45:13.416263 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-67ch9\" (UniqueName: \"kubernetes.io/projected/5017a1a6-ef1c-4277-9282-9bf58af30ae7-kube-api-access-67ch9\") pod \"5017a1a6-ef1c-4277-9282-9bf58af30ae7\" (UID: \"5017a1a6-ef1c-4277-9282-9bf58af30ae7\") " Dec 12 00:45:13 crc kubenswrapper[4606]: I1212 00:45:13.416362 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q6kdf\" (UniqueName: \"kubernetes.io/projected/92f2dac5-52f1-4438-aa85-059425ed7822-kube-api-access-q6kdf\") pod \"92f2dac5-52f1-4438-aa85-059425ed7822\" (UID: \"92f2dac5-52f1-4438-aa85-059425ed7822\") " Dec 12 00:45:13 crc kubenswrapper[4606]: I1212 00:45:13.416423 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5017a1a6-ef1c-4277-9282-9bf58af30ae7-dns-svc\") pod \"5017a1a6-ef1c-4277-9282-9bf58af30ae7\" (UID: \"5017a1a6-ef1c-4277-9282-9bf58af30ae7\") " Dec 12 00:45:13 crc kubenswrapper[4606]: I1212 00:45:13.416530 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5017a1a6-ef1c-4277-9282-9bf58af30ae7-ovsdbserver-nb\") pod \"5017a1a6-ef1c-4277-9282-9bf58af30ae7\" (UID: \"5017a1a6-ef1c-4277-9282-9bf58af30ae7\") " Dec 12 00:45:13 crc kubenswrapper[4606]: I1212 00:45:13.416625 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j5v2r\" (UniqueName: \"kubernetes.io/projected/18577071-2d04-4c75-99fd-cd721836a571-kube-api-access-j5v2r\") pod \"18577071-2d04-4c75-99fd-cd721836a571\" (UID: \"18577071-2d04-4c75-99fd-cd721836a571\") " Dec 12 00:45:13 crc kubenswrapper[4606]: I1212 00:45:13.416713 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/18577071-2d04-4c75-99fd-cd721836a571-operator-scripts\") pod \"18577071-2d04-4c75-99fd-cd721836a571\" (UID: \"18577071-2d04-4c75-99fd-cd721836a571\") " Dec 12 00:45:13 crc kubenswrapper[4606]: I1212 00:45:13.416106 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92f2dac5-52f1-4438-aa85-059425ed7822-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "92f2dac5-52f1-4438-aa85-059425ed7822" (UID: "92f2dac5-52f1-4438-aa85-059425ed7822"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:45:13 crc kubenswrapper[4606]: I1212 00:45:13.417215 4606 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/92f2dac5-52f1-4438-aa85-059425ed7822-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 12 00:45:13 crc kubenswrapper[4606]: I1212 00:45:13.420995 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18577071-2d04-4c75-99fd-cd721836a571-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "18577071-2d04-4c75-99fd-cd721836a571" (UID: "18577071-2d04-4c75-99fd-cd721836a571"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:45:13 crc kubenswrapper[4606]: I1212 00:45:13.432529 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92f2dac5-52f1-4438-aa85-059425ed7822-kube-api-access-q6kdf" (OuterVolumeSpecName: "kube-api-access-q6kdf") pod "92f2dac5-52f1-4438-aa85-059425ed7822" (UID: "92f2dac5-52f1-4438-aa85-059425ed7822"). InnerVolumeSpecName "kube-api-access-q6kdf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:45:13 crc kubenswrapper[4606]: I1212 00:45:13.446585 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18577071-2d04-4c75-99fd-cd721836a571-kube-api-access-j5v2r" (OuterVolumeSpecName: "kube-api-access-j5v2r") pod "18577071-2d04-4c75-99fd-cd721836a571" (UID: "18577071-2d04-4c75-99fd-cd721836a571"). InnerVolumeSpecName "kube-api-access-j5v2r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:45:13 crc kubenswrapper[4606]: I1212 00:45:13.447095 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5017a1a6-ef1c-4277-9282-9bf58af30ae7-kube-api-access-67ch9" (OuterVolumeSpecName: "kube-api-access-67ch9") pod "5017a1a6-ef1c-4277-9282-9bf58af30ae7" (UID: "5017a1a6-ef1c-4277-9282-9bf58af30ae7"). InnerVolumeSpecName "kube-api-access-67ch9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:45:13 crc kubenswrapper[4606]: I1212 00:45:13.476935 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5017a1a6-ef1c-4277-9282-9bf58af30ae7-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "5017a1a6-ef1c-4277-9282-9bf58af30ae7" (UID: "5017a1a6-ef1c-4277-9282-9bf58af30ae7"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:45:13 crc kubenswrapper[4606]: I1212 00:45:13.479742 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5017a1a6-ef1c-4277-9282-9bf58af30ae7-config" (OuterVolumeSpecName: "config") pod "5017a1a6-ef1c-4277-9282-9bf58af30ae7" (UID: "5017a1a6-ef1c-4277-9282-9bf58af30ae7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:45:13 crc kubenswrapper[4606]: I1212 00:45:13.506669 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5017a1a6-ef1c-4277-9282-9bf58af30ae7-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5017a1a6-ef1c-4277-9282-9bf58af30ae7" (UID: "5017a1a6-ef1c-4277-9282-9bf58af30ae7"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:45:13 crc kubenswrapper[4606]: E1212 00:45:13.507050 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5017a1a6-ef1c-4277-9282-9bf58af30ae7-ovsdbserver-nb podName:5017a1a6-ef1c-4277-9282-9bf58af30ae7 nodeName:}" failed. No retries permitted until 2025-12-12 00:45:14.007029394 +0000 UTC m=+1304.552382260 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "ovsdbserver-nb" (UniqueName: "kubernetes.io/configmap/5017a1a6-ef1c-4277-9282-9bf58af30ae7-ovsdbserver-nb") pod "5017a1a6-ef1c-4277-9282-9bf58af30ae7" (UID: "5017a1a6-ef1c-4277-9282-9bf58af30ae7") : error deleting /var/lib/kubelet/pods/5017a1a6-ef1c-4277-9282-9bf58af30ae7/volume-subpaths: remove /var/lib/kubelet/pods/5017a1a6-ef1c-4277-9282-9bf58af30ae7/volume-subpaths: no such file or directory Dec 12 00:45:13 crc kubenswrapper[4606]: I1212 00:45:13.516945 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-019c-account-create-update-h2brj" Dec 12 00:45:13 crc kubenswrapper[4606]: I1212 00:45:13.522588 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-nrs7j" Dec 12 00:45:13 crc kubenswrapper[4606]: I1212 00:45:13.536953 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b00e1c7b-96a5-46de-8f2e-66ed8ff9275c-operator-scripts\") pod \"b00e1c7b-96a5-46de-8f2e-66ed8ff9275c\" (UID: \"b00e1c7b-96a5-46de-8f2e-66ed8ff9275c\") " Dec 12 00:45:13 crc kubenswrapper[4606]: I1212 00:45:13.537770 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t442m\" (UniqueName: \"kubernetes.io/projected/b00e1c7b-96a5-46de-8f2e-66ed8ff9275c-kube-api-access-t442m\") pod \"b00e1c7b-96a5-46de-8f2e-66ed8ff9275c\" (UID: \"b00e1c7b-96a5-46de-8f2e-66ed8ff9275c\") " Dec 12 00:45:13 crc kubenswrapper[4606]: I1212 00:45:13.538105 4606 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/18577071-2d04-4c75-99fd-cd721836a571-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 12 00:45:13 crc kubenswrapper[4606]: I1212 00:45:13.538122 4606 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5017a1a6-ef1c-4277-9282-9bf58af30ae7-config\") on node \"crc\" DevicePath \"\"" Dec 12 00:45:13 crc kubenswrapper[4606]: I1212 00:45:13.538131 4606 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5017a1a6-ef1c-4277-9282-9bf58af30ae7-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 12 00:45:13 crc kubenswrapper[4606]: I1212 00:45:13.538141 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-67ch9\" (UniqueName: \"kubernetes.io/projected/5017a1a6-ef1c-4277-9282-9bf58af30ae7-kube-api-access-67ch9\") on node \"crc\" DevicePath \"\"" Dec 12 00:45:13 crc kubenswrapper[4606]: I1212 00:45:13.538149 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q6kdf\" (UniqueName: \"kubernetes.io/projected/92f2dac5-52f1-4438-aa85-059425ed7822-kube-api-access-q6kdf\") on node \"crc\" DevicePath \"\"" Dec 12 00:45:13 crc kubenswrapper[4606]: I1212 00:45:13.538157 4606 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5017a1a6-ef1c-4277-9282-9bf58af30ae7-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 12 00:45:13 crc kubenswrapper[4606]: I1212 00:45:13.538164 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j5v2r\" (UniqueName: \"kubernetes.io/projected/18577071-2d04-4c75-99fd-cd721836a571-kube-api-access-j5v2r\") on node \"crc\" DevicePath \"\"" Dec 12 00:45:13 crc kubenswrapper[4606]: I1212 00:45:13.541146 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b00e1c7b-96a5-46de-8f2e-66ed8ff9275c-kube-api-access-t442m" (OuterVolumeSpecName: "kube-api-access-t442m") pod "b00e1c7b-96a5-46de-8f2e-66ed8ff9275c" (UID: "b00e1c7b-96a5-46de-8f2e-66ed8ff9275c"). InnerVolumeSpecName "kube-api-access-t442m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:45:13 crc kubenswrapper[4606]: I1212 00:45:13.541712 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b00e1c7b-96a5-46de-8f2e-66ed8ff9275c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b00e1c7b-96a5-46de-8f2e-66ed8ff9275c" (UID: "b00e1c7b-96a5-46de-8f2e-66ed8ff9275c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:45:13 crc kubenswrapper[4606]: I1212 00:45:13.625195 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74dc88fc-9rbg7" event={"ID":"5017a1a6-ef1c-4277-9282-9bf58af30ae7","Type":"ContainerDied","Data":"5956968746bcb4d7165de8640225bd81e5e1198d33069b2941c9d3dd02adfbc2"} Dec 12 00:45:13 crc kubenswrapper[4606]: I1212 00:45:13.625245 4606 scope.go:117] "RemoveContainer" containerID="b39e9b4464e2bf1800ce2d3a8bf8ffac2bf9bea032bcacdd1d5b472198b806d2" Dec 12 00:45:13 crc kubenswrapper[4606]: I1212 00:45:13.625344 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74dc88fc-9rbg7" Dec 12 00:45:13 crc kubenswrapper[4606]: I1212 00:45:13.635017 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5d1d-account-create-update-r2rq9" event={"ID":"18577071-2d04-4c75-99fd-cd721836a571","Type":"ContainerDied","Data":"765362ecf5875898861ae1419e00ce1b3a54c4be5093289c2b6469508b050feb"} Dec 12 00:45:13 crc kubenswrapper[4606]: I1212 00:45:13.635049 4606 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="765362ecf5875898861ae1419e00ce1b3a54c4be5093289c2b6469508b050feb" Dec 12 00:45:13 crc kubenswrapper[4606]: I1212 00:45:13.635030 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5d1d-account-create-update-r2rq9" Dec 12 00:45:13 crc kubenswrapper[4606]: I1212 00:45:13.638509 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/be20f016-2639-4715-80af-5719a068a857-operator-scripts\") pod \"be20f016-2639-4715-80af-5719a068a857\" (UID: \"be20f016-2639-4715-80af-5719a068a857\") " Dec 12 00:45:13 crc kubenswrapper[4606]: I1212 00:45:13.638693 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjn6b\" (UniqueName: \"kubernetes.io/projected/5d88d3ae-b52d-4a92-9bdb-ba5d5541abaa-kube-api-access-pjn6b\") pod \"5d88d3ae-b52d-4a92-9bdb-ba5d5541abaa\" (UID: \"5d88d3ae-b52d-4a92-9bdb-ba5d5541abaa\") " Dec 12 00:45:13 crc kubenswrapper[4606]: I1212 00:45:13.638821 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rxxj5\" (UniqueName: \"kubernetes.io/projected/be20f016-2639-4715-80af-5719a068a857-kube-api-access-rxxj5\") pod \"be20f016-2639-4715-80af-5719a068a857\" (UID: \"be20f016-2639-4715-80af-5719a068a857\") " Dec 12 00:45:13 crc kubenswrapper[4606]: I1212 00:45:13.638859 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5d88d3ae-b52d-4a92-9bdb-ba5d5541abaa-operator-scripts\") pod \"5d88d3ae-b52d-4a92-9bdb-ba5d5541abaa\" (UID: \"5d88d3ae-b52d-4a92-9bdb-ba5d5541abaa\") " Dec 12 00:45:13 crc kubenswrapper[4606]: I1212 00:45:13.639109 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be20f016-2639-4715-80af-5719a068a857-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "be20f016-2639-4715-80af-5719a068a857" (UID: "be20f016-2639-4715-80af-5719a068a857"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:45:13 crc kubenswrapper[4606]: I1212 00:45:13.639285 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t442m\" (UniqueName: \"kubernetes.io/projected/b00e1c7b-96a5-46de-8f2e-66ed8ff9275c-kube-api-access-t442m\") on node \"crc\" DevicePath \"\"" Dec 12 00:45:13 crc kubenswrapper[4606]: I1212 00:45:13.639310 4606 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b00e1c7b-96a5-46de-8f2e-66ed8ff9275c-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 12 00:45:13 crc kubenswrapper[4606]: I1212 00:45:13.639323 4606 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/be20f016-2639-4715-80af-5719a068a857-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 12 00:45:13 crc kubenswrapper[4606]: I1212 00:45:13.639582 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d88d3ae-b52d-4a92-9bdb-ba5d5541abaa-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5d88d3ae-b52d-4a92-9bdb-ba5d5541abaa" (UID: "5d88d3ae-b52d-4a92-9bdb-ba5d5541abaa"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:45:13 crc kubenswrapper[4606]: I1212 00:45:13.641869 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-4rtsn" event={"ID":"0ea658e5-7318-49c3-ab43-00c885902e4f","Type":"ContainerDied","Data":"38b2f19e1c99058db1edbdd0abf4e3a5f4ecd02d6a4e19811442534942b1499d"} Dec 12 00:45:13 crc kubenswrapper[4606]: I1212 00:45:13.641901 4606 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="38b2f19e1c99058db1edbdd0abf4e3a5f4ecd02d6a4e19811442534942b1499d" Dec 12 00:45:13 crc kubenswrapper[4606]: I1212 00:45:13.641974 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-4rtsn" Dec 12 00:45:13 crc kubenswrapper[4606]: I1212 00:45:13.642317 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be20f016-2639-4715-80af-5719a068a857-kube-api-access-rxxj5" (OuterVolumeSpecName: "kube-api-access-rxxj5") pod "be20f016-2639-4715-80af-5719a068a857" (UID: "be20f016-2639-4715-80af-5719a068a857"). InnerVolumeSpecName "kube-api-access-rxxj5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:45:13 crc kubenswrapper[4606]: I1212 00:45:13.643026 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d88d3ae-b52d-4a92-9bdb-ba5d5541abaa-kube-api-access-pjn6b" (OuterVolumeSpecName: "kube-api-access-pjn6b") pod "5d88d3ae-b52d-4a92-9bdb-ba5d5541abaa" (UID: "5d88d3ae-b52d-4a92-9bdb-ba5d5541abaa"). InnerVolumeSpecName "kube-api-access-pjn6b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:45:13 crc kubenswrapper[4606]: I1212 00:45:13.643315 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-8mpd7" event={"ID":"b00e1c7b-96a5-46de-8f2e-66ed8ff9275c","Type":"ContainerDied","Data":"5cee1959311ff9754ac3a4c3bd618855096fd46052eccf453f7074a8403a8fa0"} Dec 12 00:45:13 crc kubenswrapper[4606]: I1212 00:45:13.643342 4606 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5cee1959311ff9754ac3a4c3bd618855096fd46052eccf453f7074a8403a8fa0" Dec 12 00:45:13 crc kubenswrapper[4606]: I1212 00:45:13.643410 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-8mpd7" Dec 12 00:45:13 crc kubenswrapper[4606]: I1212 00:45:13.645395 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-5080-account-create-update-ztlj6" Dec 12 00:45:13 crc kubenswrapper[4606]: I1212 00:45:13.645431 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-5080-account-create-update-ztlj6" event={"ID":"92f2dac5-52f1-4438-aa85-059425ed7822","Type":"ContainerDied","Data":"eb252b5ca0081c62840a7096bc15f2aa9bab374502d16fe5eeade13595384b0f"} Dec 12 00:45:13 crc kubenswrapper[4606]: I1212 00:45:13.645467 4606 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eb252b5ca0081c62840a7096bc15f2aa9bab374502d16fe5eeade13595384b0f" Dec 12 00:45:13 crc kubenswrapper[4606]: I1212 00:45:13.647706 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-019c-account-create-update-h2brj" event={"ID":"be20f016-2639-4715-80af-5719a068a857","Type":"ContainerDied","Data":"674a4c415bd9dafe9e0eba94768168650ce80ad885678af11a9939e107bef37e"} Dec 12 00:45:13 crc kubenswrapper[4606]: I1212 00:45:13.647729 4606 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="674a4c415bd9dafe9e0eba94768168650ce80ad885678af11a9939e107bef37e" Dec 12 00:45:13 crc kubenswrapper[4606]: I1212 00:45:13.647791 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-019c-account-create-update-h2brj" Dec 12 00:45:13 crc kubenswrapper[4606]: I1212 00:45:13.651680 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-nrs7j" event={"ID":"5d88d3ae-b52d-4a92-9bdb-ba5d5541abaa","Type":"ContainerDied","Data":"39b6a4bc0fce184e32bef85c40612c3f29be4c1f83517b7a82af50470e7ff488"} Dec 12 00:45:13 crc kubenswrapper[4606]: I1212 00:45:13.651708 4606 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="39b6a4bc0fce184e32bef85c40612c3f29be4c1f83517b7a82af50470e7ff488" Dec 12 00:45:13 crc kubenswrapper[4606]: I1212 00:45:13.651808 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-nrs7j" Dec 12 00:45:13 crc kubenswrapper[4606]: I1212 00:45:13.657495 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-6ms77" event={"ID":"31335089-493b-42bb-9a5f-cb4ea39951f4","Type":"ContainerStarted","Data":"8491d7f65e8fd7e0f7e430fda22ef9861a076be3942ad2ab7e3118166e6f9466"} Dec 12 00:45:13 crc kubenswrapper[4606]: I1212 00:45:13.657860 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5f59b8f679-6ms77" Dec 12 00:45:13 crc kubenswrapper[4606]: I1212 00:45:13.687030 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5f59b8f679-6ms77" podStartSLOduration=3.687011714 podStartE2EDuration="3.687011714s" podCreationTimestamp="2025-12-12 00:45:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:45:13.677575061 +0000 UTC m=+1304.222927937" watchObservedRunningTime="2025-12-12 00:45:13.687011714 +0000 UTC m=+1304.232364580" Dec 12 00:45:13 crc kubenswrapper[4606]: I1212 00:45:13.741148 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjn6b\" (UniqueName: \"kubernetes.io/projected/5d88d3ae-b52d-4a92-9bdb-ba5d5541abaa-kube-api-access-pjn6b\") on node \"crc\" DevicePath \"\"" Dec 12 00:45:13 crc kubenswrapper[4606]: I1212 00:45:13.741188 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rxxj5\" (UniqueName: \"kubernetes.io/projected/be20f016-2639-4715-80af-5719a068a857-kube-api-access-rxxj5\") on node \"crc\" DevicePath \"\"" Dec 12 00:45:13 crc kubenswrapper[4606]: I1212 00:45:13.741199 4606 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5d88d3ae-b52d-4a92-9bdb-ba5d5541abaa-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 12 00:45:14 crc kubenswrapper[4606]: I1212 00:45:14.045221 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5017a1a6-ef1c-4277-9282-9bf58af30ae7-ovsdbserver-nb\") pod \"5017a1a6-ef1c-4277-9282-9bf58af30ae7\" (UID: \"5017a1a6-ef1c-4277-9282-9bf58af30ae7\") " Dec 12 00:45:14 crc kubenswrapper[4606]: I1212 00:45:14.045885 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5017a1a6-ef1c-4277-9282-9bf58af30ae7-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "5017a1a6-ef1c-4277-9282-9bf58af30ae7" (UID: "5017a1a6-ef1c-4277-9282-9bf58af30ae7"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:45:14 crc kubenswrapper[4606]: I1212 00:45:14.146733 4606 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5017a1a6-ef1c-4277-9282-9bf58af30ae7-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 12 00:45:14 crc kubenswrapper[4606]: I1212 00:45:14.285418 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74dc88fc-9rbg7"] Dec 12 00:45:14 crc kubenswrapper[4606]: I1212 00:45:14.297691 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-74dc88fc-9rbg7"] Dec 12 00:45:15 crc kubenswrapper[4606]: I1212 00:45:15.710889 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5017a1a6-ef1c-4277-9282-9bf58af30ae7" path="/var/lib/kubelet/pods/5017a1a6-ef1c-4277-9282-9bf58af30ae7/volumes" Dec 12 00:45:17 crc kubenswrapper[4606]: I1212 00:45:17.712536 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-j24jk" event={"ID":"a58ecb89-37d3-4d35-9a1a-9820df05848e","Type":"ContainerStarted","Data":"9313441d4ff4e009604a3cd040bfbe6d68e5a117c52c8de43661a35abe25c295"} Dec 12 00:45:17 crc kubenswrapper[4606]: I1212 00:45:17.725706 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-j24jk" podStartSLOduration=3.093497332 podStartE2EDuration="9.7256893s" podCreationTimestamp="2025-12-12 00:45:08 +0000 UTC" firstStartedPulling="2025-12-12 00:45:10.163711699 +0000 UTC m=+1300.709064565" lastFinishedPulling="2025-12-12 00:45:16.795903667 +0000 UTC m=+1307.341256533" observedRunningTime="2025-12-12 00:45:17.723439209 +0000 UTC m=+1308.268792085" watchObservedRunningTime="2025-12-12 00:45:17.7256893 +0000 UTC m=+1308.271042166" Dec 12 00:45:17 crc kubenswrapper[4606]: I1212 00:45:17.748850 4606 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","podd4a54eac-00ee-452a-9c4b-e777e338e670"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort podd4a54eac-00ee-452a-9c4b-e777e338e670] : Timed out while waiting for systemd to remove kubepods-besteffort-podd4a54eac_00ee_452a_9c4b_e777e338e670.slice" Dec 12 00:45:19 crc kubenswrapper[4606]: I1212 00:45:19.726116 4606 generic.go:334] "Generic (PLEG): container finished" podID="a58ecb89-37d3-4d35-9a1a-9820df05848e" containerID="9313441d4ff4e009604a3cd040bfbe6d68e5a117c52c8de43661a35abe25c295" exitCode=0 Dec 12 00:45:19 crc kubenswrapper[4606]: I1212 00:45:19.726414 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-j24jk" event={"ID":"a58ecb89-37d3-4d35-9a1a-9820df05848e","Type":"ContainerDied","Data":"9313441d4ff4e009604a3cd040bfbe6d68e5a117c52c8de43661a35abe25c295"} Dec 12 00:45:21 crc kubenswrapper[4606]: I1212 00:45:21.039622 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-j24jk" Dec 12 00:45:21 crc kubenswrapper[4606]: I1212 00:45:21.179953 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a58ecb89-37d3-4d35-9a1a-9820df05848e-config-data\") pod \"a58ecb89-37d3-4d35-9a1a-9820df05848e\" (UID: \"a58ecb89-37d3-4d35-9a1a-9820df05848e\") " Dec 12 00:45:21 crc kubenswrapper[4606]: I1212 00:45:21.180273 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a58ecb89-37d3-4d35-9a1a-9820df05848e-combined-ca-bundle\") pod \"a58ecb89-37d3-4d35-9a1a-9820df05848e\" (UID: \"a58ecb89-37d3-4d35-9a1a-9820df05848e\") " Dec 12 00:45:21 crc kubenswrapper[4606]: I1212 00:45:21.180723 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xbz25\" (UniqueName: \"kubernetes.io/projected/a58ecb89-37d3-4d35-9a1a-9820df05848e-kube-api-access-xbz25\") pod \"a58ecb89-37d3-4d35-9a1a-9820df05848e\" (UID: \"a58ecb89-37d3-4d35-9a1a-9820df05848e\") " Dec 12 00:45:21 crc kubenswrapper[4606]: I1212 00:45:21.200482 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a58ecb89-37d3-4d35-9a1a-9820df05848e-kube-api-access-xbz25" (OuterVolumeSpecName: "kube-api-access-xbz25") pod "a58ecb89-37d3-4d35-9a1a-9820df05848e" (UID: "a58ecb89-37d3-4d35-9a1a-9820df05848e"). InnerVolumeSpecName "kube-api-access-xbz25". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:45:21 crc kubenswrapper[4606]: I1212 00:45:21.209323 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a58ecb89-37d3-4d35-9a1a-9820df05848e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a58ecb89-37d3-4d35-9a1a-9820df05848e" (UID: "a58ecb89-37d3-4d35-9a1a-9820df05848e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:45:21 crc kubenswrapper[4606]: I1212 00:45:21.245469 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a58ecb89-37d3-4d35-9a1a-9820df05848e-config-data" (OuterVolumeSpecName: "config-data") pod "a58ecb89-37d3-4d35-9a1a-9820df05848e" (UID: "a58ecb89-37d3-4d35-9a1a-9820df05848e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:45:21 crc kubenswrapper[4606]: I1212 00:45:21.282680 4606 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a58ecb89-37d3-4d35-9a1a-9820df05848e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 12 00:45:21 crc kubenswrapper[4606]: I1212 00:45:21.282717 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xbz25\" (UniqueName: \"kubernetes.io/projected/a58ecb89-37d3-4d35-9a1a-9820df05848e-kube-api-access-xbz25\") on node \"crc\" DevicePath \"\"" Dec 12 00:45:21 crc kubenswrapper[4606]: I1212 00:45:21.282731 4606 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a58ecb89-37d3-4d35-9a1a-9820df05848e-config-data\") on node \"crc\" DevicePath \"\"" Dec 12 00:45:21 crc kubenswrapper[4606]: I1212 00:45:21.454495 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5f59b8f679-6ms77" Dec 12 00:45:21 crc kubenswrapper[4606]: I1212 00:45:21.543350 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-v224c"] Dec 12 00:45:21 crc kubenswrapper[4606]: I1212 00:45:21.543696 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-b8fbc5445-v224c" podUID="d9a87b6a-f358-4e8f-90a9-0c3cc5b8971d" containerName="dnsmasq-dns" containerID="cri-o://bb4b0e071f6d1520c484ab216a2d53fd85e9cf5e48c7849c5f250a898e52f497" gracePeriod=10 Dec 12 00:45:21 crc kubenswrapper[4606]: I1212 00:45:21.767794 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-j24jk" event={"ID":"a58ecb89-37d3-4d35-9a1a-9820df05848e","Type":"ContainerDied","Data":"a89b6129fedb42ded877a2fc205b4e982a69d5020aa53fc35a5828c112134a7d"} Dec 12 00:45:21 crc kubenswrapper[4606]: I1212 00:45:21.767839 4606 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a89b6129fedb42ded877a2fc205b4e982a69d5020aa53fc35a5828c112134a7d" Dec 12 00:45:21 crc kubenswrapper[4606]: I1212 00:45:21.767894 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-j24jk" Dec 12 00:45:21 crc kubenswrapper[4606]: I1212 00:45:21.797631 4606 generic.go:334] "Generic (PLEG): container finished" podID="d9a87b6a-f358-4e8f-90a9-0c3cc5b8971d" containerID="bb4b0e071f6d1520c484ab216a2d53fd85e9cf5e48c7849c5f250a898e52f497" exitCode=0 Dec 12 00:45:21 crc kubenswrapper[4606]: I1212 00:45:21.797675 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-v224c" event={"ID":"d9a87b6a-f358-4e8f-90a9-0c3cc5b8971d","Type":"ContainerDied","Data":"bb4b0e071f6d1520c484ab216a2d53fd85e9cf5e48c7849c5f250a898e52f497"} Dec 12 00:45:21 crc kubenswrapper[4606]: I1212 00:45:21.919539 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-98jlt"] Dec 12 00:45:21 crc kubenswrapper[4606]: E1212 00:45:21.919858 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b00e1c7b-96a5-46de-8f2e-66ed8ff9275c" containerName="mariadb-database-create" Dec 12 00:45:21 crc kubenswrapper[4606]: I1212 00:45:21.919870 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="b00e1c7b-96a5-46de-8f2e-66ed8ff9275c" containerName="mariadb-database-create" Dec 12 00:45:21 crc kubenswrapper[4606]: E1212 00:45:21.919886 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a58ecb89-37d3-4d35-9a1a-9820df05848e" containerName="keystone-db-sync" Dec 12 00:45:21 crc kubenswrapper[4606]: I1212 00:45:21.919892 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="a58ecb89-37d3-4d35-9a1a-9820df05848e" containerName="keystone-db-sync" Dec 12 00:45:21 crc kubenswrapper[4606]: E1212 00:45:21.919905 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5017a1a6-ef1c-4277-9282-9bf58af30ae7" containerName="init" Dec 12 00:45:21 crc kubenswrapper[4606]: I1212 00:45:21.919911 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="5017a1a6-ef1c-4277-9282-9bf58af30ae7" containerName="init" Dec 12 00:45:21 crc kubenswrapper[4606]: E1212 00:45:21.919928 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18577071-2d04-4c75-99fd-cd721836a571" containerName="mariadb-account-create-update" Dec 12 00:45:21 crc kubenswrapper[4606]: I1212 00:45:21.919934 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="18577071-2d04-4c75-99fd-cd721836a571" containerName="mariadb-account-create-update" Dec 12 00:45:21 crc kubenswrapper[4606]: E1212 00:45:21.919942 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d88d3ae-b52d-4a92-9bdb-ba5d5541abaa" containerName="mariadb-database-create" Dec 12 00:45:21 crc kubenswrapper[4606]: I1212 00:45:21.919947 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d88d3ae-b52d-4a92-9bdb-ba5d5541abaa" containerName="mariadb-database-create" Dec 12 00:45:21 crc kubenswrapper[4606]: E1212 00:45:21.919961 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92f2dac5-52f1-4438-aa85-059425ed7822" containerName="mariadb-account-create-update" Dec 12 00:45:21 crc kubenswrapper[4606]: I1212 00:45:21.919967 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="92f2dac5-52f1-4438-aa85-059425ed7822" containerName="mariadb-account-create-update" Dec 12 00:45:21 crc kubenswrapper[4606]: E1212 00:45:21.919976 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ea658e5-7318-49c3-ab43-00c885902e4f" containerName="mariadb-database-create" Dec 12 00:45:21 crc kubenswrapper[4606]: I1212 00:45:21.919981 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ea658e5-7318-49c3-ab43-00c885902e4f" containerName="mariadb-database-create" Dec 12 00:45:21 crc kubenswrapper[4606]: E1212 00:45:21.919988 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be20f016-2639-4715-80af-5719a068a857" containerName="mariadb-account-create-update" Dec 12 00:45:21 crc kubenswrapper[4606]: I1212 00:45:21.919994 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="be20f016-2639-4715-80af-5719a068a857" containerName="mariadb-account-create-update" Dec 12 00:45:21 crc kubenswrapper[4606]: I1212 00:45:21.920140 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="be20f016-2639-4715-80af-5719a068a857" containerName="mariadb-account-create-update" Dec 12 00:45:21 crc kubenswrapper[4606]: I1212 00:45:21.920151 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="a58ecb89-37d3-4d35-9a1a-9820df05848e" containerName="keystone-db-sync" Dec 12 00:45:21 crc kubenswrapper[4606]: I1212 00:45:21.920158 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ea658e5-7318-49c3-ab43-00c885902e4f" containerName="mariadb-database-create" Dec 12 00:45:21 crc kubenswrapper[4606]: I1212 00:45:21.920169 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d88d3ae-b52d-4a92-9bdb-ba5d5541abaa" containerName="mariadb-database-create" Dec 12 00:45:21 crc kubenswrapper[4606]: I1212 00:45:21.920182 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="b00e1c7b-96a5-46de-8f2e-66ed8ff9275c" containerName="mariadb-database-create" Dec 12 00:45:21 crc kubenswrapper[4606]: I1212 00:45:21.920215 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="92f2dac5-52f1-4438-aa85-059425ed7822" containerName="mariadb-account-create-update" Dec 12 00:45:21 crc kubenswrapper[4606]: I1212 00:45:21.920228 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="5017a1a6-ef1c-4277-9282-9bf58af30ae7" containerName="init" Dec 12 00:45:21 crc kubenswrapper[4606]: I1212 00:45:21.920243 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="18577071-2d04-4c75-99fd-cd721836a571" containerName="mariadb-account-create-update" Dec 12 00:45:21 crc kubenswrapper[4606]: I1212 00:45:21.921053 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf5cc879-98jlt" Dec 12 00:45:21 crc kubenswrapper[4606]: I1212 00:45:21.939307 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-98jlt"] Dec 12 00:45:21 crc kubenswrapper[4606]: I1212 00:45:21.998998 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-jx54m"] Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.000336 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-jx54m" Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.023947 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.024075 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.024194 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.024368 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-wkdqx" Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.024569 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.025978 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-jx54m"] Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.102085 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b21b129-dddc-4a41-ad1f-6cad37d0aa07-combined-ca-bundle\") pod \"keystone-bootstrap-jx54m\" (UID: \"4b21b129-dddc-4a41-ad1f-6cad37d0aa07\") " pod="openstack/keystone-bootstrap-jx54m" Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.102179 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4b21b129-dddc-4a41-ad1f-6cad37d0aa07-fernet-keys\") pod \"keystone-bootstrap-jx54m\" (UID: \"4b21b129-dddc-4a41-ad1f-6cad37d0aa07\") " pod="openstack/keystone-bootstrap-jx54m" Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.102228 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b21b129-dddc-4a41-ad1f-6cad37d0aa07-scripts\") pod \"keystone-bootstrap-jx54m\" (UID: \"4b21b129-dddc-4a41-ad1f-6cad37d0aa07\") " pod="openstack/keystone-bootstrap-jx54m" Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.102265 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aabbe14d-15a9-4e93-a862-00fd5cf988a0-ovsdbserver-sb\") pod \"dnsmasq-dns-bbf5cc879-98jlt\" (UID: \"aabbe14d-15a9-4e93-a862-00fd5cf988a0\") " pod="openstack/dnsmasq-dns-bbf5cc879-98jlt" Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.102290 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdcv9\" (UniqueName: \"kubernetes.io/projected/aabbe14d-15a9-4e93-a862-00fd5cf988a0-kube-api-access-pdcv9\") pod \"dnsmasq-dns-bbf5cc879-98jlt\" (UID: \"aabbe14d-15a9-4e93-a862-00fd5cf988a0\") " pod="openstack/dnsmasq-dns-bbf5cc879-98jlt" Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.102312 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aabbe14d-15a9-4e93-a862-00fd5cf988a0-config\") pod \"dnsmasq-dns-bbf5cc879-98jlt\" (UID: \"aabbe14d-15a9-4e93-a862-00fd5cf988a0\") " pod="openstack/dnsmasq-dns-bbf5cc879-98jlt" Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.102341 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aabbe14d-15a9-4e93-a862-00fd5cf988a0-ovsdbserver-nb\") pod \"dnsmasq-dns-bbf5cc879-98jlt\" (UID: \"aabbe14d-15a9-4e93-a862-00fd5cf988a0\") " pod="openstack/dnsmasq-dns-bbf5cc879-98jlt" Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.102369 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffjsz\" (UniqueName: \"kubernetes.io/projected/4b21b129-dddc-4a41-ad1f-6cad37d0aa07-kube-api-access-ffjsz\") pod \"keystone-bootstrap-jx54m\" (UID: \"4b21b129-dddc-4a41-ad1f-6cad37d0aa07\") " pod="openstack/keystone-bootstrap-jx54m" Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.102412 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aabbe14d-15a9-4e93-a862-00fd5cf988a0-dns-swift-storage-0\") pod \"dnsmasq-dns-bbf5cc879-98jlt\" (UID: \"aabbe14d-15a9-4e93-a862-00fd5cf988a0\") " pod="openstack/dnsmasq-dns-bbf5cc879-98jlt" Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.102437 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b21b129-dddc-4a41-ad1f-6cad37d0aa07-config-data\") pod \"keystone-bootstrap-jx54m\" (UID: \"4b21b129-dddc-4a41-ad1f-6cad37d0aa07\") " pod="openstack/keystone-bootstrap-jx54m" Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.102461 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aabbe14d-15a9-4e93-a862-00fd5cf988a0-dns-svc\") pod \"dnsmasq-dns-bbf5cc879-98jlt\" (UID: \"aabbe14d-15a9-4e93-a862-00fd5cf988a0\") " pod="openstack/dnsmasq-dns-bbf5cc879-98jlt" Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.102485 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4b21b129-dddc-4a41-ad1f-6cad37d0aa07-credential-keys\") pod \"keystone-bootstrap-jx54m\" (UID: \"4b21b129-dddc-4a41-ad1f-6cad37d0aa07\") " pod="openstack/keystone-bootstrap-jx54m" Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.207691 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b21b129-dddc-4a41-ad1f-6cad37d0aa07-combined-ca-bundle\") pod \"keystone-bootstrap-jx54m\" (UID: \"4b21b129-dddc-4a41-ad1f-6cad37d0aa07\") " pod="openstack/keystone-bootstrap-jx54m" Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.207956 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4b21b129-dddc-4a41-ad1f-6cad37d0aa07-fernet-keys\") pod \"keystone-bootstrap-jx54m\" (UID: \"4b21b129-dddc-4a41-ad1f-6cad37d0aa07\") " pod="openstack/keystone-bootstrap-jx54m" Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.207992 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b21b129-dddc-4a41-ad1f-6cad37d0aa07-scripts\") pod \"keystone-bootstrap-jx54m\" (UID: \"4b21b129-dddc-4a41-ad1f-6cad37d0aa07\") " pod="openstack/keystone-bootstrap-jx54m" Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.208017 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aabbe14d-15a9-4e93-a862-00fd5cf988a0-ovsdbserver-sb\") pod \"dnsmasq-dns-bbf5cc879-98jlt\" (UID: \"aabbe14d-15a9-4e93-a862-00fd5cf988a0\") " pod="openstack/dnsmasq-dns-bbf5cc879-98jlt" Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.208035 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdcv9\" (UniqueName: \"kubernetes.io/projected/aabbe14d-15a9-4e93-a862-00fd5cf988a0-kube-api-access-pdcv9\") pod \"dnsmasq-dns-bbf5cc879-98jlt\" (UID: \"aabbe14d-15a9-4e93-a862-00fd5cf988a0\") " pod="openstack/dnsmasq-dns-bbf5cc879-98jlt" Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.208054 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aabbe14d-15a9-4e93-a862-00fd5cf988a0-config\") pod \"dnsmasq-dns-bbf5cc879-98jlt\" (UID: \"aabbe14d-15a9-4e93-a862-00fd5cf988a0\") " pod="openstack/dnsmasq-dns-bbf5cc879-98jlt" Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.208074 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aabbe14d-15a9-4e93-a862-00fd5cf988a0-ovsdbserver-nb\") pod \"dnsmasq-dns-bbf5cc879-98jlt\" (UID: \"aabbe14d-15a9-4e93-a862-00fd5cf988a0\") " pod="openstack/dnsmasq-dns-bbf5cc879-98jlt" Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.208094 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ffjsz\" (UniqueName: \"kubernetes.io/projected/4b21b129-dddc-4a41-ad1f-6cad37d0aa07-kube-api-access-ffjsz\") pod \"keystone-bootstrap-jx54m\" (UID: \"4b21b129-dddc-4a41-ad1f-6cad37d0aa07\") " pod="openstack/keystone-bootstrap-jx54m" Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.208126 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aabbe14d-15a9-4e93-a862-00fd5cf988a0-dns-swift-storage-0\") pod \"dnsmasq-dns-bbf5cc879-98jlt\" (UID: \"aabbe14d-15a9-4e93-a862-00fd5cf988a0\") " pod="openstack/dnsmasq-dns-bbf5cc879-98jlt" Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.208145 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b21b129-dddc-4a41-ad1f-6cad37d0aa07-config-data\") pod \"keystone-bootstrap-jx54m\" (UID: \"4b21b129-dddc-4a41-ad1f-6cad37d0aa07\") " pod="openstack/keystone-bootstrap-jx54m" Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.208164 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aabbe14d-15a9-4e93-a862-00fd5cf988a0-dns-svc\") pod \"dnsmasq-dns-bbf5cc879-98jlt\" (UID: \"aabbe14d-15a9-4e93-a862-00fd5cf988a0\") " pod="openstack/dnsmasq-dns-bbf5cc879-98jlt" Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.208185 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4b21b129-dddc-4a41-ad1f-6cad37d0aa07-credential-keys\") pod \"keystone-bootstrap-jx54m\" (UID: \"4b21b129-dddc-4a41-ad1f-6cad37d0aa07\") " pod="openstack/keystone-bootstrap-jx54m" Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.209030 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aabbe14d-15a9-4e93-a862-00fd5cf988a0-config\") pod \"dnsmasq-dns-bbf5cc879-98jlt\" (UID: \"aabbe14d-15a9-4e93-a862-00fd5cf988a0\") " pod="openstack/dnsmasq-dns-bbf5cc879-98jlt" Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.209736 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aabbe14d-15a9-4e93-a862-00fd5cf988a0-dns-swift-storage-0\") pod \"dnsmasq-dns-bbf5cc879-98jlt\" (UID: \"aabbe14d-15a9-4e93-a862-00fd5cf988a0\") " pod="openstack/dnsmasq-dns-bbf5cc879-98jlt" Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.213818 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aabbe14d-15a9-4e93-a862-00fd5cf988a0-ovsdbserver-nb\") pod \"dnsmasq-dns-bbf5cc879-98jlt\" (UID: \"aabbe14d-15a9-4e93-a862-00fd5cf988a0\") " pod="openstack/dnsmasq-dns-bbf5cc879-98jlt" Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.213938 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aabbe14d-15a9-4e93-a862-00fd5cf988a0-dns-svc\") pod \"dnsmasq-dns-bbf5cc879-98jlt\" (UID: \"aabbe14d-15a9-4e93-a862-00fd5cf988a0\") " pod="openstack/dnsmasq-dns-bbf5cc879-98jlt" Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.214038 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aabbe14d-15a9-4e93-a862-00fd5cf988a0-ovsdbserver-sb\") pod \"dnsmasq-dns-bbf5cc879-98jlt\" (UID: \"aabbe14d-15a9-4e93-a862-00fd5cf988a0\") " pod="openstack/dnsmasq-dns-bbf5cc879-98jlt" Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.215314 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b21b129-dddc-4a41-ad1f-6cad37d0aa07-combined-ca-bundle\") pod \"keystone-bootstrap-jx54m\" (UID: \"4b21b129-dddc-4a41-ad1f-6cad37d0aa07\") " pod="openstack/keystone-bootstrap-jx54m" Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.229726 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b21b129-dddc-4a41-ad1f-6cad37d0aa07-scripts\") pod \"keystone-bootstrap-jx54m\" (UID: \"4b21b129-dddc-4a41-ad1f-6cad37d0aa07\") " pod="openstack/keystone-bootstrap-jx54m" Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.230027 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4b21b129-dddc-4a41-ad1f-6cad37d0aa07-credential-keys\") pod \"keystone-bootstrap-jx54m\" (UID: \"4b21b129-dddc-4a41-ad1f-6cad37d0aa07\") " pod="openstack/keystone-bootstrap-jx54m" Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.230811 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4b21b129-dddc-4a41-ad1f-6cad37d0aa07-fernet-keys\") pod \"keystone-bootstrap-jx54m\" (UID: \"4b21b129-dddc-4a41-ad1f-6cad37d0aa07\") " pod="openstack/keystone-bootstrap-jx54m" Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.256109 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b21b129-dddc-4a41-ad1f-6cad37d0aa07-config-data\") pod \"keystone-bootstrap-jx54m\" (UID: \"4b21b129-dddc-4a41-ad1f-6cad37d0aa07\") " pod="openstack/keystone-bootstrap-jx54m" Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.271530 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffjsz\" (UniqueName: \"kubernetes.io/projected/4b21b129-dddc-4a41-ad1f-6cad37d0aa07-kube-api-access-ffjsz\") pod \"keystone-bootstrap-jx54m\" (UID: \"4b21b129-dddc-4a41-ad1f-6cad37d0aa07\") " pod="openstack/keystone-bootstrap-jx54m" Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.287033 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-5f59cc7d7-l4jln"] Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.288353 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5f59cc7d7-l4jln" Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.299691 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.299874 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.300020 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.312477 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-nclhm" Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.319957 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdcv9\" (UniqueName: \"kubernetes.io/projected/aabbe14d-15a9-4e93-a862-00fd5cf988a0-kube-api-access-pdcv9\") pod \"dnsmasq-dns-bbf5cc879-98jlt\" (UID: \"aabbe14d-15a9-4e93-a862-00fd5cf988a0\") " pod="openstack/dnsmasq-dns-bbf5cc879-98jlt" Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.329865 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5f59cc7d7-l4jln"] Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.342400 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-jx54m" Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.417362 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b710b7ca-abc9-465b-a279-949bd345962b-horizon-secret-key\") pod \"horizon-5f59cc7d7-l4jln\" (UID: \"b710b7ca-abc9-465b-a279-949bd345962b\") " pod="openstack/horizon-5f59cc7d7-l4jln" Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.417487 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b710b7ca-abc9-465b-a279-949bd345962b-scripts\") pod \"horizon-5f59cc7d7-l4jln\" (UID: \"b710b7ca-abc9-465b-a279-949bd345962b\") " pod="openstack/horizon-5f59cc7d7-l4jln" Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.417515 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b710b7ca-abc9-465b-a279-949bd345962b-logs\") pod \"horizon-5f59cc7d7-l4jln\" (UID: \"b710b7ca-abc9-465b-a279-949bd345962b\") " pod="openstack/horizon-5f59cc7d7-l4jln" Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.417601 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b710b7ca-abc9-465b-a279-949bd345962b-config-data\") pod \"horizon-5f59cc7d7-l4jln\" (UID: \"b710b7ca-abc9-465b-a279-949bd345962b\") " pod="openstack/horizon-5f59cc7d7-l4jln" Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.417927 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mq47m\" (UniqueName: \"kubernetes.io/projected/b710b7ca-abc9-465b-a279-949bd345962b-kube-api-access-mq47m\") pod \"horizon-5f59cc7d7-l4jln\" (UID: \"b710b7ca-abc9-465b-a279-949bd345962b\") " pod="openstack/horizon-5f59cc7d7-l4jln" Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.446316 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-v224c" Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.502315 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-9xbj8"] Dec 12 00:45:22 crc kubenswrapper[4606]: E1212 00:45:22.521637 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9a87b6a-f358-4e8f-90a9-0c3cc5b8971d" containerName="dnsmasq-dns" Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.521663 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9a87b6a-f358-4e8f-90a9-0c3cc5b8971d" containerName="dnsmasq-dns" Dec 12 00:45:22 crc kubenswrapper[4606]: E1212 00:45:22.521705 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9a87b6a-f358-4e8f-90a9-0c3cc5b8971d" containerName="init" Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.521724 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9a87b6a-f358-4e8f-90a9-0c3cc5b8971d" containerName="init" Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.521933 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9a87b6a-f358-4e8f-90a9-0c3cc5b8971d" containerName="dnsmasq-dns" Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.522887 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-9xbj8" Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.523873 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b710b7ca-abc9-465b-a279-949bd345962b-config-data\") pod \"horizon-5f59cc7d7-l4jln\" (UID: \"b710b7ca-abc9-465b-a279-949bd345962b\") " pod="openstack/horizon-5f59cc7d7-l4jln" Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.523958 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mq47m\" (UniqueName: \"kubernetes.io/projected/b710b7ca-abc9-465b-a279-949bd345962b-kube-api-access-mq47m\") pod \"horizon-5f59cc7d7-l4jln\" (UID: \"b710b7ca-abc9-465b-a279-949bd345962b\") " pod="openstack/horizon-5f59cc7d7-l4jln" Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.524074 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b710b7ca-abc9-465b-a279-949bd345962b-horizon-secret-key\") pod \"horizon-5f59cc7d7-l4jln\" (UID: \"b710b7ca-abc9-465b-a279-949bd345962b\") " pod="openstack/horizon-5f59cc7d7-l4jln" Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.524106 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b710b7ca-abc9-465b-a279-949bd345962b-scripts\") pod \"horizon-5f59cc7d7-l4jln\" (UID: \"b710b7ca-abc9-465b-a279-949bd345962b\") " pod="openstack/horizon-5f59cc7d7-l4jln" Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.524146 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b710b7ca-abc9-465b-a279-949bd345962b-logs\") pod \"horizon-5f59cc7d7-l4jln\" (UID: \"b710b7ca-abc9-465b-a279-949bd345962b\") " pod="openstack/horizon-5f59cc7d7-l4jln" Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.524774 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b710b7ca-abc9-465b-a279-949bd345962b-logs\") pod \"horizon-5f59cc7d7-l4jln\" (UID: \"b710b7ca-abc9-465b-a279-949bd345962b\") " pod="openstack/horizon-5f59cc7d7-l4jln" Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.526051 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b710b7ca-abc9-465b-a279-949bd345962b-config-data\") pod \"horizon-5f59cc7d7-l4jln\" (UID: \"b710b7ca-abc9-465b-a279-949bd345962b\") " pod="openstack/horizon-5f59cc7d7-l4jln" Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.536412 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-9xbj8"] Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.537102 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b710b7ca-abc9-465b-a279-949bd345962b-scripts\") pod \"horizon-5f59cc7d7-l4jln\" (UID: \"b710b7ca-abc9-465b-a279-949bd345962b\") " pod="openstack/horizon-5f59cc7d7-l4jln" Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.550680 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b710b7ca-abc9-465b-a279-949bd345962b-horizon-secret-key\") pod \"horizon-5f59cc7d7-l4jln\" (UID: \"b710b7ca-abc9-465b-a279-949bd345962b\") " pod="openstack/horizon-5f59cc7d7-l4jln" Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.550939 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.551212 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-m4kqj" Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.551500 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.558379 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf5cc879-98jlt" Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.624882 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mq47m\" (UniqueName: \"kubernetes.io/projected/b710b7ca-abc9-465b-a279-949bd345962b-kube-api-access-mq47m\") pod \"horizon-5f59cc7d7-l4jln\" (UID: \"b710b7ca-abc9-465b-a279-949bd345962b\") " pod="openstack/horizon-5f59cc7d7-l4jln" Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.625384 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sczjk\" (UniqueName: \"kubernetes.io/projected/d9a87b6a-f358-4e8f-90a9-0c3cc5b8971d-kube-api-access-sczjk\") pod \"d9a87b6a-f358-4e8f-90a9-0c3cc5b8971d\" (UID: \"d9a87b6a-f358-4e8f-90a9-0c3cc5b8971d\") " Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.625459 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9a87b6a-f358-4e8f-90a9-0c3cc5b8971d-config\") pod \"d9a87b6a-f358-4e8f-90a9-0c3cc5b8971d\" (UID: \"d9a87b6a-f358-4e8f-90a9-0c3cc5b8971d\") " Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.625607 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d9a87b6a-f358-4e8f-90a9-0c3cc5b8971d-dns-svc\") pod \"d9a87b6a-f358-4e8f-90a9-0c3cc5b8971d\" (UID: \"d9a87b6a-f358-4e8f-90a9-0c3cc5b8971d\") " Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.625639 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d9a87b6a-f358-4e8f-90a9-0c3cc5b8971d-ovsdbserver-nb\") pod \"d9a87b6a-f358-4e8f-90a9-0c3cc5b8971d\" (UID: \"d9a87b6a-f358-4e8f-90a9-0c3cc5b8971d\") " Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.625661 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d9a87b6a-f358-4e8f-90a9-0c3cc5b8971d-ovsdbserver-sb\") pod \"d9a87b6a-f358-4e8f-90a9-0c3cc5b8971d\" (UID: \"d9a87b6a-f358-4e8f-90a9-0c3cc5b8971d\") " Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.625865 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7978c0cd-b859-49f1-ad0e-1cb88ff58495-config-data\") pod \"cinder-db-sync-9xbj8\" (UID: \"7978c0cd-b859-49f1-ad0e-1cb88ff58495\") " pod="openstack/cinder-db-sync-9xbj8" Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.625889 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7978c0cd-b859-49f1-ad0e-1cb88ff58495-db-sync-config-data\") pod \"cinder-db-sync-9xbj8\" (UID: \"7978c0cd-b859-49f1-ad0e-1cb88ff58495\") " pod="openstack/cinder-db-sync-9xbj8" Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.625912 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwfdl\" (UniqueName: \"kubernetes.io/projected/7978c0cd-b859-49f1-ad0e-1cb88ff58495-kube-api-access-jwfdl\") pod \"cinder-db-sync-9xbj8\" (UID: \"7978c0cd-b859-49f1-ad0e-1cb88ff58495\") " pod="openstack/cinder-db-sync-9xbj8" Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.625941 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7978c0cd-b859-49f1-ad0e-1cb88ff58495-scripts\") pod \"cinder-db-sync-9xbj8\" (UID: \"7978c0cd-b859-49f1-ad0e-1cb88ff58495\") " pod="openstack/cinder-db-sync-9xbj8" Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.625975 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7978c0cd-b859-49f1-ad0e-1cb88ff58495-combined-ca-bundle\") pod \"cinder-db-sync-9xbj8\" (UID: \"7978c0cd-b859-49f1-ad0e-1cb88ff58495\") " pod="openstack/cinder-db-sync-9xbj8" Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.626003 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7978c0cd-b859-49f1-ad0e-1cb88ff58495-etc-machine-id\") pod \"cinder-db-sync-9xbj8\" (UID: \"7978c0cd-b859-49f1-ad0e-1cb88ff58495\") " pod="openstack/cinder-db-sync-9xbj8" Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.680681 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-64d56f87f-bhmtm"] Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.682572 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9a87b6a-f358-4e8f-90a9-0c3cc5b8971d-kube-api-access-sczjk" (OuterVolumeSpecName: "kube-api-access-sczjk") pod "d9a87b6a-f358-4e8f-90a9-0c3cc5b8971d" (UID: "d9a87b6a-f358-4e8f-90a9-0c3cc5b8971d"). InnerVolumeSpecName "kube-api-access-sczjk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.684884 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-64d56f87f-bhmtm" Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.732892 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7978c0cd-b859-49f1-ad0e-1cb88ff58495-scripts\") pod \"cinder-db-sync-9xbj8\" (UID: \"7978c0cd-b859-49f1-ad0e-1cb88ff58495\") " pod="openstack/cinder-db-sync-9xbj8" Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.733136 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7978c0cd-b859-49f1-ad0e-1cb88ff58495-combined-ca-bundle\") pod \"cinder-db-sync-9xbj8\" (UID: \"7978c0cd-b859-49f1-ad0e-1cb88ff58495\") " pod="openstack/cinder-db-sync-9xbj8" Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.733177 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7978c0cd-b859-49f1-ad0e-1cb88ff58495-etc-machine-id\") pod \"cinder-db-sync-9xbj8\" (UID: \"7978c0cd-b859-49f1-ad0e-1cb88ff58495\") " pod="openstack/cinder-db-sync-9xbj8" Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.733273 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7978c0cd-b859-49f1-ad0e-1cb88ff58495-config-data\") pod \"cinder-db-sync-9xbj8\" (UID: \"7978c0cd-b859-49f1-ad0e-1cb88ff58495\") " pod="openstack/cinder-db-sync-9xbj8" Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.733291 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7978c0cd-b859-49f1-ad0e-1cb88ff58495-db-sync-config-data\") pod \"cinder-db-sync-9xbj8\" (UID: \"7978c0cd-b859-49f1-ad0e-1cb88ff58495\") " pod="openstack/cinder-db-sync-9xbj8" Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.733313 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwfdl\" (UniqueName: \"kubernetes.io/projected/7978c0cd-b859-49f1-ad0e-1cb88ff58495-kube-api-access-jwfdl\") pod \"cinder-db-sync-9xbj8\" (UID: \"7978c0cd-b859-49f1-ad0e-1cb88ff58495\") " pod="openstack/cinder-db-sync-9xbj8" Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.733370 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sczjk\" (UniqueName: \"kubernetes.io/projected/d9a87b6a-f358-4e8f-90a9-0c3cc5b8971d-kube-api-access-sczjk\") on node \"crc\" DevicePath \"\"" Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.739310 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7978c0cd-b859-49f1-ad0e-1cb88ff58495-etc-machine-id\") pod \"cinder-db-sync-9xbj8\" (UID: \"7978c0cd-b859-49f1-ad0e-1cb88ff58495\") " pod="openstack/cinder-db-sync-9xbj8" Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.757264 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-98jlt"] Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.765330 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5f59cc7d7-l4jln" Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.771506 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7978c0cd-b859-49f1-ad0e-1cb88ff58495-config-data\") pod \"cinder-db-sync-9xbj8\" (UID: \"7978c0cd-b859-49f1-ad0e-1cb88ff58495\") " pod="openstack/cinder-db-sync-9xbj8" Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.783706 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7978c0cd-b859-49f1-ad0e-1cb88ff58495-combined-ca-bundle\") pod \"cinder-db-sync-9xbj8\" (UID: \"7978c0cd-b859-49f1-ad0e-1cb88ff58495\") " pod="openstack/cinder-db-sync-9xbj8" Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.793383 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7978c0cd-b859-49f1-ad0e-1cb88ff58495-scripts\") pod \"cinder-db-sync-9xbj8\" (UID: \"7978c0cd-b859-49f1-ad0e-1cb88ff58495\") " pod="openstack/cinder-db-sync-9xbj8" Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.796636 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7978c0cd-b859-49f1-ad0e-1cb88ff58495-db-sync-config-data\") pod \"cinder-db-sync-9xbj8\" (UID: \"7978c0cd-b859-49f1-ad0e-1cb88ff58495\") " pod="openstack/cinder-db-sync-9xbj8" Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.808256 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-64d56f87f-bhmtm"] Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.820039 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwfdl\" (UniqueName: \"kubernetes.io/projected/7978c0cd-b859-49f1-ad0e-1cb88ff58495-kube-api-access-jwfdl\") pod \"cinder-db-sync-9xbj8\" (UID: \"7978c0cd-b859-49f1-ad0e-1cb88ff58495\") " pod="openstack/cinder-db-sync-9xbj8" Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.828452 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-hp9lg"] Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.838725 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdnbm\" (UniqueName: \"kubernetes.io/projected/fc444db7-f445-40e7-bea0-f16f6afc2b91-kube-api-access-xdnbm\") pod \"horizon-64d56f87f-bhmtm\" (UID: \"fc444db7-f445-40e7-bea0-f16f6afc2b91\") " pod="openstack/horizon-64d56f87f-bhmtm" Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.838765 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fc444db7-f445-40e7-bea0-f16f6afc2b91-scripts\") pod \"horizon-64d56f87f-bhmtm\" (UID: \"fc444db7-f445-40e7-bea0-f16f6afc2b91\") " pod="openstack/horizon-64d56f87f-bhmtm" Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.838827 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fc444db7-f445-40e7-bea0-f16f6afc2b91-config-data\") pod \"horizon-64d56f87f-bhmtm\" (UID: \"fc444db7-f445-40e7-bea0-f16f6afc2b91\") " pod="openstack/horizon-64d56f87f-bhmtm" Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.838884 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc444db7-f445-40e7-bea0-f16f6afc2b91-logs\") pod \"horizon-64d56f87f-bhmtm\" (UID: \"fc444db7-f445-40e7-bea0-f16f6afc2b91\") " pod="openstack/horizon-64d56f87f-bhmtm" Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.838915 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/fc444db7-f445-40e7-bea0-f16f6afc2b91-horizon-secret-key\") pod \"horizon-64d56f87f-bhmtm\" (UID: \"fc444db7-f445-40e7-bea0-f16f6afc2b91\") " pod="openstack/horizon-64d56f87f-bhmtm" Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.858891 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-hp9lg" Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.859900 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d9a87b6a-f358-4e8f-90a9-0c3cc5b8971d-config" (OuterVolumeSpecName: "config") pod "d9a87b6a-f358-4e8f-90a9-0c3cc5b8971d" (UID: "d9a87b6a-f358-4e8f-90a9-0c3cc5b8971d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.864480 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.864687 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.873596 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-v224c" event={"ID":"d9a87b6a-f358-4e8f-90a9-0c3cc5b8971d","Type":"ContainerDied","Data":"cbab86f029bb8028e8c380aaee0515f5ed8f624f81a5f5f244e2b95c6ce8c203"} Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.888687 4606 scope.go:117] "RemoveContainer" containerID="bb4b0e071f6d1520c484ab216a2d53fd85e9cf5e48c7849c5f250a898e52f497" Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.873689 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-v224c" Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.883886 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-9xbj8" Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.875746 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-xhqlh" Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.890648 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d9a87b6a-f358-4e8f-90a9-0c3cc5b8971d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d9a87b6a-f358-4e8f-90a9-0c3cc5b8971d" (UID: "d9a87b6a-f358-4e8f-90a9-0c3cc5b8971d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.900039 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-4hg5f"] Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.901460 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-4hg5f" Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.906606 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.906780 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-lvj2f" Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.906880 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.920608 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d9a87b6a-f358-4e8f-90a9-0c3cc5b8971d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d9a87b6a-f358-4e8f-90a9-0c3cc5b8971d" (UID: "d9a87b6a-f358-4e8f-90a9-0c3cc5b8971d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.948062 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/fc444db7-f445-40e7-bea0-f16f6afc2b91-horizon-secret-key\") pod \"horizon-64d56f87f-bhmtm\" (UID: \"fc444db7-f445-40e7-bea0-f16f6afc2b91\") " pod="openstack/horizon-64d56f87f-bhmtm" Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.948136 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce0c3f48-d61e-420d-ab53-61361c7a4a25-combined-ca-bundle\") pod \"neutron-db-sync-hp9lg\" (UID: \"ce0c3f48-d61e-420d-ab53-61361c7a4a25\") " pod="openstack/neutron-db-sync-hp9lg" Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.948215 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdnbm\" (UniqueName: \"kubernetes.io/projected/fc444db7-f445-40e7-bea0-f16f6afc2b91-kube-api-access-xdnbm\") pod \"horizon-64d56f87f-bhmtm\" (UID: \"fc444db7-f445-40e7-bea0-f16f6afc2b91\") " pod="openstack/horizon-64d56f87f-bhmtm" Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.948238 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w88cn\" (UniqueName: \"kubernetes.io/projected/ce0c3f48-d61e-420d-ab53-61361c7a4a25-kube-api-access-w88cn\") pod \"neutron-db-sync-hp9lg\" (UID: \"ce0c3f48-d61e-420d-ab53-61361c7a4a25\") " pod="openstack/neutron-db-sync-hp9lg" Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.948279 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fc444db7-f445-40e7-bea0-f16f6afc2b91-scripts\") pod \"horizon-64d56f87f-bhmtm\" (UID: \"fc444db7-f445-40e7-bea0-f16f6afc2b91\") " pod="openstack/horizon-64d56f87f-bhmtm" Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.948332 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ce0c3f48-d61e-420d-ab53-61361c7a4a25-config\") pod \"neutron-db-sync-hp9lg\" (UID: \"ce0c3f48-d61e-420d-ab53-61361c7a4a25\") " pod="openstack/neutron-db-sync-hp9lg" Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.948368 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fc444db7-f445-40e7-bea0-f16f6afc2b91-config-data\") pod \"horizon-64d56f87f-bhmtm\" (UID: \"fc444db7-f445-40e7-bea0-f16f6afc2b91\") " pod="openstack/horizon-64d56f87f-bhmtm" Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.948411 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc444db7-f445-40e7-bea0-f16f6afc2b91-logs\") pod \"horizon-64d56f87f-bhmtm\" (UID: \"fc444db7-f445-40e7-bea0-f16f6afc2b91\") " pod="openstack/horizon-64d56f87f-bhmtm" Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.948467 4606 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d9a87b6a-f358-4e8f-90a9-0c3cc5b8971d-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.948477 4606 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d9a87b6a-f358-4e8f-90a9-0c3cc5b8971d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.948487 4606 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9a87b6a-f358-4e8f-90a9-0c3cc5b8971d-config\") on node \"crc\" DevicePath \"\"" Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.948969 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc444db7-f445-40e7-bea0-f16f6afc2b91-logs\") pod \"horizon-64d56f87f-bhmtm\" (UID: \"fc444db7-f445-40e7-bea0-f16f6afc2b91\") " pod="openstack/horizon-64d56f87f-bhmtm" Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.949541 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fc444db7-f445-40e7-bea0-f16f6afc2b91-scripts\") pod \"horizon-64d56f87f-bhmtm\" (UID: \"fc444db7-f445-40e7-bea0-f16f6afc2b91\") " pod="openstack/horizon-64d56f87f-bhmtm" Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.949641 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-j2fv2"] Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.951020 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-j2fv2" Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.965738 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fc444db7-f445-40e7-bea0-f16f6afc2b91-config-data\") pod \"horizon-64d56f87f-bhmtm\" (UID: \"fc444db7-f445-40e7-bea0-f16f6afc2b91\") " pod="openstack/horizon-64d56f87f-bhmtm" Dec 12 00:45:22 crc kubenswrapper[4606]: I1212 00:45:22.988286 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdnbm\" (UniqueName: \"kubernetes.io/projected/fc444db7-f445-40e7-bea0-f16f6afc2b91-kube-api-access-xdnbm\") pod \"horizon-64d56f87f-bhmtm\" (UID: \"fc444db7-f445-40e7-bea0-f16f6afc2b91\") " pod="openstack/horizon-64d56f87f-bhmtm" Dec 12 00:45:23 crc kubenswrapper[4606]: I1212 00:45:23.010773 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-hp9lg"] Dec 12 00:45:23 crc kubenswrapper[4606]: I1212 00:45:23.019722 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/fc444db7-f445-40e7-bea0-f16f6afc2b91-horizon-secret-key\") pod \"horizon-64d56f87f-bhmtm\" (UID: \"fc444db7-f445-40e7-bea0-f16f6afc2b91\") " pod="openstack/horizon-64d56f87f-bhmtm" Dec 12 00:45:23 crc kubenswrapper[4606]: I1212 00:45:23.029210 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d9a87b6a-f358-4e8f-90a9-0c3cc5b8971d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d9a87b6a-f358-4e8f-90a9-0c3cc5b8971d" (UID: "d9a87b6a-f358-4e8f-90a9-0c3cc5b8971d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:45:23 crc kubenswrapper[4606]: I1212 00:45:23.047621 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-64d56f87f-bhmtm" Dec 12 00:45:23 crc kubenswrapper[4606]: I1212 00:45:23.049595 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/39112d38-7887-4e2b-b32f-d679ca162941-dns-swift-storage-0\") pod \"dnsmasq-dns-56df8fb6b7-j2fv2\" (UID: \"39112d38-7887-4e2b-b32f-d679ca162941\") " pod="openstack/dnsmasq-dns-56df8fb6b7-j2fv2" Dec 12 00:45:23 crc kubenswrapper[4606]: I1212 00:45:23.049635 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ce0c3f48-d61e-420d-ab53-61361c7a4a25-config\") pod \"neutron-db-sync-hp9lg\" (UID: \"ce0c3f48-d61e-420d-ab53-61361c7a4a25\") " pod="openstack/neutron-db-sync-hp9lg" Dec 12 00:45:23 crc kubenswrapper[4606]: I1212 00:45:23.049657 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/39112d38-7887-4e2b-b32f-d679ca162941-ovsdbserver-nb\") pod \"dnsmasq-dns-56df8fb6b7-j2fv2\" (UID: \"39112d38-7887-4e2b-b32f-d679ca162941\") " pod="openstack/dnsmasq-dns-56df8fb6b7-j2fv2" Dec 12 00:45:23 crc kubenswrapper[4606]: I1212 00:45:23.049695 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a0b6b98-c743-4435-a967-55c0edb95531-combined-ca-bundle\") pod \"placement-db-sync-4hg5f\" (UID: \"5a0b6b98-c743-4435-a967-55c0edb95531\") " pod="openstack/placement-db-sync-4hg5f" Dec 12 00:45:23 crc kubenswrapper[4606]: I1212 00:45:23.049717 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5a0b6b98-c743-4435-a967-55c0edb95531-logs\") pod \"placement-db-sync-4hg5f\" (UID: \"5a0b6b98-c743-4435-a967-55c0edb95531\") " pod="openstack/placement-db-sync-4hg5f" Dec 12 00:45:23 crc kubenswrapper[4606]: I1212 00:45:23.049742 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2cll\" (UniqueName: \"kubernetes.io/projected/5a0b6b98-c743-4435-a967-55c0edb95531-kube-api-access-v2cll\") pod \"placement-db-sync-4hg5f\" (UID: \"5a0b6b98-c743-4435-a967-55c0edb95531\") " pod="openstack/placement-db-sync-4hg5f" Dec 12 00:45:23 crc kubenswrapper[4606]: I1212 00:45:23.049764 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/39112d38-7887-4e2b-b32f-d679ca162941-dns-svc\") pod \"dnsmasq-dns-56df8fb6b7-j2fv2\" (UID: \"39112d38-7887-4e2b-b32f-d679ca162941\") " pod="openstack/dnsmasq-dns-56df8fb6b7-j2fv2" Dec 12 00:45:23 crc kubenswrapper[4606]: I1212 00:45:23.049797 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce0c3f48-d61e-420d-ab53-61361c7a4a25-combined-ca-bundle\") pod \"neutron-db-sync-hp9lg\" (UID: \"ce0c3f48-d61e-420d-ab53-61361c7a4a25\") " pod="openstack/neutron-db-sync-hp9lg" Dec 12 00:45:23 crc kubenswrapper[4606]: I1212 00:45:23.049813 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dz6w\" (UniqueName: \"kubernetes.io/projected/39112d38-7887-4e2b-b32f-d679ca162941-kube-api-access-9dz6w\") pod \"dnsmasq-dns-56df8fb6b7-j2fv2\" (UID: \"39112d38-7887-4e2b-b32f-d679ca162941\") " pod="openstack/dnsmasq-dns-56df8fb6b7-j2fv2" Dec 12 00:45:23 crc kubenswrapper[4606]: I1212 00:45:23.049836 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a0b6b98-c743-4435-a967-55c0edb95531-scripts\") pod \"placement-db-sync-4hg5f\" (UID: \"5a0b6b98-c743-4435-a967-55c0edb95531\") " pod="openstack/placement-db-sync-4hg5f" Dec 12 00:45:23 crc kubenswrapper[4606]: I1212 00:45:23.049856 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/39112d38-7887-4e2b-b32f-d679ca162941-ovsdbserver-sb\") pod \"dnsmasq-dns-56df8fb6b7-j2fv2\" (UID: \"39112d38-7887-4e2b-b32f-d679ca162941\") " pod="openstack/dnsmasq-dns-56df8fb6b7-j2fv2" Dec 12 00:45:23 crc kubenswrapper[4606]: I1212 00:45:23.049882 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39112d38-7887-4e2b-b32f-d679ca162941-config\") pod \"dnsmasq-dns-56df8fb6b7-j2fv2\" (UID: \"39112d38-7887-4e2b-b32f-d679ca162941\") " pod="openstack/dnsmasq-dns-56df8fb6b7-j2fv2" Dec 12 00:45:23 crc kubenswrapper[4606]: I1212 00:45:23.049909 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w88cn\" (UniqueName: \"kubernetes.io/projected/ce0c3f48-d61e-420d-ab53-61361c7a4a25-kube-api-access-w88cn\") pod \"neutron-db-sync-hp9lg\" (UID: \"ce0c3f48-d61e-420d-ab53-61361c7a4a25\") " pod="openstack/neutron-db-sync-hp9lg" Dec 12 00:45:23 crc kubenswrapper[4606]: I1212 00:45:23.049929 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a0b6b98-c743-4435-a967-55c0edb95531-config-data\") pod \"placement-db-sync-4hg5f\" (UID: \"5a0b6b98-c743-4435-a967-55c0edb95531\") " pod="openstack/placement-db-sync-4hg5f" Dec 12 00:45:23 crc kubenswrapper[4606]: I1212 00:45:23.049975 4606 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d9a87b6a-f358-4e8f-90a9-0c3cc5b8971d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 12 00:45:23 crc kubenswrapper[4606]: I1212 00:45:23.065976 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce0c3f48-d61e-420d-ab53-61361c7a4a25-combined-ca-bundle\") pod \"neutron-db-sync-hp9lg\" (UID: \"ce0c3f48-d61e-420d-ab53-61361c7a4a25\") " pod="openstack/neutron-db-sync-hp9lg" Dec 12 00:45:23 crc kubenswrapper[4606]: I1212 00:45:23.074828 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/ce0c3f48-d61e-420d-ab53-61361c7a4a25-config\") pod \"neutron-db-sync-hp9lg\" (UID: \"ce0c3f48-d61e-420d-ab53-61361c7a4a25\") " pod="openstack/neutron-db-sync-hp9lg" Dec 12 00:45:23 crc kubenswrapper[4606]: I1212 00:45:23.107484 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w88cn\" (UniqueName: \"kubernetes.io/projected/ce0c3f48-d61e-420d-ab53-61361c7a4a25-kube-api-access-w88cn\") pod \"neutron-db-sync-hp9lg\" (UID: \"ce0c3f48-d61e-420d-ab53-61361c7a4a25\") " pod="openstack/neutron-db-sync-hp9lg" Dec 12 00:45:23 crc kubenswrapper[4606]: I1212 00:45:23.114603 4606 scope.go:117] "RemoveContainer" containerID="b389a3beb415877ef288eec5fb32660337c20df2a722a5732809ce332fcb288d" Dec 12 00:45:23 crc kubenswrapper[4606]: I1212 00:45:23.116164 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-4hg5f"] Dec 12 00:45:23 crc kubenswrapper[4606]: I1212 00:45:23.151030 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2cll\" (UniqueName: \"kubernetes.io/projected/5a0b6b98-c743-4435-a967-55c0edb95531-kube-api-access-v2cll\") pod \"placement-db-sync-4hg5f\" (UID: \"5a0b6b98-c743-4435-a967-55c0edb95531\") " pod="openstack/placement-db-sync-4hg5f" Dec 12 00:45:23 crc kubenswrapper[4606]: I1212 00:45:23.151085 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/39112d38-7887-4e2b-b32f-d679ca162941-dns-svc\") pod \"dnsmasq-dns-56df8fb6b7-j2fv2\" (UID: \"39112d38-7887-4e2b-b32f-d679ca162941\") " pod="openstack/dnsmasq-dns-56df8fb6b7-j2fv2" Dec 12 00:45:23 crc kubenswrapper[4606]: I1212 00:45:23.151130 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9dz6w\" (UniqueName: \"kubernetes.io/projected/39112d38-7887-4e2b-b32f-d679ca162941-kube-api-access-9dz6w\") pod \"dnsmasq-dns-56df8fb6b7-j2fv2\" (UID: \"39112d38-7887-4e2b-b32f-d679ca162941\") " pod="openstack/dnsmasq-dns-56df8fb6b7-j2fv2" Dec 12 00:45:23 crc kubenswrapper[4606]: I1212 00:45:23.151153 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a0b6b98-c743-4435-a967-55c0edb95531-scripts\") pod \"placement-db-sync-4hg5f\" (UID: \"5a0b6b98-c743-4435-a967-55c0edb95531\") " pod="openstack/placement-db-sync-4hg5f" Dec 12 00:45:23 crc kubenswrapper[4606]: I1212 00:45:23.151183 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/39112d38-7887-4e2b-b32f-d679ca162941-ovsdbserver-sb\") pod \"dnsmasq-dns-56df8fb6b7-j2fv2\" (UID: \"39112d38-7887-4e2b-b32f-d679ca162941\") " pod="openstack/dnsmasq-dns-56df8fb6b7-j2fv2" Dec 12 00:45:23 crc kubenswrapper[4606]: I1212 00:45:23.151224 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39112d38-7887-4e2b-b32f-d679ca162941-config\") pod \"dnsmasq-dns-56df8fb6b7-j2fv2\" (UID: \"39112d38-7887-4e2b-b32f-d679ca162941\") " pod="openstack/dnsmasq-dns-56df8fb6b7-j2fv2" Dec 12 00:45:23 crc kubenswrapper[4606]: I1212 00:45:23.151259 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a0b6b98-c743-4435-a967-55c0edb95531-config-data\") pod \"placement-db-sync-4hg5f\" (UID: \"5a0b6b98-c743-4435-a967-55c0edb95531\") " pod="openstack/placement-db-sync-4hg5f" Dec 12 00:45:23 crc kubenswrapper[4606]: I1212 00:45:23.151281 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/39112d38-7887-4e2b-b32f-d679ca162941-dns-swift-storage-0\") pod \"dnsmasq-dns-56df8fb6b7-j2fv2\" (UID: \"39112d38-7887-4e2b-b32f-d679ca162941\") " pod="openstack/dnsmasq-dns-56df8fb6b7-j2fv2" Dec 12 00:45:23 crc kubenswrapper[4606]: I1212 00:45:23.151313 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/39112d38-7887-4e2b-b32f-d679ca162941-ovsdbserver-nb\") pod \"dnsmasq-dns-56df8fb6b7-j2fv2\" (UID: \"39112d38-7887-4e2b-b32f-d679ca162941\") " pod="openstack/dnsmasq-dns-56df8fb6b7-j2fv2" Dec 12 00:45:23 crc kubenswrapper[4606]: I1212 00:45:23.151348 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a0b6b98-c743-4435-a967-55c0edb95531-combined-ca-bundle\") pod \"placement-db-sync-4hg5f\" (UID: \"5a0b6b98-c743-4435-a967-55c0edb95531\") " pod="openstack/placement-db-sync-4hg5f" Dec 12 00:45:23 crc kubenswrapper[4606]: I1212 00:45:23.151367 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5a0b6b98-c743-4435-a967-55c0edb95531-logs\") pod \"placement-db-sync-4hg5f\" (UID: \"5a0b6b98-c743-4435-a967-55c0edb95531\") " pod="openstack/placement-db-sync-4hg5f" Dec 12 00:45:23 crc kubenswrapper[4606]: I1212 00:45:23.151779 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5a0b6b98-c743-4435-a967-55c0edb95531-logs\") pod \"placement-db-sync-4hg5f\" (UID: \"5a0b6b98-c743-4435-a967-55c0edb95531\") " pod="openstack/placement-db-sync-4hg5f" Dec 12 00:45:23 crc kubenswrapper[4606]: I1212 00:45:23.152857 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/39112d38-7887-4e2b-b32f-d679ca162941-dns-svc\") pod \"dnsmasq-dns-56df8fb6b7-j2fv2\" (UID: \"39112d38-7887-4e2b-b32f-d679ca162941\") " pod="openstack/dnsmasq-dns-56df8fb6b7-j2fv2" Dec 12 00:45:23 crc kubenswrapper[4606]: I1212 00:45:23.162186 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/39112d38-7887-4e2b-b32f-d679ca162941-ovsdbserver-nb\") pod \"dnsmasq-dns-56df8fb6b7-j2fv2\" (UID: \"39112d38-7887-4e2b-b32f-d679ca162941\") " pod="openstack/dnsmasq-dns-56df8fb6b7-j2fv2" Dec 12 00:45:23 crc kubenswrapper[4606]: I1212 00:45:23.162791 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/39112d38-7887-4e2b-b32f-d679ca162941-dns-swift-storage-0\") pod \"dnsmasq-dns-56df8fb6b7-j2fv2\" (UID: \"39112d38-7887-4e2b-b32f-d679ca162941\") " pod="openstack/dnsmasq-dns-56df8fb6b7-j2fv2" Dec 12 00:45:23 crc kubenswrapper[4606]: I1212 00:45:23.163027 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39112d38-7887-4e2b-b32f-d679ca162941-config\") pod \"dnsmasq-dns-56df8fb6b7-j2fv2\" (UID: \"39112d38-7887-4e2b-b32f-d679ca162941\") " pod="openstack/dnsmasq-dns-56df8fb6b7-j2fv2" Dec 12 00:45:23 crc kubenswrapper[4606]: I1212 00:45:23.163613 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/39112d38-7887-4e2b-b32f-d679ca162941-ovsdbserver-sb\") pod \"dnsmasq-dns-56df8fb6b7-j2fv2\" (UID: \"39112d38-7887-4e2b-b32f-d679ca162941\") " pod="openstack/dnsmasq-dns-56df8fb6b7-j2fv2" Dec 12 00:45:23 crc kubenswrapper[4606]: I1212 00:45:23.182066 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a0b6b98-c743-4435-a967-55c0edb95531-config-data\") pod \"placement-db-sync-4hg5f\" (UID: \"5a0b6b98-c743-4435-a967-55c0edb95531\") " pod="openstack/placement-db-sync-4hg5f" Dec 12 00:45:23 crc kubenswrapper[4606]: I1212 00:45:23.186272 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a0b6b98-c743-4435-a967-55c0edb95531-scripts\") pod \"placement-db-sync-4hg5f\" (UID: \"5a0b6b98-c743-4435-a967-55c0edb95531\") " pod="openstack/placement-db-sync-4hg5f" Dec 12 00:45:23 crc kubenswrapper[4606]: I1212 00:45:23.198773 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-hp9lg" Dec 12 00:45:23 crc kubenswrapper[4606]: I1212 00:45:23.204879 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a0b6b98-c743-4435-a967-55c0edb95531-combined-ca-bundle\") pod \"placement-db-sync-4hg5f\" (UID: \"5a0b6b98-c743-4435-a967-55c0edb95531\") " pod="openstack/placement-db-sync-4hg5f" Dec 12 00:45:23 crc kubenswrapper[4606]: I1212 00:45:23.209487 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2cll\" (UniqueName: \"kubernetes.io/projected/5a0b6b98-c743-4435-a967-55c0edb95531-kube-api-access-v2cll\") pod \"placement-db-sync-4hg5f\" (UID: \"5a0b6b98-c743-4435-a967-55c0edb95531\") " pod="openstack/placement-db-sync-4hg5f" Dec 12 00:45:23 crc kubenswrapper[4606]: I1212 00:45:23.209630 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dz6w\" (UniqueName: \"kubernetes.io/projected/39112d38-7887-4e2b-b32f-d679ca162941-kube-api-access-9dz6w\") pod \"dnsmasq-dns-56df8fb6b7-j2fv2\" (UID: \"39112d38-7887-4e2b-b32f-d679ca162941\") " pod="openstack/dnsmasq-dns-56df8fb6b7-j2fv2" Dec 12 00:45:23 crc kubenswrapper[4606]: I1212 00:45:23.212652 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-j2fv2"] Dec 12 00:45:23 crc kubenswrapper[4606]: I1212 00:45:23.223013 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-qmcpn"] Dec 12 00:45:23 crc kubenswrapper[4606]: I1212 00:45:23.224407 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-qmcpn" Dec 12 00:45:23 crc kubenswrapper[4606]: I1212 00:45:23.233675 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-8t92h" Dec 12 00:45:23 crc kubenswrapper[4606]: I1212 00:45:23.233902 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 12 00:45:23 crc kubenswrapper[4606]: I1212 00:45:23.250532 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-4hg5f" Dec 12 00:45:23 crc kubenswrapper[4606]: I1212 00:45:23.252700 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-qmcpn"] Dec 12 00:45:23 crc kubenswrapper[4606]: I1212 00:45:23.354562 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/73c136c9-e12d-434a-aab1-ed21dfaf0f60-db-sync-config-data\") pod \"barbican-db-sync-qmcpn\" (UID: \"73c136c9-e12d-434a-aab1-ed21dfaf0f60\") " pod="openstack/barbican-db-sync-qmcpn" Dec 12 00:45:23 crc kubenswrapper[4606]: I1212 00:45:23.354776 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mm82\" (UniqueName: \"kubernetes.io/projected/73c136c9-e12d-434a-aab1-ed21dfaf0f60-kube-api-access-7mm82\") pod \"barbican-db-sync-qmcpn\" (UID: \"73c136c9-e12d-434a-aab1-ed21dfaf0f60\") " pod="openstack/barbican-db-sync-qmcpn" Dec 12 00:45:23 crc kubenswrapper[4606]: I1212 00:45:23.354814 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73c136c9-e12d-434a-aab1-ed21dfaf0f60-combined-ca-bundle\") pod \"barbican-db-sync-qmcpn\" (UID: \"73c136c9-e12d-434a-aab1-ed21dfaf0f60\") " pod="openstack/barbican-db-sync-qmcpn" Dec 12 00:45:23 crc kubenswrapper[4606]: I1212 00:45:23.379975 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 12 00:45:23 crc kubenswrapper[4606]: I1212 00:45:23.382988 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 12 00:45:23 crc kubenswrapper[4606]: I1212 00:45:23.393060 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 12 00:45:23 crc kubenswrapper[4606]: I1212 00:45:23.393261 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 12 00:45:23 crc kubenswrapper[4606]: I1212 00:45:23.443446 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 12 00:45:23 crc kubenswrapper[4606]: I1212 00:45:23.445111 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 12 00:45:23 crc kubenswrapper[4606]: I1212 00:45:23.452657 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Dec 12 00:45:23 crc kubenswrapper[4606]: I1212 00:45:23.452921 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-gs7ln" Dec 12 00:45:23 crc kubenswrapper[4606]: I1212 00:45:23.453034 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 12 00:45:23 crc kubenswrapper[4606]: I1212 00:45:23.453159 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 12 00:45:23 crc kubenswrapper[4606]: I1212 00:45:23.456282 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88bd7935-19a0-486d-b1e7-4737abcf21ab-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"88bd7935-19a0-486d-b1e7-4737abcf21ab\") " pod="openstack/ceilometer-0" Dec 12 00:45:23 crc kubenswrapper[4606]: I1212 00:45:23.456325 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mm82\" (UniqueName: \"kubernetes.io/projected/73c136c9-e12d-434a-aab1-ed21dfaf0f60-kube-api-access-7mm82\") pod \"barbican-db-sync-qmcpn\" (UID: \"73c136c9-e12d-434a-aab1-ed21dfaf0f60\") " pod="openstack/barbican-db-sync-qmcpn" Dec 12 00:45:23 crc kubenswrapper[4606]: I1212 00:45:23.456374 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73c136c9-e12d-434a-aab1-ed21dfaf0f60-combined-ca-bundle\") pod \"barbican-db-sync-qmcpn\" (UID: \"73c136c9-e12d-434a-aab1-ed21dfaf0f60\") " pod="openstack/barbican-db-sync-qmcpn" Dec 12 00:45:23 crc kubenswrapper[4606]: I1212 00:45:23.456399 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88bd7935-19a0-486d-b1e7-4737abcf21ab-config-data\") pod \"ceilometer-0\" (UID: \"88bd7935-19a0-486d-b1e7-4737abcf21ab\") " pod="openstack/ceilometer-0" Dec 12 00:45:23 crc kubenswrapper[4606]: I1212 00:45:23.456428 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqrqm\" (UniqueName: \"kubernetes.io/projected/88bd7935-19a0-486d-b1e7-4737abcf21ab-kube-api-access-jqrqm\") pod \"ceilometer-0\" (UID: \"88bd7935-19a0-486d-b1e7-4737abcf21ab\") " pod="openstack/ceilometer-0" Dec 12 00:45:23 crc kubenswrapper[4606]: I1212 00:45:23.456465 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/88bd7935-19a0-486d-b1e7-4737abcf21ab-run-httpd\") pod \"ceilometer-0\" (UID: \"88bd7935-19a0-486d-b1e7-4737abcf21ab\") " pod="openstack/ceilometer-0" Dec 12 00:45:23 crc kubenswrapper[4606]: I1212 00:45:23.456502 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/88bd7935-19a0-486d-b1e7-4737abcf21ab-log-httpd\") pod \"ceilometer-0\" (UID: \"88bd7935-19a0-486d-b1e7-4737abcf21ab\") " pod="openstack/ceilometer-0" Dec 12 00:45:23 crc kubenswrapper[4606]: I1212 00:45:23.456542 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88bd7935-19a0-486d-b1e7-4737abcf21ab-scripts\") pod \"ceilometer-0\" (UID: \"88bd7935-19a0-486d-b1e7-4737abcf21ab\") " pod="openstack/ceilometer-0" Dec 12 00:45:23 crc kubenswrapper[4606]: I1212 00:45:23.456573 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/73c136c9-e12d-434a-aab1-ed21dfaf0f60-db-sync-config-data\") pod \"barbican-db-sync-qmcpn\" (UID: \"73c136c9-e12d-434a-aab1-ed21dfaf0f60\") " pod="openstack/barbican-db-sync-qmcpn" Dec 12 00:45:23 crc kubenswrapper[4606]: I1212 00:45:23.456591 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/88bd7935-19a0-486d-b1e7-4737abcf21ab-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"88bd7935-19a0-486d-b1e7-4737abcf21ab\") " pod="openstack/ceilometer-0" Dec 12 00:45:23 crc kubenswrapper[4606]: W1212 00:45:23.480789 4606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4b21b129_dddc_4a41_ad1f_6cad37d0aa07.slice/crio-5cf49a5d879c03def251f0378329479a6e1edafa5d925c0793be69d8d9c49a3c WatchSource:0}: Error finding container 5cf49a5d879c03def251f0378329479a6e1edafa5d925c0793be69d8d9c49a3c: Status 404 returned error can't find the container with id 5cf49a5d879c03def251f0378329479a6e1edafa5d925c0793be69d8d9c49a3c Dec 12 00:45:23 crc kubenswrapper[4606]: I1212 00:45:23.495776 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73c136c9-e12d-434a-aab1-ed21dfaf0f60-combined-ca-bundle\") pod \"barbican-db-sync-qmcpn\" (UID: \"73c136c9-e12d-434a-aab1-ed21dfaf0f60\") " pod="openstack/barbican-db-sync-qmcpn" Dec 12 00:45:23 crc kubenswrapper[4606]: I1212 00:45:23.505340 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 12 00:45:23 crc kubenswrapper[4606]: I1212 00:45:23.523278 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 12 00:45:23 crc kubenswrapper[4606]: I1212 00:45:23.530430 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/73c136c9-e12d-434a-aab1-ed21dfaf0f60-db-sync-config-data\") pod \"barbican-db-sync-qmcpn\" (UID: \"73c136c9-e12d-434a-aab1-ed21dfaf0f60\") " pod="openstack/barbican-db-sync-qmcpn" Dec 12 00:45:23 crc kubenswrapper[4606]: I1212 00:45:23.530731 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mm82\" (UniqueName: \"kubernetes.io/projected/73c136c9-e12d-434a-aab1-ed21dfaf0f60-kube-api-access-7mm82\") pod \"barbican-db-sync-qmcpn\" (UID: \"73c136c9-e12d-434a-aab1-ed21dfaf0f60\") " pod="openstack/barbican-db-sync-qmcpn" Dec 12 00:45:23 crc kubenswrapper[4606]: I1212 00:45:23.541763 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-j2fv2" Dec 12 00:45:23 crc kubenswrapper[4606]: I1212 00:45:23.590226 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88bd7935-19a0-486d-b1e7-4737abcf21ab-scripts\") pod \"ceilometer-0\" (UID: \"88bd7935-19a0-486d-b1e7-4737abcf21ab\") " pod="openstack/ceilometer-0" Dec 12 00:45:23 crc kubenswrapper[4606]: I1212 00:45:23.590309 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6w52\" (UniqueName: \"kubernetes.io/projected/f83e9b6e-2200-4066-81be-df7d867ea60e-kube-api-access-p6w52\") pod \"glance-default-internal-api-0\" (UID: \"f83e9b6e-2200-4066-81be-df7d867ea60e\") " pod="openstack/glance-default-internal-api-0" Dec 12 00:45:23 crc kubenswrapper[4606]: I1212 00:45:23.590344 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f83e9b6e-2200-4066-81be-df7d867ea60e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"f83e9b6e-2200-4066-81be-df7d867ea60e\") " pod="openstack/glance-default-internal-api-0" Dec 12 00:45:23 crc kubenswrapper[4606]: I1212 00:45:23.590381 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/88bd7935-19a0-486d-b1e7-4737abcf21ab-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"88bd7935-19a0-486d-b1e7-4737abcf21ab\") " pod="openstack/ceilometer-0" Dec 12 00:45:23 crc kubenswrapper[4606]: I1212 00:45:23.590505 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88bd7935-19a0-486d-b1e7-4737abcf21ab-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"88bd7935-19a0-486d-b1e7-4737abcf21ab\") " pod="openstack/ceilometer-0" Dec 12 00:45:23 crc kubenswrapper[4606]: I1212 00:45:23.590550 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f83e9b6e-2200-4066-81be-df7d867ea60e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"f83e9b6e-2200-4066-81be-df7d867ea60e\") " pod="openstack/glance-default-internal-api-0" Dec 12 00:45:23 crc kubenswrapper[4606]: I1212 00:45:23.590578 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f83e9b6e-2200-4066-81be-df7d867ea60e-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"f83e9b6e-2200-4066-81be-df7d867ea60e\") " pod="openstack/glance-default-internal-api-0" Dec 12 00:45:23 crc kubenswrapper[4606]: I1212 00:45:23.590640 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f83e9b6e-2200-4066-81be-df7d867ea60e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"f83e9b6e-2200-4066-81be-df7d867ea60e\") " pod="openstack/glance-default-internal-api-0" Dec 12 00:45:23 crc kubenswrapper[4606]: I1212 00:45:23.590696 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"f83e9b6e-2200-4066-81be-df7d867ea60e\") " pod="openstack/glance-default-internal-api-0" Dec 12 00:45:23 crc kubenswrapper[4606]: I1212 00:45:23.590803 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88bd7935-19a0-486d-b1e7-4737abcf21ab-config-data\") pod \"ceilometer-0\" (UID: \"88bd7935-19a0-486d-b1e7-4737abcf21ab\") " pod="openstack/ceilometer-0" Dec 12 00:45:23 crc kubenswrapper[4606]: I1212 00:45:23.590861 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqrqm\" (UniqueName: \"kubernetes.io/projected/88bd7935-19a0-486d-b1e7-4737abcf21ab-kube-api-access-jqrqm\") pod \"ceilometer-0\" (UID: \"88bd7935-19a0-486d-b1e7-4737abcf21ab\") " pod="openstack/ceilometer-0" Dec 12 00:45:23 crc kubenswrapper[4606]: I1212 00:45:23.590910 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/88bd7935-19a0-486d-b1e7-4737abcf21ab-run-httpd\") pod \"ceilometer-0\" (UID: \"88bd7935-19a0-486d-b1e7-4737abcf21ab\") " pod="openstack/ceilometer-0" Dec 12 00:45:23 crc kubenswrapper[4606]: I1212 00:45:23.590973 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f83e9b6e-2200-4066-81be-df7d867ea60e-logs\") pod \"glance-default-internal-api-0\" (UID: \"f83e9b6e-2200-4066-81be-df7d867ea60e\") " pod="openstack/glance-default-internal-api-0" Dec 12 00:45:23 crc kubenswrapper[4606]: I1212 00:45:23.591071 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f83e9b6e-2200-4066-81be-df7d867ea60e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"f83e9b6e-2200-4066-81be-df7d867ea60e\") " pod="openstack/glance-default-internal-api-0" Dec 12 00:45:23 crc kubenswrapper[4606]: I1212 00:45:23.591095 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/88bd7935-19a0-486d-b1e7-4737abcf21ab-log-httpd\") pod \"ceilometer-0\" (UID: \"88bd7935-19a0-486d-b1e7-4737abcf21ab\") " pod="openstack/ceilometer-0" Dec 12 00:45:23 crc kubenswrapper[4606]: I1212 00:45:23.600679 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/88bd7935-19a0-486d-b1e7-4737abcf21ab-run-httpd\") pod \"ceilometer-0\" (UID: \"88bd7935-19a0-486d-b1e7-4737abcf21ab\") " pod="openstack/ceilometer-0" Dec 12 00:45:23 crc kubenswrapper[4606]: I1212 00:45:23.601290 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/88bd7935-19a0-486d-b1e7-4737abcf21ab-log-httpd\") pod \"ceilometer-0\" (UID: \"88bd7935-19a0-486d-b1e7-4737abcf21ab\") " pod="openstack/ceilometer-0" Dec 12 00:45:23 crc kubenswrapper[4606]: I1212 00:45:23.605681 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88bd7935-19a0-486d-b1e7-4737abcf21ab-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"88bd7935-19a0-486d-b1e7-4737abcf21ab\") " pod="openstack/ceilometer-0" Dec 12 00:45:23 crc kubenswrapper[4606]: I1212 00:45:23.614269 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88bd7935-19a0-486d-b1e7-4737abcf21ab-scripts\") pod \"ceilometer-0\" (UID: \"88bd7935-19a0-486d-b1e7-4737abcf21ab\") " pod="openstack/ceilometer-0" Dec 12 00:45:23 crc kubenswrapper[4606]: I1212 00:45:23.615346 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88bd7935-19a0-486d-b1e7-4737abcf21ab-config-data\") pod \"ceilometer-0\" (UID: \"88bd7935-19a0-486d-b1e7-4737abcf21ab\") " pod="openstack/ceilometer-0" Dec 12 00:45:23 crc kubenswrapper[4606]: I1212 00:45:23.620283 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-jx54m"] Dec 12 00:45:23 crc kubenswrapper[4606]: I1212 00:45:23.655161 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqrqm\" (UniqueName: \"kubernetes.io/projected/88bd7935-19a0-486d-b1e7-4737abcf21ab-kube-api-access-jqrqm\") pod \"ceilometer-0\" (UID: \"88bd7935-19a0-486d-b1e7-4737abcf21ab\") " pod="openstack/ceilometer-0" Dec 12 00:45:23 crc kubenswrapper[4606]: I1212 00:45:23.666680 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/88bd7935-19a0-486d-b1e7-4737abcf21ab-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"88bd7935-19a0-486d-b1e7-4737abcf21ab\") " pod="openstack/ceilometer-0" Dec 12 00:45:23 crc kubenswrapper[4606]: I1212 00:45:23.698582 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f83e9b6e-2200-4066-81be-df7d867ea60e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"f83e9b6e-2200-4066-81be-df7d867ea60e\") " pod="openstack/glance-default-internal-api-0" Dec 12 00:45:23 crc kubenswrapper[4606]: I1212 00:45:23.698656 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"f83e9b6e-2200-4066-81be-df7d867ea60e\") " pod="openstack/glance-default-internal-api-0" Dec 12 00:45:23 crc kubenswrapper[4606]: I1212 00:45:23.698745 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f83e9b6e-2200-4066-81be-df7d867ea60e-logs\") pod \"glance-default-internal-api-0\" (UID: \"f83e9b6e-2200-4066-81be-df7d867ea60e\") " pod="openstack/glance-default-internal-api-0" Dec 12 00:45:23 crc kubenswrapper[4606]: I1212 00:45:23.698773 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f83e9b6e-2200-4066-81be-df7d867ea60e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"f83e9b6e-2200-4066-81be-df7d867ea60e\") " pod="openstack/glance-default-internal-api-0" Dec 12 00:45:23 crc kubenswrapper[4606]: I1212 00:45:23.698854 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6w52\" (UniqueName: \"kubernetes.io/projected/f83e9b6e-2200-4066-81be-df7d867ea60e-kube-api-access-p6w52\") pod \"glance-default-internal-api-0\" (UID: \"f83e9b6e-2200-4066-81be-df7d867ea60e\") " pod="openstack/glance-default-internal-api-0" Dec 12 00:45:23 crc kubenswrapper[4606]: I1212 00:45:23.698883 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f83e9b6e-2200-4066-81be-df7d867ea60e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"f83e9b6e-2200-4066-81be-df7d867ea60e\") " pod="openstack/glance-default-internal-api-0" Dec 12 00:45:23 crc kubenswrapper[4606]: I1212 00:45:23.718430 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f83e9b6e-2200-4066-81be-df7d867ea60e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"f83e9b6e-2200-4066-81be-df7d867ea60e\") " pod="openstack/glance-default-internal-api-0" Dec 12 00:45:23 crc kubenswrapper[4606]: I1212 00:45:23.756465 4606 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"f83e9b6e-2200-4066-81be-df7d867ea60e\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/glance-default-internal-api-0" Dec 12 00:45:23 crc kubenswrapper[4606]: I1212 00:45:23.795186 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f83e9b6e-2200-4066-81be-df7d867ea60e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"f83e9b6e-2200-4066-81be-df7d867ea60e\") " pod="openstack/glance-default-internal-api-0" Dec 12 00:45:23 crc kubenswrapper[4606]: I1212 00:45:23.808968 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f83e9b6e-2200-4066-81be-df7d867ea60e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"f83e9b6e-2200-4066-81be-df7d867ea60e\") " pod="openstack/glance-default-internal-api-0" Dec 12 00:45:23 crc kubenswrapper[4606]: I1212 00:45:23.809155 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f83e9b6e-2200-4066-81be-df7d867ea60e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"f83e9b6e-2200-4066-81be-df7d867ea60e\") " pod="openstack/glance-default-internal-api-0" Dec 12 00:45:23 crc kubenswrapper[4606]: I1212 00:45:23.809202 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f83e9b6e-2200-4066-81be-df7d867ea60e-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"f83e9b6e-2200-4066-81be-df7d867ea60e\") " pod="openstack/glance-default-internal-api-0" Dec 12 00:45:23 crc kubenswrapper[4606]: I1212 00:45:23.810957 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f83e9b6e-2200-4066-81be-df7d867ea60e-logs\") pod \"glance-default-internal-api-0\" (UID: \"f83e9b6e-2200-4066-81be-df7d867ea60e\") " pod="openstack/glance-default-internal-api-0" Dec 12 00:45:23 crc kubenswrapper[4606]: I1212 00:45:23.832528 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-qmcpn" Dec 12 00:45:23 crc kubenswrapper[4606]: I1212 00:45:23.838474 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6w52\" (UniqueName: \"kubernetes.io/projected/f83e9b6e-2200-4066-81be-df7d867ea60e-kube-api-access-p6w52\") pod \"glance-default-internal-api-0\" (UID: \"f83e9b6e-2200-4066-81be-df7d867ea60e\") " pod="openstack/glance-default-internal-api-0" Dec 12 00:45:23 crc kubenswrapper[4606]: I1212 00:45:23.847235 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f83e9b6e-2200-4066-81be-df7d867ea60e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"f83e9b6e-2200-4066-81be-df7d867ea60e\") " pod="openstack/glance-default-internal-api-0" Dec 12 00:45:23 crc kubenswrapper[4606]: I1212 00:45:23.867658 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f83e9b6e-2200-4066-81be-df7d867ea60e-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"f83e9b6e-2200-4066-81be-df7d867ea60e\") " pod="openstack/glance-default-internal-api-0" Dec 12 00:45:23 crc kubenswrapper[4606]: I1212 00:45:23.931331 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 12 00:45:23 crc kubenswrapper[4606]: I1212 00:45:23.934059 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-v224c"] Dec 12 00:45:23 crc kubenswrapper[4606]: I1212 00:45:23.958251 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-v224c"] Dec 12 00:45:23 crc kubenswrapper[4606]: I1212 00:45:23.975215 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bbf5cc879-98jlt" event={"ID":"aabbe14d-15a9-4e93-a862-00fd5cf988a0","Type":"ContainerStarted","Data":"9c9585b4cf8508c293a0710b54a06dd305ed4228ffdbd0fdb141ab1224a18f4a"} Dec 12 00:45:23 crc kubenswrapper[4606]: I1212 00:45:23.986882 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"f83e9b6e-2200-4066-81be-df7d867ea60e\") " pod="openstack/glance-default-internal-api-0" Dec 12 00:45:24 crc kubenswrapper[4606]: I1212 00:45:24.005938 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-jx54m" event={"ID":"4b21b129-dddc-4a41-ad1f-6cad37d0aa07","Type":"ContainerStarted","Data":"5cf49a5d879c03def251f0378329479a6e1edafa5d925c0793be69d8d9c49a3c"} Dec 12 00:45:24 crc kubenswrapper[4606]: I1212 00:45:24.107391 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 12 00:45:24 crc kubenswrapper[4606]: I1212 00:45:24.110966 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 12 00:45:24 crc kubenswrapper[4606]: I1212 00:45:24.113258 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 12 00:45:24 crc kubenswrapper[4606]: I1212 00:45:24.127812 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 12 00:45:24 crc kubenswrapper[4606]: I1212 00:45:24.148569 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 12 00:45:24 crc kubenswrapper[4606]: I1212 00:45:24.177933 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-98jlt"] Dec 12 00:45:24 crc kubenswrapper[4606]: I1212 00:45:24.190266 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5f59cc7d7-l4jln"] Dec 12 00:45:24 crc kubenswrapper[4606]: I1212 00:45:24.246710 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"bce18320-9dbb-4a06-b2f8-0d0bbd6c5592\") " pod="openstack/glance-default-external-api-0" Dec 12 00:45:24 crc kubenswrapper[4606]: I1212 00:45:24.246766 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bce18320-9dbb-4a06-b2f8-0d0bbd6c5592-scripts\") pod \"glance-default-external-api-0\" (UID: \"bce18320-9dbb-4a06-b2f8-0d0bbd6c5592\") " pod="openstack/glance-default-external-api-0" Dec 12 00:45:24 crc kubenswrapper[4606]: I1212 00:45:24.246819 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bce18320-9dbb-4a06-b2f8-0d0bbd6c5592-config-data\") pod \"glance-default-external-api-0\" (UID: \"bce18320-9dbb-4a06-b2f8-0d0bbd6c5592\") " pod="openstack/glance-default-external-api-0" Dec 12 00:45:24 crc kubenswrapper[4606]: I1212 00:45:24.246866 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bce18320-9dbb-4a06-b2f8-0d0bbd6c5592-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"bce18320-9dbb-4a06-b2f8-0d0bbd6c5592\") " pod="openstack/glance-default-external-api-0" Dec 12 00:45:24 crc kubenswrapper[4606]: I1212 00:45:24.246906 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bce18320-9dbb-4a06-b2f8-0d0bbd6c5592-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"bce18320-9dbb-4a06-b2f8-0d0bbd6c5592\") " pod="openstack/glance-default-external-api-0" Dec 12 00:45:24 crc kubenswrapper[4606]: I1212 00:45:24.247215 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bce18320-9dbb-4a06-b2f8-0d0bbd6c5592-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"bce18320-9dbb-4a06-b2f8-0d0bbd6c5592\") " pod="openstack/glance-default-external-api-0" Dec 12 00:45:24 crc kubenswrapper[4606]: I1212 00:45:24.247318 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8jf5\" (UniqueName: \"kubernetes.io/projected/bce18320-9dbb-4a06-b2f8-0d0bbd6c5592-kube-api-access-p8jf5\") pod \"glance-default-external-api-0\" (UID: \"bce18320-9dbb-4a06-b2f8-0d0bbd6c5592\") " pod="openstack/glance-default-external-api-0" Dec 12 00:45:24 crc kubenswrapper[4606]: I1212 00:45:24.247377 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bce18320-9dbb-4a06-b2f8-0d0bbd6c5592-logs\") pod \"glance-default-external-api-0\" (UID: \"bce18320-9dbb-4a06-b2f8-0d0bbd6c5592\") " pod="openstack/glance-default-external-api-0" Dec 12 00:45:24 crc kubenswrapper[4606]: I1212 00:45:24.267871 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 12 00:45:24 crc kubenswrapper[4606]: I1212 00:45:24.350966 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"bce18320-9dbb-4a06-b2f8-0d0bbd6c5592\") " pod="openstack/glance-default-external-api-0" Dec 12 00:45:24 crc kubenswrapper[4606]: I1212 00:45:24.351505 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bce18320-9dbb-4a06-b2f8-0d0bbd6c5592-scripts\") pod \"glance-default-external-api-0\" (UID: \"bce18320-9dbb-4a06-b2f8-0d0bbd6c5592\") " pod="openstack/glance-default-external-api-0" Dec 12 00:45:24 crc kubenswrapper[4606]: I1212 00:45:24.351602 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bce18320-9dbb-4a06-b2f8-0d0bbd6c5592-config-data\") pod \"glance-default-external-api-0\" (UID: \"bce18320-9dbb-4a06-b2f8-0d0bbd6c5592\") " pod="openstack/glance-default-external-api-0" Dec 12 00:45:24 crc kubenswrapper[4606]: I1212 00:45:24.351677 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bce18320-9dbb-4a06-b2f8-0d0bbd6c5592-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"bce18320-9dbb-4a06-b2f8-0d0bbd6c5592\") " pod="openstack/glance-default-external-api-0" Dec 12 00:45:24 crc kubenswrapper[4606]: I1212 00:45:24.351760 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bce18320-9dbb-4a06-b2f8-0d0bbd6c5592-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"bce18320-9dbb-4a06-b2f8-0d0bbd6c5592\") " pod="openstack/glance-default-external-api-0" Dec 12 00:45:24 crc kubenswrapper[4606]: I1212 00:45:24.351871 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bce18320-9dbb-4a06-b2f8-0d0bbd6c5592-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"bce18320-9dbb-4a06-b2f8-0d0bbd6c5592\") " pod="openstack/glance-default-external-api-0" Dec 12 00:45:24 crc kubenswrapper[4606]: I1212 00:45:24.351945 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p8jf5\" (UniqueName: \"kubernetes.io/projected/bce18320-9dbb-4a06-b2f8-0d0bbd6c5592-kube-api-access-p8jf5\") pod \"glance-default-external-api-0\" (UID: \"bce18320-9dbb-4a06-b2f8-0d0bbd6c5592\") " pod="openstack/glance-default-external-api-0" Dec 12 00:45:24 crc kubenswrapper[4606]: I1212 00:45:24.352048 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bce18320-9dbb-4a06-b2f8-0d0bbd6c5592-logs\") pod \"glance-default-external-api-0\" (UID: \"bce18320-9dbb-4a06-b2f8-0d0bbd6c5592\") " pod="openstack/glance-default-external-api-0" Dec 12 00:45:24 crc kubenswrapper[4606]: I1212 00:45:24.352704 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bce18320-9dbb-4a06-b2f8-0d0bbd6c5592-logs\") pod \"glance-default-external-api-0\" (UID: \"bce18320-9dbb-4a06-b2f8-0d0bbd6c5592\") " pod="openstack/glance-default-external-api-0" Dec 12 00:45:24 crc kubenswrapper[4606]: I1212 00:45:24.351528 4606 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"bce18320-9dbb-4a06-b2f8-0d0bbd6c5592\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-external-api-0" Dec 12 00:45:24 crc kubenswrapper[4606]: I1212 00:45:24.357054 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bce18320-9dbb-4a06-b2f8-0d0bbd6c5592-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"bce18320-9dbb-4a06-b2f8-0d0bbd6c5592\") " pod="openstack/glance-default-external-api-0" Dec 12 00:45:24 crc kubenswrapper[4606]: I1212 00:45:24.364276 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bce18320-9dbb-4a06-b2f8-0d0bbd6c5592-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"bce18320-9dbb-4a06-b2f8-0d0bbd6c5592\") " pod="openstack/glance-default-external-api-0" Dec 12 00:45:24 crc kubenswrapper[4606]: I1212 00:45:24.371630 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bce18320-9dbb-4a06-b2f8-0d0bbd6c5592-config-data\") pod \"glance-default-external-api-0\" (UID: \"bce18320-9dbb-4a06-b2f8-0d0bbd6c5592\") " pod="openstack/glance-default-external-api-0" Dec 12 00:45:24 crc kubenswrapper[4606]: I1212 00:45:24.377649 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bce18320-9dbb-4a06-b2f8-0d0bbd6c5592-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"bce18320-9dbb-4a06-b2f8-0d0bbd6c5592\") " pod="openstack/glance-default-external-api-0" Dec 12 00:45:24 crc kubenswrapper[4606]: I1212 00:45:24.383867 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-9xbj8"] Dec 12 00:45:24 crc kubenswrapper[4606]: I1212 00:45:24.385141 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bce18320-9dbb-4a06-b2f8-0d0bbd6c5592-scripts\") pod \"glance-default-external-api-0\" (UID: \"bce18320-9dbb-4a06-b2f8-0d0bbd6c5592\") " pod="openstack/glance-default-external-api-0" Dec 12 00:45:24 crc kubenswrapper[4606]: I1212 00:45:24.394363 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-hp9lg"] Dec 12 00:45:24 crc kubenswrapper[4606]: I1212 00:45:24.396536 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8jf5\" (UniqueName: \"kubernetes.io/projected/bce18320-9dbb-4a06-b2f8-0d0bbd6c5592-kube-api-access-p8jf5\") pod \"glance-default-external-api-0\" (UID: \"bce18320-9dbb-4a06-b2f8-0d0bbd6c5592\") " pod="openstack/glance-default-external-api-0" Dec 12 00:45:24 crc kubenswrapper[4606]: W1212 00:45:24.408089 4606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7978c0cd_b859_49f1_ad0e_1cb88ff58495.slice/crio-76e42ca04fa8a4754b1af4600675dfe684c3689b875e7071dbc32d210f1af767 WatchSource:0}: Error finding container 76e42ca04fa8a4754b1af4600675dfe684c3689b875e7071dbc32d210f1af767: Status 404 returned error can't find the container with id 76e42ca04fa8a4754b1af4600675dfe684c3689b875e7071dbc32d210f1af767 Dec 12 00:45:24 crc kubenswrapper[4606]: I1212 00:45:24.427011 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"bce18320-9dbb-4a06-b2f8-0d0bbd6c5592\") " pod="openstack/glance-default-external-api-0" Dec 12 00:45:24 crc kubenswrapper[4606]: I1212 00:45:24.657721 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 12 00:45:24 crc kubenswrapper[4606]: I1212 00:45:24.838556 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-j2fv2"] Dec 12 00:45:24 crc kubenswrapper[4606]: I1212 00:45:24.858268 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-4hg5f"] Dec 12 00:45:24 crc kubenswrapper[4606]: W1212 00:45:24.872234 4606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5a0b6b98_c743_4435_a967_55c0edb95531.slice/crio-b722a345ecaf7177444d77904c2aa47c636550c83d26daaee6c1a11d15b11cb6 WatchSource:0}: Error finding container b722a345ecaf7177444d77904c2aa47c636550c83d26daaee6c1a11d15b11cb6: Status 404 returned error can't find the container with id b722a345ecaf7177444d77904c2aa47c636550c83d26daaee6c1a11d15b11cb6 Dec 12 00:45:24 crc kubenswrapper[4606]: I1212 00:45:24.875425 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-qmcpn"] Dec 12 00:45:24 crc kubenswrapper[4606]: I1212 00:45:24.885152 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-64d56f87f-bhmtm"] Dec 12 00:45:24 crc kubenswrapper[4606]: I1212 00:45:24.980422 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 12 00:45:25 crc kubenswrapper[4606]: I1212 00:45:25.055158 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-4hg5f" event={"ID":"5a0b6b98-c743-4435-a967-55c0edb95531","Type":"ContainerStarted","Data":"b722a345ecaf7177444d77904c2aa47c636550c83d26daaee6c1a11d15b11cb6"} Dec 12 00:45:25 crc kubenswrapper[4606]: I1212 00:45:25.072693 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-j2fv2" event={"ID":"39112d38-7887-4e2b-b32f-d679ca162941","Type":"ContainerStarted","Data":"c9003bd8eb2677005882dce3843f74353f40d5852fa80cd621a61efcc5166985"} Dec 12 00:45:25 crc kubenswrapper[4606]: I1212 00:45:25.098196 4606 generic.go:334] "Generic (PLEG): container finished" podID="aabbe14d-15a9-4e93-a862-00fd5cf988a0" containerID="8320c6f2e74d1e1ecbe069ee2c4adf599295860b06514c31adc0bf0897f8c504" exitCode=0 Dec 12 00:45:25 crc kubenswrapper[4606]: I1212 00:45:25.098272 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bbf5cc879-98jlt" event={"ID":"aabbe14d-15a9-4e93-a862-00fd5cf988a0","Type":"ContainerDied","Data":"8320c6f2e74d1e1ecbe069ee2c4adf599295860b06514c31adc0bf0897f8c504"} Dec 12 00:45:25 crc kubenswrapper[4606]: I1212 00:45:25.113744 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-qmcpn" event={"ID":"73c136c9-e12d-434a-aab1-ed21dfaf0f60","Type":"ContainerStarted","Data":"3e58127df5cf06183dade2f6cb5c9d87ad85ab2f0678f7317084d0d6caec9458"} Dec 12 00:45:25 crc kubenswrapper[4606]: I1212 00:45:25.128853 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-hp9lg" event={"ID":"ce0c3f48-d61e-420d-ab53-61361c7a4a25","Type":"ContainerStarted","Data":"43e74f811bdd82896befc7495dab3869fa3c81bec160c4912838cddece137eca"} Dec 12 00:45:25 crc kubenswrapper[4606]: I1212 00:45:25.128890 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-hp9lg" event={"ID":"ce0c3f48-d61e-420d-ab53-61361c7a4a25","Type":"ContainerStarted","Data":"bbe469aed3c875a1e09a439e9614fecee00849e88743f45216af1c618533350c"} Dec 12 00:45:25 crc kubenswrapper[4606]: I1212 00:45:25.135540 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5f59cc7d7-l4jln" event={"ID":"b710b7ca-abc9-465b-a279-949bd345962b","Type":"ContainerStarted","Data":"3c630bf6508fa4bdcdc67478b61b4cccfe5ea168aa23c02bf5183d1130cc927e"} Dec 12 00:45:25 crc kubenswrapper[4606]: I1212 00:45:25.141685 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 12 00:45:25 crc kubenswrapper[4606]: I1212 00:45:25.154773 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-hp9lg" podStartSLOduration=3.154757624 podStartE2EDuration="3.154757624s" podCreationTimestamp="2025-12-12 00:45:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:45:25.154019554 +0000 UTC m=+1315.699372420" watchObservedRunningTime="2025-12-12 00:45:25.154757624 +0000 UTC m=+1315.700110490" Dec 12 00:45:25 crc kubenswrapper[4606]: I1212 00:45:25.156367 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-64d56f87f-bhmtm" event={"ID":"fc444db7-f445-40e7-bea0-f16f6afc2b91","Type":"ContainerStarted","Data":"cc5eff8ae5bf9266942669732f2199a14fd8f8eb75ef357936b53da3fc821d49"} Dec 12 00:45:25 crc kubenswrapper[4606]: I1212 00:45:25.160275 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-9xbj8" event={"ID":"7978c0cd-b859-49f1-ad0e-1cb88ff58495","Type":"ContainerStarted","Data":"76e42ca04fa8a4754b1af4600675dfe684c3689b875e7071dbc32d210f1af767"} Dec 12 00:45:25 crc kubenswrapper[4606]: I1212 00:45:25.201788 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-jx54m" event={"ID":"4b21b129-dddc-4a41-ad1f-6cad37d0aa07","Type":"ContainerStarted","Data":"a4823e5b2d7c2714afe1b56e652606df93932f01689b012beee3a434a8505f96"} Dec 12 00:45:25 crc kubenswrapper[4606]: I1212 00:45:25.375206 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-jx54m" podStartSLOduration=4.375171259 podStartE2EDuration="4.375171259s" podCreationTimestamp="2025-12-12 00:45:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:45:25.228928805 +0000 UTC m=+1315.774281671" watchObservedRunningTime="2025-12-12 00:45:25.375171259 +0000 UTC m=+1315.920524125" Dec 12 00:45:25 crc kubenswrapper[4606]: I1212 00:45:25.375997 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 12 00:45:25 crc kubenswrapper[4606]: I1212 00:45:25.540712 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf5cc879-98jlt" Dec 12 00:45:25 crc kubenswrapper[4606]: I1212 00:45:25.696845 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pdcv9\" (UniqueName: \"kubernetes.io/projected/aabbe14d-15a9-4e93-a862-00fd5cf988a0-kube-api-access-pdcv9\") pod \"aabbe14d-15a9-4e93-a862-00fd5cf988a0\" (UID: \"aabbe14d-15a9-4e93-a862-00fd5cf988a0\") " Dec 12 00:45:25 crc kubenswrapper[4606]: I1212 00:45:25.696901 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aabbe14d-15a9-4e93-a862-00fd5cf988a0-config\") pod \"aabbe14d-15a9-4e93-a862-00fd5cf988a0\" (UID: \"aabbe14d-15a9-4e93-a862-00fd5cf988a0\") " Dec 12 00:45:25 crc kubenswrapper[4606]: I1212 00:45:25.697011 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aabbe14d-15a9-4e93-a862-00fd5cf988a0-ovsdbserver-sb\") pod \"aabbe14d-15a9-4e93-a862-00fd5cf988a0\" (UID: \"aabbe14d-15a9-4e93-a862-00fd5cf988a0\") " Dec 12 00:45:25 crc kubenswrapper[4606]: I1212 00:45:25.697061 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aabbe14d-15a9-4e93-a862-00fd5cf988a0-ovsdbserver-nb\") pod \"aabbe14d-15a9-4e93-a862-00fd5cf988a0\" (UID: \"aabbe14d-15a9-4e93-a862-00fd5cf988a0\") " Dec 12 00:45:25 crc kubenswrapper[4606]: I1212 00:45:25.697125 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aabbe14d-15a9-4e93-a862-00fd5cf988a0-dns-svc\") pod \"aabbe14d-15a9-4e93-a862-00fd5cf988a0\" (UID: \"aabbe14d-15a9-4e93-a862-00fd5cf988a0\") " Dec 12 00:45:25 crc kubenswrapper[4606]: I1212 00:45:25.697290 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aabbe14d-15a9-4e93-a862-00fd5cf988a0-dns-swift-storage-0\") pod \"aabbe14d-15a9-4e93-a862-00fd5cf988a0\" (UID: \"aabbe14d-15a9-4e93-a862-00fd5cf988a0\") " Dec 12 00:45:25 crc kubenswrapper[4606]: I1212 00:45:25.705494 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aabbe14d-15a9-4e93-a862-00fd5cf988a0-kube-api-access-pdcv9" (OuterVolumeSpecName: "kube-api-access-pdcv9") pod "aabbe14d-15a9-4e93-a862-00fd5cf988a0" (UID: "aabbe14d-15a9-4e93-a862-00fd5cf988a0"). InnerVolumeSpecName "kube-api-access-pdcv9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:45:25 crc kubenswrapper[4606]: I1212 00:45:25.727797 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aabbe14d-15a9-4e93-a862-00fd5cf988a0-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "aabbe14d-15a9-4e93-a862-00fd5cf988a0" (UID: "aabbe14d-15a9-4e93-a862-00fd5cf988a0"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:45:25 crc kubenswrapper[4606]: I1212 00:45:25.731748 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9a87b6a-f358-4e8f-90a9-0c3cc5b8971d" path="/var/lib/kubelet/pods/d9a87b6a-f358-4e8f-90a9-0c3cc5b8971d/volumes" Dec 12 00:45:25 crc kubenswrapper[4606]: I1212 00:45:25.737303 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aabbe14d-15a9-4e93-a862-00fd5cf988a0-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "aabbe14d-15a9-4e93-a862-00fd5cf988a0" (UID: "aabbe14d-15a9-4e93-a862-00fd5cf988a0"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:45:25 crc kubenswrapper[4606]: I1212 00:45:25.756108 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aabbe14d-15a9-4e93-a862-00fd5cf988a0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "aabbe14d-15a9-4e93-a862-00fd5cf988a0" (UID: "aabbe14d-15a9-4e93-a862-00fd5cf988a0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:45:25 crc kubenswrapper[4606]: I1212 00:45:25.777334 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aabbe14d-15a9-4e93-a862-00fd5cf988a0-config" (OuterVolumeSpecName: "config") pod "aabbe14d-15a9-4e93-a862-00fd5cf988a0" (UID: "aabbe14d-15a9-4e93-a862-00fd5cf988a0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:45:25 crc kubenswrapper[4606]: I1212 00:45:25.804919 4606 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aabbe14d-15a9-4e93-a862-00fd5cf988a0-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 12 00:45:25 crc kubenswrapper[4606]: I1212 00:45:25.804951 4606 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aabbe14d-15a9-4e93-a862-00fd5cf988a0-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 12 00:45:25 crc kubenswrapper[4606]: I1212 00:45:25.804966 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pdcv9\" (UniqueName: \"kubernetes.io/projected/aabbe14d-15a9-4e93-a862-00fd5cf988a0-kube-api-access-pdcv9\") on node \"crc\" DevicePath \"\"" Dec 12 00:45:25 crc kubenswrapper[4606]: I1212 00:45:25.804975 4606 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aabbe14d-15a9-4e93-a862-00fd5cf988a0-config\") on node \"crc\" DevicePath \"\"" Dec 12 00:45:25 crc kubenswrapper[4606]: I1212 00:45:25.804982 4606 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aabbe14d-15a9-4e93-a862-00fd5cf988a0-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 12 00:45:25 crc kubenswrapper[4606]: I1212 00:45:25.808793 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aabbe14d-15a9-4e93-a862-00fd5cf988a0-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "aabbe14d-15a9-4e93-a862-00fd5cf988a0" (UID: "aabbe14d-15a9-4e93-a862-00fd5cf988a0"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:45:25 crc kubenswrapper[4606]: I1212 00:45:25.906866 4606 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aabbe14d-15a9-4e93-a862-00fd5cf988a0-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 12 00:45:25 crc kubenswrapper[4606]: I1212 00:45:25.966811 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 12 00:45:26 crc kubenswrapper[4606]: I1212 00:45:26.058497 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5f59cc7d7-l4jln"] Dec 12 00:45:26 crc kubenswrapper[4606]: I1212 00:45:26.101488 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-b55cc6bf7-ckhd2"] Dec 12 00:45:26 crc kubenswrapper[4606]: E1212 00:45:26.101886 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aabbe14d-15a9-4e93-a862-00fd5cf988a0" containerName="init" Dec 12 00:45:26 crc kubenswrapper[4606]: I1212 00:45:26.101900 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="aabbe14d-15a9-4e93-a862-00fd5cf988a0" containerName="init" Dec 12 00:45:26 crc kubenswrapper[4606]: I1212 00:45:26.102064 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="aabbe14d-15a9-4e93-a862-00fd5cf988a0" containerName="init" Dec 12 00:45:26 crc kubenswrapper[4606]: I1212 00:45:26.112556 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-b55cc6bf7-ckhd2" Dec 12 00:45:26 crc kubenswrapper[4606]: I1212 00:45:26.116411 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-b55cc6bf7-ckhd2"] Dec 12 00:45:26 crc kubenswrapper[4606]: I1212 00:45:26.148065 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 12 00:45:26 crc kubenswrapper[4606]: I1212 00:45:26.213209 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/465e3cb1-d565-45fb-9251-de59579f3add-horizon-secret-key\") pod \"horizon-b55cc6bf7-ckhd2\" (UID: \"465e3cb1-d565-45fb-9251-de59579f3add\") " pod="openstack/horizon-b55cc6bf7-ckhd2" Dec 12 00:45:26 crc kubenswrapper[4606]: I1212 00:45:26.213258 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/465e3cb1-d565-45fb-9251-de59579f3add-config-data\") pod \"horizon-b55cc6bf7-ckhd2\" (UID: \"465e3cb1-d565-45fb-9251-de59579f3add\") " pod="openstack/horizon-b55cc6bf7-ckhd2" Dec 12 00:45:26 crc kubenswrapper[4606]: I1212 00:45:26.213312 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/465e3cb1-d565-45fb-9251-de59579f3add-scripts\") pod \"horizon-b55cc6bf7-ckhd2\" (UID: \"465e3cb1-d565-45fb-9251-de59579f3add\") " pod="openstack/horizon-b55cc6bf7-ckhd2" Dec 12 00:45:26 crc kubenswrapper[4606]: I1212 00:45:26.213362 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/465e3cb1-d565-45fb-9251-de59579f3add-logs\") pod \"horizon-b55cc6bf7-ckhd2\" (UID: \"465e3cb1-d565-45fb-9251-de59579f3add\") " pod="openstack/horizon-b55cc6bf7-ckhd2" Dec 12 00:45:26 crc kubenswrapper[4606]: I1212 00:45:26.213395 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjwp4\" (UniqueName: \"kubernetes.io/projected/465e3cb1-d565-45fb-9251-de59579f3add-kube-api-access-jjwp4\") pod \"horizon-b55cc6bf7-ckhd2\" (UID: \"465e3cb1-d565-45fb-9251-de59579f3add\") " pod="openstack/horizon-b55cc6bf7-ckhd2" Dec 12 00:45:26 crc kubenswrapper[4606]: I1212 00:45:26.239538 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"88bd7935-19a0-486d-b1e7-4737abcf21ab","Type":"ContainerStarted","Data":"19657743b1bb044f1d0b8091aa2db14366bbfaf9e010174bb9f635c7d1affe54"} Dec 12 00:45:26 crc kubenswrapper[4606]: I1212 00:45:26.240789 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 12 00:45:26 crc kubenswrapper[4606]: I1212 00:45:26.287488 4606 generic.go:334] "Generic (PLEG): container finished" podID="39112d38-7887-4e2b-b32f-d679ca162941" containerID="527eee0fb28cbe9a028b32af3bdd23b9100c7cb7378cbd7337cf61c4a6455a68" exitCode=0 Dec 12 00:45:26 crc kubenswrapper[4606]: I1212 00:45:26.287721 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-j2fv2" event={"ID":"39112d38-7887-4e2b-b32f-d679ca162941","Type":"ContainerDied","Data":"527eee0fb28cbe9a028b32af3bdd23b9100c7cb7378cbd7337cf61c4a6455a68"} Dec 12 00:45:26 crc kubenswrapper[4606]: I1212 00:45:26.315346 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/465e3cb1-d565-45fb-9251-de59579f3add-logs\") pod \"horizon-b55cc6bf7-ckhd2\" (UID: \"465e3cb1-d565-45fb-9251-de59579f3add\") " pod="openstack/horizon-b55cc6bf7-ckhd2" Dec 12 00:45:26 crc kubenswrapper[4606]: I1212 00:45:26.315411 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjwp4\" (UniqueName: \"kubernetes.io/projected/465e3cb1-d565-45fb-9251-de59579f3add-kube-api-access-jjwp4\") pod \"horizon-b55cc6bf7-ckhd2\" (UID: \"465e3cb1-d565-45fb-9251-de59579f3add\") " pod="openstack/horizon-b55cc6bf7-ckhd2" Dec 12 00:45:26 crc kubenswrapper[4606]: I1212 00:45:26.315462 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/465e3cb1-d565-45fb-9251-de59579f3add-horizon-secret-key\") pod \"horizon-b55cc6bf7-ckhd2\" (UID: \"465e3cb1-d565-45fb-9251-de59579f3add\") " pod="openstack/horizon-b55cc6bf7-ckhd2" Dec 12 00:45:26 crc kubenswrapper[4606]: I1212 00:45:26.315486 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/465e3cb1-d565-45fb-9251-de59579f3add-config-data\") pod \"horizon-b55cc6bf7-ckhd2\" (UID: \"465e3cb1-d565-45fb-9251-de59579f3add\") " pod="openstack/horizon-b55cc6bf7-ckhd2" Dec 12 00:45:26 crc kubenswrapper[4606]: I1212 00:45:26.315537 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/465e3cb1-d565-45fb-9251-de59579f3add-scripts\") pod \"horizon-b55cc6bf7-ckhd2\" (UID: \"465e3cb1-d565-45fb-9251-de59579f3add\") " pod="openstack/horizon-b55cc6bf7-ckhd2" Dec 12 00:45:26 crc kubenswrapper[4606]: I1212 00:45:26.317601 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/465e3cb1-d565-45fb-9251-de59579f3add-logs\") pod \"horizon-b55cc6bf7-ckhd2\" (UID: \"465e3cb1-d565-45fb-9251-de59579f3add\") " pod="openstack/horizon-b55cc6bf7-ckhd2" Dec 12 00:45:26 crc kubenswrapper[4606]: I1212 00:45:26.330691 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/465e3cb1-d565-45fb-9251-de59579f3add-scripts\") pod \"horizon-b55cc6bf7-ckhd2\" (UID: \"465e3cb1-d565-45fb-9251-de59579f3add\") " pod="openstack/horizon-b55cc6bf7-ckhd2" Dec 12 00:45:26 crc kubenswrapper[4606]: I1212 00:45:26.331946 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bbf5cc879-98jlt" event={"ID":"aabbe14d-15a9-4e93-a862-00fd5cf988a0","Type":"ContainerDied","Data":"9c9585b4cf8508c293a0710b54a06dd305ed4228ffdbd0fdb141ab1224a18f4a"} Dec 12 00:45:26 crc kubenswrapper[4606]: I1212 00:45:26.331979 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/465e3cb1-d565-45fb-9251-de59579f3add-config-data\") pod \"horizon-b55cc6bf7-ckhd2\" (UID: \"465e3cb1-d565-45fb-9251-de59579f3add\") " pod="openstack/horizon-b55cc6bf7-ckhd2" Dec 12 00:45:26 crc kubenswrapper[4606]: I1212 00:45:26.332112 4606 scope.go:117] "RemoveContainer" containerID="8320c6f2e74d1e1ecbe069ee2c4adf599295860b06514c31adc0bf0897f8c504" Dec 12 00:45:26 crc kubenswrapper[4606]: I1212 00:45:26.332218 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf5cc879-98jlt" Dec 12 00:45:26 crc kubenswrapper[4606]: I1212 00:45:26.332788 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/465e3cb1-d565-45fb-9251-de59579f3add-horizon-secret-key\") pod \"horizon-b55cc6bf7-ckhd2\" (UID: \"465e3cb1-d565-45fb-9251-de59579f3add\") " pod="openstack/horizon-b55cc6bf7-ckhd2" Dec 12 00:45:26 crc kubenswrapper[4606]: I1212 00:45:26.351981 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"bce18320-9dbb-4a06-b2f8-0d0bbd6c5592","Type":"ContainerStarted","Data":"d9072929497f921ef6626ad13b1d1c88cc36e6ac2bad30a697f8aaf3ef877e9b"} Dec 12 00:45:26 crc kubenswrapper[4606]: I1212 00:45:26.360774 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjwp4\" (UniqueName: \"kubernetes.io/projected/465e3cb1-d565-45fb-9251-de59579f3add-kube-api-access-jjwp4\") pod \"horizon-b55cc6bf7-ckhd2\" (UID: \"465e3cb1-d565-45fb-9251-de59579f3add\") " pod="openstack/horizon-b55cc6bf7-ckhd2" Dec 12 00:45:26 crc kubenswrapper[4606]: I1212 00:45:26.364237 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f83e9b6e-2200-4066-81be-df7d867ea60e","Type":"ContainerStarted","Data":"689d234d744cef4e10f118942ad28cef4e1cdb18ccf4de22c9445ce54c9187b7"} Dec 12 00:45:26 crc kubenswrapper[4606]: I1212 00:45:26.449966 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-b55cc6bf7-ckhd2" Dec 12 00:45:26 crc kubenswrapper[4606]: I1212 00:45:26.554289 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-98jlt"] Dec 12 00:45:26 crc kubenswrapper[4606]: I1212 00:45:26.563652 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-98jlt"] Dec 12 00:45:27 crc kubenswrapper[4606]: I1212 00:45:27.205198 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-b55cc6bf7-ckhd2"] Dec 12 00:45:27 crc kubenswrapper[4606]: I1212 00:45:27.396860 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-j2fv2" event={"ID":"39112d38-7887-4e2b-b32f-d679ca162941","Type":"ContainerStarted","Data":"36fca641890dbd9a9be70fa6d54d9b5aaea44c7d21f2e44a90cbd3b6504661b1"} Dec 12 00:45:27 crc kubenswrapper[4606]: I1212 00:45:27.397093 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-56df8fb6b7-j2fv2" Dec 12 00:45:27 crc kubenswrapper[4606]: I1212 00:45:27.406376 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"bce18320-9dbb-4a06-b2f8-0d0bbd6c5592","Type":"ContainerStarted","Data":"df5af5ac6e1c2441d6ead74e75687ed6dc8fe8f203849355a805a1e71245be62"} Dec 12 00:45:27 crc kubenswrapper[4606]: I1212 00:45:27.407927 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-b55cc6bf7-ckhd2" event={"ID":"465e3cb1-d565-45fb-9251-de59579f3add","Type":"ContainerStarted","Data":"9d92f8ce684fac8750989f792341f509ed07e1ddbe387fc745d30d311638ab5e"} Dec 12 00:45:27 crc kubenswrapper[4606]: I1212 00:45:27.411856 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f83e9b6e-2200-4066-81be-df7d867ea60e","Type":"ContainerStarted","Data":"a59d48bcdde2f2a0ae5a1b3d56fc95b761c413d6a7f2fb272bb5b35ce47d03d3"} Dec 12 00:45:27 crc kubenswrapper[4606]: I1212 00:45:27.423446 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-56df8fb6b7-j2fv2" podStartSLOduration=5.423428448 podStartE2EDuration="5.423428448s" podCreationTimestamp="2025-12-12 00:45:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:45:27.420602532 +0000 UTC m=+1317.965955388" watchObservedRunningTime="2025-12-12 00:45:27.423428448 +0000 UTC m=+1317.968781314" Dec 12 00:45:27 crc kubenswrapper[4606]: I1212 00:45:27.742769 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aabbe14d-15a9-4e93-a862-00fd5cf988a0" path="/var/lib/kubelet/pods/aabbe14d-15a9-4e93-a862-00fd5cf988a0/volumes" Dec 12 00:45:28 crc kubenswrapper[4606]: I1212 00:45:28.440490 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="f83e9b6e-2200-4066-81be-df7d867ea60e" containerName="glance-log" containerID="cri-o://a59d48bcdde2f2a0ae5a1b3d56fc95b761c413d6a7f2fb272bb5b35ce47d03d3" gracePeriod=30 Dec 12 00:45:28 crc kubenswrapper[4606]: I1212 00:45:28.440889 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f83e9b6e-2200-4066-81be-df7d867ea60e","Type":"ContainerStarted","Data":"4c0bf1473d53a6248d4693740a95beb1be39d626a13197a04ecf3a7b0e8f4b2f"} Dec 12 00:45:28 crc kubenswrapper[4606]: I1212 00:45:28.440978 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="f83e9b6e-2200-4066-81be-df7d867ea60e" containerName="glance-httpd" containerID="cri-o://4c0bf1473d53a6248d4693740a95beb1be39d626a13197a04ecf3a7b0e8f4b2f" gracePeriod=30 Dec 12 00:45:28 crc kubenswrapper[4606]: I1212 00:45:28.458098 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="bce18320-9dbb-4a06-b2f8-0d0bbd6c5592" containerName="glance-log" containerID="cri-o://df5af5ac6e1c2441d6ead74e75687ed6dc8fe8f203849355a805a1e71245be62" gracePeriod=30 Dec 12 00:45:28 crc kubenswrapper[4606]: I1212 00:45:28.458520 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"bce18320-9dbb-4a06-b2f8-0d0bbd6c5592","Type":"ContainerStarted","Data":"a9e7ec08ae94c320a49feba31b5760a7bd1da716391c628c830c85876adc0e58"} Dec 12 00:45:28 crc kubenswrapper[4606]: I1212 00:45:28.458583 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="bce18320-9dbb-4a06-b2f8-0d0bbd6c5592" containerName="glance-httpd" containerID="cri-o://a9e7ec08ae94c320a49feba31b5760a7bd1da716391c628c830c85876adc0e58" gracePeriod=30 Dec 12 00:45:28 crc kubenswrapper[4606]: I1212 00:45:28.475240 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=6.475222676 podStartE2EDuration="6.475222676s" podCreationTimestamp="2025-12-12 00:45:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:45:28.463867951 +0000 UTC m=+1319.009220817" watchObservedRunningTime="2025-12-12 00:45:28.475222676 +0000 UTC m=+1319.020575542" Dec 12 00:45:28 crc kubenswrapper[4606]: I1212 00:45:28.496884 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=6.496872647 podStartE2EDuration="6.496872647s" podCreationTimestamp="2025-12-12 00:45:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:45:28.495531331 +0000 UTC m=+1319.040884207" watchObservedRunningTime="2025-12-12 00:45:28.496872647 +0000 UTC m=+1319.042225513" Dec 12 00:45:29 crc kubenswrapper[4606]: I1212 00:45:29.258839 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 12 00:45:29 crc kubenswrapper[4606]: I1212 00:45:29.329160 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 12 00:45:29 crc kubenswrapper[4606]: I1212 00:45:29.381277 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bce18320-9dbb-4a06-b2f8-0d0bbd6c5592-public-tls-certs\") pod \"bce18320-9dbb-4a06-b2f8-0d0bbd6c5592\" (UID: \"bce18320-9dbb-4a06-b2f8-0d0bbd6c5592\") " Dec 12 00:45:29 crc kubenswrapper[4606]: I1212 00:45:29.381331 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bce18320-9dbb-4a06-b2f8-0d0bbd6c5592-scripts\") pod \"bce18320-9dbb-4a06-b2f8-0d0bbd6c5592\" (UID: \"bce18320-9dbb-4a06-b2f8-0d0bbd6c5592\") " Dec 12 00:45:29 crc kubenswrapper[4606]: I1212 00:45:29.381357 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p6w52\" (UniqueName: \"kubernetes.io/projected/f83e9b6e-2200-4066-81be-df7d867ea60e-kube-api-access-p6w52\") pod \"f83e9b6e-2200-4066-81be-df7d867ea60e\" (UID: \"f83e9b6e-2200-4066-81be-df7d867ea60e\") " Dec 12 00:45:29 crc kubenswrapper[4606]: I1212 00:45:29.381390 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f83e9b6e-2200-4066-81be-df7d867ea60e-internal-tls-certs\") pod \"f83e9b6e-2200-4066-81be-df7d867ea60e\" (UID: \"f83e9b6e-2200-4066-81be-df7d867ea60e\") " Dec 12 00:45:29 crc kubenswrapper[4606]: I1212 00:45:29.381469 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f83e9b6e-2200-4066-81be-df7d867ea60e-config-data\") pod \"f83e9b6e-2200-4066-81be-df7d867ea60e\" (UID: \"f83e9b6e-2200-4066-81be-df7d867ea60e\") " Dec 12 00:45:29 crc kubenswrapper[4606]: I1212 00:45:29.381492 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bce18320-9dbb-4a06-b2f8-0d0bbd6c5592-httpd-run\") pod \"bce18320-9dbb-4a06-b2f8-0d0bbd6c5592\" (UID: \"bce18320-9dbb-4a06-b2f8-0d0bbd6c5592\") " Dec 12 00:45:29 crc kubenswrapper[4606]: I1212 00:45:29.381519 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"f83e9b6e-2200-4066-81be-df7d867ea60e\" (UID: \"f83e9b6e-2200-4066-81be-df7d867ea60e\") " Dec 12 00:45:29 crc kubenswrapper[4606]: I1212 00:45:29.381537 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f83e9b6e-2200-4066-81be-df7d867ea60e-scripts\") pod \"f83e9b6e-2200-4066-81be-df7d867ea60e\" (UID: \"f83e9b6e-2200-4066-81be-df7d867ea60e\") " Dec 12 00:45:29 crc kubenswrapper[4606]: I1212 00:45:29.381563 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bce18320-9dbb-4a06-b2f8-0d0bbd6c5592-config-data\") pod \"bce18320-9dbb-4a06-b2f8-0d0bbd6c5592\" (UID: \"bce18320-9dbb-4a06-b2f8-0d0bbd6c5592\") " Dec 12 00:45:29 crc kubenswrapper[4606]: I1212 00:45:29.381619 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f83e9b6e-2200-4066-81be-df7d867ea60e-httpd-run\") pod \"f83e9b6e-2200-4066-81be-df7d867ea60e\" (UID: \"f83e9b6e-2200-4066-81be-df7d867ea60e\") " Dec 12 00:45:29 crc kubenswrapper[4606]: I1212 00:45:29.381639 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bce18320-9dbb-4a06-b2f8-0d0bbd6c5592-logs\") pod \"bce18320-9dbb-4a06-b2f8-0d0bbd6c5592\" (UID: \"bce18320-9dbb-4a06-b2f8-0d0bbd6c5592\") " Dec 12 00:45:29 crc kubenswrapper[4606]: I1212 00:45:29.381683 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bce18320-9dbb-4a06-b2f8-0d0bbd6c5592-combined-ca-bundle\") pod \"bce18320-9dbb-4a06-b2f8-0d0bbd6c5592\" (UID: \"bce18320-9dbb-4a06-b2f8-0d0bbd6c5592\") " Dec 12 00:45:29 crc kubenswrapper[4606]: I1212 00:45:29.381717 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p8jf5\" (UniqueName: \"kubernetes.io/projected/bce18320-9dbb-4a06-b2f8-0d0bbd6c5592-kube-api-access-p8jf5\") pod \"bce18320-9dbb-4a06-b2f8-0d0bbd6c5592\" (UID: \"bce18320-9dbb-4a06-b2f8-0d0bbd6c5592\") " Dec 12 00:45:29 crc kubenswrapper[4606]: I1212 00:45:29.381735 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f83e9b6e-2200-4066-81be-df7d867ea60e-combined-ca-bundle\") pod \"f83e9b6e-2200-4066-81be-df7d867ea60e\" (UID: \"f83e9b6e-2200-4066-81be-df7d867ea60e\") " Dec 12 00:45:29 crc kubenswrapper[4606]: I1212 00:45:29.381754 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"bce18320-9dbb-4a06-b2f8-0d0bbd6c5592\" (UID: \"bce18320-9dbb-4a06-b2f8-0d0bbd6c5592\") " Dec 12 00:45:29 crc kubenswrapper[4606]: I1212 00:45:29.381777 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f83e9b6e-2200-4066-81be-df7d867ea60e-logs\") pod \"f83e9b6e-2200-4066-81be-df7d867ea60e\" (UID: \"f83e9b6e-2200-4066-81be-df7d867ea60e\") " Dec 12 00:45:29 crc kubenswrapper[4606]: I1212 00:45:29.382832 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f83e9b6e-2200-4066-81be-df7d867ea60e-logs" (OuterVolumeSpecName: "logs") pod "f83e9b6e-2200-4066-81be-df7d867ea60e" (UID: "f83e9b6e-2200-4066-81be-df7d867ea60e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 00:45:29 crc kubenswrapper[4606]: I1212 00:45:29.383247 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bce18320-9dbb-4a06-b2f8-0d0bbd6c5592-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "bce18320-9dbb-4a06-b2f8-0d0bbd6c5592" (UID: "bce18320-9dbb-4a06-b2f8-0d0bbd6c5592"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 00:45:29 crc kubenswrapper[4606]: I1212 00:45:29.401614 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f83e9b6e-2200-4066-81be-df7d867ea60e-scripts" (OuterVolumeSpecName: "scripts") pod "f83e9b6e-2200-4066-81be-df7d867ea60e" (UID: "f83e9b6e-2200-4066-81be-df7d867ea60e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:45:29 crc kubenswrapper[4606]: I1212 00:45:29.402114 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance") pod "f83e9b6e-2200-4066-81be-df7d867ea60e" (UID: "f83e9b6e-2200-4066-81be-df7d867ea60e"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 12 00:45:29 crc kubenswrapper[4606]: I1212 00:45:29.402235 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bce18320-9dbb-4a06-b2f8-0d0bbd6c5592-kube-api-access-p8jf5" (OuterVolumeSpecName: "kube-api-access-p8jf5") pod "bce18320-9dbb-4a06-b2f8-0d0bbd6c5592" (UID: "bce18320-9dbb-4a06-b2f8-0d0bbd6c5592"). InnerVolumeSpecName "kube-api-access-p8jf5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:45:29 crc kubenswrapper[4606]: I1212 00:45:29.402826 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bce18320-9dbb-4a06-b2f8-0d0bbd6c5592-logs" (OuterVolumeSpecName: "logs") pod "bce18320-9dbb-4a06-b2f8-0d0bbd6c5592" (UID: "bce18320-9dbb-4a06-b2f8-0d0bbd6c5592"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 00:45:29 crc kubenswrapper[4606]: I1212 00:45:29.403466 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bce18320-9dbb-4a06-b2f8-0d0bbd6c5592-scripts" (OuterVolumeSpecName: "scripts") pod "bce18320-9dbb-4a06-b2f8-0d0bbd6c5592" (UID: "bce18320-9dbb-4a06-b2f8-0d0bbd6c5592"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:45:29 crc kubenswrapper[4606]: I1212 00:45:29.403565 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f83e9b6e-2200-4066-81be-df7d867ea60e-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "f83e9b6e-2200-4066-81be-df7d867ea60e" (UID: "f83e9b6e-2200-4066-81be-df7d867ea60e"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 00:45:29 crc kubenswrapper[4606]: I1212 00:45:29.417674 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f83e9b6e-2200-4066-81be-df7d867ea60e-kube-api-access-p6w52" (OuterVolumeSpecName: "kube-api-access-p6w52") pod "f83e9b6e-2200-4066-81be-df7d867ea60e" (UID: "f83e9b6e-2200-4066-81be-df7d867ea60e"). InnerVolumeSpecName "kube-api-access-p6w52". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:45:29 crc kubenswrapper[4606]: I1212 00:45:29.428725 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "bce18320-9dbb-4a06-b2f8-0d0bbd6c5592" (UID: "bce18320-9dbb-4a06-b2f8-0d0bbd6c5592"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 12 00:45:29 crc kubenswrapper[4606]: I1212 00:45:29.459855 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f83e9b6e-2200-4066-81be-df7d867ea60e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f83e9b6e-2200-4066-81be-df7d867ea60e" (UID: "f83e9b6e-2200-4066-81be-df7d867ea60e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:45:29 crc kubenswrapper[4606]: I1212 00:45:29.463558 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bce18320-9dbb-4a06-b2f8-0d0bbd6c5592-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bce18320-9dbb-4a06-b2f8-0d0bbd6c5592" (UID: "bce18320-9dbb-4a06-b2f8-0d0bbd6c5592"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:45:29 crc kubenswrapper[4606]: I1212 00:45:29.483908 4606 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Dec 12 00:45:29 crc kubenswrapper[4606]: I1212 00:45:29.484859 4606 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f83e9b6e-2200-4066-81be-df7d867ea60e-logs\") on node \"crc\" DevicePath \"\"" Dec 12 00:45:29 crc kubenswrapper[4606]: I1212 00:45:29.485055 4606 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bce18320-9dbb-4a06-b2f8-0d0bbd6c5592-scripts\") on node \"crc\" DevicePath \"\"" Dec 12 00:45:29 crc kubenswrapper[4606]: I1212 00:45:29.485236 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p6w52\" (UniqueName: \"kubernetes.io/projected/f83e9b6e-2200-4066-81be-df7d867ea60e-kube-api-access-p6w52\") on node \"crc\" DevicePath \"\"" Dec 12 00:45:29 crc kubenswrapper[4606]: I1212 00:45:29.485480 4606 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bce18320-9dbb-4a06-b2f8-0d0bbd6c5592-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 12 00:45:29 crc kubenswrapper[4606]: I1212 00:45:29.485648 4606 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Dec 12 00:45:29 crc kubenswrapper[4606]: I1212 00:45:29.485897 4606 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f83e9b6e-2200-4066-81be-df7d867ea60e-scripts\") on node \"crc\" DevicePath \"\"" Dec 12 00:45:29 crc kubenswrapper[4606]: I1212 00:45:29.485997 4606 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f83e9b6e-2200-4066-81be-df7d867ea60e-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 12 00:45:29 crc kubenswrapper[4606]: I1212 00:45:29.486221 4606 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bce18320-9dbb-4a06-b2f8-0d0bbd6c5592-logs\") on node \"crc\" DevicePath \"\"" Dec 12 00:45:29 crc kubenswrapper[4606]: I1212 00:45:29.486357 4606 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bce18320-9dbb-4a06-b2f8-0d0bbd6c5592-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 12 00:45:29 crc kubenswrapper[4606]: I1212 00:45:29.486471 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p8jf5\" (UniqueName: \"kubernetes.io/projected/bce18320-9dbb-4a06-b2f8-0d0bbd6c5592-kube-api-access-p8jf5\") on node \"crc\" DevicePath \"\"" Dec 12 00:45:29 crc kubenswrapper[4606]: I1212 00:45:29.487107 4606 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f83e9b6e-2200-4066-81be-df7d867ea60e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 12 00:45:29 crc kubenswrapper[4606]: I1212 00:45:29.488344 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bce18320-9dbb-4a06-b2f8-0d0bbd6c5592-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "bce18320-9dbb-4a06-b2f8-0d0bbd6c5592" (UID: "bce18320-9dbb-4a06-b2f8-0d0bbd6c5592"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:45:29 crc kubenswrapper[4606]: I1212 00:45:29.497581 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f83e9b6e-2200-4066-81be-df7d867ea60e-config-data" (OuterVolumeSpecName: "config-data") pod "f83e9b6e-2200-4066-81be-df7d867ea60e" (UID: "f83e9b6e-2200-4066-81be-df7d867ea60e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:45:29 crc kubenswrapper[4606]: I1212 00:45:29.522295 4606 generic.go:334] "Generic (PLEG): container finished" podID="bce18320-9dbb-4a06-b2f8-0d0bbd6c5592" containerID="a9e7ec08ae94c320a49feba31b5760a7bd1da716391c628c830c85876adc0e58" exitCode=143 Dec 12 00:45:29 crc kubenswrapper[4606]: I1212 00:45:29.522697 4606 generic.go:334] "Generic (PLEG): container finished" podID="bce18320-9dbb-4a06-b2f8-0d0bbd6c5592" containerID="df5af5ac6e1c2441d6ead74e75687ed6dc8fe8f203849355a805a1e71245be62" exitCode=143 Dec 12 00:45:29 crc kubenswrapper[4606]: I1212 00:45:29.523340 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"bce18320-9dbb-4a06-b2f8-0d0bbd6c5592","Type":"ContainerDied","Data":"a9e7ec08ae94c320a49feba31b5760a7bd1da716391c628c830c85876adc0e58"} Dec 12 00:45:29 crc kubenswrapper[4606]: I1212 00:45:29.523556 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"bce18320-9dbb-4a06-b2f8-0d0bbd6c5592","Type":"ContainerDied","Data":"df5af5ac6e1c2441d6ead74e75687ed6dc8fe8f203849355a805a1e71245be62"} Dec 12 00:45:29 crc kubenswrapper[4606]: I1212 00:45:29.523645 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"bce18320-9dbb-4a06-b2f8-0d0bbd6c5592","Type":"ContainerDied","Data":"d9072929497f921ef6626ad13b1d1c88cc36e6ac2bad30a697f8aaf3ef877e9b"} Dec 12 00:45:29 crc kubenswrapper[4606]: I1212 00:45:29.523791 4606 scope.go:117] "RemoveContainer" containerID="a9e7ec08ae94c320a49feba31b5760a7bd1da716391c628c830c85876adc0e58" Dec 12 00:45:29 crc kubenswrapper[4606]: I1212 00:45:29.524216 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 12 00:45:29 crc kubenswrapper[4606]: I1212 00:45:29.530110 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bce18320-9dbb-4a06-b2f8-0d0bbd6c5592-config-data" (OuterVolumeSpecName: "config-data") pod "bce18320-9dbb-4a06-b2f8-0d0bbd6c5592" (UID: "bce18320-9dbb-4a06-b2f8-0d0bbd6c5592"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:45:29 crc kubenswrapper[4606]: I1212 00:45:29.530460 4606 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Dec 12 00:45:29 crc kubenswrapper[4606]: I1212 00:45:29.533427 4606 generic.go:334] "Generic (PLEG): container finished" podID="f83e9b6e-2200-4066-81be-df7d867ea60e" containerID="4c0bf1473d53a6248d4693740a95beb1be39d626a13197a04ecf3a7b0e8f4b2f" exitCode=0 Dec 12 00:45:29 crc kubenswrapper[4606]: I1212 00:45:29.533456 4606 generic.go:334] "Generic (PLEG): container finished" podID="f83e9b6e-2200-4066-81be-df7d867ea60e" containerID="a59d48bcdde2f2a0ae5a1b3d56fc95b761c413d6a7f2fb272bb5b35ce47d03d3" exitCode=143 Dec 12 00:45:29 crc kubenswrapper[4606]: I1212 00:45:29.533480 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f83e9b6e-2200-4066-81be-df7d867ea60e","Type":"ContainerDied","Data":"4c0bf1473d53a6248d4693740a95beb1be39d626a13197a04ecf3a7b0e8f4b2f"} Dec 12 00:45:29 crc kubenswrapper[4606]: I1212 00:45:29.533507 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f83e9b6e-2200-4066-81be-df7d867ea60e","Type":"ContainerDied","Data":"a59d48bcdde2f2a0ae5a1b3d56fc95b761c413d6a7f2fb272bb5b35ce47d03d3"} Dec 12 00:45:29 crc kubenswrapper[4606]: I1212 00:45:29.533528 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f83e9b6e-2200-4066-81be-df7d867ea60e","Type":"ContainerDied","Data":"689d234d744cef4e10f118942ad28cef4e1cdb18ccf4de22c9445ce54c9187b7"} Dec 12 00:45:29 crc kubenswrapper[4606]: I1212 00:45:29.533593 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 12 00:45:29 crc kubenswrapper[4606]: I1212 00:45:29.537801 4606 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Dec 12 00:45:29 crc kubenswrapper[4606]: I1212 00:45:29.544472 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f83e9b6e-2200-4066-81be-df7d867ea60e-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "f83e9b6e-2200-4066-81be-df7d867ea60e" (UID: "f83e9b6e-2200-4066-81be-df7d867ea60e"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:45:29 crc kubenswrapper[4606]: I1212 00:45:29.556574 4606 scope.go:117] "RemoveContainer" containerID="df5af5ac6e1c2441d6ead74e75687ed6dc8fe8f203849355a805a1e71245be62" Dec 12 00:45:29 crc kubenswrapper[4606]: I1212 00:45:29.590632 4606 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f83e9b6e-2200-4066-81be-df7d867ea60e-config-data\") on node \"crc\" DevicePath \"\"" Dec 12 00:45:29 crc kubenswrapper[4606]: I1212 00:45:29.590670 4606 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Dec 12 00:45:29 crc kubenswrapper[4606]: I1212 00:45:29.590682 4606 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bce18320-9dbb-4a06-b2f8-0d0bbd6c5592-config-data\") on node \"crc\" DevicePath \"\"" Dec 12 00:45:29 crc kubenswrapper[4606]: I1212 00:45:29.590693 4606 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Dec 12 00:45:29 crc kubenswrapper[4606]: I1212 00:45:29.590704 4606 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bce18320-9dbb-4a06-b2f8-0d0bbd6c5592-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 12 00:45:29 crc kubenswrapper[4606]: I1212 00:45:29.590715 4606 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f83e9b6e-2200-4066-81be-df7d867ea60e-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 12 00:45:29 crc kubenswrapper[4606]: I1212 00:45:29.631445 4606 scope.go:117] "RemoveContainer" containerID="a9e7ec08ae94c320a49feba31b5760a7bd1da716391c628c830c85876adc0e58" Dec 12 00:45:29 crc kubenswrapper[4606]: E1212 00:45:29.631970 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9e7ec08ae94c320a49feba31b5760a7bd1da716391c628c830c85876adc0e58\": container with ID starting with a9e7ec08ae94c320a49feba31b5760a7bd1da716391c628c830c85876adc0e58 not found: ID does not exist" containerID="a9e7ec08ae94c320a49feba31b5760a7bd1da716391c628c830c85876adc0e58" Dec 12 00:45:29 crc kubenswrapper[4606]: I1212 00:45:29.632060 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9e7ec08ae94c320a49feba31b5760a7bd1da716391c628c830c85876adc0e58"} err="failed to get container status \"a9e7ec08ae94c320a49feba31b5760a7bd1da716391c628c830c85876adc0e58\": rpc error: code = NotFound desc = could not find container \"a9e7ec08ae94c320a49feba31b5760a7bd1da716391c628c830c85876adc0e58\": container with ID starting with a9e7ec08ae94c320a49feba31b5760a7bd1da716391c628c830c85876adc0e58 not found: ID does not exist" Dec 12 00:45:29 crc kubenswrapper[4606]: I1212 00:45:29.632093 4606 scope.go:117] "RemoveContainer" containerID="df5af5ac6e1c2441d6ead74e75687ed6dc8fe8f203849355a805a1e71245be62" Dec 12 00:45:29 crc kubenswrapper[4606]: E1212 00:45:29.632720 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df5af5ac6e1c2441d6ead74e75687ed6dc8fe8f203849355a805a1e71245be62\": container with ID starting with df5af5ac6e1c2441d6ead74e75687ed6dc8fe8f203849355a805a1e71245be62 not found: ID does not exist" containerID="df5af5ac6e1c2441d6ead74e75687ed6dc8fe8f203849355a805a1e71245be62" Dec 12 00:45:29 crc kubenswrapper[4606]: I1212 00:45:29.632762 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df5af5ac6e1c2441d6ead74e75687ed6dc8fe8f203849355a805a1e71245be62"} err="failed to get container status \"df5af5ac6e1c2441d6ead74e75687ed6dc8fe8f203849355a805a1e71245be62\": rpc error: code = NotFound desc = could not find container \"df5af5ac6e1c2441d6ead74e75687ed6dc8fe8f203849355a805a1e71245be62\": container with ID starting with df5af5ac6e1c2441d6ead74e75687ed6dc8fe8f203849355a805a1e71245be62 not found: ID does not exist" Dec 12 00:45:29 crc kubenswrapper[4606]: I1212 00:45:29.632789 4606 scope.go:117] "RemoveContainer" containerID="a9e7ec08ae94c320a49feba31b5760a7bd1da716391c628c830c85876adc0e58" Dec 12 00:45:29 crc kubenswrapper[4606]: I1212 00:45:29.633772 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9e7ec08ae94c320a49feba31b5760a7bd1da716391c628c830c85876adc0e58"} err="failed to get container status \"a9e7ec08ae94c320a49feba31b5760a7bd1da716391c628c830c85876adc0e58\": rpc error: code = NotFound desc = could not find container \"a9e7ec08ae94c320a49feba31b5760a7bd1da716391c628c830c85876adc0e58\": container with ID starting with a9e7ec08ae94c320a49feba31b5760a7bd1da716391c628c830c85876adc0e58 not found: ID does not exist" Dec 12 00:45:29 crc kubenswrapper[4606]: I1212 00:45:29.633804 4606 scope.go:117] "RemoveContainer" containerID="df5af5ac6e1c2441d6ead74e75687ed6dc8fe8f203849355a805a1e71245be62" Dec 12 00:45:29 crc kubenswrapper[4606]: I1212 00:45:29.634371 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df5af5ac6e1c2441d6ead74e75687ed6dc8fe8f203849355a805a1e71245be62"} err="failed to get container status \"df5af5ac6e1c2441d6ead74e75687ed6dc8fe8f203849355a805a1e71245be62\": rpc error: code = NotFound desc = could not find container \"df5af5ac6e1c2441d6ead74e75687ed6dc8fe8f203849355a805a1e71245be62\": container with ID starting with df5af5ac6e1c2441d6ead74e75687ed6dc8fe8f203849355a805a1e71245be62 not found: ID does not exist" Dec 12 00:45:29 crc kubenswrapper[4606]: I1212 00:45:29.634394 4606 scope.go:117] "RemoveContainer" containerID="4c0bf1473d53a6248d4693740a95beb1be39d626a13197a04ecf3a7b0e8f4b2f" Dec 12 00:45:29 crc kubenswrapper[4606]: I1212 00:45:29.737199 4606 scope.go:117] "RemoveContainer" containerID="a59d48bcdde2f2a0ae5a1b3d56fc95b761c413d6a7f2fb272bb5b35ce47d03d3" Dec 12 00:45:29 crc kubenswrapper[4606]: I1212 00:45:29.836658 4606 scope.go:117] "RemoveContainer" containerID="4c0bf1473d53a6248d4693740a95beb1be39d626a13197a04ecf3a7b0e8f4b2f" Dec 12 00:45:29 crc kubenswrapper[4606]: E1212 00:45:29.838377 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c0bf1473d53a6248d4693740a95beb1be39d626a13197a04ecf3a7b0e8f4b2f\": container with ID starting with 4c0bf1473d53a6248d4693740a95beb1be39d626a13197a04ecf3a7b0e8f4b2f not found: ID does not exist" containerID="4c0bf1473d53a6248d4693740a95beb1be39d626a13197a04ecf3a7b0e8f4b2f" Dec 12 00:45:29 crc kubenswrapper[4606]: I1212 00:45:29.838487 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c0bf1473d53a6248d4693740a95beb1be39d626a13197a04ecf3a7b0e8f4b2f"} err="failed to get container status \"4c0bf1473d53a6248d4693740a95beb1be39d626a13197a04ecf3a7b0e8f4b2f\": rpc error: code = NotFound desc = could not find container \"4c0bf1473d53a6248d4693740a95beb1be39d626a13197a04ecf3a7b0e8f4b2f\": container with ID starting with 4c0bf1473d53a6248d4693740a95beb1be39d626a13197a04ecf3a7b0e8f4b2f not found: ID does not exist" Dec 12 00:45:29 crc kubenswrapper[4606]: I1212 00:45:29.838560 4606 scope.go:117] "RemoveContainer" containerID="a59d48bcdde2f2a0ae5a1b3d56fc95b761c413d6a7f2fb272bb5b35ce47d03d3" Dec 12 00:45:29 crc kubenswrapper[4606]: E1212 00:45:29.839481 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a59d48bcdde2f2a0ae5a1b3d56fc95b761c413d6a7f2fb272bb5b35ce47d03d3\": container with ID starting with a59d48bcdde2f2a0ae5a1b3d56fc95b761c413d6a7f2fb272bb5b35ce47d03d3 not found: ID does not exist" containerID="a59d48bcdde2f2a0ae5a1b3d56fc95b761c413d6a7f2fb272bb5b35ce47d03d3" Dec 12 00:45:29 crc kubenswrapper[4606]: I1212 00:45:29.839523 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a59d48bcdde2f2a0ae5a1b3d56fc95b761c413d6a7f2fb272bb5b35ce47d03d3"} err="failed to get container status \"a59d48bcdde2f2a0ae5a1b3d56fc95b761c413d6a7f2fb272bb5b35ce47d03d3\": rpc error: code = NotFound desc = could not find container \"a59d48bcdde2f2a0ae5a1b3d56fc95b761c413d6a7f2fb272bb5b35ce47d03d3\": container with ID starting with a59d48bcdde2f2a0ae5a1b3d56fc95b761c413d6a7f2fb272bb5b35ce47d03d3 not found: ID does not exist" Dec 12 00:45:29 crc kubenswrapper[4606]: I1212 00:45:29.839548 4606 scope.go:117] "RemoveContainer" containerID="4c0bf1473d53a6248d4693740a95beb1be39d626a13197a04ecf3a7b0e8f4b2f" Dec 12 00:45:29 crc kubenswrapper[4606]: I1212 00:45:29.840317 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c0bf1473d53a6248d4693740a95beb1be39d626a13197a04ecf3a7b0e8f4b2f"} err="failed to get container status \"4c0bf1473d53a6248d4693740a95beb1be39d626a13197a04ecf3a7b0e8f4b2f\": rpc error: code = NotFound desc = could not find container \"4c0bf1473d53a6248d4693740a95beb1be39d626a13197a04ecf3a7b0e8f4b2f\": container with ID starting with 4c0bf1473d53a6248d4693740a95beb1be39d626a13197a04ecf3a7b0e8f4b2f not found: ID does not exist" Dec 12 00:45:29 crc kubenswrapper[4606]: I1212 00:45:29.840335 4606 scope.go:117] "RemoveContainer" containerID="a59d48bcdde2f2a0ae5a1b3d56fc95b761c413d6a7f2fb272bb5b35ce47d03d3" Dec 12 00:45:29 crc kubenswrapper[4606]: I1212 00:45:29.840992 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a59d48bcdde2f2a0ae5a1b3d56fc95b761c413d6a7f2fb272bb5b35ce47d03d3"} err="failed to get container status \"a59d48bcdde2f2a0ae5a1b3d56fc95b761c413d6a7f2fb272bb5b35ce47d03d3\": rpc error: code = NotFound desc = could not find container \"a59d48bcdde2f2a0ae5a1b3d56fc95b761c413d6a7f2fb272bb5b35ce47d03d3\": container with ID starting with a59d48bcdde2f2a0ae5a1b3d56fc95b761c413d6a7f2fb272bb5b35ce47d03d3 not found: ID does not exist" Dec 12 00:45:29 crc kubenswrapper[4606]: I1212 00:45:29.969052 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 12 00:45:29 crc kubenswrapper[4606]: I1212 00:45:29.987312 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 12 00:45:29 crc kubenswrapper[4606]: I1212 00:45:29.991383 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 12 00:45:30 crc kubenswrapper[4606]: I1212 00:45:30.009729 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 12 00:45:30 crc kubenswrapper[4606]: I1212 00:45:30.022214 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 12 00:45:30 crc kubenswrapper[4606]: E1212 00:45:30.023826 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bce18320-9dbb-4a06-b2f8-0d0bbd6c5592" containerName="glance-log" Dec 12 00:45:30 crc kubenswrapper[4606]: I1212 00:45:30.023847 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="bce18320-9dbb-4a06-b2f8-0d0bbd6c5592" containerName="glance-log" Dec 12 00:45:30 crc kubenswrapper[4606]: E1212 00:45:30.023863 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f83e9b6e-2200-4066-81be-df7d867ea60e" containerName="glance-httpd" Dec 12 00:45:30 crc kubenswrapper[4606]: I1212 00:45:30.023869 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="f83e9b6e-2200-4066-81be-df7d867ea60e" containerName="glance-httpd" Dec 12 00:45:30 crc kubenswrapper[4606]: E1212 00:45:30.023887 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bce18320-9dbb-4a06-b2f8-0d0bbd6c5592" containerName="glance-httpd" Dec 12 00:45:30 crc kubenswrapper[4606]: I1212 00:45:30.023893 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="bce18320-9dbb-4a06-b2f8-0d0bbd6c5592" containerName="glance-httpd" Dec 12 00:45:30 crc kubenswrapper[4606]: E1212 00:45:30.023901 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f83e9b6e-2200-4066-81be-df7d867ea60e" containerName="glance-log" Dec 12 00:45:30 crc kubenswrapper[4606]: I1212 00:45:30.023907 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="f83e9b6e-2200-4066-81be-df7d867ea60e" containerName="glance-log" Dec 12 00:45:30 crc kubenswrapper[4606]: I1212 00:45:30.024065 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="bce18320-9dbb-4a06-b2f8-0d0bbd6c5592" containerName="glance-httpd" Dec 12 00:45:30 crc kubenswrapper[4606]: I1212 00:45:30.024074 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="bce18320-9dbb-4a06-b2f8-0d0bbd6c5592" containerName="glance-log" Dec 12 00:45:30 crc kubenswrapper[4606]: I1212 00:45:30.024085 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="f83e9b6e-2200-4066-81be-df7d867ea60e" containerName="glance-log" Dec 12 00:45:30 crc kubenswrapper[4606]: I1212 00:45:30.024099 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="f83e9b6e-2200-4066-81be-df7d867ea60e" containerName="glance-httpd" Dec 12 00:45:30 crc kubenswrapper[4606]: I1212 00:45:30.024946 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 12 00:45:30 crc kubenswrapper[4606]: I1212 00:45:30.031070 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 12 00:45:30 crc kubenswrapper[4606]: I1212 00:45:30.034357 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Dec 12 00:45:30 crc kubenswrapper[4606]: I1212 00:45:30.034435 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-gs7ln" Dec 12 00:45:30 crc kubenswrapper[4606]: I1212 00:45:30.034821 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 12 00:45:30 crc kubenswrapper[4606]: I1212 00:45:30.035224 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 12 00:45:30 crc kubenswrapper[4606]: I1212 00:45:30.056470 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 12 00:45:30 crc kubenswrapper[4606]: I1212 00:45:30.058557 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 12 00:45:30 crc kubenswrapper[4606]: I1212 00:45:30.061605 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 12 00:45:30 crc kubenswrapper[4606]: I1212 00:45:30.061985 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 12 00:45:30 crc kubenswrapper[4606]: I1212 00:45:30.069878 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 12 00:45:30 crc kubenswrapper[4606]: I1212 00:45:30.101354 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"8e040a81-ee68-4745-b947-61aa28e33fa7\") " pod="openstack/glance-default-internal-api-0" Dec 12 00:45:30 crc kubenswrapper[4606]: I1212 00:45:30.101414 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7kns\" (UniqueName: \"kubernetes.io/projected/8e040a81-ee68-4745-b947-61aa28e33fa7-kube-api-access-b7kns\") pod \"glance-default-internal-api-0\" (UID: \"8e040a81-ee68-4745-b947-61aa28e33fa7\") " pod="openstack/glance-default-internal-api-0" Dec 12 00:45:30 crc kubenswrapper[4606]: I1212 00:45:30.101458 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2bccf3c-0850-4ad7-96c0-415f9c6bb56c-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"c2bccf3c-0850-4ad7-96c0-415f9c6bb56c\") " pod="openstack/glance-default-external-api-0" Dec 12 00:45:30 crc kubenswrapper[4606]: I1212 00:45:30.101496 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c2bccf3c-0850-4ad7-96c0-415f9c6bb56c-logs\") pod \"glance-default-external-api-0\" (UID: \"c2bccf3c-0850-4ad7-96c0-415f9c6bb56c\") " pod="openstack/glance-default-external-api-0" Dec 12 00:45:30 crc kubenswrapper[4606]: I1212 00:45:30.101590 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8e040a81-ee68-4745-b947-61aa28e33fa7-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8e040a81-ee68-4745-b947-61aa28e33fa7\") " pod="openstack/glance-default-internal-api-0" Dec 12 00:45:30 crc kubenswrapper[4606]: I1212 00:45:30.101619 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e040a81-ee68-4745-b947-61aa28e33fa7-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"8e040a81-ee68-4745-b947-61aa28e33fa7\") " pod="openstack/glance-default-internal-api-0" Dec 12 00:45:30 crc kubenswrapper[4606]: I1212 00:45:30.101652 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2bccf3c-0850-4ad7-96c0-415f9c6bb56c-config-data\") pod \"glance-default-external-api-0\" (UID: \"c2bccf3c-0850-4ad7-96c0-415f9c6bb56c\") " pod="openstack/glance-default-external-api-0" Dec 12 00:45:30 crc kubenswrapper[4606]: I1212 00:45:30.101739 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e040a81-ee68-4745-b947-61aa28e33fa7-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8e040a81-ee68-4745-b947-61aa28e33fa7\") " pod="openstack/glance-default-internal-api-0" Dec 12 00:45:30 crc kubenswrapper[4606]: I1212 00:45:30.101881 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2bccf3c-0850-4ad7-96c0-415f9c6bb56c-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"c2bccf3c-0850-4ad7-96c0-415f9c6bb56c\") " pod="openstack/glance-default-external-api-0" Dec 12 00:45:30 crc kubenswrapper[4606]: I1212 00:45:30.101933 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c2bccf3c-0850-4ad7-96c0-415f9c6bb56c-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"c2bccf3c-0850-4ad7-96c0-415f9c6bb56c\") " pod="openstack/glance-default-external-api-0" Dec 12 00:45:30 crc kubenswrapper[4606]: I1212 00:45:30.102028 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8e040a81-ee68-4745-b947-61aa28e33fa7-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8e040a81-ee68-4745-b947-61aa28e33fa7\") " pod="openstack/glance-default-internal-api-0" Dec 12 00:45:30 crc kubenswrapper[4606]: I1212 00:45:30.102086 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8e040a81-ee68-4745-b947-61aa28e33fa7-logs\") pod \"glance-default-internal-api-0\" (UID: \"8e040a81-ee68-4745-b947-61aa28e33fa7\") " pod="openstack/glance-default-internal-api-0" Dec 12 00:45:30 crc kubenswrapper[4606]: I1212 00:45:30.102130 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c2bccf3c-0850-4ad7-96c0-415f9c6bb56c-scripts\") pod \"glance-default-external-api-0\" (UID: \"c2bccf3c-0850-4ad7-96c0-415f9c6bb56c\") " pod="openstack/glance-default-external-api-0" Dec 12 00:45:30 crc kubenswrapper[4606]: I1212 00:45:30.102168 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e040a81-ee68-4745-b947-61aa28e33fa7-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8e040a81-ee68-4745-b947-61aa28e33fa7\") " pod="openstack/glance-default-internal-api-0" Dec 12 00:45:30 crc kubenswrapper[4606]: I1212 00:45:30.102219 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"c2bccf3c-0850-4ad7-96c0-415f9c6bb56c\") " pod="openstack/glance-default-external-api-0" Dec 12 00:45:30 crc kubenswrapper[4606]: I1212 00:45:30.102267 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gztps\" (UniqueName: \"kubernetes.io/projected/c2bccf3c-0850-4ad7-96c0-415f9c6bb56c-kube-api-access-gztps\") pod \"glance-default-external-api-0\" (UID: \"c2bccf3c-0850-4ad7-96c0-415f9c6bb56c\") " pod="openstack/glance-default-external-api-0" Dec 12 00:45:30 crc kubenswrapper[4606]: I1212 00:45:30.208299 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8e040a81-ee68-4745-b947-61aa28e33fa7-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8e040a81-ee68-4745-b947-61aa28e33fa7\") " pod="openstack/glance-default-internal-api-0" Dec 12 00:45:30 crc kubenswrapper[4606]: I1212 00:45:30.208646 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8e040a81-ee68-4745-b947-61aa28e33fa7-logs\") pod \"glance-default-internal-api-0\" (UID: \"8e040a81-ee68-4745-b947-61aa28e33fa7\") " pod="openstack/glance-default-internal-api-0" Dec 12 00:45:30 crc kubenswrapper[4606]: I1212 00:45:30.208680 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c2bccf3c-0850-4ad7-96c0-415f9c6bb56c-scripts\") pod \"glance-default-external-api-0\" (UID: \"c2bccf3c-0850-4ad7-96c0-415f9c6bb56c\") " pod="openstack/glance-default-external-api-0" Dec 12 00:45:30 crc kubenswrapper[4606]: I1212 00:45:30.208708 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e040a81-ee68-4745-b947-61aa28e33fa7-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8e040a81-ee68-4745-b947-61aa28e33fa7\") " pod="openstack/glance-default-internal-api-0" Dec 12 00:45:30 crc kubenswrapper[4606]: I1212 00:45:30.208732 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"c2bccf3c-0850-4ad7-96c0-415f9c6bb56c\") " pod="openstack/glance-default-external-api-0" Dec 12 00:45:30 crc kubenswrapper[4606]: I1212 00:45:30.208760 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gztps\" (UniqueName: \"kubernetes.io/projected/c2bccf3c-0850-4ad7-96c0-415f9c6bb56c-kube-api-access-gztps\") pod \"glance-default-external-api-0\" (UID: \"c2bccf3c-0850-4ad7-96c0-415f9c6bb56c\") " pod="openstack/glance-default-external-api-0" Dec 12 00:45:30 crc kubenswrapper[4606]: I1212 00:45:30.208796 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"8e040a81-ee68-4745-b947-61aa28e33fa7\") " pod="openstack/glance-default-internal-api-0" Dec 12 00:45:30 crc kubenswrapper[4606]: I1212 00:45:30.208831 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b7kns\" (UniqueName: \"kubernetes.io/projected/8e040a81-ee68-4745-b947-61aa28e33fa7-kube-api-access-b7kns\") pod \"glance-default-internal-api-0\" (UID: \"8e040a81-ee68-4745-b947-61aa28e33fa7\") " pod="openstack/glance-default-internal-api-0" Dec 12 00:45:30 crc kubenswrapper[4606]: I1212 00:45:30.208860 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2bccf3c-0850-4ad7-96c0-415f9c6bb56c-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"c2bccf3c-0850-4ad7-96c0-415f9c6bb56c\") " pod="openstack/glance-default-external-api-0" Dec 12 00:45:30 crc kubenswrapper[4606]: I1212 00:45:30.208888 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c2bccf3c-0850-4ad7-96c0-415f9c6bb56c-logs\") pod \"glance-default-external-api-0\" (UID: \"c2bccf3c-0850-4ad7-96c0-415f9c6bb56c\") " pod="openstack/glance-default-external-api-0" Dec 12 00:45:30 crc kubenswrapper[4606]: I1212 00:45:30.208913 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8e040a81-ee68-4745-b947-61aa28e33fa7-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8e040a81-ee68-4745-b947-61aa28e33fa7\") " pod="openstack/glance-default-internal-api-0" Dec 12 00:45:30 crc kubenswrapper[4606]: I1212 00:45:30.208940 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e040a81-ee68-4745-b947-61aa28e33fa7-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"8e040a81-ee68-4745-b947-61aa28e33fa7\") " pod="openstack/glance-default-internal-api-0" Dec 12 00:45:30 crc kubenswrapper[4606]: I1212 00:45:30.208966 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2bccf3c-0850-4ad7-96c0-415f9c6bb56c-config-data\") pod \"glance-default-external-api-0\" (UID: \"c2bccf3c-0850-4ad7-96c0-415f9c6bb56c\") " pod="openstack/glance-default-external-api-0" Dec 12 00:45:30 crc kubenswrapper[4606]: I1212 00:45:30.208995 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e040a81-ee68-4745-b947-61aa28e33fa7-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8e040a81-ee68-4745-b947-61aa28e33fa7\") " pod="openstack/glance-default-internal-api-0" Dec 12 00:45:30 crc kubenswrapper[4606]: I1212 00:45:30.209048 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2bccf3c-0850-4ad7-96c0-415f9c6bb56c-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"c2bccf3c-0850-4ad7-96c0-415f9c6bb56c\") " pod="openstack/glance-default-external-api-0" Dec 12 00:45:30 crc kubenswrapper[4606]: I1212 00:45:30.209078 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c2bccf3c-0850-4ad7-96c0-415f9c6bb56c-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"c2bccf3c-0850-4ad7-96c0-415f9c6bb56c\") " pod="openstack/glance-default-external-api-0" Dec 12 00:45:30 crc kubenswrapper[4606]: I1212 00:45:30.209529 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c2bccf3c-0850-4ad7-96c0-415f9c6bb56c-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"c2bccf3c-0850-4ad7-96c0-415f9c6bb56c\") " pod="openstack/glance-default-external-api-0" Dec 12 00:45:30 crc kubenswrapper[4606]: I1212 00:45:30.209835 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8e040a81-ee68-4745-b947-61aa28e33fa7-logs\") pod \"glance-default-internal-api-0\" (UID: \"8e040a81-ee68-4745-b947-61aa28e33fa7\") " pod="openstack/glance-default-internal-api-0" Dec 12 00:45:30 crc kubenswrapper[4606]: I1212 00:45:30.217465 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c2bccf3c-0850-4ad7-96c0-415f9c6bb56c-logs\") pod \"glance-default-external-api-0\" (UID: \"c2bccf3c-0850-4ad7-96c0-415f9c6bb56c\") " pod="openstack/glance-default-external-api-0" Dec 12 00:45:30 crc kubenswrapper[4606]: I1212 00:45:30.217803 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8e040a81-ee68-4745-b947-61aa28e33fa7-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8e040a81-ee68-4745-b947-61aa28e33fa7\") " pod="openstack/glance-default-internal-api-0" Dec 12 00:45:30 crc kubenswrapper[4606]: I1212 00:45:30.218112 4606 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"c2bccf3c-0850-4ad7-96c0-415f9c6bb56c\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-external-api-0" Dec 12 00:45:30 crc kubenswrapper[4606]: I1212 00:45:30.225542 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e040a81-ee68-4745-b947-61aa28e33fa7-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"8e040a81-ee68-4745-b947-61aa28e33fa7\") " pod="openstack/glance-default-internal-api-0" Dec 12 00:45:30 crc kubenswrapper[4606]: I1212 00:45:30.226304 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8e040a81-ee68-4745-b947-61aa28e33fa7-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8e040a81-ee68-4745-b947-61aa28e33fa7\") " pod="openstack/glance-default-internal-api-0" Dec 12 00:45:30 crc kubenswrapper[4606]: I1212 00:45:30.227025 4606 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"8e040a81-ee68-4745-b947-61aa28e33fa7\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/glance-default-internal-api-0" Dec 12 00:45:30 crc kubenswrapper[4606]: I1212 00:45:30.229048 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2bccf3c-0850-4ad7-96c0-415f9c6bb56c-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"c2bccf3c-0850-4ad7-96c0-415f9c6bb56c\") " pod="openstack/glance-default-external-api-0" Dec 12 00:45:30 crc kubenswrapper[4606]: I1212 00:45:30.229534 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e040a81-ee68-4745-b947-61aa28e33fa7-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8e040a81-ee68-4745-b947-61aa28e33fa7\") " pod="openstack/glance-default-internal-api-0" Dec 12 00:45:30 crc kubenswrapper[4606]: I1212 00:45:30.230513 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2bccf3c-0850-4ad7-96c0-415f9c6bb56c-config-data\") pod \"glance-default-external-api-0\" (UID: \"c2bccf3c-0850-4ad7-96c0-415f9c6bb56c\") " pod="openstack/glance-default-external-api-0" Dec 12 00:45:30 crc kubenswrapper[4606]: I1212 00:45:30.231701 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2bccf3c-0850-4ad7-96c0-415f9c6bb56c-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"c2bccf3c-0850-4ad7-96c0-415f9c6bb56c\") " pod="openstack/glance-default-external-api-0" Dec 12 00:45:30 crc kubenswrapper[4606]: I1212 00:45:30.235414 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e040a81-ee68-4745-b947-61aa28e33fa7-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8e040a81-ee68-4745-b947-61aa28e33fa7\") " pod="openstack/glance-default-internal-api-0" Dec 12 00:45:30 crc kubenswrapper[4606]: I1212 00:45:30.250240 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gztps\" (UniqueName: \"kubernetes.io/projected/c2bccf3c-0850-4ad7-96c0-415f9c6bb56c-kube-api-access-gztps\") pod \"glance-default-external-api-0\" (UID: \"c2bccf3c-0850-4ad7-96c0-415f9c6bb56c\") " pod="openstack/glance-default-external-api-0" Dec 12 00:45:30 crc kubenswrapper[4606]: I1212 00:45:30.261901 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7kns\" (UniqueName: \"kubernetes.io/projected/8e040a81-ee68-4745-b947-61aa28e33fa7-kube-api-access-b7kns\") pod \"glance-default-internal-api-0\" (UID: \"8e040a81-ee68-4745-b947-61aa28e33fa7\") " pod="openstack/glance-default-internal-api-0" Dec 12 00:45:30 crc kubenswrapper[4606]: I1212 00:45:30.262103 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c2bccf3c-0850-4ad7-96c0-415f9c6bb56c-scripts\") pod \"glance-default-external-api-0\" (UID: \"c2bccf3c-0850-4ad7-96c0-415f9c6bb56c\") " pod="openstack/glance-default-external-api-0" Dec 12 00:45:30 crc kubenswrapper[4606]: I1212 00:45:30.305098 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"8e040a81-ee68-4745-b947-61aa28e33fa7\") " pod="openstack/glance-default-internal-api-0" Dec 12 00:45:30 crc kubenswrapper[4606]: I1212 00:45:30.330059 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"c2bccf3c-0850-4ad7-96c0-415f9c6bb56c\") " pod="openstack/glance-default-external-api-0" Dec 12 00:45:30 crc kubenswrapper[4606]: I1212 00:45:30.364672 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 12 00:45:30 crc kubenswrapper[4606]: I1212 00:45:30.402573 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 12 00:45:30 crc kubenswrapper[4606]: I1212 00:45:30.546477 4606 generic.go:334] "Generic (PLEG): container finished" podID="4b21b129-dddc-4a41-ad1f-6cad37d0aa07" containerID="a4823e5b2d7c2714afe1b56e652606df93932f01689b012beee3a434a8505f96" exitCode=0 Dec 12 00:45:30 crc kubenswrapper[4606]: I1212 00:45:30.546536 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-jx54m" event={"ID":"4b21b129-dddc-4a41-ad1f-6cad37d0aa07","Type":"ContainerDied","Data":"a4823e5b2d7c2714afe1b56e652606df93932f01689b012beee3a434a8505f96"} Dec 12 00:45:31 crc kubenswrapper[4606]: I1212 00:45:31.142269 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 12 00:45:31 crc kubenswrapper[4606]: I1212 00:45:31.273613 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 12 00:45:31 crc kubenswrapper[4606]: I1212 00:45:31.522298 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-64d56f87f-bhmtm"] Dec 12 00:45:31 crc kubenswrapper[4606]: I1212 00:45:31.569519 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-79c99578bb-cdgsn"] Dec 12 00:45:31 crc kubenswrapper[4606]: I1212 00:45:31.577684 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-79c99578bb-cdgsn" Dec 12 00:45:31 crc kubenswrapper[4606]: I1212 00:45:31.584534 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Dec 12 00:45:31 crc kubenswrapper[4606]: I1212 00:45:31.615518 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-79c99578bb-cdgsn"] Dec 12 00:45:31 crc kubenswrapper[4606]: I1212 00:45:31.644935 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 12 00:45:31 crc kubenswrapper[4606]: I1212 00:45:31.658016 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkmtn\" (UniqueName: \"kubernetes.io/projected/9ede4720-3fd7-4524-adfc-c1c395f12170-kube-api-access-pkmtn\") pod \"horizon-79c99578bb-cdgsn\" (UID: \"9ede4720-3fd7-4524-adfc-c1c395f12170\") " pod="openstack/horizon-79c99578bb-cdgsn" Dec 12 00:45:31 crc kubenswrapper[4606]: I1212 00:45:31.658076 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ede4720-3fd7-4524-adfc-c1c395f12170-horizon-tls-certs\") pod \"horizon-79c99578bb-cdgsn\" (UID: \"9ede4720-3fd7-4524-adfc-c1c395f12170\") " pod="openstack/horizon-79c99578bb-cdgsn" Dec 12 00:45:31 crc kubenswrapper[4606]: I1212 00:45:31.658103 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9ede4720-3fd7-4524-adfc-c1c395f12170-logs\") pod \"horizon-79c99578bb-cdgsn\" (UID: \"9ede4720-3fd7-4524-adfc-c1c395f12170\") " pod="openstack/horizon-79c99578bb-cdgsn" Dec 12 00:45:31 crc kubenswrapper[4606]: I1212 00:45:31.658119 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9ede4720-3fd7-4524-adfc-c1c395f12170-scripts\") pod \"horizon-79c99578bb-cdgsn\" (UID: \"9ede4720-3fd7-4524-adfc-c1c395f12170\") " pod="openstack/horizon-79c99578bb-cdgsn" Dec 12 00:45:31 crc kubenswrapper[4606]: I1212 00:45:31.658164 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9ede4720-3fd7-4524-adfc-c1c395f12170-horizon-secret-key\") pod \"horizon-79c99578bb-cdgsn\" (UID: \"9ede4720-3fd7-4524-adfc-c1c395f12170\") " pod="openstack/horizon-79c99578bb-cdgsn" Dec 12 00:45:31 crc kubenswrapper[4606]: I1212 00:45:31.658257 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ede4720-3fd7-4524-adfc-c1c395f12170-combined-ca-bundle\") pod \"horizon-79c99578bb-cdgsn\" (UID: \"9ede4720-3fd7-4524-adfc-c1c395f12170\") " pod="openstack/horizon-79c99578bb-cdgsn" Dec 12 00:45:31 crc kubenswrapper[4606]: I1212 00:45:31.658285 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9ede4720-3fd7-4524-adfc-c1c395f12170-config-data\") pod \"horizon-79c99578bb-cdgsn\" (UID: \"9ede4720-3fd7-4524-adfc-c1c395f12170\") " pod="openstack/horizon-79c99578bb-cdgsn" Dec 12 00:45:31 crc kubenswrapper[4606]: I1212 00:45:31.759623 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9ede4720-3fd7-4524-adfc-c1c395f12170-logs\") pod \"horizon-79c99578bb-cdgsn\" (UID: \"9ede4720-3fd7-4524-adfc-c1c395f12170\") " pod="openstack/horizon-79c99578bb-cdgsn" Dec 12 00:45:31 crc kubenswrapper[4606]: I1212 00:45:31.759663 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9ede4720-3fd7-4524-adfc-c1c395f12170-scripts\") pod \"horizon-79c99578bb-cdgsn\" (UID: \"9ede4720-3fd7-4524-adfc-c1c395f12170\") " pod="openstack/horizon-79c99578bb-cdgsn" Dec 12 00:45:31 crc kubenswrapper[4606]: I1212 00:45:31.759708 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9ede4720-3fd7-4524-adfc-c1c395f12170-horizon-secret-key\") pod \"horizon-79c99578bb-cdgsn\" (UID: \"9ede4720-3fd7-4524-adfc-c1c395f12170\") " pod="openstack/horizon-79c99578bb-cdgsn" Dec 12 00:45:31 crc kubenswrapper[4606]: I1212 00:45:31.759727 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ede4720-3fd7-4524-adfc-c1c395f12170-combined-ca-bundle\") pod \"horizon-79c99578bb-cdgsn\" (UID: \"9ede4720-3fd7-4524-adfc-c1c395f12170\") " pod="openstack/horizon-79c99578bb-cdgsn" Dec 12 00:45:31 crc kubenswrapper[4606]: I1212 00:45:31.759750 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9ede4720-3fd7-4524-adfc-c1c395f12170-config-data\") pod \"horizon-79c99578bb-cdgsn\" (UID: \"9ede4720-3fd7-4524-adfc-c1c395f12170\") " pod="openstack/horizon-79c99578bb-cdgsn" Dec 12 00:45:31 crc kubenswrapper[4606]: I1212 00:45:31.759823 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pkmtn\" (UniqueName: \"kubernetes.io/projected/9ede4720-3fd7-4524-adfc-c1c395f12170-kube-api-access-pkmtn\") pod \"horizon-79c99578bb-cdgsn\" (UID: \"9ede4720-3fd7-4524-adfc-c1c395f12170\") " pod="openstack/horizon-79c99578bb-cdgsn" Dec 12 00:45:31 crc kubenswrapper[4606]: I1212 00:45:31.759852 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ede4720-3fd7-4524-adfc-c1c395f12170-horizon-tls-certs\") pod \"horizon-79c99578bb-cdgsn\" (UID: \"9ede4720-3fd7-4524-adfc-c1c395f12170\") " pod="openstack/horizon-79c99578bb-cdgsn" Dec 12 00:45:31 crc kubenswrapper[4606]: I1212 00:45:31.761428 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bce18320-9dbb-4a06-b2f8-0d0bbd6c5592" path="/var/lib/kubelet/pods/bce18320-9dbb-4a06-b2f8-0d0bbd6c5592/volumes" Dec 12 00:45:31 crc kubenswrapper[4606]: I1212 00:45:31.762326 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f83e9b6e-2200-4066-81be-df7d867ea60e" path="/var/lib/kubelet/pods/f83e9b6e-2200-4066-81be-df7d867ea60e/volumes" Dec 12 00:45:31 crc kubenswrapper[4606]: I1212 00:45:31.762927 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-b55cc6bf7-ckhd2"] Dec 12 00:45:31 crc kubenswrapper[4606]: I1212 00:45:31.762959 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-b9fb498f6-62fcc"] Dec 12 00:45:31 crc kubenswrapper[4606]: I1212 00:45:31.763022 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9ede4720-3fd7-4524-adfc-c1c395f12170-logs\") pod \"horizon-79c99578bb-cdgsn\" (UID: \"9ede4720-3fd7-4524-adfc-c1c395f12170\") " pod="openstack/horizon-79c99578bb-cdgsn" Dec 12 00:45:31 crc kubenswrapper[4606]: I1212 00:45:31.764811 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-b9fb498f6-62fcc" Dec 12 00:45:31 crc kubenswrapper[4606]: I1212 00:45:31.768315 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 12 00:45:31 crc kubenswrapper[4606]: I1212 00:45:31.783927 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-b9fb498f6-62fcc"] Dec 12 00:45:31 crc kubenswrapper[4606]: I1212 00:45:31.785513 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9ede4720-3fd7-4524-adfc-c1c395f12170-horizon-secret-key\") pod \"horizon-79c99578bb-cdgsn\" (UID: \"9ede4720-3fd7-4524-adfc-c1c395f12170\") " pod="openstack/horizon-79c99578bb-cdgsn" Dec 12 00:45:31 crc kubenswrapper[4606]: I1212 00:45:31.786623 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9ede4720-3fd7-4524-adfc-c1c395f12170-config-data\") pod \"horizon-79c99578bb-cdgsn\" (UID: \"9ede4720-3fd7-4524-adfc-c1c395f12170\") " pod="openstack/horizon-79c99578bb-cdgsn" Dec 12 00:45:31 crc kubenswrapper[4606]: I1212 00:45:31.786936 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9ede4720-3fd7-4524-adfc-c1c395f12170-scripts\") pod \"horizon-79c99578bb-cdgsn\" (UID: \"9ede4720-3fd7-4524-adfc-c1c395f12170\") " pod="openstack/horizon-79c99578bb-cdgsn" Dec 12 00:45:31 crc kubenswrapper[4606]: I1212 00:45:31.798832 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ede4720-3fd7-4524-adfc-c1c395f12170-horizon-tls-certs\") pod \"horizon-79c99578bb-cdgsn\" (UID: \"9ede4720-3fd7-4524-adfc-c1c395f12170\") " pod="openstack/horizon-79c99578bb-cdgsn" Dec 12 00:45:31 crc kubenswrapper[4606]: I1212 00:45:31.802567 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkmtn\" (UniqueName: \"kubernetes.io/projected/9ede4720-3fd7-4524-adfc-c1c395f12170-kube-api-access-pkmtn\") pod \"horizon-79c99578bb-cdgsn\" (UID: \"9ede4720-3fd7-4524-adfc-c1c395f12170\") " pod="openstack/horizon-79c99578bb-cdgsn" Dec 12 00:45:31 crc kubenswrapper[4606]: I1212 00:45:31.825882 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ede4720-3fd7-4524-adfc-c1c395f12170-combined-ca-bundle\") pod \"horizon-79c99578bb-cdgsn\" (UID: \"9ede4720-3fd7-4524-adfc-c1c395f12170\") " pod="openstack/horizon-79c99578bb-cdgsn" Dec 12 00:45:31 crc kubenswrapper[4606]: I1212 00:45:31.904613 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-79c99578bb-cdgsn" Dec 12 00:45:31 crc kubenswrapper[4606]: I1212 00:45:31.969842 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e38df57e-1a86-4c45-bf40-6282a6a049ed-scripts\") pod \"horizon-b9fb498f6-62fcc\" (UID: \"e38df57e-1a86-4c45-bf40-6282a6a049ed\") " pod="openstack/horizon-b9fb498f6-62fcc" Dec 12 00:45:31 crc kubenswrapper[4606]: I1212 00:45:31.969929 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e38df57e-1a86-4c45-bf40-6282a6a049ed-combined-ca-bundle\") pod \"horizon-b9fb498f6-62fcc\" (UID: \"e38df57e-1a86-4c45-bf40-6282a6a049ed\") " pod="openstack/horizon-b9fb498f6-62fcc" Dec 12 00:45:31 crc kubenswrapper[4606]: I1212 00:45:31.969970 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e38df57e-1a86-4c45-bf40-6282a6a049ed-logs\") pod \"horizon-b9fb498f6-62fcc\" (UID: \"e38df57e-1a86-4c45-bf40-6282a6a049ed\") " pod="openstack/horizon-b9fb498f6-62fcc" Dec 12 00:45:31 crc kubenswrapper[4606]: I1212 00:45:31.969997 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/e38df57e-1a86-4c45-bf40-6282a6a049ed-horizon-tls-certs\") pod \"horizon-b9fb498f6-62fcc\" (UID: \"e38df57e-1a86-4c45-bf40-6282a6a049ed\") " pod="openstack/horizon-b9fb498f6-62fcc" Dec 12 00:45:31 crc kubenswrapper[4606]: I1212 00:45:31.970047 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e38df57e-1a86-4c45-bf40-6282a6a049ed-config-data\") pod \"horizon-b9fb498f6-62fcc\" (UID: \"e38df57e-1a86-4c45-bf40-6282a6a049ed\") " pod="openstack/horizon-b9fb498f6-62fcc" Dec 12 00:45:31 crc kubenswrapper[4606]: I1212 00:45:31.970074 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kv925\" (UniqueName: \"kubernetes.io/projected/e38df57e-1a86-4c45-bf40-6282a6a049ed-kube-api-access-kv925\") pod \"horizon-b9fb498f6-62fcc\" (UID: \"e38df57e-1a86-4c45-bf40-6282a6a049ed\") " pod="openstack/horizon-b9fb498f6-62fcc" Dec 12 00:45:31 crc kubenswrapper[4606]: I1212 00:45:31.970108 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e38df57e-1a86-4c45-bf40-6282a6a049ed-horizon-secret-key\") pod \"horizon-b9fb498f6-62fcc\" (UID: \"e38df57e-1a86-4c45-bf40-6282a6a049ed\") " pod="openstack/horizon-b9fb498f6-62fcc" Dec 12 00:45:32 crc kubenswrapper[4606]: I1212 00:45:32.072202 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e38df57e-1a86-4c45-bf40-6282a6a049ed-config-data\") pod \"horizon-b9fb498f6-62fcc\" (UID: \"e38df57e-1a86-4c45-bf40-6282a6a049ed\") " pod="openstack/horizon-b9fb498f6-62fcc" Dec 12 00:45:32 crc kubenswrapper[4606]: I1212 00:45:32.072281 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kv925\" (UniqueName: \"kubernetes.io/projected/e38df57e-1a86-4c45-bf40-6282a6a049ed-kube-api-access-kv925\") pod \"horizon-b9fb498f6-62fcc\" (UID: \"e38df57e-1a86-4c45-bf40-6282a6a049ed\") " pod="openstack/horizon-b9fb498f6-62fcc" Dec 12 00:45:32 crc kubenswrapper[4606]: I1212 00:45:32.072319 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e38df57e-1a86-4c45-bf40-6282a6a049ed-horizon-secret-key\") pod \"horizon-b9fb498f6-62fcc\" (UID: \"e38df57e-1a86-4c45-bf40-6282a6a049ed\") " pod="openstack/horizon-b9fb498f6-62fcc" Dec 12 00:45:32 crc kubenswrapper[4606]: I1212 00:45:32.072413 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e38df57e-1a86-4c45-bf40-6282a6a049ed-scripts\") pod \"horizon-b9fb498f6-62fcc\" (UID: \"e38df57e-1a86-4c45-bf40-6282a6a049ed\") " pod="openstack/horizon-b9fb498f6-62fcc" Dec 12 00:45:32 crc kubenswrapper[4606]: I1212 00:45:32.072505 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e38df57e-1a86-4c45-bf40-6282a6a049ed-combined-ca-bundle\") pod \"horizon-b9fb498f6-62fcc\" (UID: \"e38df57e-1a86-4c45-bf40-6282a6a049ed\") " pod="openstack/horizon-b9fb498f6-62fcc" Dec 12 00:45:32 crc kubenswrapper[4606]: I1212 00:45:32.072575 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e38df57e-1a86-4c45-bf40-6282a6a049ed-logs\") pod \"horizon-b9fb498f6-62fcc\" (UID: \"e38df57e-1a86-4c45-bf40-6282a6a049ed\") " pod="openstack/horizon-b9fb498f6-62fcc" Dec 12 00:45:32 crc kubenswrapper[4606]: I1212 00:45:32.072634 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/e38df57e-1a86-4c45-bf40-6282a6a049ed-horizon-tls-certs\") pod \"horizon-b9fb498f6-62fcc\" (UID: \"e38df57e-1a86-4c45-bf40-6282a6a049ed\") " pod="openstack/horizon-b9fb498f6-62fcc" Dec 12 00:45:32 crc kubenswrapper[4606]: I1212 00:45:32.074429 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e38df57e-1a86-4c45-bf40-6282a6a049ed-scripts\") pod \"horizon-b9fb498f6-62fcc\" (UID: \"e38df57e-1a86-4c45-bf40-6282a6a049ed\") " pod="openstack/horizon-b9fb498f6-62fcc" Dec 12 00:45:32 crc kubenswrapper[4606]: I1212 00:45:32.074680 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e38df57e-1a86-4c45-bf40-6282a6a049ed-logs\") pod \"horizon-b9fb498f6-62fcc\" (UID: \"e38df57e-1a86-4c45-bf40-6282a6a049ed\") " pod="openstack/horizon-b9fb498f6-62fcc" Dec 12 00:45:32 crc kubenswrapper[4606]: I1212 00:45:32.075354 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e38df57e-1a86-4c45-bf40-6282a6a049ed-config-data\") pod \"horizon-b9fb498f6-62fcc\" (UID: \"e38df57e-1a86-4c45-bf40-6282a6a049ed\") " pod="openstack/horizon-b9fb498f6-62fcc" Dec 12 00:45:32 crc kubenswrapper[4606]: I1212 00:45:32.076899 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/e38df57e-1a86-4c45-bf40-6282a6a049ed-horizon-tls-certs\") pod \"horizon-b9fb498f6-62fcc\" (UID: \"e38df57e-1a86-4c45-bf40-6282a6a049ed\") " pod="openstack/horizon-b9fb498f6-62fcc" Dec 12 00:45:32 crc kubenswrapper[4606]: I1212 00:45:32.079793 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e38df57e-1a86-4c45-bf40-6282a6a049ed-horizon-secret-key\") pod \"horizon-b9fb498f6-62fcc\" (UID: \"e38df57e-1a86-4c45-bf40-6282a6a049ed\") " pod="openstack/horizon-b9fb498f6-62fcc" Dec 12 00:45:32 crc kubenswrapper[4606]: I1212 00:45:32.079878 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e38df57e-1a86-4c45-bf40-6282a6a049ed-combined-ca-bundle\") pod \"horizon-b9fb498f6-62fcc\" (UID: \"e38df57e-1a86-4c45-bf40-6282a6a049ed\") " pod="openstack/horizon-b9fb498f6-62fcc" Dec 12 00:45:32 crc kubenswrapper[4606]: I1212 00:45:32.104560 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kv925\" (UniqueName: \"kubernetes.io/projected/e38df57e-1a86-4c45-bf40-6282a6a049ed-kube-api-access-kv925\") pod \"horizon-b9fb498f6-62fcc\" (UID: \"e38df57e-1a86-4c45-bf40-6282a6a049ed\") " pod="openstack/horizon-b9fb498f6-62fcc" Dec 12 00:45:32 crc kubenswrapper[4606]: I1212 00:45:32.177019 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-b9fb498f6-62fcc" Dec 12 00:45:33 crc kubenswrapper[4606]: I1212 00:45:33.544348 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-56df8fb6b7-j2fv2" Dec 12 00:45:33 crc kubenswrapper[4606]: I1212 00:45:33.596031 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-6ms77"] Dec 12 00:45:33 crc kubenswrapper[4606]: I1212 00:45:33.604645 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5f59b8f679-6ms77" podUID="31335089-493b-42bb-9a5f-cb4ea39951f4" containerName="dnsmasq-dns" containerID="cri-o://8491d7f65e8fd7e0f7e430fda22ef9861a076be3942ad2ab7e3118166e6f9466" gracePeriod=10 Dec 12 00:45:34 crc kubenswrapper[4606]: I1212 00:45:34.667103 4606 generic.go:334] "Generic (PLEG): container finished" podID="31335089-493b-42bb-9a5f-cb4ea39951f4" containerID="8491d7f65e8fd7e0f7e430fda22ef9861a076be3942ad2ab7e3118166e6f9466" exitCode=0 Dec 12 00:45:34 crc kubenswrapper[4606]: I1212 00:45:34.667145 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-6ms77" event={"ID":"31335089-493b-42bb-9a5f-cb4ea39951f4","Type":"ContainerDied","Data":"8491d7f65e8fd7e0f7e430fda22ef9861a076be3942ad2ab7e3118166e6f9466"} Dec 12 00:45:35 crc kubenswrapper[4606]: W1212 00:45:35.679679 4606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc2bccf3c_0850_4ad7_96c0_415f9c6bb56c.slice/crio-8dc466df9cab9566d386a159c24b03c3ee2c8f5e339aeace048ecc0c6ba0399c WatchSource:0}: Error finding container 8dc466df9cab9566d386a159c24b03c3ee2c8f5e339aeace048ecc0c6ba0399c: Status 404 returned error can't find the container with id 8dc466df9cab9566d386a159c24b03c3ee2c8f5e339aeace048ecc0c6ba0399c Dec 12 00:45:36 crc kubenswrapper[4606]: W1212 00:45:36.149659 4606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8e040a81_ee68_4745_b947_61aa28e33fa7.slice/crio-41e51447aab445cb43fca395871267b62a042140de05de1b40d80a02158d0418 WatchSource:0}: Error finding container 41e51447aab445cb43fca395871267b62a042140de05de1b40d80a02158d0418: Status 404 returned error can't find the container with id 41e51447aab445cb43fca395871267b62a042140de05de1b40d80a02158d0418 Dec 12 00:45:36 crc kubenswrapper[4606]: I1212 00:45:36.453624 4606 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5f59b8f679-6ms77" podUID="31335089-493b-42bb-9a5f-cb4ea39951f4" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.132:5353: connect: connection refused" Dec 12 00:45:36 crc kubenswrapper[4606]: I1212 00:45:36.688150 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c2bccf3c-0850-4ad7-96c0-415f9c6bb56c","Type":"ContainerStarted","Data":"8dc466df9cab9566d386a159c24b03c3ee2c8f5e339aeace048ecc0c6ba0399c"} Dec 12 00:45:36 crc kubenswrapper[4606]: I1212 00:45:36.689532 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8e040a81-ee68-4745-b947-61aa28e33fa7","Type":"ContainerStarted","Data":"41e51447aab445cb43fca395871267b62a042140de05de1b40d80a02158d0418"} Dec 12 00:45:43 crc kubenswrapper[4606]: E1212 00:45:43.732148 4606 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Dec 12 00:45:43 crc kubenswrapper[4606]: E1212 00:45:43.732768 4606 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n8chdh64h674h57dh594h5c6h57ch587h5cdh578h5f8h597h677hd9h567h676h568hdbh688hf5h56dh55chbchd8hbdhb9h597h657h684h74hdcq,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mq47m,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-5f59cc7d7-l4jln_openstack(b710b7ca-abc9-465b-a279-949bd345962b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 12 00:45:43 crc kubenswrapper[4606]: E1212 00:45:43.749122 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-5f59cc7d7-l4jln" podUID="b710b7ca-abc9-465b-a279-949bd345962b" Dec 12 00:45:45 crc kubenswrapper[4606]: E1212 00:45:45.283305 4606 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-placement-api:current-podified" Dec 12 00:45:45 crc kubenswrapper[4606]: E1212 00:45:45.283766 4606 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:placement-db-sync,Image:quay.io/podified-antelope-centos9/openstack-placement-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/placement,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:placement-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-v2cll,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42482,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-db-sync-4hg5f_openstack(5a0b6b98-c743-4435-a967-55c0edb95531): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 12 00:45:45 crc kubenswrapper[4606]: E1212 00:45:45.285158 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/placement-db-sync-4hg5f" podUID="5a0b6b98-c743-4435-a967-55c0edb95531" Dec 12 00:45:45 crc kubenswrapper[4606]: E1212 00:45:45.308779 4606 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Dec 12 00:45:45 crc kubenswrapper[4606]: E1212 00:45:45.308948 4606 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n64h74hf5h96h695h55dhbfhfchf7h578h574h75h584h5f6h66hdh58fh66ch5c8h67fh5bfh5h78h556h59ch6bhb9h694h695hf8h694h67fq,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xdnbm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-64d56f87f-bhmtm_openstack(fc444db7-f445-40e7-bea0-f16f6afc2b91): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 12 00:45:45 crc kubenswrapper[4606]: E1212 00:45:45.319006 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-64d56f87f-bhmtm" podUID="fc444db7-f445-40e7-bea0-f16f6afc2b91" Dec 12 00:45:45 crc kubenswrapper[4606]: I1212 00:45:45.385845 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-jx54m" Dec 12 00:45:45 crc kubenswrapper[4606]: I1212 00:45:45.448012 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b21b129-dddc-4a41-ad1f-6cad37d0aa07-combined-ca-bundle\") pod \"4b21b129-dddc-4a41-ad1f-6cad37d0aa07\" (UID: \"4b21b129-dddc-4a41-ad1f-6cad37d0aa07\") " Dec 12 00:45:45 crc kubenswrapper[4606]: I1212 00:45:45.448054 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ffjsz\" (UniqueName: \"kubernetes.io/projected/4b21b129-dddc-4a41-ad1f-6cad37d0aa07-kube-api-access-ffjsz\") pod \"4b21b129-dddc-4a41-ad1f-6cad37d0aa07\" (UID: \"4b21b129-dddc-4a41-ad1f-6cad37d0aa07\") " Dec 12 00:45:45 crc kubenswrapper[4606]: I1212 00:45:45.448113 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4b21b129-dddc-4a41-ad1f-6cad37d0aa07-fernet-keys\") pod \"4b21b129-dddc-4a41-ad1f-6cad37d0aa07\" (UID: \"4b21b129-dddc-4a41-ad1f-6cad37d0aa07\") " Dec 12 00:45:45 crc kubenswrapper[4606]: I1212 00:45:45.448167 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b21b129-dddc-4a41-ad1f-6cad37d0aa07-scripts\") pod \"4b21b129-dddc-4a41-ad1f-6cad37d0aa07\" (UID: \"4b21b129-dddc-4a41-ad1f-6cad37d0aa07\") " Dec 12 00:45:45 crc kubenswrapper[4606]: I1212 00:45:45.448229 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4b21b129-dddc-4a41-ad1f-6cad37d0aa07-credential-keys\") pod \"4b21b129-dddc-4a41-ad1f-6cad37d0aa07\" (UID: \"4b21b129-dddc-4a41-ad1f-6cad37d0aa07\") " Dec 12 00:45:45 crc kubenswrapper[4606]: I1212 00:45:45.448247 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b21b129-dddc-4a41-ad1f-6cad37d0aa07-config-data\") pod \"4b21b129-dddc-4a41-ad1f-6cad37d0aa07\" (UID: \"4b21b129-dddc-4a41-ad1f-6cad37d0aa07\") " Dec 12 00:45:45 crc kubenswrapper[4606]: I1212 00:45:45.457946 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b21b129-dddc-4a41-ad1f-6cad37d0aa07-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "4b21b129-dddc-4a41-ad1f-6cad37d0aa07" (UID: "4b21b129-dddc-4a41-ad1f-6cad37d0aa07"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:45:45 crc kubenswrapper[4606]: I1212 00:45:45.458070 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b21b129-dddc-4a41-ad1f-6cad37d0aa07-kube-api-access-ffjsz" (OuterVolumeSpecName: "kube-api-access-ffjsz") pod "4b21b129-dddc-4a41-ad1f-6cad37d0aa07" (UID: "4b21b129-dddc-4a41-ad1f-6cad37d0aa07"). InnerVolumeSpecName "kube-api-access-ffjsz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:45:45 crc kubenswrapper[4606]: I1212 00:45:45.463285 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b21b129-dddc-4a41-ad1f-6cad37d0aa07-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "4b21b129-dddc-4a41-ad1f-6cad37d0aa07" (UID: "4b21b129-dddc-4a41-ad1f-6cad37d0aa07"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:45:45 crc kubenswrapper[4606]: I1212 00:45:45.464717 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b21b129-dddc-4a41-ad1f-6cad37d0aa07-scripts" (OuterVolumeSpecName: "scripts") pod "4b21b129-dddc-4a41-ad1f-6cad37d0aa07" (UID: "4b21b129-dddc-4a41-ad1f-6cad37d0aa07"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:45:45 crc kubenswrapper[4606]: I1212 00:45:45.485681 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b21b129-dddc-4a41-ad1f-6cad37d0aa07-config-data" (OuterVolumeSpecName: "config-data") pod "4b21b129-dddc-4a41-ad1f-6cad37d0aa07" (UID: "4b21b129-dddc-4a41-ad1f-6cad37d0aa07"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:45:45 crc kubenswrapper[4606]: I1212 00:45:45.490656 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b21b129-dddc-4a41-ad1f-6cad37d0aa07-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4b21b129-dddc-4a41-ad1f-6cad37d0aa07" (UID: "4b21b129-dddc-4a41-ad1f-6cad37d0aa07"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:45:45 crc kubenswrapper[4606]: I1212 00:45:45.549851 4606 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b21b129-dddc-4a41-ad1f-6cad37d0aa07-scripts\") on node \"crc\" DevicePath \"\"" Dec 12 00:45:45 crc kubenswrapper[4606]: I1212 00:45:45.549889 4606 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4b21b129-dddc-4a41-ad1f-6cad37d0aa07-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 12 00:45:45 crc kubenswrapper[4606]: I1212 00:45:45.549903 4606 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b21b129-dddc-4a41-ad1f-6cad37d0aa07-config-data\") on node \"crc\" DevicePath \"\"" Dec 12 00:45:45 crc kubenswrapper[4606]: I1212 00:45:45.549916 4606 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b21b129-dddc-4a41-ad1f-6cad37d0aa07-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 12 00:45:45 crc kubenswrapper[4606]: I1212 00:45:45.549928 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ffjsz\" (UniqueName: \"kubernetes.io/projected/4b21b129-dddc-4a41-ad1f-6cad37d0aa07-kube-api-access-ffjsz\") on node \"crc\" DevicePath \"\"" Dec 12 00:45:45 crc kubenswrapper[4606]: I1212 00:45:45.549939 4606 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4b21b129-dddc-4a41-ad1f-6cad37d0aa07-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 12 00:45:45 crc kubenswrapper[4606]: I1212 00:45:45.786818 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-jx54m" Dec 12 00:45:45 crc kubenswrapper[4606]: I1212 00:45:45.787024 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-jx54m" event={"ID":"4b21b129-dddc-4a41-ad1f-6cad37d0aa07","Type":"ContainerDied","Data":"5cf49a5d879c03def251f0378329479a6e1edafa5d925c0793be69d8d9c49a3c"} Dec 12 00:45:45 crc kubenswrapper[4606]: I1212 00:45:45.787068 4606 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5cf49a5d879c03def251f0378329479a6e1edafa5d925c0793be69d8d9c49a3c" Dec 12 00:45:45 crc kubenswrapper[4606]: E1212 00:45:45.789147 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-placement-api:current-podified\\\"\"" pod="openstack/placement-db-sync-4hg5f" podUID="5a0b6b98-c743-4435-a967-55c0edb95531" Dec 12 00:45:46 crc kubenswrapper[4606]: I1212 00:45:46.453798 4606 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5f59b8f679-6ms77" podUID="31335089-493b-42bb-9a5f-cb4ea39951f4" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.132:5353: i/o timeout" Dec 12 00:45:46 crc kubenswrapper[4606]: I1212 00:45:46.483939 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-jx54m"] Dec 12 00:45:46 crc kubenswrapper[4606]: I1212 00:45:46.496248 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-jx54m"] Dec 12 00:45:46 crc kubenswrapper[4606]: I1212 00:45:46.601996 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-kcwn5"] Dec 12 00:45:46 crc kubenswrapper[4606]: E1212 00:45:46.602550 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b21b129-dddc-4a41-ad1f-6cad37d0aa07" containerName="keystone-bootstrap" Dec 12 00:45:46 crc kubenswrapper[4606]: I1212 00:45:46.602573 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b21b129-dddc-4a41-ad1f-6cad37d0aa07" containerName="keystone-bootstrap" Dec 12 00:45:46 crc kubenswrapper[4606]: I1212 00:45:46.602785 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b21b129-dddc-4a41-ad1f-6cad37d0aa07" containerName="keystone-bootstrap" Dec 12 00:45:46 crc kubenswrapper[4606]: I1212 00:45:46.603621 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-kcwn5" Dec 12 00:45:46 crc kubenswrapper[4606]: I1212 00:45:46.605798 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 12 00:45:46 crc kubenswrapper[4606]: I1212 00:45:46.605918 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 12 00:45:46 crc kubenswrapper[4606]: I1212 00:45:46.606143 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-wkdqx" Dec 12 00:45:46 crc kubenswrapper[4606]: I1212 00:45:46.606493 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 12 00:45:46 crc kubenswrapper[4606]: I1212 00:45:46.610837 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 12 00:45:46 crc kubenswrapper[4606]: I1212 00:45:46.647361 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-kcwn5"] Dec 12 00:45:46 crc kubenswrapper[4606]: I1212 00:45:46.668566 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ff95d54-4b78-48cb-b8c9-33801a6818f0-combined-ca-bundle\") pod \"keystone-bootstrap-kcwn5\" (UID: \"8ff95d54-4b78-48cb-b8c9-33801a6818f0\") " pod="openstack/keystone-bootstrap-kcwn5" Dec 12 00:45:46 crc kubenswrapper[4606]: I1212 00:45:46.668601 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ff95d54-4b78-48cb-b8c9-33801a6818f0-scripts\") pod \"keystone-bootstrap-kcwn5\" (UID: \"8ff95d54-4b78-48cb-b8c9-33801a6818f0\") " pod="openstack/keystone-bootstrap-kcwn5" Dec 12 00:45:46 crc kubenswrapper[4606]: I1212 00:45:46.668655 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8ff95d54-4b78-48cb-b8c9-33801a6818f0-fernet-keys\") pod \"keystone-bootstrap-kcwn5\" (UID: \"8ff95d54-4b78-48cb-b8c9-33801a6818f0\") " pod="openstack/keystone-bootstrap-kcwn5" Dec 12 00:45:46 crc kubenswrapper[4606]: I1212 00:45:46.668674 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8ff95d54-4b78-48cb-b8c9-33801a6818f0-credential-keys\") pod \"keystone-bootstrap-kcwn5\" (UID: \"8ff95d54-4b78-48cb-b8c9-33801a6818f0\") " pod="openstack/keystone-bootstrap-kcwn5" Dec 12 00:45:46 crc kubenswrapper[4606]: I1212 00:45:46.668696 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ff95d54-4b78-48cb-b8c9-33801a6818f0-config-data\") pod \"keystone-bootstrap-kcwn5\" (UID: \"8ff95d54-4b78-48cb-b8c9-33801a6818f0\") " pod="openstack/keystone-bootstrap-kcwn5" Dec 12 00:45:46 crc kubenswrapper[4606]: I1212 00:45:46.668714 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gnl4g\" (UniqueName: \"kubernetes.io/projected/8ff95d54-4b78-48cb-b8c9-33801a6818f0-kube-api-access-gnl4g\") pod \"keystone-bootstrap-kcwn5\" (UID: \"8ff95d54-4b78-48cb-b8c9-33801a6818f0\") " pod="openstack/keystone-bootstrap-kcwn5" Dec 12 00:45:46 crc kubenswrapper[4606]: I1212 00:45:46.770697 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ff95d54-4b78-48cb-b8c9-33801a6818f0-combined-ca-bundle\") pod \"keystone-bootstrap-kcwn5\" (UID: \"8ff95d54-4b78-48cb-b8c9-33801a6818f0\") " pod="openstack/keystone-bootstrap-kcwn5" Dec 12 00:45:46 crc kubenswrapper[4606]: I1212 00:45:46.770739 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ff95d54-4b78-48cb-b8c9-33801a6818f0-scripts\") pod \"keystone-bootstrap-kcwn5\" (UID: \"8ff95d54-4b78-48cb-b8c9-33801a6818f0\") " pod="openstack/keystone-bootstrap-kcwn5" Dec 12 00:45:46 crc kubenswrapper[4606]: I1212 00:45:46.770815 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8ff95d54-4b78-48cb-b8c9-33801a6818f0-fernet-keys\") pod \"keystone-bootstrap-kcwn5\" (UID: \"8ff95d54-4b78-48cb-b8c9-33801a6818f0\") " pod="openstack/keystone-bootstrap-kcwn5" Dec 12 00:45:46 crc kubenswrapper[4606]: I1212 00:45:46.770832 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8ff95d54-4b78-48cb-b8c9-33801a6818f0-credential-keys\") pod \"keystone-bootstrap-kcwn5\" (UID: \"8ff95d54-4b78-48cb-b8c9-33801a6818f0\") " pod="openstack/keystone-bootstrap-kcwn5" Dec 12 00:45:46 crc kubenswrapper[4606]: I1212 00:45:46.770873 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ff95d54-4b78-48cb-b8c9-33801a6818f0-config-data\") pod \"keystone-bootstrap-kcwn5\" (UID: \"8ff95d54-4b78-48cb-b8c9-33801a6818f0\") " pod="openstack/keystone-bootstrap-kcwn5" Dec 12 00:45:46 crc kubenswrapper[4606]: I1212 00:45:46.770902 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gnl4g\" (UniqueName: \"kubernetes.io/projected/8ff95d54-4b78-48cb-b8c9-33801a6818f0-kube-api-access-gnl4g\") pod \"keystone-bootstrap-kcwn5\" (UID: \"8ff95d54-4b78-48cb-b8c9-33801a6818f0\") " pod="openstack/keystone-bootstrap-kcwn5" Dec 12 00:45:46 crc kubenswrapper[4606]: I1212 00:45:46.776611 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ff95d54-4b78-48cb-b8c9-33801a6818f0-scripts\") pod \"keystone-bootstrap-kcwn5\" (UID: \"8ff95d54-4b78-48cb-b8c9-33801a6818f0\") " pod="openstack/keystone-bootstrap-kcwn5" Dec 12 00:45:46 crc kubenswrapper[4606]: I1212 00:45:46.776800 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ff95d54-4b78-48cb-b8c9-33801a6818f0-combined-ca-bundle\") pod \"keystone-bootstrap-kcwn5\" (UID: \"8ff95d54-4b78-48cb-b8c9-33801a6818f0\") " pod="openstack/keystone-bootstrap-kcwn5" Dec 12 00:45:46 crc kubenswrapper[4606]: I1212 00:45:46.776889 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ff95d54-4b78-48cb-b8c9-33801a6818f0-config-data\") pod \"keystone-bootstrap-kcwn5\" (UID: \"8ff95d54-4b78-48cb-b8c9-33801a6818f0\") " pod="openstack/keystone-bootstrap-kcwn5" Dec 12 00:45:46 crc kubenswrapper[4606]: I1212 00:45:46.777315 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8ff95d54-4b78-48cb-b8c9-33801a6818f0-fernet-keys\") pod \"keystone-bootstrap-kcwn5\" (UID: \"8ff95d54-4b78-48cb-b8c9-33801a6818f0\") " pod="openstack/keystone-bootstrap-kcwn5" Dec 12 00:45:46 crc kubenswrapper[4606]: I1212 00:45:46.779050 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8ff95d54-4b78-48cb-b8c9-33801a6818f0-credential-keys\") pod \"keystone-bootstrap-kcwn5\" (UID: \"8ff95d54-4b78-48cb-b8c9-33801a6818f0\") " pod="openstack/keystone-bootstrap-kcwn5" Dec 12 00:45:46 crc kubenswrapper[4606]: I1212 00:45:46.788779 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gnl4g\" (UniqueName: \"kubernetes.io/projected/8ff95d54-4b78-48cb-b8c9-33801a6818f0-kube-api-access-gnl4g\") pod \"keystone-bootstrap-kcwn5\" (UID: \"8ff95d54-4b78-48cb-b8c9-33801a6818f0\") " pod="openstack/keystone-bootstrap-kcwn5" Dec 12 00:45:46 crc kubenswrapper[4606]: I1212 00:45:46.939980 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-kcwn5" Dec 12 00:45:47 crc kubenswrapper[4606]: I1212 00:45:47.711527 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b21b129-dddc-4a41-ad1f-6cad37d0aa07" path="/var/lib/kubelet/pods/4b21b129-dddc-4a41-ad1f-6cad37d0aa07/volumes" Dec 12 00:45:47 crc kubenswrapper[4606]: I1212 00:45:47.814083 4606 generic.go:334] "Generic (PLEG): container finished" podID="ce0c3f48-d61e-420d-ab53-61361c7a4a25" containerID="43e74f811bdd82896befc7495dab3869fa3c81bec160c4912838cddece137eca" exitCode=0 Dec 12 00:45:47 crc kubenswrapper[4606]: I1212 00:45:47.814137 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-hp9lg" event={"ID":"ce0c3f48-d61e-420d-ab53-61361c7a4a25","Type":"ContainerDied","Data":"43e74f811bdd82896befc7495dab3869fa3c81bec160c4912838cddece137eca"} Dec 12 00:45:51 crc kubenswrapper[4606]: I1212 00:45:51.454569 4606 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5f59b8f679-6ms77" podUID="31335089-493b-42bb-9a5f-cb4ea39951f4" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.132:5353: i/o timeout" Dec 12 00:45:51 crc kubenswrapper[4606]: I1212 00:45:51.455349 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5f59b8f679-6ms77" Dec 12 00:45:56 crc kubenswrapper[4606]: I1212 00:45:56.456439 4606 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5f59b8f679-6ms77" podUID="31335089-493b-42bb-9a5f-cb4ea39951f4" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.132:5353: i/o timeout" Dec 12 00:45:56 crc kubenswrapper[4606]: I1212 00:45:56.768842 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5f59cc7d7-l4jln" Dec 12 00:45:56 crc kubenswrapper[4606]: I1212 00:45:56.772786 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f59b8f679-6ms77" Dec 12 00:45:56 crc kubenswrapper[4606]: I1212 00:45:56.783694 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-64d56f87f-bhmtm" Dec 12 00:45:56 crc kubenswrapper[4606]: I1212 00:45:56.799815 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-hp9lg" Dec 12 00:45:56 crc kubenswrapper[4606]: I1212 00:45:56.885016 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b710b7ca-abc9-465b-a279-949bd345962b-logs\") pod \"b710b7ca-abc9-465b-a279-949bd345962b\" (UID: \"b710b7ca-abc9-465b-a279-949bd345962b\") " Dec 12 00:45:56 crc kubenswrapper[4606]: I1212 00:45:56.885398 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/31335089-493b-42bb-9a5f-cb4ea39951f4-dns-svc\") pod \"31335089-493b-42bb-9a5f-cb4ea39951f4\" (UID: \"31335089-493b-42bb-9a5f-cb4ea39951f4\") " Dec 12 00:45:56 crc kubenswrapper[4606]: I1212 00:45:56.885425 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b710b7ca-abc9-465b-a279-949bd345962b-scripts\") pod \"b710b7ca-abc9-465b-a279-949bd345962b\" (UID: \"b710b7ca-abc9-465b-a279-949bd345962b\") " Dec 12 00:45:56 crc kubenswrapper[4606]: I1212 00:45:56.885455 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/31335089-493b-42bb-9a5f-cb4ea39951f4-ovsdbserver-nb\") pod \"31335089-493b-42bb-9a5f-cb4ea39951f4\" (UID: \"31335089-493b-42bb-9a5f-cb4ea39951f4\") " Dec 12 00:45:56 crc kubenswrapper[4606]: I1212 00:45:56.885519 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vsfhm\" (UniqueName: \"kubernetes.io/projected/31335089-493b-42bb-9a5f-cb4ea39951f4-kube-api-access-vsfhm\") pod \"31335089-493b-42bb-9a5f-cb4ea39951f4\" (UID: \"31335089-493b-42bb-9a5f-cb4ea39951f4\") " Dec 12 00:45:56 crc kubenswrapper[4606]: I1212 00:45:56.885538 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce0c3f48-d61e-420d-ab53-61361c7a4a25-combined-ca-bundle\") pod \"ce0c3f48-d61e-420d-ab53-61361c7a4a25\" (UID: \"ce0c3f48-d61e-420d-ab53-61361c7a4a25\") " Dec 12 00:45:56 crc kubenswrapper[4606]: I1212 00:45:56.885559 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/31335089-493b-42bb-9a5f-cb4ea39951f4-ovsdbserver-sb\") pod \"31335089-493b-42bb-9a5f-cb4ea39951f4\" (UID: \"31335089-493b-42bb-9a5f-cb4ea39951f4\") " Dec 12 00:45:56 crc kubenswrapper[4606]: I1212 00:45:56.885584 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fc444db7-f445-40e7-bea0-f16f6afc2b91-scripts\") pod \"fc444db7-f445-40e7-bea0-f16f6afc2b91\" (UID: \"fc444db7-f445-40e7-bea0-f16f6afc2b91\") " Dec 12 00:45:56 crc kubenswrapper[4606]: I1212 00:45:56.885603 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ce0c3f48-d61e-420d-ab53-61361c7a4a25-config\") pod \"ce0c3f48-d61e-420d-ab53-61361c7a4a25\" (UID: \"ce0c3f48-d61e-420d-ab53-61361c7a4a25\") " Dec 12 00:45:56 crc kubenswrapper[4606]: I1212 00:45:56.885623 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/31335089-493b-42bb-9a5f-cb4ea39951f4-config\") pod \"31335089-493b-42bb-9a5f-cb4ea39951f4\" (UID: \"31335089-493b-42bb-9a5f-cb4ea39951f4\") " Dec 12 00:45:56 crc kubenswrapper[4606]: I1212 00:45:56.885650 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b710b7ca-abc9-465b-a279-949bd345962b-horizon-secret-key\") pod \"b710b7ca-abc9-465b-a279-949bd345962b\" (UID: \"b710b7ca-abc9-465b-a279-949bd345962b\") " Dec 12 00:45:56 crc kubenswrapper[4606]: I1212 00:45:56.885689 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mq47m\" (UniqueName: \"kubernetes.io/projected/b710b7ca-abc9-465b-a279-949bd345962b-kube-api-access-mq47m\") pod \"b710b7ca-abc9-465b-a279-949bd345962b\" (UID: \"b710b7ca-abc9-465b-a279-949bd345962b\") " Dec 12 00:45:56 crc kubenswrapper[4606]: I1212 00:45:56.885706 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fc444db7-f445-40e7-bea0-f16f6afc2b91-config-data\") pod \"fc444db7-f445-40e7-bea0-f16f6afc2b91\" (UID: \"fc444db7-f445-40e7-bea0-f16f6afc2b91\") " Dec 12 00:45:56 crc kubenswrapper[4606]: I1212 00:45:56.885732 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b710b7ca-abc9-465b-a279-949bd345962b-config-data\") pod \"b710b7ca-abc9-465b-a279-949bd345962b\" (UID: \"b710b7ca-abc9-465b-a279-949bd345962b\") " Dec 12 00:45:56 crc kubenswrapper[4606]: I1212 00:45:56.885747 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/fc444db7-f445-40e7-bea0-f16f6afc2b91-horizon-secret-key\") pod \"fc444db7-f445-40e7-bea0-f16f6afc2b91\" (UID: \"fc444db7-f445-40e7-bea0-f16f6afc2b91\") " Dec 12 00:45:56 crc kubenswrapper[4606]: I1212 00:45:56.885772 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w88cn\" (UniqueName: \"kubernetes.io/projected/ce0c3f48-d61e-420d-ab53-61361c7a4a25-kube-api-access-w88cn\") pod \"ce0c3f48-d61e-420d-ab53-61361c7a4a25\" (UID: \"ce0c3f48-d61e-420d-ab53-61361c7a4a25\") " Dec 12 00:45:56 crc kubenswrapper[4606]: I1212 00:45:56.885791 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xdnbm\" (UniqueName: \"kubernetes.io/projected/fc444db7-f445-40e7-bea0-f16f6afc2b91-kube-api-access-xdnbm\") pod \"fc444db7-f445-40e7-bea0-f16f6afc2b91\" (UID: \"fc444db7-f445-40e7-bea0-f16f6afc2b91\") " Dec 12 00:45:56 crc kubenswrapper[4606]: I1212 00:45:56.885826 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/31335089-493b-42bb-9a5f-cb4ea39951f4-dns-swift-storage-0\") pod \"31335089-493b-42bb-9a5f-cb4ea39951f4\" (UID: \"31335089-493b-42bb-9a5f-cb4ea39951f4\") " Dec 12 00:45:56 crc kubenswrapper[4606]: I1212 00:45:56.885863 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc444db7-f445-40e7-bea0-f16f6afc2b91-logs\") pod \"fc444db7-f445-40e7-bea0-f16f6afc2b91\" (UID: \"fc444db7-f445-40e7-bea0-f16f6afc2b91\") " Dec 12 00:45:56 crc kubenswrapper[4606]: I1212 00:45:56.886491 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc444db7-f445-40e7-bea0-f16f6afc2b91-logs" (OuterVolumeSpecName: "logs") pod "fc444db7-f445-40e7-bea0-f16f6afc2b91" (UID: "fc444db7-f445-40e7-bea0-f16f6afc2b91"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 00:45:56 crc kubenswrapper[4606]: I1212 00:45:56.886915 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b710b7ca-abc9-465b-a279-949bd345962b-logs" (OuterVolumeSpecName: "logs") pod "b710b7ca-abc9-465b-a279-949bd345962b" (UID: "b710b7ca-abc9-465b-a279-949bd345962b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 00:45:56 crc kubenswrapper[4606]: I1212 00:45:56.888370 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b710b7ca-abc9-465b-a279-949bd345962b-config-data" (OuterVolumeSpecName: "config-data") pod "b710b7ca-abc9-465b-a279-949bd345962b" (UID: "b710b7ca-abc9-465b-a279-949bd345962b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:45:56 crc kubenswrapper[4606]: I1212 00:45:56.897500 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b710b7ca-abc9-465b-a279-949bd345962b-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "b710b7ca-abc9-465b-a279-949bd345962b" (UID: "b710b7ca-abc9-465b-a279-949bd345962b"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:45:56 crc kubenswrapper[4606]: I1212 00:45:56.906622 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b710b7ca-abc9-465b-a279-949bd345962b-scripts" (OuterVolumeSpecName: "scripts") pod "b710b7ca-abc9-465b-a279-949bd345962b" (UID: "b710b7ca-abc9-465b-a279-949bd345962b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:45:56 crc kubenswrapper[4606]: I1212 00:45:56.917691 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc444db7-f445-40e7-bea0-f16f6afc2b91-kube-api-access-xdnbm" (OuterVolumeSpecName: "kube-api-access-xdnbm") pod "fc444db7-f445-40e7-bea0-f16f6afc2b91" (UID: "fc444db7-f445-40e7-bea0-f16f6afc2b91"). InnerVolumeSpecName "kube-api-access-xdnbm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:45:56 crc kubenswrapper[4606]: I1212 00:45:56.930694 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc444db7-f445-40e7-bea0-f16f6afc2b91-config-data" (OuterVolumeSpecName: "config-data") pod "fc444db7-f445-40e7-bea0-f16f6afc2b91" (UID: "fc444db7-f445-40e7-bea0-f16f6afc2b91"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:45:56 crc kubenswrapper[4606]: I1212 00:45:56.930803 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc444db7-f445-40e7-bea0-f16f6afc2b91-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "fc444db7-f445-40e7-bea0-f16f6afc2b91" (UID: "fc444db7-f445-40e7-bea0-f16f6afc2b91"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:45:56 crc kubenswrapper[4606]: I1212 00:45:56.930894 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31335089-493b-42bb-9a5f-cb4ea39951f4-kube-api-access-vsfhm" (OuterVolumeSpecName: "kube-api-access-vsfhm") pod "31335089-493b-42bb-9a5f-cb4ea39951f4" (UID: "31335089-493b-42bb-9a5f-cb4ea39951f4"). InnerVolumeSpecName "kube-api-access-vsfhm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:45:56 crc kubenswrapper[4606]: I1212 00:45:56.932681 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b710b7ca-abc9-465b-a279-949bd345962b-kube-api-access-mq47m" (OuterVolumeSpecName: "kube-api-access-mq47m") pod "b710b7ca-abc9-465b-a279-949bd345962b" (UID: "b710b7ca-abc9-465b-a279-949bd345962b"). InnerVolumeSpecName "kube-api-access-mq47m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:45:56 crc kubenswrapper[4606]: I1212 00:45:56.935054 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce0c3f48-d61e-420d-ab53-61361c7a4a25-kube-api-access-w88cn" (OuterVolumeSpecName: "kube-api-access-w88cn") pod "ce0c3f48-d61e-420d-ab53-61361c7a4a25" (UID: "ce0c3f48-d61e-420d-ab53-61361c7a4a25"). InnerVolumeSpecName "kube-api-access-w88cn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:45:56 crc kubenswrapper[4606]: I1212 00:45:56.951127 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc444db7-f445-40e7-bea0-f16f6afc2b91-scripts" (OuterVolumeSpecName: "scripts") pod "fc444db7-f445-40e7-bea0-f16f6afc2b91" (UID: "fc444db7-f445-40e7-bea0-f16f6afc2b91"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:45:56 crc kubenswrapper[4606]: I1212 00:45:56.964618 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce0c3f48-d61e-420d-ab53-61361c7a4a25-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ce0c3f48-d61e-420d-ab53-61361c7a4a25" (UID: "ce0c3f48-d61e-420d-ab53-61361c7a4a25"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:45:56 crc kubenswrapper[4606]: I1212 00:45:56.990138 4606 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b710b7ca-abc9-465b-a279-949bd345962b-logs\") on node \"crc\" DevicePath \"\"" Dec 12 00:45:56 crc kubenswrapper[4606]: I1212 00:45:56.990181 4606 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b710b7ca-abc9-465b-a279-949bd345962b-scripts\") on node \"crc\" DevicePath \"\"" Dec 12 00:45:56 crc kubenswrapper[4606]: I1212 00:45:56.990191 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vsfhm\" (UniqueName: \"kubernetes.io/projected/31335089-493b-42bb-9a5f-cb4ea39951f4-kube-api-access-vsfhm\") on node \"crc\" DevicePath \"\"" Dec 12 00:45:56 crc kubenswrapper[4606]: I1212 00:45:56.990200 4606 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce0c3f48-d61e-420d-ab53-61361c7a4a25-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 12 00:45:56 crc kubenswrapper[4606]: I1212 00:45:56.990208 4606 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fc444db7-f445-40e7-bea0-f16f6afc2b91-scripts\") on node \"crc\" DevicePath \"\"" Dec 12 00:45:56 crc kubenswrapper[4606]: I1212 00:45:56.990216 4606 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b710b7ca-abc9-465b-a279-949bd345962b-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 12 00:45:56 crc kubenswrapper[4606]: I1212 00:45:56.990226 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mq47m\" (UniqueName: \"kubernetes.io/projected/b710b7ca-abc9-465b-a279-949bd345962b-kube-api-access-mq47m\") on node \"crc\" DevicePath \"\"" Dec 12 00:45:56 crc kubenswrapper[4606]: I1212 00:45:56.990235 4606 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fc444db7-f445-40e7-bea0-f16f6afc2b91-config-data\") on node \"crc\" DevicePath \"\"" Dec 12 00:45:56 crc kubenswrapper[4606]: I1212 00:45:56.990243 4606 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b710b7ca-abc9-465b-a279-949bd345962b-config-data\") on node \"crc\" DevicePath \"\"" Dec 12 00:45:56 crc kubenswrapper[4606]: I1212 00:45:56.990250 4606 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/fc444db7-f445-40e7-bea0-f16f6afc2b91-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 12 00:45:56 crc kubenswrapper[4606]: I1212 00:45:56.990258 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w88cn\" (UniqueName: \"kubernetes.io/projected/ce0c3f48-d61e-420d-ab53-61361c7a4a25-kube-api-access-w88cn\") on node \"crc\" DevicePath \"\"" Dec 12 00:45:56 crc kubenswrapper[4606]: I1212 00:45:56.990268 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xdnbm\" (UniqueName: \"kubernetes.io/projected/fc444db7-f445-40e7-bea0-f16f6afc2b91-kube-api-access-xdnbm\") on node \"crc\" DevicePath \"\"" Dec 12 00:45:56 crc kubenswrapper[4606]: I1212 00:45:56.990276 4606 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc444db7-f445-40e7-bea0-f16f6afc2b91-logs\") on node \"crc\" DevicePath \"\"" Dec 12 00:45:57 crc kubenswrapper[4606]: I1212 00:45:57.001181 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-6ms77" event={"ID":"31335089-493b-42bb-9a5f-cb4ea39951f4","Type":"ContainerDied","Data":"fb90acd762d492feb0afe414e82b66b34cee5ac9bd7f1633a2105839c7ab13cf"} Dec 12 00:45:57 crc kubenswrapper[4606]: I1212 00:45:57.001394 4606 scope.go:117] "RemoveContainer" containerID="8491d7f65e8fd7e0f7e430fda22ef9861a076be3942ad2ab7e3118166e6f9466" Dec 12 00:45:57 crc kubenswrapper[4606]: I1212 00:45:57.001665 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f59b8f679-6ms77" Dec 12 00:45:57 crc kubenswrapper[4606]: I1212 00:45:57.012120 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31335089-493b-42bb-9a5f-cb4ea39951f4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "31335089-493b-42bb-9a5f-cb4ea39951f4" (UID: "31335089-493b-42bb-9a5f-cb4ea39951f4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:45:57 crc kubenswrapper[4606]: I1212 00:45:57.026697 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-hp9lg" event={"ID":"ce0c3f48-d61e-420d-ab53-61361c7a4a25","Type":"ContainerDied","Data":"bbe469aed3c875a1e09a439e9614fecee00849e88743f45216af1c618533350c"} Dec 12 00:45:57 crc kubenswrapper[4606]: I1212 00:45:57.026768 4606 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bbe469aed3c875a1e09a439e9614fecee00849e88743f45216af1c618533350c" Dec 12 00:45:57 crc kubenswrapper[4606]: I1212 00:45:57.026915 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-hp9lg" Dec 12 00:45:57 crc kubenswrapper[4606]: I1212 00:45:57.043878 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce0c3f48-d61e-420d-ab53-61361c7a4a25-config" (OuterVolumeSpecName: "config") pod "ce0c3f48-d61e-420d-ab53-61361c7a4a25" (UID: "ce0c3f48-d61e-420d-ab53-61361c7a4a25"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:45:57 crc kubenswrapper[4606]: I1212 00:45:57.055464 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5f59cc7d7-l4jln" event={"ID":"b710b7ca-abc9-465b-a279-949bd345962b","Type":"ContainerDied","Data":"3c630bf6508fa4bdcdc67478b61b4cccfe5ea168aa23c02bf5183d1130cc927e"} Dec 12 00:45:57 crc kubenswrapper[4606]: I1212 00:45:57.055832 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5f59cc7d7-l4jln" Dec 12 00:45:57 crc kubenswrapper[4606]: I1212 00:45:57.072492 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-64d56f87f-bhmtm" event={"ID":"fc444db7-f445-40e7-bea0-f16f6afc2b91","Type":"ContainerDied","Data":"cc5eff8ae5bf9266942669732f2199a14fd8f8eb75ef357936b53da3fc821d49"} Dec 12 00:45:57 crc kubenswrapper[4606]: I1212 00:45:57.072646 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-64d56f87f-bhmtm" Dec 12 00:45:57 crc kubenswrapper[4606]: I1212 00:45:57.092700 4606 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/ce0c3f48-d61e-420d-ab53-61361c7a4a25-config\") on node \"crc\" DevicePath \"\"" Dec 12 00:45:57 crc kubenswrapper[4606]: I1212 00:45:57.092752 4606 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/31335089-493b-42bb-9a5f-cb4ea39951f4-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 12 00:45:57 crc kubenswrapper[4606]: I1212 00:45:57.109517 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31335089-493b-42bb-9a5f-cb4ea39951f4-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "31335089-493b-42bb-9a5f-cb4ea39951f4" (UID: "31335089-493b-42bb-9a5f-cb4ea39951f4"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:45:57 crc kubenswrapper[4606]: I1212 00:45:57.127635 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31335089-493b-42bb-9a5f-cb4ea39951f4-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "31335089-493b-42bb-9a5f-cb4ea39951f4" (UID: "31335089-493b-42bb-9a5f-cb4ea39951f4"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:45:57 crc kubenswrapper[4606]: I1212 00:45:57.130838 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31335089-493b-42bb-9a5f-cb4ea39951f4-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "31335089-493b-42bb-9a5f-cb4ea39951f4" (UID: "31335089-493b-42bb-9a5f-cb4ea39951f4"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:45:57 crc kubenswrapper[4606]: I1212 00:45:57.130924 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31335089-493b-42bb-9a5f-cb4ea39951f4-config" (OuterVolumeSpecName: "config") pod "31335089-493b-42bb-9a5f-cb4ea39951f4" (UID: "31335089-493b-42bb-9a5f-cb4ea39951f4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:45:57 crc kubenswrapper[4606]: I1212 00:45:57.215832 4606 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/31335089-493b-42bb-9a5f-cb4ea39951f4-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 12 00:45:57 crc kubenswrapper[4606]: I1212 00:45:57.215889 4606 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/31335089-493b-42bb-9a5f-cb4ea39951f4-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 12 00:45:57 crc kubenswrapper[4606]: I1212 00:45:57.215902 4606 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/31335089-493b-42bb-9a5f-cb4ea39951f4-config\") on node \"crc\" DevicePath \"\"" Dec 12 00:45:57 crc kubenswrapper[4606]: I1212 00:45:57.215913 4606 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/31335089-493b-42bb-9a5f-cb4ea39951f4-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 12 00:45:57 crc kubenswrapper[4606]: I1212 00:45:57.218046 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-64d56f87f-bhmtm"] Dec 12 00:45:57 crc kubenswrapper[4606]: I1212 00:45:57.245800 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-64d56f87f-bhmtm"] Dec 12 00:45:57 crc kubenswrapper[4606]: I1212 00:45:57.269986 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5f59cc7d7-l4jln"] Dec 12 00:45:57 crc kubenswrapper[4606]: I1212 00:45:57.290048 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-5f59cc7d7-l4jln"] Dec 12 00:45:57 crc kubenswrapper[4606]: I1212 00:45:57.340878 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-6ms77"] Dec 12 00:45:57 crc kubenswrapper[4606]: I1212 00:45:57.348067 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-6ms77"] Dec 12 00:45:57 crc kubenswrapper[4606]: I1212 00:45:57.734054 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31335089-493b-42bb-9a5f-cb4ea39951f4" path="/var/lib/kubelet/pods/31335089-493b-42bb-9a5f-cb4ea39951f4/volumes" Dec 12 00:45:57 crc kubenswrapper[4606]: I1212 00:45:57.736483 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b710b7ca-abc9-465b-a279-949bd345962b" path="/var/lib/kubelet/pods/b710b7ca-abc9-465b-a279-949bd345962b/volumes" Dec 12 00:45:57 crc kubenswrapper[4606]: I1212 00:45:57.737001 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc444db7-f445-40e7-bea0-f16f6afc2b91" path="/var/lib/kubelet/pods/fc444db7-f445-40e7-bea0-f16f6afc2b91/volumes" Dec 12 00:45:58 crc kubenswrapper[4606]: I1212 00:45:58.165378 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-qqpvb"] Dec 12 00:45:58 crc kubenswrapper[4606]: E1212 00:45:58.165707 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31335089-493b-42bb-9a5f-cb4ea39951f4" containerName="dnsmasq-dns" Dec 12 00:45:58 crc kubenswrapper[4606]: I1212 00:45:58.165722 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="31335089-493b-42bb-9a5f-cb4ea39951f4" containerName="dnsmasq-dns" Dec 12 00:45:58 crc kubenswrapper[4606]: E1212 00:45:58.165732 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31335089-493b-42bb-9a5f-cb4ea39951f4" containerName="init" Dec 12 00:45:58 crc kubenswrapper[4606]: I1212 00:45:58.165739 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="31335089-493b-42bb-9a5f-cb4ea39951f4" containerName="init" Dec 12 00:45:58 crc kubenswrapper[4606]: E1212 00:45:58.177816 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce0c3f48-d61e-420d-ab53-61361c7a4a25" containerName="neutron-db-sync" Dec 12 00:45:58 crc kubenswrapper[4606]: I1212 00:45:58.177853 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce0c3f48-d61e-420d-ab53-61361c7a4a25" containerName="neutron-db-sync" Dec 12 00:45:58 crc kubenswrapper[4606]: I1212 00:45:58.178116 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce0c3f48-d61e-420d-ab53-61361c7a4a25" containerName="neutron-db-sync" Dec 12 00:45:58 crc kubenswrapper[4606]: I1212 00:45:58.178147 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="31335089-493b-42bb-9a5f-cb4ea39951f4" containerName="dnsmasq-dns" Dec 12 00:45:58 crc kubenswrapper[4606]: I1212 00:45:58.179025 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-qqpvb" Dec 12 00:45:58 crc kubenswrapper[4606]: I1212 00:45:58.193457 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-qqpvb"] Dec 12 00:45:58 crc kubenswrapper[4606]: I1212 00:45:58.326214 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5dfd45968b-nj9ll"] Dec 12 00:45:58 crc kubenswrapper[4606]: I1212 00:45:58.327587 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5dfd45968b-nj9ll" Dec 12 00:45:58 crc kubenswrapper[4606]: I1212 00:45:58.334323 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 12 00:45:58 crc kubenswrapper[4606]: I1212 00:45:58.346081 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e071e571-9ded-4520-9275-221d832aa78d-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7b667979-qqpvb\" (UID: \"e071e571-9ded-4520-9275-221d832aa78d\") " pod="openstack/dnsmasq-dns-6b7b667979-qqpvb" Dec 12 00:45:58 crc kubenswrapper[4606]: I1212 00:45:58.346353 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e071e571-9ded-4520-9275-221d832aa78d-dns-svc\") pod \"dnsmasq-dns-6b7b667979-qqpvb\" (UID: \"e071e571-9ded-4520-9275-221d832aa78d\") " pod="openstack/dnsmasq-dns-6b7b667979-qqpvb" Dec 12 00:45:58 crc kubenswrapper[4606]: I1212 00:45:58.346579 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e071e571-9ded-4520-9275-221d832aa78d-config\") pod \"dnsmasq-dns-6b7b667979-qqpvb\" (UID: \"e071e571-9ded-4520-9275-221d832aa78d\") " pod="openstack/dnsmasq-dns-6b7b667979-qqpvb" Dec 12 00:45:58 crc kubenswrapper[4606]: I1212 00:45:58.346695 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e071e571-9ded-4520-9275-221d832aa78d-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7b667979-qqpvb\" (UID: \"e071e571-9ded-4520-9275-221d832aa78d\") " pod="openstack/dnsmasq-dns-6b7b667979-qqpvb" Dec 12 00:45:58 crc kubenswrapper[4606]: I1212 00:45:58.346747 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Dec 12 00:45:58 crc kubenswrapper[4606]: I1212 00:45:58.346771 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e071e571-9ded-4520-9275-221d832aa78d-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7b667979-qqpvb\" (UID: \"e071e571-9ded-4520-9275-221d832aa78d\") " pod="openstack/dnsmasq-dns-6b7b667979-qqpvb" Dec 12 00:45:58 crc kubenswrapper[4606]: I1212 00:45:58.346809 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jn9k2\" (UniqueName: \"kubernetes.io/projected/e071e571-9ded-4520-9275-221d832aa78d-kube-api-access-jn9k2\") pod \"dnsmasq-dns-6b7b667979-qqpvb\" (UID: \"e071e571-9ded-4520-9275-221d832aa78d\") " pod="openstack/dnsmasq-dns-6b7b667979-qqpvb" Dec 12 00:45:58 crc kubenswrapper[4606]: I1212 00:45:58.346910 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-xhqlh" Dec 12 00:45:58 crc kubenswrapper[4606]: I1212 00:45:58.347088 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 12 00:45:58 crc kubenswrapper[4606]: I1212 00:45:58.348344 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5dfd45968b-nj9ll"] Dec 12 00:45:58 crc kubenswrapper[4606]: I1212 00:45:58.451516 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e071e571-9ded-4520-9275-221d832aa78d-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7b667979-qqpvb\" (UID: \"e071e571-9ded-4520-9275-221d832aa78d\") " pod="openstack/dnsmasq-dns-6b7b667979-qqpvb" Dec 12 00:45:58 crc kubenswrapper[4606]: I1212 00:45:58.452018 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e071e571-9ded-4520-9275-221d832aa78d-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7b667979-qqpvb\" (UID: \"e071e571-9ded-4520-9275-221d832aa78d\") " pod="openstack/dnsmasq-dns-6b7b667979-qqpvb" Dec 12 00:45:58 crc kubenswrapper[4606]: I1212 00:45:58.452148 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/218c1acf-b25f-43b6-9967-badd62c1a155-config\") pod \"neutron-5dfd45968b-nj9ll\" (UID: \"218c1acf-b25f-43b6-9967-badd62c1a155\") " pod="openstack/neutron-5dfd45968b-nj9ll" Dec 12 00:45:58 crc kubenswrapper[4606]: I1212 00:45:58.452312 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jn9k2\" (UniqueName: \"kubernetes.io/projected/e071e571-9ded-4520-9275-221d832aa78d-kube-api-access-jn9k2\") pod \"dnsmasq-dns-6b7b667979-qqpvb\" (UID: \"e071e571-9ded-4520-9275-221d832aa78d\") " pod="openstack/dnsmasq-dns-6b7b667979-qqpvb" Dec 12 00:45:58 crc kubenswrapper[4606]: I1212 00:45:58.452781 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8s9dc\" (UniqueName: \"kubernetes.io/projected/218c1acf-b25f-43b6-9967-badd62c1a155-kube-api-access-8s9dc\") pod \"neutron-5dfd45968b-nj9ll\" (UID: \"218c1acf-b25f-43b6-9967-badd62c1a155\") " pod="openstack/neutron-5dfd45968b-nj9ll" Dec 12 00:45:58 crc kubenswrapper[4606]: I1212 00:45:58.453199 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e071e571-9ded-4520-9275-221d832aa78d-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7b667979-qqpvb\" (UID: \"e071e571-9ded-4520-9275-221d832aa78d\") " pod="openstack/dnsmasq-dns-6b7b667979-qqpvb" Dec 12 00:45:58 crc kubenswrapper[4606]: I1212 00:45:58.452675 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e071e571-9ded-4520-9275-221d832aa78d-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7b667979-qqpvb\" (UID: \"e071e571-9ded-4520-9275-221d832aa78d\") " pod="openstack/dnsmasq-dns-6b7b667979-qqpvb" Dec 12 00:45:58 crc kubenswrapper[4606]: I1212 00:45:58.453548 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e071e571-9ded-4520-9275-221d832aa78d-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7b667979-qqpvb\" (UID: \"e071e571-9ded-4520-9275-221d832aa78d\") " pod="openstack/dnsmasq-dns-6b7b667979-qqpvb" Dec 12 00:45:58 crc kubenswrapper[4606]: I1212 00:45:58.453684 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e071e571-9ded-4520-9275-221d832aa78d-dns-svc\") pod \"dnsmasq-dns-6b7b667979-qqpvb\" (UID: \"e071e571-9ded-4520-9275-221d832aa78d\") " pod="openstack/dnsmasq-dns-6b7b667979-qqpvb" Dec 12 00:45:58 crc kubenswrapper[4606]: I1212 00:45:58.453918 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/218c1acf-b25f-43b6-9967-badd62c1a155-ovndb-tls-certs\") pod \"neutron-5dfd45968b-nj9ll\" (UID: \"218c1acf-b25f-43b6-9967-badd62c1a155\") " pod="openstack/neutron-5dfd45968b-nj9ll" Dec 12 00:45:58 crc kubenswrapper[4606]: I1212 00:45:58.454127 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/218c1acf-b25f-43b6-9967-badd62c1a155-httpd-config\") pod \"neutron-5dfd45968b-nj9ll\" (UID: \"218c1acf-b25f-43b6-9967-badd62c1a155\") " pod="openstack/neutron-5dfd45968b-nj9ll" Dec 12 00:45:58 crc kubenswrapper[4606]: I1212 00:45:58.454258 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/218c1acf-b25f-43b6-9967-badd62c1a155-combined-ca-bundle\") pod \"neutron-5dfd45968b-nj9ll\" (UID: \"218c1acf-b25f-43b6-9967-badd62c1a155\") " pod="openstack/neutron-5dfd45968b-nj9ll" Dec 12 00:45:58 crc kubenswrapper[4606]: I1212 00:45:58.454374 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e071e571-9ded-4520-9275-221d832aa78d-config\") pod \"dnsmasq-dns-6b7b667979-qqpvb\" (UID: \"e071e571-9ded-4520-9275-221d832aa78d\") " pod="openstack/dnsmasq-dns-6b7b667979-qqpvb" Dec 12 00:45:58 crc kubenswrapper[4606]: I1212 00:45:58.454415 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e071e571-9ded-4520-9275-221d832aa78d-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7b667979-qqpvb\" (UID: \"e071e571-9ded-4520-9275-221d832aa78d\") " pod="openstack/dnsmasq-dns-6b7b667979-qqpvb" Dec 12 00:45:58 crc kubenswrapper[4606]: I1212 00:45:58.454428 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e071e571-9ded-4520-9275-221d832aa78d-dns-svc\") pod \"dnsmasq-dns-6b7b667979-qqpvb\" (UID: \"e071e571-9ded-4520-9275-221d832aa78d\") " pod="openstack/dnsmasq-dns-6b7b667979-qqpvb" Dec 12 00:45:58 crc kubenswrapper[4606]: I1212 00:45:58.455250 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e071e571-9ded-4520-9275-221d832aa78d-config\") pod \"dnsmasq-dns-6b7b667979-qqpvb\" (UID: \"e071e571-9ded-4520-9275-221d832aa78d\") " pod="openstack/dnsmasq-dns-6b7b667979-qqpvb" Dec 12 00:45:58 crc kubenswrapper[4606]: I1212 00:45:58.488372 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jn9k2\" (UniqueName: \"kubernetes.io/projected/e071e571-9ded-4520-9275-221d832aa78d-kube-api-access-jn9k2\") pod \"dnsmasq-dns-6b7b667979-qqpvb\" (UID: \"e071e571-9ded-4520-9275-221d832aa78d\") " pod="openstack/dnsmasq-dns-6b7b667979-qqpvb" Dec 12 00:45:58 crc kubenswrapper[4606]: I1212 00:45:58.519892 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-qqpvb" Dec 12 00:45:58 crc kubenswrapper[4606]: I1212 00:45:58.555669 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/218c1acf-b25f-43b6-9967-badd62c1a155-httpd-config\") pod \"neutron-5dfd45968b-nj9ll\" (UID: \"218c1acf-b25f-43b6-9967-badd62c1a155\") " pod="openstack/neutron-5dfd45968b-nj9ll" Dec 12 00:45:58 crc kubenswrapper[4606]: I1212 00:45:58.556078 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/218c1acf-b25f-43b6-9967-badd62c1a155-combined-ca-bundle\") pod \"neutron-5dfd45968b-nj9ll\" (UID: \"218c1acf-b25f-43b6-9967-badd62c1a155\") " pod="openstack/neutron-5dfd45968b-nj9ll" Dec 12 00:45:58 crc kubenswrapper[4606]: I1212 00:45:58.556144 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/218c1acf-b25f-43b6-9967-badd62c1a155-config\") pod \"neutron-5dfd45968b-nj9ll\" (UID: \"218c1acf-b25f-43b6-9967-badd62c1a155\") " pod="openstack/neutron-5dfd45968b-nj9ll" Dec 12 00:45:58 crc kubenswrapper[4606]: I1212 00:45:58.556199 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8s9dc\" (UniqueName: \"kubernetes.io/projected/218c1acf-b25f-43b6-9967-badd62c1a155-kube-api-access-8s9dc\") pod \"neutron-5dfd45968b-nj9ll\" (UID: \"218c1acf-b25f-43b6-9967-badd62c1a155\") " pod="openstack/neutron-5dfd45968b-nj9ll" Dec 12 00:45:58 crc kubenswrapper[4606]: I1212 00:45:58.556253 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/218c1acf-b25f-43b6-9967-badd62c1a155-ovndb-tls-certs\") pod \"neutron-5dfd45968b-nj9ll\" (UID: \"218c1acf-b25f-43b6-9967-badd62c1a155\") " pod="openstack/neutron-5dfd45968b-nj9ll" Dec 12 00:45:58 crc kubenswrapper[4606]: I1212 00:45:58.562310 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/218c1acf-b25f-43b6-9967-badd62c1a155-httpd-config\") pod \"neutron-5dfd45968b-nj9ll\" (UID: \"218c1acf-b25f-43b6-9967-badd62c1a155\") " pod="openstack/neutron-5dfd45968b-nj9ll" Dec 12 00:45:58 crc kubenswrapper[4606]: I1212 00:45:58.567262 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/218c1acf-b25f-43b6-9967-badd62c1a155-config\") pod \"neutron-5dfd45968b-nj9ll\" (UID: \"218c1acf-b25f-43b6-9967-badd62c1a155\") " pod="openstack/neutron-5dfd45968b-nj9ll" Dec 12 00:45:58 crc kubenswrapper[4606]: I1212 00:45:58.574699 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/218c1acf-b25f-43b6-9967-badd62c1a155-combined-ca-bundle\") pod \"neutron-5dfd45968b-nj9ll\" (UID: \"218c1acf-b25f-43b6-9967-badd62c1a155\") " pod="openstack/neutron-5dfd45968b-nj9ll" Dec 12 00:45:58 crc kubenswrapper[4606]: I1212 00:45:58.575454 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/218c1acf-b25f-43b6-9967-badd62c1a155-ovndb-tls-certs\") pod \"neutron-5dfd45968b-nj9ll\" (UID: \"218c1acf-b25f-43b6-9967-badd62c1a155\") " pod="openstack/neutron-5dfd45968b-nj9ll" Dec 12 00:45:58 crc kubenswrapper[4606]: I1212 00:45:58.583706 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8s9dc\" (UniqueName: \"kubernetes.io/projected/218c1acf-b25f-43b6-9967-badd62c1a155-kube-api-access-8s9dc\") pod \"neutron-5dfd45968b-nj9ll\" (UID: \"218c1acf-b25f-43b6-9967-badd62c1a155\") " pod="openstack/neutron-5dfd45968b-nj9ll" Dec 12 00:45:58 crc kubenswrapper[4606]: I1212 00:45:58.680870 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5dfd45968b-nj9ll" Dec 12 00:45:59 crc kubenswrapper[4606]: E1212 00:45:59.207022 4606 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Dec 12 00:45:59 crc kubenswrapper[4606]: E1212 00:45:59.207193 4606 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jwfdl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-9xbj8_openstack(7978c0cd-b859-49f1-ad0e-1cb88ff58495): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 12 00:45:59 crc kubenswrapper[4606]: E1212 00:45:59.208523 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-9xbj8" podUID="7978c0cd-b859-49f1-ad0e-1cb88ff58495" Dec 12 00:45:59 crc kubenswrapper[4606]: I1212 00:45:59.317259 4606 scope.go:117] "RemoveContainer" containerID="00daec83dafa2cf69b8acb19b94061424145cdfeacf788446937d11ece95dcaa" Dec 12 00:45:59 crc kubenswrapper[4606]: I1212 00:45:59.849367 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-79c99578bb-cdgsn"] Dec 12 00:45:59 crc kubenswrapper[4606]: W1212 00:45:59.931754 4606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9ede4720_3fd7_4524_adfc_c1c395f12170.slice/crio-24356c8576770364961e0198ef8f3ba94c43809cc5158bcadf3aad4ce0c21a66 WatchSource:0}: Error finding container 24356c8576770364961e0198ef8f3ba94c43809cc5158bcadf3aad4ce0c21a66: Status 404 returned error can't find the container with id 24356c8576770364961e0198ef8f3ba94c43809cc5158bcadf3aad4ce0c21a66 Dec 12 00:46:00 crc kubenswrapper[4606]: I1212 00:46:00.028030 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-b9fb498f6-62fcc"] Dec 12 00:46:00 crc kubenswrapper[4606]: I1212 00:46:00.124315 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-79c99578bb-cdgsn" event={"ID":"9ede4720-3fd7-4524-adfc-c1c395f12170","Type":"ContainerStarted","Data":"24356c8576770364961e0198ef8f3ba94c43809cc5158bcadf3aad4ce0c21a66"} Dec 12 00:46:00 crc kubenswrapper[4606]: E1212 00:46:00.181331 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-9xbj8" podUID="7978c0cd-b859-49f1-ad0e-1cb88ff58495" Dec 12 00:46:00 crc kubenswrapper[4606]: I1212 00:46:00.355934 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-kcwn5"] Dec 12 00:46:00 crc kubenswrapper[4606]: I1212 00:46:00.568340 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5dfd45968b-nj9ll"] Dec 12 00:46:00 crc kubenswrapper[4606]: I1212 00:46:00.608840 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-qqpvb"] Dec 12 00:46:01 crc kubenswrapper[4606]: I1212 00:46:01.045760 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-d9886cd8c-2vtxs"] Dec 12 00:46:01 crc kubenswrapper[4606]: I1212 00:46:01.047560 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-d9886cd8c-2vtxs" Dec 12 00:46:01 crc kubenswrapper[4606]: I1212 00:46:01.053626 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Dec 12 00:46:01 crc kubenswrapper[4606]: I1212 00:46:01.054324 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Dec 12 00:46:01 crc kubenswrapper[4606]: I1212 00:46:01.066754 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-d9886cd8c-2vtxs"] Dec 12 00:46:01 crc kubenswrapper[4606]: I1212 00:46:01.113406 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c923042a-1c66-4db8-8e92-fc41e2f19b4f-internal-tls-certs\") pod \"neutron-d9886cd8c-2vtxs\" (UID: \"c923042a-1c66-4db8-8e92-fc41e2f19b4f\") " pod="openstack/neutron-d9886cd8c-2vtxs" Dec 12 00:46:01 crc kubenswrapper[4606]: I1212 00:46:01.113444 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dn4sq\" (UniqueName: \"kubernetes.io/projected/c923042a-1c66-4db8-8e92-fc41e2f19b4f-kube-api-access-dn4sq\") pod \"neutron-d9886cd8c-2vtxs\" (UID: \"c923042a-1c66-4db8-8e92-fc41e2f19b4f\") " pod="openstack/neutron-d9886cd8c-2vtxs" Dec 12 00:46:01 crc kubenswrapper[4606]: I1212 00:46:01.113506 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c923042a-1c66-4db8-8e92-fc41e2f19b4f-ovndb-tls-certs\") pod \"neutron-d9886cd8c-2vtxs\" (UID: \"c923042a-1c66-4db8-8e92-fc41e2f19b4f\") " pod="openstack/neutron-d9886cd8c-2vtxs" Dec 12 00:46:01 crc kubenswrapper[4606]: I1212 00:46:01.113542 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c923042a-1c66-4db8-8e92-fc41e2f19b4f-combined-ca-bundle\") pod \"neutron-d9886cd8c-2vtxs\" (UID: \"c923042a-1c66-4db8-8e92-fc41e2f19b4f\") " pod="openstack/neutron-d9886cd8c-2vtxs" Dec 12 00:46:01 crc kubenswrapper[4606]: I1212 00:46:01.113751 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c923042a-1c66-4db8-8e92-fc41e2f19b4f-public-tls-certs\") pod \"neutron-d9886cd8c-2vtxs\" (UID: \"c923042a-1c66-4db8-8e92-fc41e2f19b4f\") " pod="openstack/neutron-d9886cd8c-2vtxs" Dec 12 00:46:01 crc kubenswrapper[4606]: I1212 00:46:01.113866 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c923042a-1c66-4db8-8e92-fc41e2f19b4f-httpd-config\") pod \"neutron-d9886cd8c-2vtxs\" (UID: \"c923042a-1c66-4db8-8e92-fc41e2f19b4f\") " pod="openstack/neutron-d9886cd8c-2vtxs" Dec 12 00:46:01 crc kubenswrapper[4606]: I1212 00:46:01.113958 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c923042a-1c66-4db8-8e92-fc41e2f19b4f-config\") pod \"neutron-d9886cd8c-2vtxs\" (UID: \"c923042a-1c66-4db8-8e92-fc41e2f19b4f\") " pod="openstack/neutron-d9886cd8c-2vtxs" Dec 12 00:46:01 crc kubenswrapper[4606]: I1212 00:46:01.219509 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c923042a-1c66-4db8-8e92-fc41e2f19b4f-ovndb-tls-certs\") pod \"neutron-d9886cd8c-2vtxs\" (UID: \"c923042a-1c66-4db8-8e92-fc41e2f19b4f\") " pod="openstack/neutron-d9886cd8c-2vtxs" Dec 12 00:46:01 crc kubenswrapper[4606]: I1212 00:46:01.219568 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c923042a-1c66-4db8-8e92-fc41e2f19b4f-combined-ca-bundle\") pod \"neutron-d9886cd8c-2vtxs\" (UID: \"c923042a-1c66-4db8-8e92-fc41e2f19b4f\") " pod="openstack/neutron-d9886cd8c-2vtxs" Dec 12 00:46:01 crc kubenswrapper[4606]: I1212 00:46:01.219623 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c923042a-1c66-4db8-8e92-fc41e2f19b4f-public-tls-certs\") pod \"neutron-d9886cd8c-2vtxs\" (UID: \"c923042a-1c66-4db8-8e92-fc41e2f19b4f\") " pod="openstack/neutron-d9886cd8c-2vtxs" Dec 12 00:46:01 crc kubenswrapper[4606]: I1212 00:46:01.219658 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c923042a-1c66-4db8-8e92-fc41e2f19b4f-httpd-config\") pod \"neutron-d9886cd8c-2vtxs\" (UID: \"c923042a-1c66-4db8-8e92-fc41e2f19b4f\") " pod="openstack/neutron-d9886cd8c-2vtxs" Dec 12 00:46:01 crc kubenswrapper[4606]: I1212 00:46:01.219697 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c923042a-1c66-4db8-8e92-fc41e2f19b4f-config\") pod \"neutron-d9886cd8c-2vtxs\" (UID: \"c923042a-1c66-4db8-8e92-fc41e2f19b4f\") " pod="openstack/neutron-d9886cd8c-2vtxs" Dec 12 00:46:01 crc kubenswrapper[4606]: I1212 00:46:01.219725 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c923042a-1c66-4db8-8e92-fc41e2f19b4f-internal-tls-certs\") pod \"neutron-d9886cd8c-2vtxs\" (UID: \"c923042a-1c66-4db8-8e92-fc41e2f19b4f\") " pod="openstack/neutron-d9886cd8c-2vtxs" Dec 12 00:46:01 crc kubenswrapper[4606]: I1212 00:46:01.219742 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dn4sq\" (UniqueName: \"kubernetes.io/projected/c923042a-1c66-4db8-8e92-fc41e2f19b4f-kube-api-access-dn4sq\") pod \"neutron-d9886cd8c-2vtxs\" (UID: \"c923042a-1c66-4db8-8e92-fc41e2f19b4f\") " pod="openstack/neutron-d9886cd8c-2vtxs" Dec 12 00:46:01 crc kubenswrapper[4606]: I1212 00:46:01.222797 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-b9fb498f6-62fcc" event={"ID":"e38df57e-1a86-4c45-bf40-6282a6a049ed","Type":"ContainerStarted","Data":"e44151b4e4ad27bf5cbb8a04ea7f6755b1796ce480140634ff655e07ae9a6a78"} Dec 12 00:46:01 crc kubenswrapper[4606]: I1212 00:46:01.222844 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-b9fb498f6-62fcc" event={"ID":"e38df57e-1a86-4c45-bf40-6282a6a049ed","Type":"ContainerStarted","Data":"f30df7e0e3cc85475c05cbc368551d48c736fc63ddc3e024b5f45520fddcf3d7"} Dec 12 00:46:01 crc kubenswrapper[4606]: I1212 00:46:01.229496 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c923042a-1c66-4db8-8e92-fc41e2f19b4f-httpd-config\") pod \"neutron-d9886cd8c-2vtxs\" (UID: \"c923042a-1c66-4db8-8e92-fc41e2f19b4f\") " pod="openstack/neutron-d9886cd8c-2vtxs" Dec 12 00:46:01 crc kubenswrapper[4606]: I1212 00:46:01.233295 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c923042a-1c66-4db8-8e92-fc41e2f19b4f-combined-ca-bundle\") pod \"neutron-d9886cd8c-2vtxs\" (UID: \"c923042a-1c66-4db8-8e92-fc41e2f19b4f\") " pod="openstack/neutron-d9886cd8c-2vtxs" Dec 12 00:46:01 crc kubenswrapper[4606]: I1212 00:46:01.235852 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c923042a-1c66-4db8-8e92-fc41e2f19b4f-public-tls-certs\") pod \"neutron-d9886cd8c-2vtxs\" (UID: \"c923042a-1c66-4db8-8e92-fc41e2f19b4f\") " pod="openstack/neutron-d9886cd8c-2vtxs" Dec 12 00:46:01 crc kubenswrapper[4606]: I1212 00:46:01.237323 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/c923042a-1c66-4db8-8e92-fc41e2f19b4f-config\") pod \"neutron-d9886cd8c-2vtxs\" (UID: \"c923042a-1c66-4db8-8e92-fc41e2f19b4f\") " pod="openstack/neutron-d9886cd8c-2vtxs" Dec 12 00:46:01 crc kubenswrapper[4606]: I1212 00:46:01.243655 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c923042a-1c66-4db8-8e92-fc41e2f19b4f-ovndb-tls-certs\") pod \"neutron-d9886cd8c-2vtxs\" (UID: \"c923042a-1c66-4db8-8e92-fc41e2f19b4f\") " pod="openstack/neutron-d9886cd8c-2vtxs" Dec 12 00:46:01 crc kubenswrapper[4606]: I1212 00:46:01.247220 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-b55cc6bf7-ckhd2" event={"ID":"465e3cb1-d565-45fb-9251-de59579f3add","Type":"ContainerStarted","Data":"2f51d6b05b49c252f430521dbf06561a8e8107c42c290eb1b48f934e24e71d34"} Dec 12 00:46:01 crc kubenswrapper[4606]: I1212 00:46:01.247319 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-b55cc6bf7-ckhd2" podUID="465e3cb1-d565-45fb-9251-de59579f3add" containerName="horizon-log" containerID="cri-o://2f51d6b05b49c252f430521dbf06561a8e8107c42c290eb1b48f934e24e71d34" gracePeriod=30 Dec 12 00:46:01 crc kubenswrapper[4606]: I1212 00:46:01.247679 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-b55cc6bf7-ckhd2" podUID="465e3cb1-d565-45fb-9251-de59579f3add" containerName="horizon" containerID="cri-o://cf0be897f83d1499d58628455a2bba9f36282137aa420d1d9db956ae32c07e35" gracePeriod=30 Dec 12 00:46:01 crc kubenswrapper[4606]: I1212 00:46:01.251886 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-kcwn5" event={"ID":"8ff95d54-4b78-48cb-b8c9-33801a6818f0","Type":"ContainerStarted","Data":"6a8d99cbd3f26c6e5854367af42e1ce9ba21770fc02cad0e163f6b6760344294"} Dec 12 00:46:01 crc kubenswrapper[4606]: I1212 00:46:01.255626 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"88bd7935-19a0-486d-b1e7-4737abcf21ab","Type":"ContainerStarted","Data":"4b5bdf970c54fef8e9c089fcdbb9c46b973ecdbfcce0ef2b0dc6b071a5a7f0b0"} Dec 12 00:46:01 crc kubenswrapper[4606]: I1212 00:46:01.267000 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c2bccf3c-0850-4ad7-96c0-415f9c6bb56c","Type":"ContainerStarted","Data":"bd67f99ed213e4fb654a58b7ee58e729045371a2c40ea81705d0056c76d6c05a"} Dec 12 00:46:01 crc kubenswrapper[4606]: I1212 00:46:01.267641 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c923042a-1c66-4db8-8e92-fc41e2f19b4f-internal-tls-certs\") pod \"neutron-d9886cd8c-2vtxs\" (UID: \"c923042a-1c66-4db8-8e92-fc41e2f19b4f\") " pod="openstack/neutron-d9886cd8c-2vtxs" Dec 12 00:46:01 crc kubenswrapper[4606]: I1212 00:46:01.273470 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-79c99578bb-cdgsn" event={"ID":"9ede4720-3fd7-4524-adfc-c1c395f12170","Type":"ContainerStarted","Data":"3da605673d28778834960969826f21261bb757319e5b0200c284265da3ac2e79"} Dec 12 00:46:01 crc kubenswrapper[4606]: I1212 00:46:01.276439 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-qqpvb" event={"ID":"e071e571-9ded-4520-9275-221d832aa78d","Type":"ContainerStarted","Data":"619cec974fb9ba7102c351dfddea3a15ed285c8a8993ed86e929e8d9b1794a6a"} Dec 12 00:46:01 crc kubenswrapper[4606]: I1212 00:46:01.276921 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-b55cc6bf7-ckhd2" podStartSLOduration=3.359733943 podStartE2EDuration="35.276889481s" podCreationTimestamp="2025-12-12 00:45:26 +0000 UTC" firstStartedPulling="2025-12-12 00:45:27.233971924 +0000 UTC m=+1317.779324790" lastFinishedPulling="2025-12-12 00:45:59.151127462 +0000 UTC m=+1349.696480328" observedRunningTime="2025-12-12 00:46:01.263251015 +0000 UTC m=+1351.808603881" watchObservedRunningTime="2025-12-12 00:46:01.276889481 +0000 UTC m=+1351.822242347" Dec 12 00:46:01 crc kubenswrapper[4606]: I1212 00:46:01.277829 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dn4sq\" (UniqueName: \"kubernetes.io/projected/c923042a-1c66-4db8-8e92-fc41e2f19b4f-kube-api-access-dn4sq\") pod \"neutron-d9886cd8c-2vtxs\" (UID: \"c923042a-1c66-4db8-8e92-fc41e2f19b4f\") " pod="openstack/neutron-d9886cd8c-2vtxs" Dec 12 00:46:01 crc kubenswrapper[4606]: I1212 00:46:01.280069 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-qmcpn" event={"ID":"73c136c9-e12d-434a-aab1-ed21dfaf0f60","Type":"ContainerStarted","Data":"0e8eece22d2df1260b252eb6bff1082a6dcbf84ce11e6b0f6c0ac4cd3cd3b87a"} Dec 12 00:46:01 crc kubenswrapper[4606]: I1212 00:46:01.287347 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5dfd45968b-nj9ll" event={"ID":"218c1acf-b25f-43b6-9967-badd62c1a155","Type":"ContainerStarted","Data":"88f5868b474b733fb729888cb06fc225d7f583d8440c6ee4c40f655edef62b54"} Dec 12 00:46:01 crc kubenswrapper[4606]: I1212 00:46:01.301086 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-qmcpn" podStartSLOduration=5.033496543 podStartE2EDuration="39.30107078s" podCreationTimestamp="2025-12-12 00:45:22 +0000 UTC" firstStartedPulling="2025-12-12 00:45:24.894049517 +0000 UTC m=+1315.439402383" lastFinishedPulling="2025-12-12 00:45:59.161623754 +0000 UTC m=+1349.706976620" observedRunningTime="2025-12-12 00:46:01.29625401 +0000 UTC m=+1351.841606876" watchObservedRunningTime="2025-12-12 00:46:01.30107078 +0000 UTC m=+1351.846423646" Dec 12 00:46:01 crc kubenswrapper[4606]: I1212 00:46:01.304400 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8e040a81-ee68-4745-b947-61aa28e33fa7","Type":"ContainerStarted","Data":"3968a1227bb5407a5351432bbc1d248eae38c49b7714f4a900ded9dfe698b58d"} Dec 12 00:46:01 crc kubenswrapper[4606]: I1212 00:46:01.400031 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-d9886cd8c-2vtxs" Dec 12 00:46:01 crc kubenswrapper[4606]: I1212 00:46:01.464292 4606 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5f59b8f679-6ms77" podUID="31335089-493b-42bb-9a5f-cb4ea39951f4" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.132:5353: i/o timeout" Dec 12 00:46:02 crc kubenswrapper[4606]: I1212 00:46:02.325897 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5dfd45968b-nj9ll" event={"ID":"218c1acf-b25f-43b6-9967-badd62c1a155","Type":"ContainerStarted","Data":"d1b6480067229a434be4e619b942a9f48ca05a819714ceeaa05d7ca9e025c51b"} Dec 12 00:46:02 crc kubenswrapper[4606]: I1212 00:46:02.326452 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5dfd45968b-nj9ll" event={"ID":"218c1acf-b25f-43b6-9967-badd62c1a155","Type":"ContainerStarted","Data":"44e3ffe4c5022a3dae1e2fd671e95fc1ee641f7665dabc91a545e50f3b8f0634"} Dec 12 00:46:02 crc kubenswrapper[4606]: I1212 00:46:02.327131 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-5dfd45968b-nj9ll" Dec 12 00:46:02 crc kubenswrapper[4606]: I1212 00:46:02.339440 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c2bccf3c-0850-4ad7-96c0-415f9c6bb56c","Type":"ContainerStarted","Data":"a1b89c85ce5fe7e0c71860a351d43de3d2f7d620050a22876d62d58a5085c21b"} Dec 12 00:46:02 crc kubenswrapper[4606]: I1212 00:46:02.339649 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="c2bccf3c-0850-4ad7-96c0-415f9c6bb56c" containerName="glance-log" containerID="cri-o://bd67f99ed213e4fb654a58b7ee58e729045371a2c40ea81705d0056c76d6c05a" gracePeriod=30 Dec 12 00:46:02 crc kubenswrapper[4606]: I1212 00:46:02.339762 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="c2bccf3c-0850-4ad7-96c0-415f9c6bb56c" containerName="glance-httpd" containerID="cri-o://a1b89c85ce5fe7e0c71860a351d43de3d2f7d620050a22876d62d58a5085c21b" gracePeriod=30 Dec 12 00:46:02 crc kubenswrapper[4606]: I1212 00:46:02.343127 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-79c99578bb-cdgsn" event={"ID":"9ede4720-3fd7-4524-adfc-c1c395f12170","Type":"ContainerStarted","Data":"a50c810b61ab80031ddc96b3bb79c28f1af6ee88b45271a71187d7164a11dd04"} Dec 12 00:46:02 crc kubenswrapper[4606]: I1212 00:46:02.346247 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-4hg5f" event={"ID":"5a0b6b98-c743-4435-a967-55c0edb95531","Type":"ContainerStarted","Data":"6123990c2827dad2f196359e8f22bb8b9f9b9d940730041d059d21c3bc9fd5e1"} Dec 12 00:46:02 crc kubenswrapper[4606]: I1212 00:46:02.354014 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-b9fb498f6-62fcc" event={"ID":"e38df57e-1a86-4c45-bf40-6282a6a049ed","Type":"ContainerStarted","Data":"2dbf7369ad77ec21071d76169ebe84202398f88e4b0d626bed0634bc2c0923cd"} Dec 12 00:46:02 crc kubenswrapper[4606]: I1212 00:46:02.359893 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-5dfd45968b-nj9ll" podStartSLOduration=4.359875044 podStartE2EDuration="4.359875044s" podCreationTimestamp="2025-12-12 00:45:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:46:02.356345169 +0000 UTC m=+1352.901698035" watchObservedRunningTime="2025-12-12 00:46:02.359875044 +0000 UTC m=+1352.905227910" Dec 12 00:46:02 crc kubenswrapper[4606]: I1212 00:46:02.377544 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-qqpvb" event={"ID":"e071e571-9ded-4520-9275-221d832aa78d","Type":"ContainerStarted","Data":"08cd718141c9d291cf03d46cabc4d360935067d8b604b7d59d1bfb86de4fe6a4"} Dec 12 00:46:02 crc kubenswrapper[4606]: I1212 00:46:02.386528 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-b55cc6bf7-ckhd2" event={"ID":"465e3cb1-d565-45fb-9251-de59579f3add","Type":"ContainerStarted","Data":"cf0be897f83d1499d58628455a2bba9f36282137aa420d1d9db956ae32c07e35"} Dec 12 00:46:02 crc kubenswrapper[4606]: I1212 00:46:02.388036 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-d9886cd8c-2vtxs"] Dec 12 00:46:02 crc kubenswrapper[4606]: I1212 00:46:02.389768 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-b9fb498f6-62fcc" podStartSLOduration=31.389749656 podStartE2EDuration="31.389749656s" podCreationTimestamp="2025-12-12 00:45:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:46:02.386864938 +0000 UTC m=+1352.932217804" watchObservedRunningTime="2025-12-12 00:46:02.389749656 +0000 UTC m=+1352.935102522" Dec 12 00:46:02 crc kubenswrapper[4606]: I1212 00:46:02.394632 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-kcwn5" event={"ID":"8ff95d54-4b78-48cb-b8c9-33801a6818f0","Type":"ContainerStarted","Data":"f7c412f15338c35c0d22eec74a9158bf49f2120376a8b6e416c4b366c83906f5"} Dec 12 00:46:02 crc kubenswrapper[4606]: I1212 00:46:02.424123 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-79c99578bb-cdgsn" podStartSLOduration=31.424099268 podStartE2EDuration="31.424099268s" podCreationTimestamp="2025-12-12 00:45:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:46:02.411596362 +0000 UTC m=+1352.956949238" watchObservedRunningTime="2025-12-12 00:46:02.424099268 +0000 UTC m=+1352.969452134" Dec 12 00:46:02 crc kubenswrapper[4606]: I1212 00:46:02.449036 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-4hg5f" podStartSLOduration=4.604788397 podStartE2EDuration="40.449018646s" podCreationTimestamp="2025-12-12 00:45:22 +0000 UTC" firstStartedPulling="2025-12-12 00:45:24.88670889 +0000 UTC m=+1315.432061766" lastFinishedPulling="2025-12-12 00:46:00.730939149 +0000 UTC m=+1351.276292015" observedRunningTime="2025-12-12 00:46:02.433355076 +0000 UTC m=+1352.978707942" watchObservedRunningTime="2025-12-12 00:46:02.449018646 +0000 UTC m=+1352.994371512" Dec 12 00:46:02 crc kubenswrapper[4606]: I1212 00:46:02.473382 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=33.47336578 podStartE2EDuration="33.47336578s" podCreationTimestamp="2025-12-12 00:45:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:46:02.471632483 +0000 UTC m=+1353.016985349" watchObservedRunningTime="2025-12-12 00:46:02.47336578 +0000 UTC m=+1353.018718646" Dec 12 00:46:02 crc kubenswrapper[4606]: I1212 00:46:02.499029 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-kcwn5" podStartSLOduration=16.499010168 podStartE2EDuration="16.499010168s" podCreationTimestamp="2025-12-12 00:45:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:46:02.496243894 +0000 UTC m=+1353.041596760" watchObservedRunningTime="2025-12-12 00:46:02.499010168 +0000 UTC m=+1353.044363054" Dec 12 00:46:03 crc kubenswrapper[4606]: I1212 00:46:03.411891 4606 generic.go:334] "Generic (PLEG): container finished" podID="e071e571-9ded-4520-9275-221d832aa78d" containerID="08cd718141c9d291cf03d46cabc4d360935067d8b604b7d59d1bfb86de4fe6a4" exitCode=0 Dec 12 00:46:03 crc kubenswrapper[4606]: I1212 00:46:03.411979 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-qqpvb" event={"ID":"e071e571-9ded-4520-9275-221d832aa78d","Type":"ContainerDied","Data":"08cd718141c9d291cf03d46cabc4d360935067d8b604b7d59d1bfb86de4fe6a4"} Dec 12 00:46:03 crc kubenswrapper[4606]: I1212 00:46:03.417468 4606 generic.go:334] "Generic (PLEG): container finished" podID="c2bccf3c-0850-4ad7-96c0-415f9c6bb56c" containerID="a1b89c85ce5fe7e0c71860a351d43de3d2f7d620050a22876d62d58a5085c21b" exitCode=143 Dec 12 00:46:03 crc kubenswrapper[4606]: I1212 00:46:03.417728 4606 generic.go:334] "Generic (PLEG): container finished" podID="c2bccf3c-0850-4ad7-96c0-415f9c6bb56c" containerID="bd67f99ed213e4fb654a58b7ee58e729045371a2c40ea81705d0056c76d6c05a" exitCode=143 Dec 12 00:46:03 crc kubenswrapper[4606]: I1212 00:46:03.417770 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c2bccf3c-0850-4ad7-96c0-415f9c6bb56c","Type":"ContainerDied","Data":"a1b89c85ce5fe7e0c71860a351d43de3d2f7d620050a22876d62d58a5085c21b"} Dec 12 00:46:03 crc kubenswrapper[4606]: I1212 00:46:03.417795 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c2bccf3c-0850-4ad7-96c0-415f9c6bb56c","Type":"ContainerDied","Data":"bd67f99ed213e4fb654a58b7ee58e729045371a2c40ea81705d0056c76d6c05a"} Dec 12 00:46:03 crc kubenswrapper[4606]: I1212 00:46:03.419591 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-d9886cd8c-2vtxs" event={"ID":"c923042a-1c66-4db8-8e92-fc41e2f19b4f","Type":"ContainerStarted","Data":"1e37716ca59b4b60d768578c0b426d2e7400fd3687852e594f298567ac02c229"} Dec 12 00:46:03 crc kubenswrapper[4606]: I1212 00:46:03.419614 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-d9886cd8c-2vtxs" event={"ID":"c923042a-1c66-4db8-8e92-fc41e2f19b4f","Type":"ContainerStarted","Data":"37698b7e04f19bcbedf89297e3db6056e6259ffb9c6f9d6bd4060c910082e2fe"} Dec 12 00:46:03 crc kubenswrapper[4606]: I1212 00:46:03.422606 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="8e040a81-ee68-4745-b947-61aa28e33fa7" containerName="glance-log" containerID="cri-o://3968a1227bb5407a5351432bbc1d248eae38c49b7714f4a900ded9dfe698b58d" gracePeriod=30 Dec 12 00:46:03 crc kubenswrapper[4606]: I1212 00:46:03.422764 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8e040a81-ee68-4745-b947-61aa28e33fa7","Type":"ContainerStarted","Data":"2ddcad133862b367988cad00acdc0eda7ee0c4e09611df704d366ca56a1d4ce4"} Dec 12 00:46:03 crc kubenswrapper[4606]: I1212 00:46:03.422919 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="8e040a81-ee68-4745-b947-61aa28e33fa7" containerName="glance-httpd" containerID="cri-o://2ddcad133862b367988cad00acdc0eda7ee0c4e09611df704d366ca56a1d4ce4" gracePeriod=30 Dec 12 00:46:03 crc kubenswrapper[4606]: I1212 00:46:03.463929 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=34.463907024 podStartE2EDuration="34.463907024s" podCreationTimestamp="2025-12-12 00:45:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:46:03.451664995 +0000 UTC m=+1353.997017861" watchObservedRunningTime="2025-12-12 00:46:03.463907024 +0000 UTC m=+1354.009259890" Dec 12 00:46:04 crc kubenswrapper[4606]: I1212 00:46:04.431580 4606 generic.go:334] "Generic (PLEG): container finished" podID="8e040a81-ee68-4745-b947-61aa28e33fa7" containerID="2ddcad133862b367988cad00acdc0eda7ee0c4e09611df704d366ca56a1d4ce4" exitCode=143 Dec 12 00:46:04 crc kubenswrapper[4606]: I1212 00:46:04.431611 4606 generic.go:334] "Generic (PLEG): container finished" podID="8e040a81-ee68-4745-b947-61aa28e33fa7" containerID="3968a1227bb5407a5351432bbc1d248eae38c49b7714f4a900ded9dfe698b58d" exitCode=143 Dec 12 00:46:04 crc kubenswrapper[4606]: I1212 00:46:04.431631 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8e040a81-ee68-4745-b947-61aa28e33fa7","Type":"ContainerDied","Data":"2ddcad133862b367988cad00acdc0eda7ee0c4e09611df704d366ca56a1d4ce4"} Dec 12 00:46:04 crc kubenswrapper[4606]: I1212 00:46:04.431656 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8e040a81-ee68-4745-b947-61aa28e33fa7","Type":"ContainerDied","Data":"3968a1227bb5407a5351432bbc1d248eae38c49b7714f4a900ded9dfe698b58d"} Dec 12 00:46:05 crc kubenswrapper[4606]: I1212 00:46:05.738593 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 12 00:46:05 crc kubenswrapper[4606]: I1212 00:46:05.816946 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"c2bccf3c-0850-4ad7-96c0-415f9c6bb56c\" (UID: \"c2bccf3c-0850-4ad7-96c0-415f9c6bb56c\") " Dec 12 00:46:05 crc kubenswrapper[4606]: I1212 00:46:05.817024 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c2bccf3c-0850-4ad7-96c0-415f9c6bb56c-scripts\") pod \"c2bccf3c-0850-4ad7-96c0-415f9c6bb56c\" (UID: \"c2bccf3c-0850-4ad7-96c0-415f9c6bb56c\") " Dec 12 00:46:05 crc kubenswrapper[4606]: I1212 00:46:05.817079 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c2bccf3c-0850-4ad7-96c0-415f9c6bb56c-logs\") pod \"c2bccf3c-0850-4ad7-96c0-415f9c6bb56c\" (UID: \"c2bccf3c-0850-4ad7-96c0-415f9c6bb56c\") " Dec 12 00:46:05 crc kubenswrapper[4606]: I1212 00:46:05.817217 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2bccf3c-0850-4ad7-96c0-415f9c6bb56c-config-data\") pod \"c2bccf3c-0850-4ad7-96c0-415f9c6bb56c\" (UID: \"c2bccf3c-0850-4ad7-96c0-415f9c6bb56c\") " Dec 12 00:46:05 crc kubenswrapper[4606]: I1212 00:46:05.817245 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2bccf3c-0850-4ad7-96c0-415f9c6bb56c-public-tls-certs\") pod \"c2bccf3c-0850-4ad7-96c0-415f9c6bb56c\" (UID: \"c2bccf3c-0850-4ad7-96c0-415f9c6bb56c\") " Dec 12 00:46:05 crc kubenswrapper[4606]: I1212 00:46:05.817277 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c2bccf3c-0850-4ad7-96c0-415f9c6bb56c-httpd-run\") pod \"c2bccf3c-0850-4ad7-96c0-415f9c6bb56c\" (UID: \"c2bccf3c-0850-4ad7-96c0-415f9c6bb56c\") " Dec 12 00:46:05 crc kubenswrapper[4606]: I1212 00:46:05.817324 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2bccf3c-0850-4ad7-96c0-415f9c6bb56c-combined-ca-bundle\") pod \"c2bccf3c-0850-4ad7-96c0-415f9c6bb56c\" (UID: \"c2bccf3c-0850-4ad7-96c0-415f9c6bb56c\") " Dec 12 00:46:05 crc kubenswrapper[4606]: I1212 00:46:05.817401 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gztps\" (UniqueName: \"kubernetes.io/projected/c2bccf3c-0850-4ad7-96c0-415f9c6bb56c-kube-api-access-gztps\") pod \"c2bccf3c-0850-4ad7-96c0-415f9c6bb56c\" (UID: \"c2bccf3c-0850-4ad7-96c0-415f9c6bb56c\") " Dec 12 00:46:05 crc kubenswrapper[4606]: I1212 00:46:05.819650 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2bccf3c-0850-4ad7-96c0-415f9c6bb56c-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "c2bccf3c-0850-4ad7-96c0-415f9c6bb56c" (UID: "c2bccf3c-0850-4ad7-96c0-415f9c6bb56c"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 00:46:05 crc kubenswrapper[4606]: I1212 00:46:05.819928 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2bccf3c-0850-4ad7-96c0-415f9c6bb56c-logs" (OuterVolumeSpecName: "logs") pod "c2bccf3c-0850-4ad7-96c0-415f9c6bb56c" (UID: "c2bccf3c-0850-4ad7-96c0-415f9c6bb56c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 00:46:05 crc kubenswrapper[4606]: I1212 00:46:05.827205 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2bccf3c-0850-4ad7-96c0-415f9c6bb56c-scripts" (OuterVolumeSpecName: "scripts") pod "c2bccf3c-0850-4ad7-96c0-415f9c6bb56c" (UID: "c2bccf3c-0850-4ad7-96c0-415f9c6bb56c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:46:05 crc kubenswrapper[4606]: I1212 00:46:05.830478 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2bccf3c-0850-4ad7-96c0-415f9c6bb56c-kube-api-access-gztps" (OuterVolumeSpecName: "kube-api-access-gztps") pod "c2bccf3c-0850-4ad7-96c0-415f9c6bb56c" (UID: "c2bccf3c-0850-4ad7-96c0-415f9c6bb56c"). InnerVolumeSpecName "kube-api-access-gztps". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:46:05 crc kubenswrapper[4606]: I1212 00:46:05.838896 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "c2bccf3c-0850-4ad7-96c0-415f9c6bb56c" (UID: "c2bccf3c-0850-4ad7-96c0-415f9c6bb56c"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 12 00:46:05 crc kubenswrapper[4606]: I1212 00:46:05.921569 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gztps\" (UniqueName: \"kubernetes.io/projected/c2bccf3c-0850-4ad7-96c0-415f9c6bb56c-kube-api-access-gztps\") on node \"crc\" DevicePath \"\"" Dec 12 00:46:05 crc kubenswrapper[4606]: I1212 00:46:05.927559 4606 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Dec 12 00:46:05 crc kubenswrapper[4606]: I1212 00:46:05.927574 4606 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c2bccf3c-0850-4ad7-96c0-415f9c6bb56c-scripts\") on node \"crc\" DevicePath \"\"" Dec 12 00:46:05 crc kubenswrapper[4606]: I1212 00:46:05.927583 4606 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c2bccf3c-0850-4ad7-96c0-415f9c6bb56c-logs\") on node \"crc\" DevicePath \"\"" Dec 12 00:46:05 crc kubenswrapper[4606]: I1212 00:46:05.927592 4606 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c2bccf3c-0850-4ad7-96c0-415f9c6bb56c-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 12 00:46:05 crc kubenswrapper[4606]: I1212 00:46:05.923459 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2bccf3c-0850-4ad7-96c0-415f9c6bb56c-config-data" (OuterVolumeSpecName: "config-data") pod "c2bccf3c-0850-4ad7-96c0-415f9c6bb56c" (UID: "c2bccf3c-0850-4ad7-96c0-415f9c6bb56c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:46:05 crc kubenswrapper[4606]: I1212 00:46:05.960456 4606 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Dec 12 00:46:05 crc kubenswrapper[4606]: I1212 00:46:05.982364 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2bccf3c-0850-4ad7-96c0-415f9c6bb56c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c2bccf3c-0850-4ad7-96c0-415f9c6bb56c" (UID: "c2bccf3c-0850-4ad7-96c0-415f9c6bb56c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:46:06 crc kubenswrapper[4606]: I1212 00:46:06.028692 4606 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Dec 12 00:46:06 crc kubenswrapper[4606]: I1212 00:46:06.028722 4606 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2bccf3c-0850-4ad7-96c0-415f9c6bb56c-config-data\") on node \"crc\" DevicePath \"\"" Dec 12 00:46:06 crc kubenswrapper[4606]: I1212 00:46:06.028733 4606 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2bccf3c-0850-4ad7-96c0-415f9c6bb56c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 12 00:46:06 crc kubenswrapper[4606]: I1212 00:46:06.120286 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2bccf3c-0850-4ad7-96c0-415f9c6bb56c-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "c2bccf3c-0850-4ad7-96c0-415f9c6bb56c" (UID: "c2bccf3c-0850-4ad7-96c0-415f9c6bb56c"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:46:06 crc kubenswrapper[4606]: I1212 00:46:06.131068 4606 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2bccf3c-0850-4ad7-96c0-415f9c6bb56c-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 12 00:46:06 crc kubenswrapper[4606]: I1212 00:46:06.206738 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 12 00:46:06 crc kubenswrapper[4606]: I1212 00:46:06.337811 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8e040a81-ee68-4745-b947-61aa28e33fa7-scripts\") pod \"8e040a81-ee68-4745-b947-61aa28e33fa7\" (UID: \"8e040a81-ee68-4745-b947-61aa28e33fa7\") " Dec 12 00:46:06 crc kubenswrapper[4606]: I1212 00:46:06.338321 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b7kns\" (UniqueName: \"kubernetes.io/projected/8e040a81-ee68-4745-b947-61aa28e33fa7-kube-api-access-b7kns\") pod \"8e040a81-ee68-4745-b947-61aa28e33fa7\" (UID: \"8e040a81-ee68-4745-b947-61aa28e33fa7\") " Dec 12 00:46:06 crc kubenswrapper[4606]: I1212 00:46:06.338446 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8e040a81-ee68-4745-b947-61aa28e33fa7-logs\") pod \"8e040a81-ee68-4745-b947-61aa28e33fa7\" (UID: \"8e040a81-ee68-4745-b947-61aa28e33fa7\") " Dec 12 00:46:06 crc kubenswrapper[4606]: I1212 00:46:06.338598 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"8e040a81-ee68-4745-b947-61aa28e33fa7\" (UID: \"8e040a81-ee68-4745-b947-61aa28e33fa7\") " Dec 12 00:46:06 crc kubenswrapper[4606]: I1212 00:46:06.338700 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e040a81-ee68-4745-b947-61aa28e33fa7-config-data\") pod \"8e040a81-ee68-4745-b947-61aa28e33fa7\" (UID: \"8e040a81-ee68-4745-b947-61aa28e33fa7\") " Dec 12 00:46:06 crc kubenswrapper[4606]: I1212 00:46:06.338846 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8e040a81-ee68-4745-b947-61aa28e33fa7-httpd-run\") pod \"8e040a81-ee68-4745-b947-61aa28e33fa7\" (UID: \"8e040a81-ee68-4745-b947-61aa28e33fa7\") " Dec 12 00:46:06 crc kubenswrapper[4606]: I1212 00:46:06.339036 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e040a81-ee68-4745-b947-61aa28e33fa7-combined-ca-bundle\") pod \"8e040a81-ee68-4745-b947-61aa28e33fa7\" (UID: \"8e040a81-ee68-4745-b947-61aa28e33fa7\") " Dec 12 00:46:06 crc kubenswrapper[4606]: I1212 00:46:06.339478 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e040a81-ee68-4745-b947-61aa28e33fa7-internal-tls-certs\") pod \"8e040a81-ee68-4745-b947-61aa28e33fa7\" (UID: \"8e040a81-ee68-4745-b947-61aa28e33fa7\") " Dec 12 00:46:06 crc kubenswrapper[4606]: I1212 00:46:06.341730 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e040a81-ee68-4745-b947-61aa28e33fa7-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "8e040a81-ee68-4745-b947-61aa28e33fa7" (UID: "8e040a81-ee68-4745-b947-61aa28e33fa7"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 00:46:06 crc kubenswrapper[4606]: I1212 00:46:06.342104 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e040a81-ee68-4745-b947-61aa28e33fa7-logs" (OuterVolumeSpecName: "logs") pod "8e040a81-ee68-4745-b947-61aa28e33fa7" (UID: "8e040a81-ee68-4745-b947-61aa28e33fa7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 00:46:06 crc kubenswrapper[4606]: I1212 00:46:06.343365 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e040a81-ee68-4745-b947-61aa28e33fa7-scripts" (OuterVolumeSpecName: "scripts") pod "8e040a81-ee68-4745-b947-61aa28e33fa7" (UID: "8e040a81-ee68-4745-b947-61aa28e33fa7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:46:06 crc kubenswrapper[4606]: I1212 00:46:06.345308 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e040a81-ee68-4745-b947-61aa28e33fa7-kube-api-access-b7kns" (OuterVolumeSpecName: "kube-api-access-b7kns") pod "8e040a81-ee68-4745-b947-61aa28e33fa7" (UID: "8e040a81-ee68-4745-b947-61aa28e33fa7"). InnerVolumeSpecName "kube-api-access-b7kns". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:46:06 crc kubenswrapper[4606]: I1212 00:46:06.345674 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance") pod "8e040a81-ee68-4745-b947-61aa28e33fa7" (UID: "8e040a81-ee68-4745-b947-61aa28e33fa7"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 12 00:46:06 crc kubenswrapper[4606]: I1212 00:46:06.395230 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e040a81-ee68-4745-b947-61aa28e33fa7-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "8e040a81-ee68-4745-b947-61aa28e33fa7" (UID: "8e040a81-ee68-4745-b947-61aa28e33fa7"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:46:06 crc kubenswrapper[4606]: I1212 00:46:06.396210 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e040a81-ee68-4745-b947-61aa28e33fa7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8e040a81-ee68-4745-b947-61aa28e33fa7" (UID: "8e040a81-ee68-4745-b947-61aa28e33fa7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:46:06 crc kubenswrapper[4606]: I1212 00:46:06.399784 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e040a81-ee68-4745-b947-61aa28e33fa7-config-data" (OuterVolumeSpecName: "config-data") pod "8e040a81-ee68-4745-b947-61aa28e33fa7" (UID: "8e040a81-ee68-4745-b947-61aa28e33fa7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:46:06 crc kubenswrapper[4606]: I1212 00:46:06.441424 4606 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Dec 12 00:46:06 crc kubenswrapper[4606]: I1212 00:46:06.441459 4606 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e040a81-ee68-4745-b947-61aa28e33fa7-config-data\") on node \"crc\" DevicePath \"\"" Dec 12 00:46:06 crc kubenswrapper[4606]: I1212 00:46:06.441469 4606 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8e040a81-ee68-4745-b947-61aa28e33fa7-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 12 00:46:06 crc kubenswrapper[4606]: I1212 00:46:06.441478 4606 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e040a81-ee68-4745-b947-61aa28e33fa7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 12 00:46:06 crc kubenswrapper[4606]: I1212 00:46:06.441489 4606 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e040a81-ee68-4745-b947-61aa28e33fa7-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 12 00:46:06 crc kubenswrapper[4606]: I1212 00:46:06.441497 4606 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8e040a81-ee68-4745-b947-61aa28e33fa7-scripts\") on node \"crc\" DevicePath \"\"" Dec 12 00:46:06 crc kubenswrapper[4606]: I1212 00:46:06.441505 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b7kns\" (UniqueName: \"kubernetes.io/projected/8e040a81-ee68-4745-b947-61aa28e33fa7-kube-api-access-b7kns\") on node \"crc\" DevicePath \"\"" Dec 12 00:46:06 crc kubenswrapper[4606]: I1212 00:46:06.441513 4606 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8e040a81-ee68-4745-b947-61aa28e33fa7-logs\") on node \"crc\" DevicePath \"\"" Dec 12 00:46:06 crc kubenswrapper[4606]: I1212 00:46:06.448060 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-d9886cd8c-2vtxs" event={"ID":"c923042a-1c66-4db8-8e92-fc41e2f19b4f","Type":"ContainerStarted","Data":"b181cd95bd403dcc161ad6f2e8506d38c00fc95d61ed2c41359799d7f1d43cd9"} Dec 12 00:46:06 crc kubenswrapper[4606]: I1212 00:46:06.449413 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-d9886cd8c-2vtxs" Dec 12 00:46:06 crc kubenswrapper[4606]: I1212 00:46:06.452236 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-b55cc6bf7-ckhd2" Dec 12 00:46:06 crc kubenswrapper[4606]: I1212 00:46:06.460867 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8e040a81-ee68-4745-b947-61aa28e33fa7","Type":"ContainerDied","Data":"41e51447aab445cb43fca395871267b62a042140de05de1b40d80a02158d0418"} Dec 12 00:46:06 crc kubenswrapper[4606]: I1212 00:46:06.460912 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 12 00:46:06 crc kubenswrapper[4606]: I1212 00:46:06.461074 4606 scope.go:117] "RemoveContainer" containerID="2ddcad133862b367988cad00acdc0eda7ee0c4e09611df704d366ca56a1d4ce4" Dec 12 00:46:06 crc kubenswrapper[4606]: I1212 00:46:06.464426 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-qqpvb" event={"ID":"e071e571-9ded-4520-9275-221d832aa78d","Type":"ContainerStarted","Data":"6f7d5a31d87804a5a38e1d766b9397d4097fc85d7e30158912179eb190bdd558"} Dec 12 00:46:06 crc kubenswrapper[4606]: I1212 00:46:06.464622 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6b7b667979-qqpvb" Dec 12 00:46:06 crc kubenswrapper[4606]: I1212 00:46:06.470644 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"88bd7935-19a0-486d-b1e7-4737abcf21ab","Type":"ContainerStarted","Data":"37f3cac0d54a98025f3895722de48b6be5179e91e78542fa9dd933d5d466826f"} Dec 12 00:46:06 crc kubenswrapper[4606]: I1212 00:46:06.476764 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-d9886cd8c-2vtxs" podStartSLOduration=5.476743819 podStartE2EDuration="5.476743819s" podCreationTimestamp="2025-12-12 00:46:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:46:06.466929705 +0000 UTC m=+1357.012282581" watchObservedRunningTime="2025-12-12 00:46:06.476743819 +0000 UTC m=+1357.022096685" Dec 12 00:46:06 crc kubenswrapper[4606]: I1212 00:46:06.479418 4606 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Dec 12 00:46:06 crc kubenswrapper[4606]: I1212 00:46:06.481382 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c2bccf3c-0850-4ad7-96c0-415f9c6bb56c","Type":"ContainerDied","Data":"8dc466df9cab9566d386a159c24b03c3ee2c8f5e339aeace048ecc0c6ba0399c"} Dec 12 00:46:06 crc kubenswrapper[4606]: I1212 00:46:06.481491 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 12 00:46:06 crc kubenswrapper[4606]: I1212 00:46:06.494408 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6b7b667979-qqpvb" podStartSLOduration=8.494390142 podStartE2EDuration="8.494390142s" podCreationTimestamp="2025-12-12 00:45:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:46:06.493113318 +0000 UTC m=+1357.038466194" watchObservedRunningTime="2025-12-12 00:46:06.494390142 +0000 UTC m=+1357.039743008" Dec 12 00:46:06 crc kubenswrapper[4606]: I1212 00:46:06.500518 4606 scope.go:117] "RemoveContainer" containerID="3968a1227bb5407a5351432bbc1d248eae38c49b7714f4a900ded9dfe698b58d" Dec 12 00:46:06 crc kubenswrapper[4606]: I1212 00:46:06.523061 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 12 00:46:06 crc kubenswrapper[4606]: I1212 00:46:06.531520 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 12 00:46:06 crc kubenswrapper[4606]: I1212 00:46:06.549018 4606 scope.go:117] "RemoveContainer" containerID="a1b89c85ce5fe7e0c71860a351d43de3d2f7d620050a22876d62d58a5085c21b" Dec 12 00:46:06 crc kubenswrapper[4606]: I1212 00:46:06.550213 4606 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Dec 12 00:46:06 crc kubenswrapper[4606]: I1212 00:46:06.564867 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 12 00:46:06 crc kubenswrapper[4606]: E1212 00:46:06.565267 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e040a81-ee68-4745-b947-61aa28e33fa7" containerName="glance-log" Dec 12 00:46:06 crc kubenswrapper[4606]: I1212 00:46:06.565280 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e040a81-ee68-4745-b947-61aa28e33fa7" containerName="glance-log" Dec 12 00:46:06 crc kubenswrapper[4606]: E1212 00:46:06.565308 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e040a81-ee68-4745-b947-61aa28e33fa7" containerName="glance-httpd" Dec 12 00:46:06 crc kubenswrapper[4606]: I1212 00:46:06.565315 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e040a81-ee68-4745-b947-61aa28e33fa7" containerName="glance-httpd" Dec 12 00:46:06 crc kubenswrapper[4606]: E1212 00:46:06.565331 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2bccf3c-0850-4ad7-96c0-415f9c6bb56c" containerName="glance-log" Dec 12 00:46:06 crc kubenswrapper[4606]: I1212 00:46:06.565338 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2bccf3c-0850-4ad7-96c0-415f9c6bb56c" containerName="glance-log" Dec 12 00:46:06 crc kubenswrapper[4606]: E1212 00:46:06.565351 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2bccf3c-0850-4ad7-96c0-415f9c6bb56c" containerName="glance-httpd" Dec 12 00:46:06 crc kubenswrapper[4606]: I1212 00:46:06.565357 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2bccf3c-0850-4ad7-96c0-415f9c6bb56c" containerName="glance-httpd" Dec 12 00:46:06 crc kubenswrapper[4606]: I1212 00:46:06.565506 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2bccf3c-0850-4ad7-96c0-415f9c6bb56c" containerName="glance-log" Dec 12 00:46:06 crc kubenswrapper[4606]: I1212 00:46:06.565520 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2bccf3c-0850-4ad7-96c0-415f9c6bb56c" containerName="glance-httpd" Dec 12 00:46:06 crc kubenswrapper[4606]: I1212 00:46:06.565531 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e040a81-ee68-4745-b947-61aa28e33fa7" containerName="glance-log" Dec 12 00:46:06 crc kubenswrapper[4606]: I1212 00:46:06.565547 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e040a81-ee68-4745-b947-61aa28e33fa7" containerName="glance-httpd" Dec 12 00:46:06 crc kubenswrapper[4606]: I1212 00:46:06.566348 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 12 00:46:06 crc kubenswrapper[4606]: I1212 00:46:06.579856 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 12 00:46:06 crc kubenswrapper[4606]: I1212 00:46:06.588580 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 12 00:46:06 crc kubenswrapper[4606]: I1212 00:46:06.606208 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 12 00:46:06 crc kubenswrapper[4606]: I1212 00:46:06.614272 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Dec 12 00:46:06 crc kubenswrapper[4606]: I1212 00:46:06.614474 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 12 00:46:06 crc kubenswrapper[4606]: I1212 00:46:06.614982 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-gs7ln" Dec 12 00:46:06 crc kubenswrapper[4606]: I1212 00:46:06.675282 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 12 00:46:06 crc kubenswrapper[4606]: I1212 00:46:06.683426 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3dc8210-b17e-45f0-8501-2a545cd4d020-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"e3dc8210-b17e-45f0-8501-2a545cd4d020\") " pod="openstack/glance-default-internal-api-0" Dec 12 00:46:06 crc kubenswrapper[4606]: I1212 00:46:06.683498 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3dc8210-b17e-45f0-8501-2a545cd4d020-logs\") pod \"glance-default-internal-api-0\" (UID: \"e3dc8210-b17e-45f0-8501-2a545cd4d020\") " pod="openstack/glance-default-internal-api-0" Dec 12 00:46:06 crc kubenswrapper[4606]: I1212 00:46:06.683553 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e3dc8210-b17e-45f0-8501-2a545cd4d020-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e3dc8210-b17e-45f0-8501-2a545cd4d020\") " pod="openstack/glance-default-internal-api-0" Dec 12 00:46:06 crc kubenswrapper[4606]: I1212 00:46:06.683593 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3dc8210-b17e-45f0-8501-2a545cd4d020-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e3dc8210-b17e-45f0-8501-2a545cd4d020\") " pod="openstack/glance-default-internal-api-0" Dec 12 00:46:06 crc kubenswrapper[4606]: I1212 00:46:06.683667 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6kqb\" (UniqueName: \"kubernetes.io/projected/e3dc8210-b17e-45f0-8501-2a545cd4d020-kube-api-access-l6kqb\") pod \"glance-default-internal-api-0\" (UID: \"e3dc8210-b17e-45f0-8501-2a545cd4d020\") " pod="openstack/glance-default-internal-api-0" Dec 12 00:46:06 crc kubenswrapper[4606]: I1212 00:46:06.683733 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"e3dc8210-b17e-45f0-8501-2a545cd4d020\") " pod="openstack/glance-default-internal-api-0" Dec 12 00:46:06 crc kubenswrapper[4606]: I1212 00:46:06.683759 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3dc8210-b17e-45f0-8501-2a545cd4d020-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e3dc8210-b17e-45f0-8501-2a545cd4d020\") " pod="openstack/glance-default-internal-api-0" Dec 12 00:46:06 crc kubenswrapper[4606]: I1212 00:46:06.683796 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3dc8210-b17e-45f0-8501-2a545cd4d020-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e3dc8210-b17e-45f0-8501-2a545cd4d020\") " pod="openstack/glance-default-internal-api-0" Dec 12 00:46:06 crc kubenswrapper[4606]: I1212 00:46:06.695416 4606 scope.go:117] "RemoveContainer" containerID="bd67f99ed213e4fb654a58b7ee58e729045371a2c40ea81705d0056c76d6c05a" Dec 12 00:46:06 crc kubenswrapper[4606]: I1212 00:46:06.723188 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 12 00:46:06 crc kubenswrapper[4606]: I1212 00:46:06.724827 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 12 00:46:06 crc kubenswrapper[4606]: I1212 00:46:06.736858 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 12 00:46:06 crc kubenswrapper[4606]: I1212 00:46:06.737142 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 12 00:46:06 crc kubenswrapper[4606]: I1212 00:46:06.752385 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 12 00:46:06 crc kubenswrapper[4606]: I1212 00:46:06.785318 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3dc8210-b17e-45f0-8501-2a545cd4d020-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e3dc8210-b17e-45f0-8501-2a545cd4d020\") " pod="openstack/glance-default-internal-api-0" Dec 12 00:46:06 crc kubenswrapper[4606]: I1212 00:46:06.785377 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3dc8210-b17e-45f0-8501-2a545cd4d020-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e3dc8210-b17e-45f0-8501-2a545cd4d020\") " pod="openstack/glance-default-internal-api-0" Dec 12 00:46:06 crc kubenswrapper[4606]: I1212 00:46:06.785439 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3dc8210-b17e-45f0-8501-2a545cd4d020-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"e3dc8210-b17e-45f0-8501-2a545cd4d020\") " pod="openstack/glance-default-internal-api-0" Dec 12 00:46:06 crc kubenswrapper[4606]: I1212 00:46:06.785484 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3dc8210-b17e-45f0-8501-2a545cd4d020-logs\") pod \"glance-default-internal-api-0\" (UID: \"e3dc8210-b17e-45f0-8501-2a545cd4d020\") " pod="openstack/glance-default-internal-api-0" Dec 12 00:46:06 crc kubenswrapper[4606]: I1212 00:46:06.785513 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e3dc8210-b17e-45f0-8501-2a545cd4d020-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e3dc8210-b17e-45f0-8501-2a545cd4d020\") " pod="openstack/glance-default-internal-api-0" Dec 12 00:46:06 crc kubenswrapper[4606]: I1212 00:46:06.785537 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3dc8210-b17e-45f0-8501-2a545cd4d020-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e3dc8210-b17e-45f0-8501-2a545cd4d020\") " pod="openstack/glance-default-internal-api-0" Dec 12 00:46:06 crc kubenswrapper[4606]: I1212 00:46:06.787247 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l6kqb\" (UniqueName: \"kubernetes.io/projected/e3dc8210-b17e-45f0-8501-2a545cd4d020-kube-api-access-l6kqb\") pod \"glance-default-internal-api-0\" (UID: \"e3dc8210-b17e-45f0-8501-2a545cd4d020\") " pod="openstack/glance-default-internal-api-0" Dec 12 00:46:06 crc kubenswrapper[4606]: I1212 00:46:06.787348 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"e3dc8210-b17e-45f0-8501-2a545cd4d020\") " pod="openstack/glance-default-internal-api-0" Dec 12 00:46:06 crc kubenswrapper[4606]: I1212 00:46:06.787496 4606 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"e3dc8210-b17e-45f0-8501-2a545cd4d020\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/glance-default-internal-api-0" Dec 12 00:46:06 crc kubenswrapper[4606]: I1212 00:46:06.796519 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3dc8210-b17e-45f0-8501-2a545cd4d020-logs\") pod \"glance-default-internal-api-0\" (UID: \"e3dc8210-b17e-45f0-8501-2a545cd4d020\") " pod="openstack/glance-default-internal-api-0" Dec 12 00:46:06 crc kubenswrapper[4606]: I1212 00:46:06.797958 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e3dc8210-b17e-45f0-8501-2a545cd4d020-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e3dc8210-b17e-45f0-8501-2a545cd4d020\") " pod="openstack/glance-default-internal-api-0" Dec 12 00:46:06 crc kubenswrapper[4606]: I1212 00:46:06.806941 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3dc8210-b17e-45f0-8501-2a545cd4d020-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"e3dc8210-b17e-45f0-8501-2a545cd4d020\") " pod="openstack/glance-default-internal-api-0" Dec 12 00:46:06 crc kubenswrapper[4606]: I1212 00:46:06.813795 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3dc8210-b17e-45f0-8501-2a545cd4d020-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e3dc8210-b17e-45f0-8501-2a545cd4d020\") " pod="openstack/glance-default-internal-api-0" Dec 12 00:46:06 crc kubenswrapper[4606]: I1212 00:46:06.826831 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3dc8210-b17e-45f0-8501-2a545cd4d020-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e3dc8210-b17e-45f0-8501-2a545cd4d020\") " pod="openstack/glance-default-internal-api-0" Dec 12 00:46:06 crc kubenswrapper[4606]: I1212 00:46:06.827733 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6kqb\" (UniqueName: \"kubernetes.io/projected/e3dc8210-b17e-45f0-8501-2a545cd4d020-kube-api-access-l6kqb\") pod \"glance-default-internal-api-0\" (UID: \"e3dc8210-b17e-45f0-8501-2a545cd4d020\") " pod="openstack/glance-default-internal-api-0" Dec 12 00:46:06 crc kubenswrapper[4606]: I1212 00:46:06.828740 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3dc8210-b17e-45f0-8501-2a545cd4d020-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e3dc8210-b17e-45f0-8501-2a545cd4d020\") " pod="openstack/glance-default-internal-api-0" Dec 12 00:46:06 crc kubenswrapper[4606]: I1212 00:46:06.889102 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"c4accb34-f155-4198-b222-0800ff8755c3\") " pod="openstack/glance-default-external-api-0" Dec 12 00:46:06 crc kubenswrapper[4606]: I1212 00:46:06.889150 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c4accb34-f155-4198-b222-0800ff8755c3-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"c4accb34-f155-4198-b222-0800ff8755c3\") " pod="openstack/glance-default-external-api-0" Dec 12 00:46:06 crc kubenswrapper[4606]: I1212 00:46:06.889183 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c4accb34-f155-4198-b222-0800ff8755c3-scripts\") pod \"glance-default-external-api-0\" (UID: \"c4accb34-f155-4198-b222-0800ff8755c3\") " pod="openstack/glance-default-external-api-0" Dec 12 00:46:06 crc kubenswrapper[4606]: I1212 00:46:06.889203 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4accb34-f155-4198-b222-0800ff8755c3-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"c4accb34-f155-4198-b222-0800ff8755c3\") " pod="openstack/glance-default-external-api-0" Dec 12 00:46:06 crc kubenswrapper[4606]: I1212 00:46:06.889237 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bsqzl\" (UniqueName: \"kubernetes.io/projected/c4accb34-f155-4198-b222-0800ff8755c3-kube-api-access-bsqzl\") pod \"glance-default-external-api-0\" (UID: \"c4accb34-f155-4198-b222-0800ff8755c3\") " pod="openstack/glance-default-external-api-0" Dec 12 00:46:06 crc kubenswrapper[4606]: I1212 00:46:06.889272 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4accb34-f155-4198-b222-0800ff8755c3-config-data\") pod \"glance-default-external-api-0\" (UID: \"c4accb34-f155-4198-b222-0800ff8755c3\") " pod="openstack/glance-default-external-api-0" Dec 12 00:46:06 crc kubenswrapper[4606]: I1212 00:46:06.889286 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c4accb34-f155-4198-b222-0800ff8755c3-logs\") pod \"glance-default-external-api-0\" (UID: \"c4accb34-f155-4198-b222-0800ff8755c3\") " pod="openstack/glance-default-external-api-0" Dec 12 00:46:06 crc kubenswrapper[4606]: I1212 00:46:06.889318 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c4accb34-f155-4198-b222-0800ff8755c3-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"c4accb34-f155-4198-b222-0800ff8755c3\") " pod="openstack/glance-default-external-api-0" Dec 12 00:46:06 crc kubenswrapper[4606]: I1212 00:46:06.923383 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"e3dc8210-b17e-45f0-8501-2a545cd4d020\") " pod="openstack/glance-default-internal-api-0" Dec 12 00:46:06 crc kubenswrapper[4606]: I1212 00:46:06.992009 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c4accb34-f155-4198-b222-0800ff8755c3-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"c4accb34-f155-4198-b222-0800ff8755c3\") " pod="openstack/glance-default-external-api-0" Dec 12 00:46:06 crc kubenswrapper[4606]: I1212 00:46:06.992113 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"c4accb34-f155-4198-b222-0800ff8755c3\") " pod="openstack/glance-default-external-api-0" Dec 12 00:46:06 crc kubenswrapper[4606]: I1212 00:46:06.992136 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c4accb34-f155-4198-b222-0800ff8755c3-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"c4accb34-f155-4198-b222-0800ff8755c3\") " pod="openstack/glance-default-external-api-0" Dec 12 00:46:06 crc kubenswrapper[4606]: I1212 00:46:06.992157 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c4accb34-f155-4198-b222-0800ff8755c3-scripts\") pod \"glance-default-external-api-0\" (UID: \"c4accb34-f155-4198-b222-0800ff8755c3\") " pod="openstack/glance-default-external-api-0" Dec 12 00:46:06 crc kubenswrapper[4606]: I1212 00:46:06.992187 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4accb34-f155-4198-b222-0800ff8755c3-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"c4accb34-f155-4198-b222-0800ff8755c3\") " pod="openstack/glance-default-external-api-0" Dec 12 00:46:06 crc kubenswrapper[4606]: I1212 00:46:06.992206 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bsqzl\" (UniqueName: \"kubernetes.io/projected/c4accb34-f155-4198-b222-0800ff8755c3-kube-api-access-bsqzl\") pod \"glance-default-external-api-0\" (UID: \"c4accb34-f155-4198-b222-0800ff8755c3\") " pod="openstack/glance-default-external-api-0" Dec 12 00:46:06 crc kubenswrapper[4606]: I1212 00:46:06.992239 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4accb34-f155-4198-b222-0800ff8755c3-config-data\") pod \"glance-default-external-api-0\" (UID: \"c4accb34-f155-4198-b222-0800ff8755c3\") " pod="openstack/glance-default-external-api-0" Dec 12 00:46:06 crc kubenswrapper[4606]: I1212 00:46:06.992254 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c4accb34-f155-4198-b222-0800ff8755c3-logs\") pod \"glance-default-external-api-0\" (UID: \"c4accb34-f155-4198-b222-0800ff8755c3\") " pod="openstack/glance-default-external-api-0" Dec 12 00:46:06 crc kubenswrapper[4606]: I1212 00:46:06.992630 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c4accb34-f155-4198-b222-0800ff8755c3-logs\") pod \"glance-default-external-api-0\" (UID: \"c4accb34-f155-4198-b222-0800ff8755c3\") " pod="openstack/glance-default-external-api-0" Dec 12 00:46:06 crc kubenswrapper[4606]: I1212 00:46:06.993193 4606 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"c4accb34-f155-4198-b222-0800ff8755c3\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-external-api-0" Dec 12 00:46:06 crc kubenswrapper[4606]: I1212 00:46:06.993438 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c4accb34-f155-4198-b222-0800ff8755c3-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"c4accb34-f155-4198-b222-0800ff8755c3\") " pod="openstack/glance-default-external-api-0" Dec 12 00:46:07 crc kubenswrapper[4606]: I1212 00:46:07.000232 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4accb34-f155-4198-b222-0800ff8755c3-config-data\") pod \"glance-default-external-api-0\" (UID: \"c4accb34-f155-4198-b222-0800ff8755c3\") " pod="openstack/glance-default-external-api-0" Dec 12 00:46:07 crc kubenswrapper[4606]: I1212 00:46:07.002721 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c4accb34-f155-4198-b222-0800ff8755c3-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"c4accb34-f155-4198-b222-0800ff8755c3\") " pod="openstack/glance-default-external-api-0" Dec 12 00:46:07 crc kubenswrapper[4606]: I1212 00:46:07.003124 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c4accb34-f155-4198-b222-0800ff8755c3-scripts\") pod \"glance-default-external-api-0\" (UID: \"c4accb34-f155-4198-b222-0800ff8755c3\") " pod="openstack/glance-default-external-api-0" Dec 12 00:46:07 crc kubenswrapper[4606]: I1212 00:46:07.004100 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4accb34-f155-4198-b222-0800ff8755c3-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"c4accb34-f155-4198-b222-0800ff8755c3\") " pod="openstack/glance-default-external-api-0" Dec 12 00:46:07 crc kubenswrapper[4606]: I1212 00:46:07.019250 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 12 00:46:07 crc kubenswrapper[4606]: I1212 00:46:07.019872 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bsqzl\" (UniqueName: \"kubernetes.io/projected/c4accb34-f155-4198-b222-0800ff8755c3-kube-api-access-bsqzl\") pod \"glance-default-external-api-0\" (UID: \"c4accb34-f155-4198-b222-0800ff8755c3\") " pod="openstack/glance-default-external-api-0" Dec 12 00:46:07 crc kubenswrapper[4606]: I1212 00:46:07.049307 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"c4accb34-f155-4198-b222-0800ff8755c3\") " pod="openstack/glance-default-external-api-0" Dec 12 00:46:07 crc kubenswrapper[4606]: I1212 00:46:07.055851 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 12 00:46:07 crc kubenswrapper[4606]: I1212 00:46:07.714762 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e040a81-ee68-4745-b947-61aa28e33fa7" path="/var/lib/kubelet/pods/8e040a81-ee68-4745-b947-61aa28e33fa7/volumes" Dec 12 00:46:07 crc kubenswrapper[4606]: I1212 00:46:07.716001 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2bccf3c-0850-4ad7-96c0-415f9c6bb56c" path="/var/lib/kubelet/pods/c2bccf3c-0850-4ad7-96c0-415f9c6bb56c/volumes" Dec 12 00:46:07 crc kubenswrapper[4606]: I1212 00:46:07.781135 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 12 00:46:07 crc kubenswrapper[4606]: I1212 00:46:07.911741 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 12 00:46:07 crc kubenswrapper[4606]: W1212 00:46:07.971434 4606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc4accb34_f155_4198_b222_0800ff8755c3.slice/crio-a2183fecdd829489fcbd25eb8052b9d29697ccc0f5f85b55a4f7794ca15bb399 WatchSource:0}: Error finding container a2183fecdd829489fcbd25eb8052b9d29697ccc0f5f85b55a4f7794ca15bb399: Status 404 returned error can't find the container with id a2183fecdd829489fcbd25eb8052b9d29697ccc0f5f85b55a4f7794ca15bb399 Dec 12 00:46:08 crc kubenswrapper[4606]: I1212 00:46:08.547186 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c4accb34-f155-4198-b222-0800ff8755c3","Type":"ContainerStarted","Data":"a2183fecdd829489fcbd25eb8052b9d29697ccc0f5f85b55a4f7794ca15bb399"} Dec 12 00:46:08 crc kubenswrapper[4606]: I1212 00:46:08.555296 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e3dc8210-b17e-45f0-8501-2a545cd4d020","Type":"ContainerStarted","Data":"7a0046cca686cedb5519fc8ec4d0df3d5a83972639ef799f01de8e8d00b4c7b9"} Dec 12 00:46:09 crc kubenswrapper[4606]: I1212 00:46:09.581514 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e3dc8210-b17e-45f0-8501-2a545cd4d020","Type":"ContainerStarted","Data":"ea4874522323d37556a3ca228c6e8c9fd26d6a3c5e91668d929069262ffb5441"} Dec 12 00:46:09 crc kubenswrapper[4606]: I1212 00:46:09.581967 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e3dc8210-b17e-45f0-8501-2a545cd4d020","Type":"ContainerStarted","Data":"6f8ed27d7178a7d8152db04f45d45142e85ee1cea9642100ea6666b984537706"} Dec 12 00:46:09 crc kubenswrapper[4606]: I1212 00:46:09.597901 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c4accb34-f155-4198-b222-0800ff8755c3","Type":"ContainerStarted","Data":"54585af99135294041a5d78a2a9e320dc5ae2f06f493abdfbd3547ad292de90b"} Dec 12 00:46:09 crc kubenswrapper[4606]: I1212 00:46:09.608628 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.6086131630000002 podStartE2EDuration="3.608613163s" podCreationTimestamp="2025-12-12 00:46:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:46:09.60590188 +0000 UTC m=+1360.151254746" watchObservedRunningTime="2025-12-12 00:46:09.608613163 +0000 UTC m=+1360.153966019" Dec 12 00:46:10 crc kubenswrapper[4606]: I1212 00:46:10.611917 4606 generic.go:334] "Generic (PLEG): container finished" podID="5a0b6b98-c743-4435-a967-55c0edb95531" containerID="6123990c2827dad2f196359e8f22bb8b9f9b9d940730041d059d21c3bc9fd5e1" exitCode=0 Dec 12 00:46:10 crc kubenswrapper[4606]: I1212 00:46:10.612004 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-4hg5f" event={"ID":"5a0b6b98-c743-4435-a967-55c0edb95531","Type":"ContainerDied","Data":"6123990c2827dad2f196359e8f22bb8b9f9b9d940730041d059d21c3bc9fd5e1"} Dec 12 00:46:10 crc kubenswrapper[4606]: I1212 00:46:10.615678 4606 generic.go:334] "Generic (PLEG): container finished" podID="73c136c9-e12d-434a-aab1-ed21dfaf0f60" containerID="0e8eece22d2df1260b252eb6bff1082a6dcbf84ce11e6b0f6c0ac4cd3cd3b87a" exitCode=0 Dec 12 00:46:10 crc kubenswrapper[4606]: I1212 00:46:10.615748 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-qmcpn" event={"ID":"73c136c9-e12d-434a-aab1-ed21dfaf0f60","Type":"ContainerDied","Data":"0e8eece22d2df1260b252eb6bff1082a6dcbf84ce11e6b0f6c0ac4cd3cd3b87a"} Dec 12 00:46:10 crc kubenswrapper[4606]: I1212 00:46:10.618754 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c4accb34-f155-4198-b222-0800ff8755c3","Type":"ContainerStarted","Data":"12f4e82554976085fa170a9fcbd902bb2540c05f81b4c6d1b9cbca2a1a5219fb"} Dec 12 00:46:10 crc kubenswrapper[4606]: I1212 00:46:10.658929 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.658908429 podStartE2EDuration="4.658908429s" podCreationTimestamp="2025-12-12 00:46:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:46:10.649876567 +0000 UTC m=+1361.195229433" watchObservedRunningTime="2025-12-12 00:46:10.658908429 +0000 UTC m=+1361.204261295" Dec 12 00:46:11 crc kubenswrapper[4606]: I1212 00:46:11.637726 4606 generic.go:334] "Generic (PLEG): container finished" podID="8ff95d54-4b78-48cb-b8c9-33801a6818f0" containerID="f7c412f15338c35c0d22eec74a9158bf49f2120376a8b6e416c4b366c83906f5" exitCode=0 Dec 12 00:46:11 crc kubenswrapper[4606]: I1212 00:46:11.637962 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-kcwn5" event={"ID":"8ff95d54-4b78-48cb-b8c9-33801a6818f0","Type":"ContainerDied","Data":"f7c412f15338c35c0d22eec74a9158bf49f2120376a8b6e416c4b366c83906f5"} Dec 12 00:46:11 crc kubenswrapper[4606]: I1212 00:46:11.905710 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-79c99578bb-cdgsn" Dec 12 00:46:11 crc kubenswrapper[4606]: I1212 00:46:11.906087 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-79c99578bb-cdgsn" Dec 12 00:46:11 crc kubenswrapper[4606]: I1212 00:46:11.919825 4606 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-79c99578bb-cdgsn" podUID="9ede4720-3fd7-4524-adfc-c1c395f12170" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.148:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.148:8443: connect: connection refused" Dec 12 00:46:12 crc kubenswrapper[4606]: I1212 00:46:12.178404 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-b9fb498f6-62fcc" Dec 12 00:46:12 crc kubenswrapper[4606]: I1212 00:46:12.178487 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-b9fb498f6-62fcc" Dec 12 00:46:12 crc kubenswrapper[4606]: I1212 00:46:12.181390 4606 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-b9fb498f6-62fcc" podUID="e38df57e-1a86-4c45-bf40-6282a6a049ed" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.149:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.149:8443: connect: connection refused" Dec 12 00:46:13 crc kubenswrapper[4606]: I1212 00:46:13.521859 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6b7b667979-qqpvb" Dec 12 00:46:13 crc kubenswrapper[4606]: I1212 00:46:13.582941 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-j2fv2"] Dec 12 00:46:13 crc kubenswrapper[4606]: I1212 00:46:13.583191 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-56df8fb6b7-j2fv2" podUID="39112d38-7887-4e2b-b32f-d679ca162941" containerName="dnsmasq-dns" containerID="cri-o://36fca641890dbd9a9be70fa6d54d9b5aaea44c7d21f2e44a90cbd3b6504661b1" gracePeriod=10 Dec 12 00:46:14 crc kubenswrapper[4606]: I1212 00:46:14.676310 4606 generic.go:334] "Generic (PLEG): container finished" podID="39112d38-7887-4e2b-b32f-d679ca162941" containerID="36fca641890dbd9a9be70fa6d54d9b5aaea44c7d21f2e44a90cbd3b6504661b1" exitCode=0 Dec 12 00:46:14 crc kubenswrapper[4606]: I1212 00:46:14.676354 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-j2fv2" event={"ID":"39112d38-7887-4e2b-b32f-d679ca162941","Type":"ContainerDied","Data":"36fca641890dbd9a9be70fa6d54d9b5aaea44c7d21f2e44a90cbd3b6504661b1"} Dec 12 00:46:14 crc kubenswrapper[4606]: I1212 00:46:14.978138 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-4hg5f" Dec 12 00:46:15 crc kubenswrapper[4606]: I1212 00:46:15.067394 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a0b6b98-c743-4435-a967-55c0edb95531-config-data\") pod \"5a0b6b98-c743-4435-a967-55c0edb95531\" (UID: \"5a0b6b98-c743-4435-a967-55c0edb95531\") " Dec 12 00:46:15 crc kubenswrapper[4606]: I1212 00:46:15.067472 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a0b6b98-c743-4435-a967-55c0edb95531-combined-ca-bundle\") pod \"5a0b6b98-c743-4435-a967-55c0edb95531\" (UID: \"5a0b6b98-c743-4435-a967-55c0edb95531\") " Dec 12 00:46:15 crc kubenswrapper[4606]: I1212 00:46:15.067504 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5a0b6b98-c743-4435-a967-55c0edb95531-logs\") pod \"5a0b6b98-c743-4435-a967-55c0edb95531\" (UID: \"5a0b6b98-c743-4435-a967-55c0edb95531\") " Dec 12 00:46:15 crc kubenswrapper[4606]: I1212 00:46:15.067573 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a0b6b98-c743-4435-a967-55c0edb95531-scripts\") pod \"5a0b6b98-c743-4435-a967-55c0edb95531\" (UID: \"5a0b6b98-c743-4435-a967-55c0edb95531\") " Dec 12 00:46:15 crc kubenswrapper[4606]: I1212 00:46:15.067658 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v2cll\" (UniqueName: \"kubernetes.io/projected/5a0b6b98-c743-4435-a967-55c0edb95531-kube-api-access-v2cll\") pod \"5a0b6b98-c743-4435-a967-55c0edb95531\" (UID: \"5a0b6b98-c743-4435-a967-55c0edb95531\") " Dec 12 00:46:15 crc kubenswrapper[4606]: I1212 00:46:15.068860 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a0b6b98-c743-4435-a967-55c0edb95531-logs" (OuterVolumeSpecName: "logs") pod "5a0b6b98-c743-4435-a967-55c0edb95531" (UID: "5a0b6b98-c743-4435-a967-55c0edb95531"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 00:46:15 crc kubenswrapper[4606]: I1212 00:46:15.074927 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a0b6b98-c743-4435-a967-55c0edb95531-scripts" (OuterVolumeSpecName: "scripts") pod "5a0b6b98-c743-4435-a967-55c0edb95531" (UID: "5a0b6b98-c743-4435-a967-55c0edb95531"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:46:15 crc kubenswrapper[4606]: I1212 00:46:15.087109 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a0b6b98-c743-4435-a967-55c0edb95531-kube-api-access-v2cll" (OuterVolumeSpecName: "kube-api-access-v2cll") pod "5a0b6b98-c743-4435-a967-55c0edb95531" (UID: "5a0b6b98-c743-4435-a967-55c0edb95531"). InnerVolumeSpecName "kube-api-access-v2cll". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:46:15 crc kubenswrapper[4606]: I1212 00:46:15.114743 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a0b6b98-c743-4435-a967-55c0edb95531-config-data" (OuterVolumeSpecName: "config-data") pod "5a0b6b98-c743-4435-a967-55c0edb95531" (UID: "5a0b6b98-c743-4435-a967-55c0edb95531"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:46:15 crc kubenswrapper[4606]: I1212 00:46:15.121427 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a0b6b98-c743-4435-a967-55c0edb95531-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5a0b6b98-c743-4435-a967-55c0edb95531" (UID: "5a0b6b98-c743-4435-a967-55c0edb95531"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:46:15 crc kubenswrapper[4606]: I1212 00:46:15.169304 4606 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a0b6b98-c743-4435-a967-55c0edb95531-config-data\") on node \"crc\" DevicePath \"\"" Dec 12 00:46:15 crc kubenswrapper[4606]: I1212 00:46:15.169629 4606 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a0b6b98-c743-4435-a967-55c0edb95531-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 12 00:46:15 crc kubenswrapper[4606]: I1212 00:46:15.169851 4606 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5a0b6b98-c743-4435-a967-55c0edb95531-logs\") on node \"crc\" DevicePath \"\"" Dec 12 00:46:15 crc kubenswrapper[4606]: I1212 00:46:15.169915 4606 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a0b6b98-c743-4435-a967-55c0edb95531-scripts\") on node \"crc\" DevicePath \"\"" Dec 12 00:46:15 crc kubenswrapper[4606]: I1212 00:46:15.169970 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v2cll\" (UniqueName: \"kubernetes.io/projected/5a0b6b98-c743-4435-a967-55c0edb95531-kube-api-access-v2cll\") on node \"crc\" DevicePath \"\"" Dec 12 00:46:15 crc kubenswrapper[4606]: I1212 00:46:15.707658 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-4hg5f" Dec 12 00:46:15 crc kubenswrapper[4606]: I1212 00:46:15.711204 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-4hg5f" event={"ID":"5a0b6b98-c743-4435-a967-55c0edb95531","Type":"ContainerDied","Data":"b722a345ecaf7177444d77904c2aa47c636550c83d26daaee6c1a11d15b11cb6"} Dec 12 00:46:15 crc kubenswrapper[4606]: I1212 00:46:15.711235 4606 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b722a345ecaf7177444d77904c2aa47c636550c83d26daaee6c1a11d15b11cb6" Dec 12 00:46:16 crc kubenswrapper[4606]: I1212 00:46:16.163362 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-9f4869f76-gqjw4"] Dec 12 00:46:16 crc kubenswrapper[4606]: E1212 00:46:16.195611 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a0b6b98-c743-4435-a967-55c0edb95531" containerName="placement-db-sync" Dec 12 00:46:16 crc kubenswrapper[4606]: I1212 00:46:16.195649 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a0b6b98-c743-4435-a967-55c0edb95531" containerName="placement-db-sync" Dec 12 00:46:16 crc kubenswrapper[4606]: I1212 00:46:16.195965 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a0b6b98-c743-4435-a967-55c0edb95531" containerName="placement-db-sync" Dec 12 00:46:16 crc kubenswrapper[4606]: I1212 00:46:16.197077 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-9f4869f76-gqjw4" Dec 12 00:46:16 crc kubenswrapper[4606]: I1212 00:46:16.198832 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-9f4869f76-gqjw4"] Dec 12 00:46:16 crc kubenswrapper[4606]: I1212 00:46:16.206428 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Dec 12 00:46:16 crc kubenswrapper[4606]: I1212 00:46:16.206628 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 12 00:46:16 crc kubenswrapper[4606]: I1212 00:46:16.206741 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-lvj2f" Dec 12 00:46:16 crc kubenswrapper[4606]: I1212 00:46:16.206969 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Dec 12 00:46:16 crc kubenswrapper[4606]: I1212 00:46:16.207079 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 12 00:46:16 crc kubenswrapper[4606]: I1212 00:46:16.308523 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ad12b18-e66a-4871-9a92-39e75070b4fb-config-data\") pod \"placement-9f4869f76-gqjw4\" (UID: \"1ad12b18-e66a-4871-9a92-39e75070b4fb\") " pod="openstack/placement-9f4869f76-gqjw4" Dec 12 00:46:16 crc kubenswrapper[4606]: I1212 00:46:16.308568 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ad12b18-e66a-4871-9a92-39e75070b4fb-internal-tls-certs\") pod \"placement-9f4869f76-gqjw4\" (UID: \"1ad12b18-e66a-4871-9a92-39e75070b4fb\") " pod="openstack/placement-9f4869f76-gqjw4" Dec 12 00:46:16 crc kubenswrapper[4606]: I1212 00:46:16.308607 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ad12b18-e66a-4871-9a92-39e75070b4fb-public-tls-certs\") pod \"placement-9f4869f76-gqjw4\" (UID: \"1ad12b18-e66a-4871-9a92-39e75070b4fb\") " pod="openstack/placement-9f4869f76-gqjw4" Dec 12 00:46:16 crc kubenswrapper[4606]: I1212 00:46:16.308693 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4f2lm\" (UniqueName: \"kubernetes.io/projected/1ad12b18-e66a-4871-9a92-39e75070b4fb-kube-api-access-4f2lm\") pod \"placement-9f4869f76-gqjw4\" (UID: \"1ad12b18-e66a-4871-9a92-39e75070b4fb\") " pod="openstack/placement-9f4869f76-gqjw4" Dec 12 00:46:16 crc kubenswrapper[4606]: I1212 00:46:16.308731 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ad12b18-e66a-4871-9a92-39e75070b4fb-scripts\") pod \"placement-9f4869f76-gqjw4\" (UID: \"1ad12b18-e66a-4871-9a92-39e75070b4fb\") " pod="openstack/placement-9f4869f76-gqjw4" Dec 12 00:46:16 crc kubenswrapper[4606]: I1212 00:46:16.308773 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ad12b18-e66a-4871-9a92-39e75070b4fb-combined-ca-bundle\") pod \"placement-9f4869f76-gqjw4\" (UID: \"1ad12b18-e66a-4871-9a92-39e75070b4fb\") " pod="openstack/placement-9f4869f76-gqjw4" Dec 12 00:46:16 crc kubenswrapper[4606]: I1212 00:46:16.308798 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1ad12b18-e66a-4871-9a92-39e75070b4fb-logs\") pod \"placement-9f4869f76-gqjw4\" (UID: \"1ad12b18-e66a-4871-9a92-39e75070b4fb\") " pod="openstack/placement-9f4869f76-gqjw4" Dec 12 00:46:16 crc kubenswrapper[4606]: I1212 00:46:16.409861 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ad12b18-e66a-4871-9a92-39e75070b4fb-combined-ca-bundle\") pod \"placement-9f4869f76-gqjw4\" (UID: \"1ad12b18-e66a-4871-9a92-39e75070b4fb\") " pod="openstack/placement-9f4869f76-gqjw4" Dec 12 00:46:16 crc kubenswrapper[4606]: I1212 00:46:16.409906 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1ad12b18-e66a-4871-9a92-39e75070b4fb-logs\") pod \"placement-9f4869f76-gqjw4\" (UID: \"1ad12b18-e66a-4871-9a92-39e75070b4fb\") " pod="openstack/placement-9f4869f76-gqjw4" Dec 12 00:46:16 crc kubenswrapper[4606]: I1212 00:46:16.409969 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ad12b18-e66a-4871-9a92-39e75070b4fb-config-data\") pod \"placement-9f4869f76-gqjw4\" (UID: \"1ad12b18-e66a-4871-9a92-39e75070b4fb\") " pod="openstack/placement-9f4869f76-gqjw4" Dec 12 00:46:16 crc kubenswrapper[4606]: I1212 00:46:16.409989 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ad12b18-e66a-4871-9a92-39e75070b4fb-internal-tls-certs\") pod \"placement-9f4869f76-gqjw4\" (UID: \"1ad12b18-e66a-4871-9a92-39e75070b4fb\") " pod="openstack/placement-9f4869f76-gqjw4" Dec 12 00:46:16 crc kubenswrapper[4606]: I1212 00:46:16.410016 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ad12b18-e66a-4871-9a92-39e75070b4fb-public-tls-certs\") pod \"placement-9f4869f76-gqjw4\" (UID: \"1ad12b18-e66a-4871-9a92-39e75070b4fb\") " pod="openstack/placement-9f4869f76-gqjw4" Dec 12 00:46:16 crc kubenswrapper[4606]: I1212 00:46:16.410086 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4f2lm\" (UniqueName: \"kubernetes.io/projected/1ad12b18-e66a-4871-9a92-39e75070b4fb-kube-api-access-4f2lm\") pod \"placement-9f4869f76-gqjw4\" (UID: \"1ad12b18-e66a-4871-9a92-39e75070b4fb\") " pod="openstack/placement-9f4869f76-gqjw4" Dec 12 00:46:16 crc kubenswrapper[4606]: I1212 00:46:16.410114 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ad12b18-e66a-4871-9a92-39e75070b4fb-scripts\") pod \"placement-9f4869f76-gqjw4\" (UID: \"1ad12b18-e66a-4871-9a92-39e75070b4fb\") " pod="openstack/placement-9f4869f76-gqjw4" Dec 12 00:46:16 crc kubenswrapper[4606]: I1212 00:46:16.410858 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1ad12b18-e66a-4871-9a92-39e75070b4fb-logs\") pod \"placement-9f4869f76-gqjw4\" (UID: \"1ad12b18-e66a-4871-9a92-39e75070b4fb\") " pod="openstack/placement-9f4869f76-gqjw4" Dec 12 00:46:16 crc kubenswrapper[4606]: I1212 00:46:16.415666 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ad12b18-e66a-4871-9a92-39e75070b4fb-scripts\") pod \"placement-9f4869f76-gqjw4\" (UID: \"1ad12b18-e66a-4871-9a92-39e75070b4fb\") " pod="openstack/placement-9f4869f76-gqjw4" Dec 12 00:46:16 crc kubenswrapper[4606]: I1212 00:46:16.415740 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ad12b18-e66a-4871-9a92-39e75070b4fb-internal-tls-certs\") pod \"placement-9f4869f76-gqjw4\" (UID: \"1ad12b18-e66a-4871-9a92-39e75070b4fb\") " pod="openstack/placement-9f4869f76-gqjw4" Dec 12 00:46:16 crc kubenswrapper[4606]: I1212 00:46:16.416104 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ad12b18-e66a-4871-9a92-39e75070b4fb-combined-ca-bundle\") pod \"placement-9f4869f76-gqjw4\" (UID: \"1ad12b18-e66a-4871-9a92-39e75070b4fb\") " pod="openstack/placement-9f4869f76-gqjw4" Dec 12 00:46:16 crc kubenswrapper[4606]: I1212 00:46:16.418243 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ad12b18-e66a-4871-9a92-39e75070b4fb-public-tls-certs\") pod \"placement-9f4869f76-gqjw4\" (UID: \"1ad12b18-e66a-4871-9a92-39e75070b4fb\") " pod="openstack/placement-9f4869f76-gqjw4" Dec 12 00:46:16 crc kubenswrapper[4606]: I1212 00:46:16.425984 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ad12b18-e66a-4871-9a92-39e75070b4fb-config-data\") pod \"placement-9f4869f76-gqjw4\" (UID: \"1ad12b18-e66a-4871-9a92-39e75070b4fb\") " pod="openstack/placement-9f4869f76-gqjw4" Dec 12 00:46:16 crc kubenswrapper[4606]: I1212 00:46:16.433977 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4f2lm\" (UniqueName: \"kubernetes.io/projected/1ad12b18-e66a-4871-9a92-39e75070b4fb-kube-api-access-4f2lm\") pod \"placement-9f4869f76-gqjw4\" (UID: \"1ad12b18-e66a-4871-9a92-39e75070b4fb\") " pod="openstack/placement-9f4869f76-gqjw4" Dec 12 00:46:16 crc kubenswrapper[4606]: I1212 00:46:16.645339 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-9f4869f76-gqjw4" Dec 12 00:46:17 crc kubenswrapper[4606]: I1212 00:46:17.020297 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 12 00:46:17 crc kubenswrapper[4606]: I1212 00:46:17.020590 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 12 00:46:17 crc kubenswrapper[4606]: I1212 00:46:17.060719 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 12 00:46:17 crc kubenswrapper[4606]: I1212 00:46:17.060771 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 12 00:46:17 crc kubenswrapper[4606]: I1212 00:46:17.097873 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 12 00:46:17 crc kubenswrapper[4606]: I1212 00:46:17.149200 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 12 00:46:17 crc kubenswrapper[4606]: I1212 00:46:17.174967 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 12 00:46:17 crc kubenswrapper[4606]: I1212 00:46:17.184559 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 12 00:46:17 crc kubenswrapper[4606]: I1212 00:46:17.721947 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 12 00:46:17 crc kubenswrapper[4606]: I1212 00:46:17.722009 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 12 00:46:17 crc kubenswrapper[4606]: I1212 00:46:17.722508 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 12 00:46:17 crc kubenswrapper[4606]: I1212 00:46:17.722679 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 12 00:46:18 crc kubenswrapper[4606]: I1212 00:46:18.548927 4606 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-56df8fb6b7-j2fv2" podUID="39112d38-7887-4e2b-b32f-d679ca162941" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.140:5353: connect: connection refused" Dec 12 00:46:18 crc kubenswrapper[4606]: I1212 00:46:18.784836 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-qmcpn" event={"ID":"73c136c9-e12d-434a-aab1-ed21dfaf0f60","Type":"ContainerDied","Data":"3e58127df5cf06183dade2f6cb5c9d87ad85ab2f0678f7317084d0d6caec9458"} Dec 12 00:46:18 crc kubenswrapper[4606]: I1212 00:46:18.785076 4606 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3e58127df5cf06183dade2f6cb5c9d87ad85ab2f0678f7317084d0d6caec9458" Dec 12 00:46:18 crc kubenswrapper[4606]: I1212 00:46:18.785771 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-qmcpn" Dec 12 00:46:18 crc kubenswrapper[4606]: I1212 00:46:18.791519 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-kcwn5" Dec 12 00:46:18 crc kubenswrapper[4606]: I1212 00:46:18.795209 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-kcwn5" event={"ID":"8ff95d54-4b78-48cb-b8c9-33801a6818f0","Type":"ContainerDied","Data":"6a8d99cbd3f26c6e5854367af42e1ce9ba21770fc02cad0e163f6b6760344294"} Dec 12 00:46:18 crc kubenswrapper[4606]: I1212 00:46:18.795250 4606 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6a8d99cbd3f26c6e5854367af42e1ce9ba21770fc02cad0e163f6b6760344294" Dec 12 00:46:18 crc kubenswrapper[4606]: I1212 00:46:18.928443 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-j2fv2" Dec 12 00:46:18 crc kubenswrapper[4606]: I1212 00:46:18.998908 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8ff95d54-4b78-48cb-b8c9-33801a6818f0-credential-keys\") pod \"8ff95d54-4b78-48cb-b8c9-33801a6818f0\" (UID: \"8ff95d54-4b78-48cb-b8c9-33801a6818f0\") " Dec 12 00:46:18 crc kubenswrapper[4606]: I1212 00:46:18.999030 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gnl4g\" (UniqueName: \"kubernetes.io/projected/8ff95d54-4b78-48cb-b8c9-33801a6818f0-kube-api-access-gnl4g\") pod \"8ff95d54-4b78-48cb-b8c9-33801a6818f0\" (UID: \"8ff95d54-4b78-48cb-b8c9-33801a6818f0\") " Dec 12 00:46:18 crc kubenswrapper[4606]: I1212 00:46:18.999077 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7mm82\" (UniqueName: \"kubernetes.io/projected/73c136c9-e12d-434a-aab1-ed21dfaf0f60-kube-api-access-7mm82\") pod \"73c136c9-e12d-434a-aab1-ed21dfaf0f60\" (UID: \"73c136c9-e12d-434a-aab1-ed21dfaf0f60\") " Dec 12 00:46:18 crc kubenswrapper[4606]: I1212 00:46:18.999097 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73c136c9-e12d-434a-aab1-ed21dfaf0f60-combined-ca-bundle\") pod \"73c136c9-e12d-434a-aab1-ed21dfaf0f60\" (UID: \"73c136c9-e12d-434a-aab1-ed21dfaf0f60\") " Dec 12 00:46:18 crc kubenswrapper[4606]: I1212 00:46:18.999133 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/73c136c9-e12d-434a-aab1-ed21dfaf0f60-db-sync-config-data\") pod \"73c136c9-e12d-434a-aab1-ed21dfaf0f60\" (UID: \"73c136c9-e12d-434a-aab1-ed21dfaf0f60\") " Dec 12 00:46:18 crc kubenswrapper[4606]: I1212 00:46:18.999147 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ff95d54-4b78-48cb-b8c9-33801a6818f0-scripts\") pod \"8ff95d54-4b78-48cb-b8c9-33801a6818f0\" (UID: \"8ff95d54-4b78-48cb-b8c9-33801a6818f0\") " Dec 12 00:46:18 crc kubenswrapper[4606]: I1212 00:46:18.999198 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8ff95d54-4b78-48cb-b8c9-33801a6818f0-fernet-keys\") pod \"8ff95d54-4b78-48cb-b8c9-33801a6818f0\" (UID: \"8ff95d54-4b78-48cb-b8c9-33801a6818f0\") " Dec 12 00:46:18 crc kubenswrapper[4606]: I1212 00:46:18.999234 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ff95d54-4b78-48cb-b8c9-33801a6818f0-config-data\") pod \"8ff95d54-4b78-48cb-b8c9-33801a6818f0\" (UID: \"8ff95d54-4b78-48cb-b8c9-33801a6818f0\") " Dec 12 00:46:18 crc kubenswrapper[4606]: I1212 00:46:18.999290 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ff95d54-4b78-48cb-b8c9-33801a6818f0-combined-ca-bundle\") pod \"8ff95d54-4b78-48cb-b8c9-33801a6818f0\" (UID: \"8ff95d54-4b78-48cb-b8c9-33801a6818f0\") " Dec 12 00:46:19 crc kubenswrapper[4606]: I1212 00:46:19.010471 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ff95d54-4b78-48cb-b8c9-33801a6818f0-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "8ff95d54-4b78-48cb-b8c9-33801a6818f0" (UID: "8ff95d54-4b78-48cb-b8c9-33801a6818f0"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:46:19 crc kubenswrapper[4606]: I1212 00:46:19.010528 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73c136c9-e12d-434a-aab1-ed21dfaf0f60-kube-api-access-7mm82" (OuterVolumeSpecName: "kube-api-access-7mm82") pod "73c136c9-e12d-434a-aab1-ed21dfaf0f60" (UID: "73c136c9-e12d-434a-aab1-ed21dfaf0f60"). InnerVolumeSpecName "kube-api-access-7mm82". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:46:19 crc kubenswrapper[4606]: I1212 00:46:19.012307 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ff95d54-4b78-48cb-b8c9-33801a6818f0-kube-api-access-gnl4g" (OuterVolumeSpecName: "kube-api-access-gnl4g") pod "8ff95d54-4b78-48cb-b8c9-33801a6818f0" (UID: "8ff95d54-4b78-48cb-b8c9-33801a6818f0"). InnerVolumeSpecName "kube-api-access-gnl4g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:46:19 crc kubenswrapper[4606]: I1212 00:46:19.015898 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73c136c9-e12d-434a-aab1-ed21dfaf0f60-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "73c136c9-e12d-434a-aab1-ed21dfaf0f60" (UID: "73c136c9-e12d-434a-aab1-ed21dfaf0f60"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:46:19 crc kubenswrapper[4606]: I1212 00:46:19.026818 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ff95d54-4b78-48cb-b8c9-33801a6818f0-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "8ff95d54-4b78-48cb-b8c9-33801a6818f0" (UID: "8ff95d54-4b78-48cb-b8c9-33801a6818f0"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:46:19 crc kubenswrapper[4606]: I1212 00:46:19.031353 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ff95d54-4b78-48cb-b8c9-33801a6818f0-scripts" (OuterVolumeSpecName: "scripts") pod "8ff95d54-4b78-48cb-b8c9-33801a6818f0" (UID: "8ff95d54-4b78-48cb-b8c9-33801a6818f0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:46:19 crc kubenswrapper[4606]: I1212 00:46:19.065582 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73c136c9-e12d-434a-aab1-ed21dfaf0f60-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "73c136c9-e12d-434a-aab1-ed21dfaf0f60" (UID: "73c136c9-e12d-434a-aab1-ed21dfaf0f60"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:46:19 crc kubenswrapper[4606]: E1212 00:46:19.067514 4606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8ff95d54-4b78-48cb-b8c9-33801a6818f0-combined-ca-bundle podName:8ff95d54-4b78-48cb-b8c9-33801a6818f0 nodeName:}" failed. No retries permitted until 2025-12-12 00:46:19.567487121 +0000 UTC m=+1370.112839987 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/8ff95d54-4b78-48cb-b8c9-33801a6818f0-combined-ca-bundle") pod "8ff95d54-4b78-48cb-b8c9-33801a6818f0" (UID: "8ff95d54-4b78-48cb-b8c9-33801a6818f0") : error deleting /var/lib/kubelet/pods/8ff95d54-4b78-48cb-b8c9-33801a6818f0/volume-subpaths: remove /var/lib/kubelet/pods/8ff95d54-4b78-48cb-b8c9-33801a6818f0/volume-subpaths: no such file or directory Dec 12 00:46:19 crc kubenswrapper[4606]: I1212 00:46:19.070228 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ff95d54-4b78-48cb-b8c9-33801a6818f0-config-data" (OuterVolumeSpecName: "config-data") pod "8ff95d54-4b78-48cb-b8c9-33801a6818f0" (UID: "8ff95d54-4b78-48cb-b8c9-33801a6818f0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:46:19 crc kubenswrapper[4606]: I1212 00:46:19.101780 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/39112d38-7887-4e2b-b32f-d679ca162941-ovsdbserver-sb\") pod \"39112d38-7887-4e2b-b32f-d679ca162941\" (UID: \"39112d38-7887-4e2b-b32f-d679ca162941\") " Dec 12 00:46:19 crc kubenswrapper[4606]: I1212 00:46:19.101869 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39112d38-7887-4e2b-b32f-d679ca162941-config\") pod \"39112d38-7887-4e2b-b32f-d679ca162941\" (UID: \"39112d38-7887-4e2b-b32f-d679ca162941\") " Dec 12 00:46:19 crc kubenswrapper[4606]: I1212 00:46:19.101927 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/39112d38-7887-4e2b-b32f-d679ca162941-dns-swift-storage-0\") pod \"39112d38-7887-4e2b-b32f-d679ca162941\" (UID: \"39112d38-7887-4e2b-b32f-d679ca162941\") " Dec 12 00:46:19 crc kubenswrapper[4606]: I1212 00:46:19.101976 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/39112d38-7887-4e2b-b32f-d679ca162941-ovsdbserver-nb\") pod \"39112d38-7887-4e2b-b32f-d679ca162941\" (UID: \"39112d38-7887-4e2b-b32f-d679ca162941\") " Dec 12 00:46:19 crc kubenswrapper[4606]: I1212 00:46:19.102094 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9dz6w\" (UniqueName: \"kubernetes.io/projected/39112d38-7887-4e2b-b32f-d679ca162941-kube-api-access-9dz6w\") pod \"39112d38-7887-4e2b-b32f-d679ca162941\" (UID: \"39112d38-7887-4e2b-b32f-d679ca162941\") " Dec 12 00:46:19 crc kubenswrapper[4606]: I1212 00:46:19.102138 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/39112d38-7887-4e2b-b32f-d679ca162941-dns-svc\") pod \"39112d38-7887-4e2b-b32f-d679ca162941\" (UID: \"39112d38-7887-4e2b-b32f-d679ca162941\") " Dec 12 00:46:19 crc kubenswrapper[4606]: I1212 00:46:19.102718 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gnl4g\" (UniqueName: \"kubernetes.io/projected/8ff95d54-4b78-48cb-b8c9-33801a6818f0-kube-api-access-gnl4g\") on node \"crc\" DevicePath \"\"" Dec 12 00:46:19 crc kubenswrapper[4606]: I1212 00:46:19.102740 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7mm82\" (UniqueName: \"kubernetes.io/projected/73c136c9-e12d-434a-aab1-ed21dfaf0f60-kube-api-access-7mm82\") on node \"crc\" DevicePath \"\"" Dec 12 00:46:19 crc kubenswrapper[4606]: I1212 00:46:19.102753 4606 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73c136c9-e12d-434a-aab1-ed21dfaf0f60-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 12 00:46:19 crc kubenswrapper[4606]: I1212 00:46:19.102765 4606 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/73c136c9-e12d-434a-aab1-ed21dfaf0f60-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 12 00:46:19 crc kubenswrapper[4606]: I1212 00:46:19.102778 4606 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ff95d54-4b78-48cb-b8c9-33801a6818f0-scripts\") on node \"crc\" DevicePath \"\"" Dec 12 00:46:19 crc kubenswrapper[4606]: I1212 00:46:19.102790 4606 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8ff95d54-4b78-48cb-b8c9-33801a6818f0-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 12 00:46:19 crc kubenswrapper[4606]: I1212 00:46:19.102800 4606 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ff95d54-4b78-48cb-b8c9-33801a6818f0-config-data\") on node \"crc\" DevicePath \"\"" Dec 12 00:46:19 crc kubenswrapper[4606]: I1212 00:46:19.102815 4606 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8ff95d54-4b78-48cb-b8c9-33801a6818f0-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 12 00:46:19 crc kubenswrapper[4606]: I1212 00:46:19.142562 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39112d38-7887-4e2b-b32f-d679ca162941-kube-api-access-9dz6w" (OuterVolumeSpecName: "kube-api-access-9dz6w") pod "39112d38-7887-4e2b-b32f-d679ca162941" (UID: "39112d38-7887-4e2b-b32f-d679ca162941"). InnerVolumeSpecName "kube-api-access-9dz6w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:46:19 crc kubenswrapper[4606]: I1212 00:46:19.177802 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39112d38-7887-4e2b-b32f-d679ca162941-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "39112d38-7887-4e2b-b32f-d679ca162941" (UID: "39112d38-7887-4e2b-b32f-d679ca162941"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:46:19 crc kubenswrapper[4606]: I1212 00:46:19.194934 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39112d38-7887-4e2b-b32f-d679ca162941-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "39112d38-7887-4e2b-b32f-d679ca162941" (UID: "39112d38-7887-4e2b-b32f-d679ca162941"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:46:19 crc kubenswrapper[4606]: I1212 00:46:19.208021 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39112d38-7887-4e2b-b32f-d679ca162941-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "39112d38-7887-4e2b-b32f-d679ca162941" (UID: "39112d38-7887-4e2b-b32f-d679ca162941"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:46:19 crc kubenswrapper[4606]: I1212 00:46:19.213596 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39112d38-7887-4e2b-b32f-d679ca162941-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "39112d38-7887-4e2b-b32f-d679ca162941" (UID: "39112d38-7887-4e2b-b32f-d679ca162941"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:46:19 crc kubenswrapper[4606]: I1212 00:46:19.214754 4606 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/39112d38-7887-4e2b-b32f-d679ca162941-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 12 00:46:19 crc kubenswrapper[4606]: I1212 00:46:19.214779 4606 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/39112d38-7887-4e2b-b32f-d679ca162941-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 12 00:46:19 crc kubenswrapper[4606]: I1212 00:46:19.214789 4606 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/39112d38-7887-4e2b-b32f-d679ca162941-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 12 00:46:19 crc kubenswrapper[4606]: I1212 00:46:19.214797 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9dz6w\" (UniqueName: \"kubernetes.io/projected/39112d38-7887-4e2b-b32f-d679ca162941-kube-api-access-9dz6w\") on node \"crc\" DevicePath \"\"" Dec 12 00:46:19 crc kubenswrapper[4606]: I1212 00:46:19.214915 4606 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/39112d38-7887-4e2b-b32f-d679ca162941-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 12 00:46:19 crc kubenswrapper[4606]: I1212 00:46:19.233652 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-9f4869f76-gqjw4"] Dec 12 00:46:19 crc kubenswrapper[4606]: I1212 00:46:19.270009 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39112d38-7887-4e2b-b32f-d679ca162941-config" (OuterVolumeSpecName: "config") pod "39112d38-7887-4e2b-b32f-d679ca162941" (UID: "39112d38-7887-4e2b-b32f-d679ca162941"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:46:19 crc kubenswrapper[4606]: I1212 00:46:19.318533 4606 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39112d38-7887-4e2b-b32f-d679ca162941-config\") on node \"crc\" DevicePath \"\"" Dec 12 00:46:19 crc kubenswrapper[4606]: I1212 00:46:19.626778 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ff95d54-4b78-48cb-b8c9-33801a6818f0-combined-ca-bundle\") pod \"8ff95d54-4b78-48cb-b8c9-33801a6818f0\" (UID: \"8ff95d54-4b78-48cb-b8c9-33801a6818f0\") " Dec 12 00:46:19 crc kubenswrapper[4606]: I1212 00:46:19.641715 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ff95d54-4b78-48cb-b8c9-33801a6818f0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8ff95d54-4b78-48cb-b8c9-33801a6818f0" (UID: "8ff95d54-4b78-48cb-b8c9-33801a6818f0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:46:19 crc kubenswrapper[4606]: I1212 00:46:19.728671 4606 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ff95d54-4b78-48cb-b8c9-33801a6818f0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 12 00:46:19 crc kubenswrapper[4606]: I1212 00:46:19.810928 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-j2fv2" event={"ID":"39112d38-7887-4e2b-b32f-d679ca162941","Type":"ContainerDied","Data":"c9003bd8eb2677005882dce3843f74353f40d5852fa80cd621a61efcc5166985"} Dec 12 00:46:19 crc kubenswrapper[4606]: I1212 00:46:19.810987 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-j2fv2" Dec 12 00:46:19 crc kubenswrapper[4606]: I1212 00:46:19.810994 4606 scope.go:117] "RemoveContainer" containerID="36fca641890dbd9a9be70fa6d54d9b5aaea44c7d21f2e44a90cbd3b6504661b1" Dec 12 00:46:19 crc kubenswrapper[4606]: I1212 00:46:19.814248 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-9f4869f76-gqjw4" event={"ID":"1ad12b18-e66a-4871-9a92-39e75070b4fb","Type":"ContainerStarted","Data":"66852fb04482f3130a9a3f1456d69eeed57aee9261f71d74aa725ef1345ab0bc"} Dec 12 00:46:19 crc kubenswrapper[4606]: I1212 00:46:19.814469 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-9f4869f76-gqjw4" event={"ID":"1ad12b18-e66a-4871-9a92-39e75070b4fb","Type":"ContainerStarted","Data":"9a880b1f9332e5e71f62d2fb341db53eb2f492c2e106385e137d8df2f9fdd5c9"} Dec 12 00:46:19 crc kubenswrapper[4606]: I1212 00:46:19.822778 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-qmcpn" Dec 12 00:46:19 crc kubenswrapper[4606]: I1212 00:46:19.823568 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"88bd7935-19a0-486d-b1e7-4737abcf21ab","Type":"ContainerStarted","Data":"50ac6e7844b43e9e559ddc58160fae5e9b534b89c917ad342b3072a675f38fc5"} Dec 12 00:46:19 crc kubenswrapper[4606]: I1212 00:46:19.823691 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-kcwn5" Dec 12 00:46:19 crc kubenswrapper[4606]: I1212 00:46:19.926773 4606 scope.go:117] "RemoveContainer" containerID="527eee0fb28cbe9a028b32af3bdd23b9100c7cb7378cbd7337cf61c4a6455a68" Dec 12 00:46:19 crc kubenswrapper[4606]: I1212 00:46:19.953047 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-j2fv2"] Dec 12 00:46:20 crc kubenswrapper[4606]: I1212 00:46:20.074632 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-j2fv2"] Dec 12 00:46:20 crc kubenswrapper[4606]: I1212 00:46:20.097119 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-76b48998b8-ff8r8"] Dec 12 00:46:20 crc kubenswrapper[4606]: E1212 00:46:20.097511 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39112d38-7887-4e2b-b32f-d679ca162941" containerName="init" Dec 12 00:46:20 crc kubenswrapper[4606]: I1212 00:46:20.097523 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="39112d38-7887-4e2b-b32f-d679ca162941" containerName="init" Dec 12 00:46:20 crc kubenswrapper[4606]: E1212 00:46:20.097537 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39112d38-7887-4e2b-b32f-d679ca162941" containerName="dnsmasq-dns" Dec 12 00:46:20 crc kubenswrapper[4606]: I1212 00:46:20.097543 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="39112d38-7887-4e2b-b32f-d679ca162941" containerName="dnsmasq-dns" Dec 12 00:46:20 crc kubenswrapper[4606]: E1212 00:46:20.097558 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ff95d54-4b78-48cb-b8c9-33801a6818f0" containerName="keystone-bootstrap" Dec 12 00:46:20 crc kubenswrapper[4606]: I1212 00:46:20.097575 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ff95d54-4b78-48cb-b8c9-33801a6818f0" containerName="keystone-bootstrap" Dec 12 00:46:20 crc kubenswrapper[4606]: E1212 00:46:20.097595 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73c136c9-e12d-434a-aab1-ed21dfaf0f60" containerName="barbican-db-sync" Dec 12 00:46:20 crc kubenswrapper[4606]: I1212 00:46:20.097600 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="73c136c9-e12d-434a-aab1-ed21dfaf0f60" containerName="barbican-db-sync" Dec 12 00:46:20 crc kubenswrapper[4606]: I1212 00:46:20.097766 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="73c136c9-e12d-434a-aab1-ed21dfaf0f60" containerName="barbican-db-sync" Dec 12 00:46:20 crc kubenswrapper[4606]: I1212 00:46:20.097784 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ff95d54-4b78-48cb-b8c9-33801a6818f0" containerName="keystone-bootstrap" Dec 12 00:46:20 crc kubenswrapper[4606]: I1212 00:46:20.097800 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="39112d38-7887-4e2b-b32f-d679ca162941" containerName="dnsmasq-dns" Dec 12 00:46:20 crc kubenswrapper[4606]: I1212 00:46:20.098505 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-76b48998b8-ff8r8" Dec 12 00:46:20 crc kubenswrapper[4606]: I1212 00:46:20.103452 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 12 00:46:20 crc kubenswrapper[4606]: I1212 00:46:20.103771 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Dec 12 00:46:20 crc kubenswrapper[4606]: I1212 00:46:20.104010 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 12 00:46:20 crc kubenswrapper[4606]: I1212 00:46:20.104217 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 12 00:46:20 crc kubenswrapper[4606]: I1212 00:46:20.104426 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-wkdqx" Dec 12 00:46:20 crc kubenswrapper[4606]: I1212 00:46:20.104619 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Dec 12 00:46:20 crc kubenswrapper[4606]: I1212 00:46:20.110463 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-76b48998b8-ff8r8"] Dec 12 00:46:20 crc kubenswrapper[4606]: I1212 00:46:20.141117 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b032f38-cd06-4fa3-9db7-6405dbaffaf4-internal-tls-certs\") pod \"keystone-76b48998b8-ff8r8\" (UID: \"5b032f38-cd06-4fa3-9db7-6405dbaffaf4\") " pod="openstack/keystone-76b48998b8-ff8r8" Dec 12 00:46:20 crc kubenswrapper[4606]: I1212 00:46:20.141162 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b032f38-cd06-4fa3-9db7-6405dbaffaf4-combined-ca-bundle\") pod \"keystone-76b48998b8-ff8r8\" (UID: \"5b032f38-cd06-4fa3-9db7-6405dbaffaf4\") " pod="openstack/keystone-76b48998b8-ff8r8" Dec 12 00:46:20 crc kubenswrapper[4606]: I1212 00:46:20.141244 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5b032f38-cd06-4fa3-9db7-6405dbaffaf4-scripts\") pod \"keystone-76b48998b8-ff8r8\" (UID: \"5b032f38-cd06-4fa3-9db7-6405dbaffaf4\") " pod="openstack/keystone-76b48998b8-ff8r8" Dec 12 00:46:20 crc kubenswrapper[4606]: I1212 00:46:20.141269 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b032f38-cd06-4fa3-9db7-6405dbaffaf4-public-tls-certs\") pod \"keystone-76b48998b8-ff8r8\" (UID: \"5b032f38-cd06-4fa3-9db7-6405dbaffaf4\") " pod="openstack/keystone-76b48998b8-ff8r8" Dec 12 00:46:20 crc kubenswrapper[4606]: I1212 00:46:20.141284 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5b032f38-cd06-4fa3-9db7-6405dbaffaf4-credential-keys\") pod \"keystone-76b48998b8-ff8r8\" (UID: \"5b032f38-cd06-4fa3-9db7-6405dbaffaf4\") " pod="openstack/keystone-76b48998b8-ff8r8" Dec 12 00:46:20 crc kubenswrapper[4606]: I1212 00:46:20.141305 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5b032f38-cd06-4fa3-9db7-6405dbaffaf4-fernet-keys\") pod \"keystone-76b48998b8-ff8r8\" (UID: \"5b032f38-cd06-4fa3-9db7-6405dbaffaf4\") " pod="openstack/keystone-76b48998b8-ff8r8" Dec 12 00:46:20 crc kubenswrapper[4606]: I1212 00:46:20.141359 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b032f38-cd06-4fa3-9db7-6405dbaffaf4-config-data\") pod \"keystone-76b48998b8-ff8r8\" (UID: \"5b032f38-cd06-4fa3-9db7-6405dbaffaf4\") " pod="openstack/keystone-76b48998b8-ff8r8" Dec 12 00:46:20 crc kubenswrapper[4606]: I1212 00:46:20.141376 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxpps\" (UniqueName: \"kubernetes.io/projected/5b032f38-cd06-4fa3-9db7-6405dbaffaf4-kube-api-access-zxpps\") pod \"keystone-76b48998b8-ff8r8\" (UID: \"5b032f38-cd06-4fa3-9db7-6405dbaffaf4\") " pod="openstack/keystone-76b48998b8-ff8r8" Dec 12 00:46:20 crc kubenswrapper[4606]: I1212 00:46:20.246090 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b032f38-cd06-4fa3-9db7-6405dbaffaf4-config-data\") pod \"keystone-76b48998b8-ff8r8\" (UID: \"5b032f38-cd06-4fa3-9db7-6405dbaffaf4\") " pod="openstack/keystone-76b48998b8-ff8r8" Dec 12 00:46:20 crc kubenswrapper[4606]: I1212 00:46:20.246423 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zxpps\" (UniqueName: \"kubernetes.io/projected/5b032f38-cd06-4fa3-9db7-6405dbaffaf4-kube-api-access-zxpps\") pod \"keystone-76b48998b8-ff8r8\" (UID: \"5b032f38-cd06-4fa3-9db7-6405dbaffaf4\") " pod="openstack/keystone-76b48998b8-ff8r8" Dec 12 00:46:20 crc kubenswrapper[4606]: I1212 00:46:20.246464 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b032f38-cd06-4fa3-9db7-6405dbaffaf4-internal-tls-certs\") pod \"keystone-76b48998b8-ff8r8\" (UID: \"5b032f38-cd06-4fa3-9db7-6405dbaffaf4\") " pod="openstack/keystone-76b48998b8-ff8r8" Dec 12 00:46:20 crc kubenswrapper[4606]: I1212 00:46:20.246482 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b032f38-cd06-4fa3-9db7-6405dbaffaf4-combined-ca-bundle\") pod \"keystone-76b48998b8-ff8r8\" (UID: \"5b032f38-cd06-4fa3-9db7-6405dbaffaf4\") " pod="openstack/keystone-76b48998b8-ff8r8" Dec 12 00:46:20 crc kubenswrapper[4606]: I1212 00:46:20.246565 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5b032f38-cd06-4fa3-9db7-6405dbaffaf4-scripts\") pod \"keystone-76b48998b8-ff8r8\" (UID: \"5b032f38-cd06-4fa3-9db7-6405dbaffaf4\") " pod="openstack/keystone-76b48998b8-ff8r8" Dec 12 00:46:20 crc kubenswrapper[4606]: I1212 00:46:20.246585 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b032f38-cd06-4fa3-9db7-6405dbaffaf4-public-tls-certs\") pod \"keystone-76b48998b8-ff8r8\" (UID: \"5b032f38-cd06-4fa3-9db7-6405dbaffaf4\") " pod="openstack/keystone-76b48998b8-ff8r8" Dec 12 00:46:20 crc kubenswrapper[4606]: I1212 00:46:20.246602 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5b032f38-cd06-4fa3-9db7-6405dbaffaf4-credential-keys\") pod \"keystone-76b48998b8-ff8r8\" (UID: \"5b032f38-cd06-4fa3-9db7-6405dbaffaf4\") " pod="openstack/keystone-76b48998b8-ff8r8" Dec 12 00:46:20 crc kubenswrapper[4606]: I1212 00:46:20.246625 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5b032f38-cd06-4fa3-9db7-6405dbaffaf4-fernet-keys\") pod \"keystone-76b48998b8-ff8r8\" (UID: \"5b032f38-cd06-4fa3-9db7-6405dbaffaf4\") " pod="openstack/keystone-76b48998b8-ff8r8" Dec 12 00:46:20 crc kubenswrapper[4606]: I1212 00:46:20.253499 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b032f38-cd06-4fa3-9db7-6405dbaffaf4-config-data\") pod \"keystone-76b48998b8-ff8r8\" (UID: \"5b032f38-cd06-4fa3-9db7-6405dbaffaf4\") " pod="openstack/keystone-76b48998b8-ff8r8" Dec 12 00:46:20 crc kubenswrapper[4606]: I1212 00:46:20.257711 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5b032f38-cd06-4fa3-9db7-6405dbaffaf4-scripts\") pod \"keystone-76b48998b8-ff8r8\" (UID: \"5b032f38-cd06-4fa3-9db7-6405dbaffaf4\") " pod="openstack/keystone-76b48998b8-ff8r8" Dec 12 00:46:20 crc kubenswrapper[4606]: I1212 00:46:20.257887 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5b032f38-cd06-4fa3-9db7-6405dbaffaf4-credential-keys\") pod \"keystone-76b48998b8-ff8r8\" (UID: \"5b032f38-cd06-4fa3-9db7-6405dbaffaf4\") " pod="openstack/keystone-76b48998b8-ff8r8" Dec 12 00:46:20 crc kubenswrapper[4606]: I1212 00:46:20.260874 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b032f38-cd06-4fa3-9db7-6405dbaffaf4-public-tls-certs\") pod \"keystone-76b48998b8-ff8r8\" (UID: \"5b032f38-cd06-4fa3-9db7-6405dbaffaf4\") " pod="openstack/keystone-76b48998b8-ff8r8" Dec 12 00:46:20 crc kubenswrapper[4606]: I1212 00:46:20.261384 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5b032f38-cd06-4fa3-9db7-6405dbaffaf4-fernet-keys\") pod \"keystone-76b48998b8-ff8r8\" (UID: \"5b032f38-cd06-4fa3-9db7-6405dbaffaf4\") " pod="openstack/keystone-76b48998b8-ff8r8" Dec 12 00:46:20 crc kubenswrapper[4606]: I1212 00:46:20.262773 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b032f38-cd06-4fa3-9db7-6405dbaffaf4-combined-ca-bundle\") pod \"keystone-76b48998b8-ff8r8\" (UID: \"5b032f38-cd06-4fa3-9db7-6405dbaffaf4\") " pod="openstack/keystone-76b48998b8-ff8r8" Dec 12 00:46:20 crc kubenswrapper[4606]: I1212 00:46:20.263396 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b032f38-cd06-4fa3-9db7-6405dbaffaf4-internal-tls-certs\") pod \"keystone-76b48998b8-ff8r8\" (UID: \"5b032f38-cd06-4fa3-9db7-6405dbaffaf4\") " pod="openstack/keystone-76b48998b8-ff8r8" Dec 12 00:46:20 crc kubenswrapper[4606]: I1212 00:46:20.333414 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxpps\" (UniqueName: \"kubernetes.io/projected/5b032f38-cd06-4fa3-9db7-6405dbaffaf4-kube-api-access-zxpps\") pod \"keystone-76b48998b8-ff8r8\" (UID: \"5b032f38-cd06-4fa3-9db7-6405dbaffaf4\") " pod="openstack/keystone-76b48998b8-ff8r8" Dec 12 00:46:20 crc kubenswrapper[4606]: I1212 00:46:20.459302 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-76b48998b8-ff8r8" Dec 12 00:46:20 crc kubenswrapper[4606]: I1212 00:46:20.463481 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-5555587ffc-hjrpk"] Dec 12 00:46:20 crc kubenswrapper[4606]: I1212 00:46:20.465348 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-5555587ffc-hjrpk" Dec 12 00:46:20 crc kubenswrapper[4606]: I1212 00:46:20.491908 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-56c8787fc4-ckfzn"] Dec 12 00:46:20 crc kubenswrapper[4606]: I1212 00:46:20.495247 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-56c8787fc4-ckfzn" Dec 12 00:46:20 crc kubenswrapper[4606]: I1212 00:46:20.517534 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 12 00:46:20 crc kubenswrapper[4606]: I1212 00:46:20.517717 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-8t92h" Dec 12 00:46:20 crc kubenswrapper[4606]: I1212 00:46:20.525426 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Dec 12 00:46:20 crc kubenswrapper[4606]: I1212 00:46:20.565502 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-56c8787fc4-ckfzn"] Dec 12 00:46:20 crc kubenswrapper[4606]: I1212 00:46:20.565599 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/71f55b69-c2e2-49eb-b468-b1e940b54f1e-config-data-custom\") pod \"barbican-worker-5555587ffc-hjrpk\" (UID: \"71f55b69-c2e2-49eb-b468-b1e940b54f1e\") " pod="openstack/barbican-worker-5555587ffc-hjrpk" Dec 12 00:46:20 crc kubenswrapper[4606]: I1212 00:46:20.565646 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71f55b69-c2e2-49eb-b468-b1e940b54f1e-combined-ca-bundle\") pod \"barbican-worker-5555587ffc-hjrpk\" (UID: \"71f55b69-c2e2-49eb-b468-b1e940b54f1e\") " pod="openstack/barbican-worker-5555587ffc-hjrpk" Dec 12 00:46:20 crc kubenswrapper[4606]: I1212 00:46:20.565676 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71f55b69-c2e2-49eb-b468-b1e940b54f1e-config-data\") pod \"barbican-worker-5555587ffc-hjrpk\" (UID: \"71f55b69-c2e2-49eb-b468-b1e940b54f1e\") " pod="openstack/barbican-worker-5555587ffc-hjrpk" Dec 12 00:46:20 crc kubenswrapper[4606]: I1212 00:46:20.565704 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/92eda0c3-4480-4e66-b349-144d9fb32bad-logs\") pod \"barbican-keystone-listener-56c8787fc4-ckfzn\" (UID: \"92eda0c3-4480-4e66-b349-144d9fb32bad\") " pod="openstack/barbican-keystone-listener-56c8787fc4-ckfzn" Dec 12 00:46:20 crc kubenswrapper[4606]: I1212 00:46:20.565745 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9q8wf\" (UniqueName: \"kubernetes.io/projected/71f55b69-c2e2-49eb-b468-b1e940b54f1e-kube-api-access-9q8wf\") pod \"barbican-worker-5555587ffc-hjrpk\" (UID: \"71f55b69-c2e2-49eb-b468-b1e940b54f1e\") " pod="openstack/barbican-worker-5555587ffc-hjrpk" Dec 12 00:46:20 crc kubenswrapper[4606]: I1212 00:46:20.565770 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvmsr\" (UniqueName: \"kubernetes.io/projected/92eda0c3-4480-4e66-b349-144d9fb32bad-kube-api-access-bvmsr\") pod \"barbican-keystone-listener-56c8787fc4-ckfzn\" (UID: \"92eda0c3-4480-4e66-b349-144d9fb32bad\") " pod="openstack/barbican-keystone-listener-56c8787fc4-ckfzn" Dec 12 00:46:20 crc kubenswrapper[4606]: I1212 00:46:20.565792 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/71f55b69-c2e2-49eb-b468-b1e940b54f1e-logs\") pod \"barbican-worker-5555587ffc-hjrpk\" (UID: \"71f55b69-c2e2-49eb-b468-b1e940b54f1e\") " pod="openstack/barbican-worker-5555587ffc-hjrpk" Dec 12 00:46:20 crc kubenswrapper[4606]: I1212 00:46:20.565813 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92eda0c3-4480-4e66-b349-144d9fb32bad-combined-ca-bundle\") pod \"barbican-keystone-listener-56c8787fc4-ckfzn\" (UID: \"92eda0c3-4480-4e66-b349-144d9fb32bad\") " pod="openstack/barbican-keystone-listener-56c8787fc4-ckfzn" Dec 12 00:46:20 crc kubenswrapper[4606]: I1212 00:46:20.566035 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92eda0c3-4480-4e66-b349-144d9fb32bad-config-data\") pod \"barbican-keystone-listener-56c8787fc4-ckfzn\" (UID: \"92eda0c3-4480-4e66-b349-144d9fb32bad\") " pod="openstack/barbican-keystone-listener-56c8787fc4-ckfzn" Dec 12 00:46:20 crc kubenswrapper[4606]: I1212 00:46:20.566069 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/92eda0c3-4480-4e66-b349-144d9fb32bad-config-data-custom\") pod \"barbican-keystone-listener-56c8787fc4-ckfzn\" (UID: \"92eda0c3-4480-4e66-b349-144d9fb32bad\") " pod="openstack/barbican-keystone-listener-56c8787fc4-ckfzn" Dec 12 00:46:20 crc kubenswrapper[4606]: I1212 00:46:20.587442 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Dec 12 00:46:20 crc kubenswrapper[4606]: I1212 00:46:20.594905 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-5555587ffc-hjrpk"] Dec 12 00:46:20 crc kubenswrapper[4606]: I1212 00:46:20.671745 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9q8wf\" (UniqueName: \"kubernetes.io/projected/71f55b69-c2e2-49eb-b468-b1e940b54f1e-kube-api-access-9q8wf\") pod \"barbican-worker-5555587ffc-hjrpk\" (UID: \"71f55b69-c2e2-49eb-b468-b1e940b54f1e\") " pod="openstack/barbican-worker-5555587ffc-hjrpk" Dec 12 00:46:20 crc kubenswrapper[4606]: I1212 00:46:20.672046 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bvmsr\" (UniqueName: \"kubernetes.io/projected/92eda0c3-4480-4e66-b349-144d9fb32bad-kube-api-access-bvmsr\") pod \"barbican-keystone-listener-56c8787fc4-ckfzn\" (UID: \"92eda0c3-4480-4e66-b349-144d9fb32bad\") " pod="openstack/barbican-keystone-listener-56c8787fc4-ckfzn" Dec 12 00:46:20 crc kubenswrapper[4606]: I1212 00:46:20.672075 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/71f55b69-c2e2-49eb-b468-b1e940b54f1e-logs\") pod \"barbican-worker-5555587ffc-hjrpk\" (UID: \"71f55b69-c2e2-49eb-b468-b1e940b54f1e\") " pod="openstack/barbican-worker-5555587ffc-hjrpk" Dec 12 00:46:20 crc kubenswrapper[4606]: I1212 00:46:20.672101 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92eda0c3-4480-4e66-b349-144d9fb32bad-combined-ca-bundle\") pod \"barbican-keystone-listener-56c8787fc4-ckfzn\" (UID: \"92eda0c3-4480-4e66-b349-144d9fb32bad\") " pod="openstack/barbican-keystone-listener-56c8787fc4-ckfzn" Dec 12 00:46:20 crc kubenswrapper[4606]: I1212 00:46:20.672143 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92eda0c3-4480-4e66-b349-144d9fb32bad-config-data\") pod \"barbican-keystone-listener-56c8787fc4-ckfzn\" (UID: \"92eda0c3-4480-4e66-b349-144d9fb32bad\") " pod="openstack/barbican-keystone-listener-56c8787fc4-ckfzn" Dec 12 00:46:20 crc kubenswrapper[4606]: I1212 00:46:20.672252 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/92eda0c3-4480-4e66-b349-144d9fb32bad-config-data-custom\") pod \"barbican-keystone-listener-56c8787fc4-ckfzn\" (UID: \"92eda0c3-4480-4e66-b349-144d9fb32bad\") " pod="openstack/barbican-keystone-listener-56c8787fc4-ckfzn" Dec 12 00:46:20 crc kubenswrapper[4606]: I1212 00:46:20.672297 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/71f55b69-c2e2-49eb-b468-b1e940b54f1e-config-data-custom\") pod \"barbican-worker-5555587ffc-hjrpk\" (UID: \"71f55b69-c2e2-49eb-b468-b1e940b54f1e\") " pod="openstack/barbican-worker-5555587ffc-hjrpk" Dec 12 00:46:20 crc kubenswrapper[4606]: I1212 00:46:20.672337 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71f55b69-c2e2-49eb-b468-b1e940b54f1e-combined-ca-bundle\") pod \"barbican-worker-5555587ffc-hjrpk\" (UID: \"71f55b69-c2e2-49eb-b468-b1e940b54f1e\") " pod="openstack/barbican-worker-5555587ffc-hjrpk" Dec 12 00:46:20 crc kubenswrapper[4606]: I1212 00:46:20.672371 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71f55b69-c2e2-49eb-b468-b1e940b54f1e-config-data\") pod \"barbican-worker-5555587ffc-hjrpk\" (UID: \"71f55b69-c2e2-49eb-b468-b1e940b54f1e\") " pod="openstack/barbican-worker-5555587ffc-hjrpk" Dec 12 00:46:20 crc kubenswrapper[4606]: I1212 00:46:20.672411 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/92eda0c3-4480-4e66-b349-144d9fb32bad-logs\") pod \"barbican-keystone-listener-56c8787fc4-ckfzn\" (UID: \"92eda0c3-4480-4e66-b349-144d9fb32bad\") " pod="openstack/barbican-keystone-listener-56c8787fc4-ckfzn" Dec 12 00:46:20 crc kubenswrapper[4606]: I1212 00:46:20.673026 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/92eda0c3-4480-4e66-b349-144d9fb32bad-logs\") pod \"barbican-keystone-listener-56c8787fc4-ckfzn\" (UID: \"92eda0c3-4480-4e66-b349-144d9fb32bad\") " pod="openstack/barbican-keystone-listener-56c8787fc4-ckfzn" Dec 12 00:46:20 crc kubenswrapper[4606]: I1212 00:46:20.677165 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/71f55b69-c2e2-49eb-b468-b1e940b54f1e-logs\") pod \"barbican-worker-5555587ffc-hjrpk\" (UID: \"71f55b69-c2e2-49eb-b468-b1e940b54f1e\") " pod="openstack/barbican-worker-5555587ffc-hjrpk" Dec 12 00:46:20 crc kubenswrapper[4606]: I1212 00:46:20.683338 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/71f55b69-c2e2-49eb-b468-b1e940b54f1e-config-data-custom\") pod \"barbican-worker-5555587ffc-hjrpk\" (UID: \"71f55b69-c2e2-49eb-b468-b1e940b54f1e\") " pod="openstack/barbican-worker-5555587ffc-hjrpk" Dec 12 00:46:20 crc kubenswrapper[4606]: I1212 00:46:20.691059 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92eda0c3-4480-4e66-b349-144d9fb32bad-combined-ca-bundle\") pod \"barbican-keystone-listener-56c8787fc4-ckfzn\" (UID: \"92eda0c3-4480-4e66-b349-144d9fb32bad\") " pod="openstack/barbican-keystone-listener-56c8787fc4-ckfzn" Dec 12 00:46:20 crc kubenswrapper[4606]: I1212 00:46:20.692077 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71f55b69-c2e2-49eb-b468-b1e940b54f1e-combined-ca-bundle\") pod \"barbican-worker-5555587ffc-hjrpk\" (UID: \"71f55b69-c2e2-49eb-b468-b1e940b54f1e\") " pod="openstack/barbican-worker-5555587ffc-hjrpk" Dec 12 00:46:20 crc kubenswrapper[4606]: I1212 00:46:20.696405 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92eda0c3-4480-4e66-b349-144d9fb32bad-config-data\") pod \"barbican-keystone-listener-56c8787fc4-ckfzn\" (UID: \"92eda0c3-4480-4e66-b349-144d9fb32bad\") " pod="openstack/barbican-keystone-listener-56c8787fc4-ckfzn" Dec 12 00:46:20 crc kubenswrapper[4606]: I1212 00:46:20.696828 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/92eda0c3-4480-4e66-b349-144d9fb32bad-config-data-custom\") pod \"barbican-keystone-listener-56c8787fc4-ckfzn\" (UID: \"92eda0c3-4480-4e66-b349-144d9fb32bad\") " pod="openstack/barbican-keystone-listener-56c8787fc4-ckfzn" Dec 12 00:46:20 crc kubenswrapper[4606]: I1212 00:46:20.701316 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71f55b69-c2e2-49eb-b468-b1e940b54f1e-config-data\") pod \"barbican-worker-5555587ffc-hjrpk\" (UID: \"71f55b69-c2e2-49eb-b468-b1e940b54f1e\") " pod="openstack/barbican-worker-5555587ffc-hjrpk" Dec 12 00:46:20 crc kubenswrapper[4606]: I1212 00:46:20.799412 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvmsr\" (UniqueName: \"kubernetes.io/projected/92eda0c3-4480-4e66-b349-144d9fb32bad-kube-api-access-bvmsr\") pod \"barbican-keystone-listener-56c8787fc4-ckfzn\" (UID: \"92eda0c3-4480-4e66-b349-144d9fb32bad\") " pod="openstack/barbican-keystone-listener-56c8787fc4-ckfzn" Dec 12 00:46:20 crc kubenswrapper[4606]: I1212 00:46:20.799870 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9q8wf\" (UniqueName: \"kubernetes.io/projected/71f55b69-c2e2-49eb-b468-b1e940b54f1e-kube-api-access-9q8wf\") pod \"barbican-worker-5555587ffc-hjrpk\" (UID: \"71f55b69-c2e2-49eb-b468-b1e940b54f1e\") " pod="openstack/barbican-worker-5555587ffc-hjrpk" Dec 12 00:46:20 crc kubenswrapper[4606]: I1212 00:46:20.857229 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-gjqxm"] Dec 12 00:46:20 crc kubenswrapper[4606]: I1212 00:46:20.858700 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-gjqxm" Dec 12 00:46:20 crc kubenswrapper[4606]: I1212 00:46:20.885306 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-gjqxm"] Dec 12 00:46:20 crc kubenswrapper[4606]: I1212 00:46:20.895324 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-9xbj8" event={"ID":"7978c0cd-b859-49f1-ad0e-1cb88ff58495","Type":"ContainerStarted","Data":"b515f4d8e08e5e2e4e36edb12127b4cd245223498e1e82b164d9c51b5ca6bd93"} Dec 12 00:46:20 crc kubenswrapper[4606]: I1212 00:46:20.977547 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-5555587ffc-hjrpk" Dec 12 00:46:21 crc kubenswrapper[4606]: I1212 00:46:20.991931 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-9xbj8" podStartSLOduration=4.832442443 podStartE2EDuration="58.99188916s" podCreationTimestamp="2025-12-12 00:45:22 +0000 UTC" firstStartedPulling="2025-12-12 00:45:24.415280989 +0000 UTC m=+1314.960633855" lastFinishedPulling="2025-12-12 00:46:18.574727706 +0000 UTC m=+1369.120080572" observedRunningTime="2025-12-12 00:46:20.978480041 +0000 UTC m=+1371.523832907" watchObservedRunningTime="2025-12-12 00:46:20.99188916 +0000 UTC m=+1371.537242016" Dec 12 00:46:21 crc kubenswrapper[4606]: I1212 00:46:20.996255 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntxf2\" (UniqueName: \"kubernetes.io/projected/17508d9b-339d-4b3a-ab72-e234a8ec168a-kube-api-access-ntxf2\") pod \"dnsmasq-dns-848cf88cfc-gjqxm\" (UID: \"17508d9b-339d-4b3a-ab72-e234a8ec168a\") " pod="openstack/dnsmasq-dns-848cf88cfc-gjqxm" Dec 12 00:46:21 crc kubenswrapper[4606]: I1212 00:46:20.996322 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17508d9b-339d-4b3a-ab72-e234a8ec168a-config\") pod \"dnsmasq-dns-848cf88cfc-gjqxm\" (UID: \"17508d9b-339d-4b3a-ab72-e234a8ec168a\") " pod="openstack/dnsmasq-dns-848cf88cfc-gjqxm" Dec 12 00:46:21 crc kubenswrapper[4606]: I1212 00:46:20.996383 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/17508d9b-339d-4b3a-ab72-e234a8ec168a-ovsdbserver-sb\") pod \"dnsmasq-dns-848cf88cfc-gjqxm\" (UID: \"17508d9b-339d-4b3a-ab72-e234a8ec168a\") " pod="openstack/dnsmasq-dns-848cf88cfc-gjqxm" Dec 12 00:46:21 crc kubenswrapper[4606]: I1212 00:46:20.996464 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/17508d9b-339d-4b3a-ab72-e234a8ec168a-ovsdbserver-nb\") pod \"dnsmasq-dns-848cf88cfc-gjqxm\" (UID: \"17508d9b-339d-4b3a-ab72-e234a8ec168a\") " pod="openstack/dnsmasq-dns-848cf88cfc-gjqxm" Dec 12 00:46:21 crc kubenswrapper[4606]: I1212 00:46:20.997117 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/17508d9b-339d-4b3a-ab72-e234a8ec168a-dns-swift-storage-0\") pod \"dnsmasq-dns-848cf88cfc-gjqxm\" (UID: \"17508d9b-339d-4b3a-ab72-e234a8ec168a\") " pod="openstack/dnsmasq-dns-848cf88cfc-gjqxm" Dec 12 00:46:21 crc kubenswrapper[4606]: I1212 00:46:20.997228 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/17508d9b-339d-4b3a-ab72-e234a8ec168a-dns-svc\") pod \"dnsmasq-dns-848cf88cfc-gjqxm\" (UID: \"17508d9b-339d-4b3a-ab72-e234a8ec168a\") " pod="openstack/dnsmasq-dns-848cf88cfc-gjqxm" Dec 12 00:46:21 crc kubenswrapper[4606]: I1212 00:46:21.055004 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-6575dff54-tdrpc"] Dec 12 00:46:21 crc kubenswrapper[4606]: I1212 00:46:21.060360 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6575dff54-tdrpc" Dec 12 00:46:21 crc kubenswrapper[4606]: I1212 00:46:21.074595 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Dec 12 00:46:21 crc kubenswrapper[4606]: I1212 00:46:21.101195 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/17508d9b-339d-4b3a-ab72-e234a8ec168a-ovsdbserver-nb\") pod \"dnsmasq-dns-848cf88cfc-gjqxm\" (UID: \"17508d9b-339d-4b3a-ab72-e234a8ec168a\") " pod="openstack/dnsmasq-dns-848cf88cfc-gjqxm" Dec 12 00:46:21 crc kubenswrapper[4606]: I1212 00:46:21.101241 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/17508d9b-339d-4b3a-ab72-e234a8ec168a-dns-swift-storage-0\") pod \"dnsmasq-dns-848cf88cfc-gjqxm\" (UID: \"17508d9b-339d-4b3a-ab72-e234a8ec168a\") " pod="openstack/dnsmasq-dns-848cf88cfc-gjqxm" Dec 12 00:46:21 crc kubenswrapper[4606]: I1212 00:46:21.101295 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/17508d9b-339d-4b3a-ab72-e234a8ec168a-dns-svc\") pod \"dnsmasq-dns-848cf88cfc-gjqxm\" (UID: \"17508d9b-339d-4b3a-ab72-e234a8ec168a\") " pod="openstack/dnsmasq-dns-848cf88cfc-gjqxm" Dec 12 00:46:21 crc kubenswrapper[4606]: I1212 00:46:21.101319 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ntxf2\" (UniqueName: \"kubernetes.io/projected/17508d9b-339d-4b3a-ab72-e234a8ec168a-kube-api-access-ntxf2\") pod \"dnsmasq-dns-848cf88cfc-gjqxm\" (UID: \"17508d9b-339d-4b3a-ab72-e234a8ec168a\") " pod="openstack/dnsmasq-dns-848cf88cfc-gjqxm" Dec 12 00:46:21 crc kubenswrapper[4606]: I1212 00:46:21.101352 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17508d9b-339d-4b3a-ab72-e234a8ec168a-config\") pod \"dnsmasq-dns-848cf88cfc-gjqxm\" (UID: \"17508d9b-339d-4b3a-ab72-e234a8ec168a\") " pod="openstack/dnsmasq-dns-848cf88cfc-gjqxm" Dec 12 00:46:21 crc kubenswrapper[4606]: I1212 00:46:21.101392 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/17508d9b-339d-4b3a-ab72-e234a8ec168a-ovsdbserver-sb\") pod \"dnsmasq-dns-848cf88cfc-gjqxm\" (UID: \"17508d9b-339d-4b3a-ab72-e234a8ec168a\") " pod="openstack/dnsmasq-dns-848cf88cfc-gjqxm" Dec 12 00:46:21 crc kubenswrapper[4606]: I1212 00:46:21.102248 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/17508d9b-339d-4b3a-ab72-e234a8ec168a-ovsdbserver-sb\") pod \"dnsmasq-dns-848cf88cfc-gjqxm\" (UID: \"17508d9b-339d-4b3a-ab72-e234a8ec168a\") " pod="openstack/dnsmasq-dns-848cf88cfc-gjqxm" Dec 12 00:46:21 crc kubenswrapper[4606]: I1212 00:46:21.103741 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/17508d9b-339d-4b3a-ab72-e234a8ec168a-ovsdbserver-nb\") pod \"dnsmasq-dns-848cf88cfc-gjqxm\" (UID: \"17508d9b-339d-4b3a-ab72-e234a8ec168a\") " pod="openstack/dnsmasq-dns-848cf88cfc-gjqxm" Dec 12 00:46:21 crc kubenswrapper[4606]: I1212 00:46:21.104238 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/17508d9b-339d-4b3a-ab72-e234a8ec168a-dns-swift-storage-0\") pod \"dnsmasq-dns-848cf88cfc-gjqxm\" (UID: \"17508d9b-339d-4b3a-ab72-e234a8ec168a\") " pod="openstack/dnsmasq-dns-848cf88cfc-gjqxm" Dec 12 00:46:21 crc kubenswrapper[4606]: I1212 00:46:21.104785 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/17508d9b-339d-4b3a-ab72-e234a8ec168a-dns-svc\") pod \"dnsmasq-dns-848cf88cfc-gjqxm\" (UID: \"17508d9b-339d-4b3a-ab72-e234a8ec168a\") " pod="openstack/dnsmasq-dns-848cf88cfc-gjqxm" Dec 12 00:46:21 crc kubenswrapper[4606]: I1212 00:46:21.107679 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6575dff54-tdrpc"] Dec 12 00:46:21 crc kubenswrapper[4606]: I1212 00:46:21.110818 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-56c8787fc4-ckfzn" Dec 12 00:46:21 crc kubenswrapper[4606]: I1212 00:46:21.114053 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17508d9b-339d-4b3a-ab72-e234a8ec168a-config\") pod \"dnsmasq-dns-848cf88cfc-gjqxm\" (UID: \"17508d9b-339d-4b3a-ab72-e234a8ec168a\") " pod="openstack/dnsmasq-dns-848cf88cfc-gjqxm" Dec 12 00:46:21 crc kubenswrapper[4606]: I1212 00:46:21.167864 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntxf2\" (UniqueName: \"kubernetes.io/projected/17508d9b-339d-4b3a-ab72-e234a8ec168a-kube-api-access-ntxf2\") pod \"dnsmasq-dns-848cf88cfc-gjqxm\" (UID: \"17508d9b-339d-4b3a-ab72-e234a8ec168a\") " pod="openstack/dnsmasq-dns-848cf88cfc-gjqxm" Dec 12 00:46:21 crc kubenswrapper[4606]: I1212 00:46:21.201290 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-gjqxm" Dec 12 00:46:21 crc kubenswrapper[4606]: I1212 00:46:21.202665 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dec2500f-eaf5-4ee7-9c07-66f7dda126f3-config-data-custom\") pod \"barbican-api-6575dff54-tdrpc\" (UID: \"dec2500f-eaf5-4ee7-9c07-66f7dda126f3\") " pod="openstack/barbican-api-6575dff54-tdrpc" Dec 12 00:46:21 crc kubenswrapper[4606]: I1212 00:46:21.202748 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dec2500f-eaf5-4ee7-9c07-66f7dda126f3-logs\") pod \"barbican-api-6575dff54-tdrpc\" (UID: \"dec2500f-eaf5-4ee7-9c07-66f7dda126f3\") " pod="openstack/barbican-api-6575dff54-tdrpc" Dec 12 00:46:21 crc kubenswrapper[4606]: I1212 00:46:21.202824 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dec2500f-eaf5-4ee7-9c07-66f7dda126f3-combined-ca-bundle\") pod \"barbican-api-6575dff54-tdrpc\" (UID: \"dec2500f-eaf5-4ee7-9c07-66f7dda126f3\") " pod="openstack/barbican-api-6575dff54-tdrpc" Dec 12 00:46:21 crc kubenswrapper[4606]: I1212 00:46:21.202864 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dec2500f-eaf5-4ee7-9c07-66f7dda126f3-config-data\") pod \"barbican-api-6575dff54-tdrpc\" (UID: \"dec2500f-eaf5-4ee7-9c07-66f7dda126f3\") " pod="openstack/barbican-api-6575dff54-tdrpc" Dec 12 00:46:21 crc kubenswrapper[4606]: I1212 00:46:21.202942 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86nmv\" (UniqueName: \"kubernetes.io/projected/dec2500f-eaf5-4ee7-9c07-66f7dda126f3-kube-api-access-86nmv\") pod \"barbican-api-6575dff54-tdrpc\" (UID: \"dec2500f-eaf5-4ee7-9c07-66f7dda126f3\") " pod="openstack/barbican-api-6575dff54-tdrpc" Dec 12 00:46:21 crc kubenswrapper[4606]: I1212 00:46:21.304246 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dec2500f-eaf5-4ee7-9c07-66f7dda126f3-config-data-custom\") pod \"barbican-api-6575dff54-tdrpc\" (UID: \"dec2500f-eaf5-4ee7-9c07-66f7dda126f3\") " pod="openstack/barbican-api-6575dff54-tdrpc" Dec 12 00:46:21 crc kubenswrapper[4606]: I1212 00:46:21.304484 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dec2500f-eaf5-4ee7-9c07-66f7dda126f3-logs\") pod \"barbican-api-6575dff54-tdrpc\" (UID: \"dec2500f-eaf5-4ee7-9c07-66f7dda126f3\") " pod="openstack/barbican-api-6575dff54-tdrpc" Dec 12 00:46:21 crc kubenswrapper[4606]: I1212 00:46:21.304535 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dec2500f-eaf5-4ee7-9c07-66f7dda126f3-combined-ca-bundle\") pod \"barbican-api-6575dff54-tdrpc\" (UID: \"dec2500f-eaf5-4ee7-9c07-66f7dda126f3\") " pod="openstack/barbican-api-6575dff54-tdrpc" Dec 12 00:46:21 crc kubenswrapper[4606]: I1212 00:46:21.304557 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dec2500f-eaf5-4ee7-9c07-66f7dda126f3-config-data\") pod \"barbican-api-6575dff54-tdrpc\" (UID: \"dec2500f-eaf5-4ee7-9c07-66f7dda126f3\") " pod="openstack/barbican-api-6575dff54-tdrpc" Dec 12 00:46:21 crc kubenswrapper[4606]: I1212 00:46:21.304627 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86nmv\" (UniqueName: \"kubernetes.io/projected/dec2500f-eaf5-4ee7-9c07-66f7dda126f3-kube-api-access-86nmv\") pod \"barbican-api-6575dff54-tdrpc\" (UID: \"dec2500f-eaf5-4ee7-9c07-66f7dda126f3\") " pod="openstack/barbican-api-6575dff54-tdrpc" Dec 12 00:46:21 crc kubenswrapper[4606]: I1212 00:46:21.309212 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dec2500f-eaf5-4ee7-9c07-66f7dda126f3-logs\") pod \"barbican-api-6575dff54-tdrpc\" (UID: \"dec2500f-eaf5-4ee7-9c07-66f7dda126f3\") " pod="openstack/barbican-api-6575dff54-tdrpc" Dec 12 00:46:21 crc kubenswrapper[4606]: I1212 00:46:21.316910 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dec2500f-eaf5-4ee7-9c07-66f7dda126f3-combined-ca-bundle\") pod \"barbican-api-6575dff54-tdrpc\" (UID: \"dec2500f-eaf5-4ee7-9c07-66f7dda126f3\") " pod="openstack/barbican-api-6575dff54-tdrpc" Dec 12 00:46:21 crc kubenswrapper[4606]: I1212 00:46:21.317972 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dec2500f-eaf5-4ee7-9c07-66f7dda126f3-config-data\") pod \"barbican-api-6575dff54-tdrpc\" (UID: \"dec2500f-eaf5-4ee7-9c07-66f7dda126f3\") " pod="openstack/barbican-api-6575dff54-tdrpc" Dec 12 00:46:21 crc kubenswrapper[4606]: I1212 00:46:21.322697 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dec2500f-eaf5-4ee7-9c07-66f7dda126f3-config-data-custom\") pod \"barbican-api-6575dff54-tdrpc\" (UID: \"dec2500f-eaf5-4ee7-9c07-66f7dda126f3\") " pod="openstack/barbican-api-6575dff54-tdrpc" Dec 12 00:46:21 crc kubenswrapper[4606]: I1212 00:46:21.327733 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86nmv\" (UniqueName: \"kubernetes.io/projected/dec2500f-eaf5-4ee7-9c07-66f7dda126f3-kube-api-access-86nmv\") pod \"barbican-api-6575dff54-tdrpc\" (UID: \"dec2500f-eaf5-4ee7-9c07-66f7dda126f3\") " pod="openstack/barbican-api-6575dff54-tdrpc" Dec 12 00:46:21 crc kubenswrapper[4606]: I1212 00:46:21.427900 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-76b48998b8-ff8r8"] Dec 12 00:46:21 crc kubenswrapper[4606]: I1212 00:46:21.435392 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6575dff54-tdrpc" Dec 12 00:46:21 crc kubenswrapper[4606]: I1212 00:46:21.792864 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39112d38-7887-4e2b-b32f-d679ca162941" path="/var/lib/kubelet/pods/39112d38-7887-4e2b-b32f-d679ca162941/volumes" Dec 12 00:46:21 crc kubenswrapper[4606]: I1212 00:46:21.917159 4606 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-79c99578bb-cdgsn" podUID="9ede4720-3fd7-4524-adfc-c1c395f12170" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.148:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.148:8443: connect: connection refused" Dec 12 00:46:21 crc kubenswrapper[4606]: I1212 00:46:21.951348 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-76b48998b8-ff8r8" event={"ID":"5b032f38-cd06-4fa3-9db7-6405dbaffaf4","Type":"ContainerStarted","Data":"f589aeadea07ff158f8577a91bbcd0d8c0609e677578b8bee3edfcc759dce5b8"} Dec 12 00:46:21 crc kubenswrapper[4606]: I1212 00:46:21.960929 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-9f4869f76-gqjw4" event={"ID":"1ad12b18-e66a-4871-9a92-39e75070b4fb","Type":"ContainerStarted","Data":"1b77c117686315ba0525fb9f38cfe04044d550d6fa7fabfa4aa3c587a2201960"} Dec 12 00:46:21 crc kubenswrapper[4606]: I1212 00:46:21.961957 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-9f4869f76-gqjw4" Dec 12 00:46:21 crc kubenswrapper[4606]: I1212 00:46:21.961977 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-9f4869f76-gqjw4" Dec 12 00:46:21 crc kubenswrapper[4606]: I1212 00:46:21.970311 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-56c8787fc4-ckfzn"] Dec 12 00:46:22 crc kubenswrapper[4606]: I1212 00:46:22.001972 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-9f4869f76-gqjw4" podStartSLOduration=6.001948198 podStartE2EDuration="6.001948198s" podCreationTimestamp="2025-12-12 00:46:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:46:21.995980008 +0000 UTC m=+1372.541332874" watchObservedRunningTime="2025-12-12 00:46:22.001948198 +0000 UTC m=+1372.547301064" Dec 12 00:46:22 crc kubenswrapper[4606]: W1212 00:46:22.020516 4606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod92eda0c3_4480_4e66_b349_144d9fb32bad.slice/crio-c178dc5f794e2e7561d4710c4a659e0307b69a14c3f99737d118b35a6ad3a7f1 WatchSource:0}: Error finding container c178dc5f794e2e7561d4710c4a659e0307b69a14c3f99737d118b35a6ad3a7f1: Status 404 returned error can't find the container with id c178dc5f794e2e7561d4710c4a659e0307b69a14c3f99737d118b35a6ad3a7f1 Dec 12 00:46:22 crc kubenswrapper[4606]: I1212 00:46:22.199688 4606 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-b9fb498f6-62fcc" podUID="e38df57e-1a86-4c45-bf40-6282a6a049ed" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.149:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.149:8443: connect: connection refused" Dec 12 00:46:22 crc kubenswrapper[4606]: I1212 00:46:22.243226 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-5555587ffc-hjrpk"] Dec 12 00:46:22 crc kubenswrapper[4606]: I1212 00:46:22.349347 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-gjqxm"] Dec 12 00:46:22 crc kubenswrapper[4606]: W1212 00:46:22.367007 4606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod17508d9b_339d_4b3a_ab72_e234a8ec168a.slice/crio-289ea5f374286f826b2c02768e282721733354cb1a0c0ffa427f39c7320130ad WatchSource:0}: Error finding container 289ea5f374286f826b2c02768e282721733354cb1a0c0ffa427f39c7320130ad: Status 404 returned error can't find the container with id 289ea5f374286f826b2c02768e282721733354cb1a0c0ffa427f39c7320130ad Dec 12 00:46:22 crc kubenswrapper[4606]: I1212 00:46:22.578834 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6575dff54-tdrpc"] Dec 12 00:46:22 crc kubenswrapper[4606]: W1212 00:46:22.591467 4606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddec2500f_eaf5_4ee7_9c07_66f7dda126f3.slice/crio-c31ae76f40f1a26090b22570f232c7a1bf92119acc58bdf11a07775c7d82f3c0 WatchSource:0}: Error finding container c31ae76f40f1a26090b22570f232c7a1bf92119acc58bdf11a07775c7d82f3c0: Status 404 returned error can't find the container with id c31ae76f40f1a26090b22570f232c7a1bf92119acc58bdf11a07775c7d82f3c0 Dec 12 00:46:22 crc kubenswrapper[4606]: I1212 00:46:22.977188 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6575dff54-tdrpc" event={"ID":"dec2500f-eaf5-4ee7-9c07-66f7dda126f3","Type":"ContainerStarted","Data":"28897c4ec05b5aa59ad1ed5aa483cdb74e5499a6e8b201838073af70d30aefe4"} Dec 12 00:46:22 crc kubenswrapper[4606]: I1212 00:46:22.977460 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6575dff54-tdrpc" event={"ID":"dec2500f-eaf5-4ee7-9c07-66f7dda126f3","Type":"ContainerStarted","Data":"c31ae76f40f1a26090b22570f232c7a1bf92119acc58bdf11a07775c7d82f3c0"} Dec 12 00:46:22 crc kubenswrapper[4606]: I1212 00:46:22.994834 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-56c8787fc4-ckfzn" event={"ID":"92eda0c3-4480-4e66-b349-144d9fb32bad","Type":"ContainerStarted","Data":"c178dc5f794e2e7561d4710c4a659e0307b69a14c3f99737d118b35a6ad3a7f1"} Dec 12 00:46:23 crc kubenswrapper[4606]: I1212 00:46:23.001912 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5555587ffc-hjrpk" event={"ID":"71f55b69-c2e2-49eb-b468-b1e940b54f1e","Type":"ContainerStarted","Data":"97d6b3c93416ca08a46f6c09825369d34fe94c4d4f7994de92c49c1cf037e26b"} Dec 12 00:46:23 crc kubenswrapper[4606]: I1212 00:46:23.012588 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-76b48998b8-ff8r8" event={"ID":"5b032f38-cd06-4fa3-9db7-6405dbaffaf4","Type":"ContainerStarted","Data":"16254941bb806a0cb7dfce890a0e830df058f528a5c6a521070b90a6c3bebe5b"} Dec 12 00:46:23 crc kubenswrapper[4606]: I1212 00:46:23.013796 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-76b48998b8-ff8r8" Dec 12 00:46:23 crc kubenswrapper[4606]: I1212 00:46:23.041192 4606 generic.go:334] "Generic (PLEG): container finished" podID="17508d9b-339d-4b3a-ab72-e234a8ec168a" containerID="f2ffccb029ab6dad21f9321765317f8d1bc1d579c34566e7f8d448acfc094fa0" exitCode=0 Dec 12 00:46:23 crc kubenswrapper[4606]: I1212 00:46:23.041382 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-gjqxm" event={"ID":"17508d9b-339d-4b3a-ab72-e234a8ec168a","Type":"ContainerDied","Data":"f2ffccb029ab6dad21f9321765317f8d1bc1d579c34566e7f8d448acfc094fa0"} Dec 12 00:46:23 crc kubenswrapper[4606]: I1212 00:46:23.041431 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-gjqxm" event={"ID":"17508d9b-339d-4b3a-ab72-e234a8ec168a","Type":"ContainerStarted","Data":"289ea5f374286f826b2c02768e282721733354cb1a0c0ffa427f39c7320130ad"} Dec 12 00:46:23 crc kubenswrapper[4606]: I1212 00:46:23.070800 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-76b48998b8-ff8r8" podStartSLOduration=3.07077697 podStartE2EDuration="3.07077697s" podCreationTimestamp="2025-12-12 00:46:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:46:23.035622818 +0000 UTC m=+1373.580975724" watchObservedRunningTime="2025-12-12 00:46:23.07077697 +0000 UTC m=+1373.616129826" Dec 12 00:46:24 crc kubenswrapper[4606]: I1212 00:46:24.082122 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-gjqxm" event={"ID":"17508d9b-339d-4b3a-ab72-e234a8ec168a","Type":"ContainerStarted","Data":"ede0ca97d555480e9bff7d8de1dec15def97f16bfc3692e0b4036023c8b4e71f"} Dec 12 00:46:24 crc kubenswrapper[4606]: I1212 00:46:24.083059 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-848cf88cfc-gjqxm" Dec 12 00:46:24 crc kubenswrapper[4606]: I1212 00:46:24.099082 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6575dff54-tdrpc" event={"ID":"dec2500f-eaf5-4ee7-9c07-66f7dda126f3","Type":"ContainerStarted","Data":"d12094fab3625aae157adf8e8fe6fd662c5ca4433bcaea8eb4452d022b07b44f"} Dec 12 00:46:24 crc kubenswrapper[4606]: I1212 00:46:24.099137 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6575dff54-tdrpc" Dec 12 00:46:24 crc kubenswrapper[4606]: I1212 00:46:24.099595 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6575dff54-tdrpc" Dec 12 00:46:24 crc kubenswrapper[4606]: I1212 00:46:24.117761 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-848cf88cfc-gjqxm" podStartSLOduration=4.117738346 podStartE2EDuration="4.117738346s" podCreationTimestamp="2025-12-12 00:46:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:46:24.108071277 +0000 UTC m=+1374.653424133" watchObservedRunningTime="2025-12-12 00:46:24.117738346 +0000 UTC m=+1374.663091212" Dec 12 00:46:24 crc kubenswrapper[4606]: I1212 00:46:24.535782 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-6575dff54-tdrpc" podStartSLOduration=4.535764488 podStartE2EDuration="4.535764488s" podCreationTimestamp="2025-12-12 00:46:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:46:24.165606308 +0000 UTC m=+1374.710959174" watchObservedRunningTime="2025-12-12 00:46:24.535764488 +0000 UTC m=+1375.081117354" Dec 12 00:46:24 crc kubenswrapper[4606]: I1212 00:46:24.538091 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-5fd8976c8-ks4n2"] Dec 12 00:46:24 crc kubenswrapper[4606]: I1212 00:46:24.541651 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5fd8976c8-ks4n2" Dec 12 00:46:24 crc kubenswrapper[4606]: I1212 00:46:24.547756 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Dec 12 00:46:24 crc kubenswrapper[4606]: I1212 00:46:24.548545 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Dec 12 00:46:24 crc kubenswrapper[4606]: I1212 00:46:24.555252 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5fd8976c8-ks4n2"] Dec 12 00:46:24 crc kubenswrapper[4606]: I1212 00:46:24.649489 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a82de1f4-ae7b-42bf-ae94-b39ba56b7e95-config-data\") pod \"barbican-api-5fd8976c8-ks4n2\" (UID: \"a82de1f4-ae7b-42bf-ae94-b39ba56b7e95\") " pod="openstack/barbican-api-5fd8976c8-ks4n2" Dec 12 00:46:24 crc kubenswrapper[4606]: I1212 00:46:24.649551 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a82de1f4-ae7b-42bf-ae94-b39ba56b7e95-logs\") pod \"barbican-api-5fd8976c8-ks4n2\" (UID: \"a82de1f4-ae7b-42bf-ae94-b39ba56b7e95\") " pod="openstack/barbican-api-5fd8976c8-ks4n2" Dec 12 00:46:24 crc kubenswrapper[4606]: I1212 00:46:24.649610 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a82de1f4-ae7b-42bf-ae94-b39ba56b7e95-internal-tls-certs\") pod \"barbican-api-5fd8976c8-ks4n2\" (UID: \"a82de1f4-ae7b-42bf-ae94-b39ba56b7e95\") " pod="openstack/barbican-api-5fd8976c8-ks4n2" Dec 12 00:46:24 crc kubenswrapper[4606]: I1212 00:46:24.649661 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9xmd\" (UniqueName: \"kubernetes.io/projected/a82de1f4-ae7b-42bf-ae94-b39ba56b7e95-kube-api-access-x9xmd\") pod \"barbican-api-5fd8976c8-ks4n2\" (UID: \"a82de1f4-ae7b-42bf-ae94-b39ba56b7e95\") " pod="openstack/barbican-api-5fd8976c8-ks4n2" Dec 12 00:46:24 crc kubenswrapper[4606]: I1212 00:46:24.649696 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a82de1f4-ae7b-42bf-ae94-b39ba56b7e95-config-data-custom\") pod \"barbican-api-5fd8976c8-ks4n2\" (UID: \"a82de1f4-ae7b-42bf-ae94-b39ba56b7e95\") " pod="openstack/barbican-api-5fd8976c8-ks4n2" Dec 12 00:46:24 crc kubenswrapper[4606]: I1212 00:46:24.649713 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a82de1f4-ae7b-42bf-ae94-b39ba56b7e95-combined-ca-bundle\") pod \"barbican-api-5fd8976c8-ks4n2\" (UID: \"a82de1f4-ae7b-42bf-ae94-b39ba56b7e95\") " pod="openstack/barbican-api-5fd8976c8-ks4n2" Dec 12 00:46:24 crc kubenswrapper[4606]: I1212 00:46:24.649736 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a82de1f4-ae7b-42bf-ae94-b39ba56b7e95-public-tls-certs\") pod \"barbican-api-5fd8976c8-ks4n2\" (UID: \"a82de1f4-ae7b-42bf-ae94-b39ba56b7e95\") " pod="openstack/barbican-api-5fd8976c8-ks4n2" Dec 12 00:46:24 crc kubenswrapper[4606]: I1212 00:46:24.753949 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9xmd\" (UniqueName: \"kubernetes.io/projected/a82de1f4-ae7b-42bf-ae94-b39ba56b7e95-kube-api-access-x9xmd\") pod \"barbican-api-5fd8976c8-ks4n2\" (UID: \"a82de1f4-ae7b-42bf-ae94-b39ba56b7e95\") " pod="openstack/barbican-api-5fd8976c8-ks4n2" Dec 12 00:46:24 crc kubenswrapper[4606]: I1212 00:46:24.754509 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a82de1f4-ae7b-42bf-ae94-b39ba56b7e95-config-data-custom\") pod \"barbican-api-5fd8976c8-ks4n2\" (UID: \"a82de1f4-ae7b-42bf-ae94-b39ba56b7e95\") " pod="openstack/barbican-api-5fd8976c8-ks4n2" Dec 12 00:46:24 crc kubenswrapper[4606]: I1212 00:46:24.756042 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a82de1f4-ae7b-42bf-ae94-b39ba56b7e95-combined-ca-bundle\") pod \"barbican-api-5fd8976c8-ks4n2\" (UID: \"a82de1f4-ae7b-42bf-ae94-b39ba56b7e95\") " pod="openstack/barbican-api-5fd8976c8-ks4n2" Dec 12 00:46:24 crc kubenswrapper[4606]: I1212 00:46:24.756088 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a82de1f4-ae7b-42bf-ae94-b39ba56b7e95-public-tls-certs\") pod \"barbican-api-5fd8976c8-ks4n2\" (UID: \"a82de1f4-ae7b-42bf-ae94-b39ba56b7e95\") " pod="openstack/barbican-api-5fd8976c8-ks4n2" Dec 12 00:46:24 crc kubenswrapper[4606]: I1212 00:46:24.756158 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a82de1f4-ae7b-42bf-ae94-b39ba56b7e95-config-data\") pod \"barbican-api-5fd8976c8-ks4n2\" (UID: \"a82de1f4-ae7b-42bf-ae94-b39ba56b7e95\") " pod="openstack/barbican-api-5fd8976c8-ks4n2" Dec 12 00:46:24 crc kubenswrapper[4606]: I1212 00:46:24.756222 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a82de1f4-ae7b-42bf-ae94-b39ba56b7e95-logs\") pod \"barbican-api-5fd8976c8-ks4n2\" (UID: \"a82de1f4-ae7b-42bf-ae94-b39ba56b7e95\") " pod="openstack/barbican-api-5fd8976c8-ks4n2" Dec 12 00:46:24 crc kubenswrapper[4606]: I1212 00:46:24.756294 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a82de1f4-ae7b-42bf-ae94-b39ba56b7e95-internal-tls-certs\") pod \"barbican-api-5fd8976c8-ks4n2\" (UID: \"a82de1f4-ae7b-42bf-ae94-b39ba56b7e95\") " pod="openstack/barbican-api-5fd8976c8-ks4n2" Dec 12 00:46:24 crc kubenswrapper[4606]: I1212 00:46:24.758326 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a82de1f4-ae7b-42bf-ae94-b39ba56b7e95-logs\") pod \"barbican-api-5fd8976c8-ks4n2\" (UID: \"a82de1f4-ae7b-42bf-ae94-b39ba56b7e95\") " pod="openstack/barbican-api-5fd8976c8-ks4n2" Dec 12 00:46:24 crc kubenswrapper[4606]: I1212 00:46:24.777441 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a82de1f4-ae7b-42bf-ae94-b39ba56b7e95-internal-tls-certs\") pod \"barbican-api-5fd8976c8-ks4n2\" (UID: \"a82de1f4-ae7b-42bf-ae94-b39ba56b7e95\") " pod="openstack/barbican-api-5fd8976c8-ks4n2" Dec 12 00:46:24 crc kubenswrapper[4606]: I1212 00:46:24.778371 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a82de1f4-ae7b-42bf-ae94-b39ba56b7e95-config-data-custom\") pod \"barbican-api-5fd8976c8-ks4n2\" (UID: \"a82de1f4-ae7b-42bf-ae94-b39ba56b7e95\") " pod="openstack/barbican-api-5fd8976c8-ks4n2" Dec 12 00:46:24 crc kubenswrapper[4606]: I1212 00:46:24.779365 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9xmd\" (UniqueName: \"kubernetes.io/projected/a82de1f4-ae7b-42bf-ae94-b39ba56b7e95-kube-api-access-x9xmd\") pod \"barbican-api-5fd8976c8-ks4n2\" (UID: \"a82de1f4-ae7b-42bf-ae94-b39ba56b7e95\") " pod="openstack/barbican-api-5fd8976c8-ks4n2" Dec 12 00:46:24 crc kubenswrapper[4606]: I1212 00:46:24.780946 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a82de1f4-ae7b-42bf-ae94-b39ba56b7e95-config-data\") pod \"barbican-api-5fd8976c8-ks4n2\" (UID: \"a82de1f4-ae7b-42bf-ae94-b39ba56b7e95\") " pod="openstack/barbican-api-5fd8976c8-ks4n2" Dec 12 00:46:24 crc kubenswrapper[4606]: I1212 00:46:24.781910 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a82de1f4-ae7b-42bf-ae94-b39ba56b7e95-public-tls-certs\") pod \"barbican-api-5fd8976c8-ks4n2\" (UID: \"a82de1f4-ae7b-42bf-ae94-b39ba56b7e95\") " pod="openstack/barbican-api-5fd8976c8-ks4n2" Dec 12 00:46:24 crc kubenswrapper[4606]: I1212 00:46:24.797243 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a82de1f4-ae7b-42bf-ae94-b39ba56b7e95-combined-ca-bundle\") pod \"barbican-api-5fd8976c8-ks4n2\" (UID: \"a82de1f4-ae7b-42bf-ae94-b39ba56b7e95\") " pod="openstack/barbican-api-5fd8976c8-ks4n2" Dec 12 00:46:24 crc kubenswrapper[4606]: I1212 00:46:24.868030 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5fd8976c8-ks4n2" Dec 12 00:46:25 crc kubenswrapper[4606]: I1212 00:46:25.360353 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 12 00:46:25 crc kubenswrapper[4606]: I1212 00:46:25.360786 4606 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 12 00:46:25 crc kubenswrapper[4606]: I1212 00:46:25.364071 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 12 00:46:25 crc kubenswrapper[4606]: I1212 00:46:25.410566 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 12 00:46:25 crc kubenswrapper[4606]: I1212 00:46:25.415275 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 12 00:46:25 crc kubenswrapper[4606]: I1212 00:46:25.694077 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5fd8976c8-ks4n2"] Dec 12 00:46:27 crc kubenswrapper[4606]: I1212 00:46:27.414818 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-9f4869f76-gqjw4" Dec 12 00:46:28 crc kubenswrapper[4606]: I1212 00:46:28.656264 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-9f4869f76-gqjw4" Dec 12 00:46:28 crc kubenswrapper[4606]: I1212 00:46:28.713861 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-5dfd45968b-nj9ll" Dec 12 00:46:29 crc kubenswrapper[4606]: I1212 00:46:29.162662 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-56c8787fc4-ckfzn" event={"ID":"92eda0c3-4480-4e66-b349-144d9fb32bad","Type":"ContainerStarted","Data":"49f7c5a8ed5dab6cb87d8acfe62d7f8fc12005afb7c8185e0c4e2bfd985e5cae"} Dec 12 00:46:29 crc kubenswrapper[4606]: I1212 00:46:29.169919 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5555587ffc-hjrpk" event={"ID":"71f55b69-c2e2-49eb-b468-b1e940b54f1e","Type":"ContainerStarted","Data":"fbabe871b0892ed93c87a17879a74e06d6cc99452747041e44d63b40500302bb"} Dec 12 00:46:29 crc kubenswrapper[4606]: I1212 00:46:29.173358 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5fd8976c8-ks4n2" event={"ID":"a82de1f4-ae7b-42bf-ae94-b39ba56b7e95","Type":"ContainerStarted","Data":"0b954ae7c36242b0b9a0e056021994c9c53e657c1b7fa607f660e91e553fd440"} Dec 12 00:46:29 crc kubenswrapper[4606]: I1212 00:46:29.176327 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5fd8976c8-ks4n2" event={"ID":"a82de1f4-ae7b-42bf-ae94-b39ba56b7e95","Type":"ContainerStarted","Data":"875b02cd9d3611d79ea4919c2f631f6cede84f9d775b2eef78a0f37886705bbd"} Dec 12 00:46:30 crc kubenswrapper[4606]: I1212 00:46:30.184529 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5fd8976c8-ks4n2" event={"ID":"a82de1f4-ae7b-42bf-ae94-b39ba56b7e95","Type":"ContainerStarted","Data":"fd3baf0396bfe0700a3ba48fa1f06e8d951e7c655ddb35d404b8c991f8e97ff7"} Dec 12 00:46:30 crc kubenswrapper[4606]: I1212 00:46:30.184842 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5fd8976c8-ks4n2" Dec 12 00:46:30 crc kubenswrapper[4606]: I1212 00:46:30.200766 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-56c8787fc4-ckfzn" event={"ID":"92eda0c3-4480-4e66-b349-144d9fb32bad","Type":"ContainerStarted","Data":"917288b9c900150f7e276b712d3a62c150f45a3001daf15525649723aa8695e5"} Dec 12 00:46:30 crc kubenswrapper[4606]: I1212 00:46:30.218533 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5555587ffc-hjrpk" event={"ID":"71f55b69-c2e2-49eb-b468-b1e940b54f1e","Type":"ContainerStarted","Data":"39d1bb001709bcf7c087c3dcf43f1a0cff9216d3e21bb04c6d1e0668b0d84f8e"} Dec 12 00:46:30 crc kubenswrapper[4606]: I1212 00:46:30.239301 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-5fd8976c8-ks4n2" podStartSLOduration=6.23928448 podStartE2EDuration="6.23928448s" podCreationTimestamp="2025-12-12 00:46:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:46:30.218590836 +0000 UTC m=+1380.763943702" watchObservedRunningTime="2025-12-12 00:46:30.23928448 +0000 UTC m=+1380.784637346" Dec 12 00:46:30 crc kubenswrapper[4606]: I1212 00:46:30.243202 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-5555587ffc-hjrpk" podStartSLOduration=3.9131264740000002 podStartE2EDuration="10.243177555s" podCreationTimestamp="2025-12-12 00:46:20 +0000 UTC" firstStartedPulling="2025-12-12 00:46:22.288423955 +0000 UTC m=+1372.833776821" lastFinishedPulling="2025-12-12 00:46:28.618475036 +0000 UTC m=+1379.163827902" observedRunningTime="2025-12-12 00:46:30.238966092 +0000 UTC m=+1380.784318988" watchObservedRunningTime="2025-12-12 00:46:30.243177555 +0000 UTC m=+1380.788530411" Dec 12 00:46:30 crc kubenswrapper[4606]: I1212 00:46:30.271159 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-56c8787fc4-ckfzn" podStartSLOduration=3.708714415 podStartE2EDuration="10.271134444s" podCreationTimestamp="2025-12-12 00:46:20 +0000 UTC" firstStartedPulling="2025-12-12 00:46:22.041122207 +0000 UTC m=+1372.586475073" lastFinishedPulling="2025-12-12 00:46:28.603542236 +0000 UTC m=+1379.148895102" observedRunningTime="2025-12-12 00:46:30.265568505 +0000 UTC m=+1380.810921381" watchObservedRunningTime="2025-12-12 00:46:30.271134444 +0000 UTC m=+1380.816487320" Dec 12 00:46:31 crc kubenswrapper[4606]: I1212 00:46:31.204238 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-848cf88cfc-gjqxm" Dec 12 00:46:31 crc kubenswrapper[4606]: I1212 00:46:31.230772 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5fd8976c8-ks4n2" Dec 12 00:46:31 crc kubenswrapper[4606]: I1212 00:46:31.275628 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-qqpvb"] Dec 12 00:46:31 crc kubenswrapper[4606]: I1212 00:46:31.276266 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6b7b667979-qqpvb" podUID="e071e571-9ded-4520-9275-221d832aa78d" containerName="dnsmasq-dns" containerID="cri-o://6f7d5a31d87804a5a38e1d766b9397d4097fc85d7e30158912179eb190bdd558" gracePeriod=10 Dec 12 00:46:31 crc kubenswrapper[4606]: I1212 00:46:31.417814 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-d9886cd8c-2vtxs" Dec 12 00:46:31 crc kubenswrapper[4606]: I1212 00:46:31.516763 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5dfd45968b-nj9ll"] Dec 12 00:46:31 crc kubenswrapper[4606]: I1212 00:46:31.516999 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5dfd45968b-nj9ll" podUID="218c1acf-b25f-43b6-9967-badd62c1a155" containerName="neutron-api" containerID="cri-o://44e3ffe4c5022a3dae1e2fd671e95fc1ee641f7665dabc91a545e50f3b8f0634" gracePeriod=30 Dec 12 00:46:31 crc kubenswrapper[4606]: I1212 00:46:31.517139 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5dfd45968b-nj9ll" podUID="218c1acf-b25f-43b6-9967-badd62c1a155" containerName="neutron-httpd" containerID="cri-o://d1b6480067229a434be4e619b942a9f48ca05a819714ceeaa05d7ca9e025c51b" gracePeriod=30 Dec 12 00:46:31 crc kubenswrapper[4606]: I1212 00:46:31.905833 4606 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-79c99578bb-cdgsn" podUID="9ede4720-3fd7-4524-adfc-c1c395f12170" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.148:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.148:8443: connect: connection refused" Dec 12 00:46:31 crc kubenswrapper[4606]: I1212 00:46:31.905916 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-79c99578bb-cdgsn" Dec 12 00:46:31 crc kubenswrapper[4606]: I1212 00:46:31.906635 4606 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="horizon" containerStatusID={"Type":"cri-o","ID":"a50c810b61ab80031ddc96b3bb79c28f1af6ee88b45271a71187d7164a11dd04"} pod="openstack/horizon-79c99578bb-cdgsn" containerMessage="Container horizon failed startup probe, will be restarted" Dec 12 00:46:31 crc kubenswrapper[4606]: I1212 00:46:31.906673 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-79c99578bb-cdgsn" podUID="9ede4720-3fd7-4524-adfc-c1c395f12170" containerName="horizon" containerID="cri-o://a50c810b61ab80031ddc96b3bb79c28f1af6ee88b45271a71187d7164a11dd04" gracePeriod=30 Dec 12 00:46:32 crc kubenswrapper[4606]: I1212 00:46:32.178855 4606 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-b9fb498f6-62fcc" podUID="e38df57e-1a86-4c45-bf40-6282a6a049ed" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.149:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.149:8443: connect: connection refused" Dec 12 00:46:32 crc kubenswrapper[4606]: I1212 00:46:32.179131 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-b9fb498f6-62fcc" Dec 12 00:46:32 crc kubenswrapper[4606]: I1212 00:46:32.179763 4606 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="horizon" containerStatusID={"Type":"cri-o","ID":"2dbf7369ad77ec21071d76169ebe84202398f88e4b0d626bed0634bc2c0923cd"} pod="openstack/horizon-b9fb498f6-62fcc" containerMessage="Container horizon failed startup probe, will be restarted" Dec 12 00:46:32 crc kubenswrapper[4606]: I1212 00:46:32.179786 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-b9fb498f6-62fcc" podUID="e38df57e-1a86-4c45-bf40-6282a6a049ed" containerName="horizon" containerID="cri-o://2dbf7369ad77ec21071d76169ebe84202398f88e4b0d626bed0634bc2c0923cd" gracePeriod=30 Dec 12 00:46:32 crc kubenswrapper[4606]: I1212 00:46:32.258390 4606 generic.go:334] "Generic (PLEG): container finished" podID="465e3cb1-d565-45fb-9251-de59579f3add" containerID="cf0be897f83d1499d58628455a2bba9f36282137aa420d1d9db956ae32c07e35" exitCode=137 Dec 12 00:46:32 crc kubenswrapper[4606]: I1212 00:46:32.258420 4606 generic.go:334] "Generic (PLEG): container finished" podID="465e3cb1-d565-45fb-9251-de59579f3add" containerID="2f51d6b05b49c252f430521dbf06561a8e8107c42c290eb1b48f934e24e71d34" exitCode=137 Dec 12 00:46:32 crc kubenswrapper[4606]: I1212 00:46:32.258495 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-b55cc6bf7-ckhd2" event={"ID":"465e3cb1-d565-45fb-9251-de59579f3add","Type":"ContainerDied","Data":"cf0be897f83d1499d58628455a2bba9f36282137aa420d1d9db956ae32c07e35"} Dec 12 00:46:32 crc kubenswrapper[4606]: I1212 00:46:32.258521 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-b55cc6bf7-ckhd2" event={"ID":"465e3cb1-d565-45fb-9251-de59579f3add","Type":"ContainerDied","Data":"2f51d6b05b49c252f430521dbf06561a8e8107c42c290eb1b48f934e24e71d34"} Dec 12 00:46:32 crc kubenswrapper[4606]: I1212 00:46:32.266464 4606 generic.go:334] "Generic (PLEG): container finished" podID="218c1acf-b25f-43b6-9967-badd62c1a155" containerID="d1b6480067229a434be4e619b942a9f48ca05a819714ceeaa05d7ca9e025c51b" exitCode=0 Dec 12 00:46:32 crc kubenswrapper[4606]: I1212 00:46:32.266707 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5dfd45968b-nj9ll" event={"ID":"218c1acf-b25f-43b6-9967-badd62c1a155","Type":"ContainerDied","Data":"d1b6480067229a434be4e619b942a9f48ca05a819714ceeaa05d7ca9e025c51b"} Dec 12 00:46:32 crc kubenswrapper[4606]: I1212 00:46:32.268295 4606 generic.go:334] "Generic (PLEG): container finished" podID="7978c0cd-b859-49f1-ad0e-1cb88ff58495" containerID="b515f4d8e08e5e2e4e36edb12127b4cd245223498e1e82b164d9c51b5ca6bd93" exitCode=0 Dec 12 00:46:32 crc kubenswrapper[4606]: I1212 00:46:32.268335 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-9xbj8" event={"ID":"7978c0cd-b859-49f1-ad0e-1cb88ff58495","Type":"ContainerDied","Data":"b515f4d8e08e5e2e4e36edb12127b4cd245223498e1e82b164d9c51b5ca6bd93"} Dec 12 00:46:32 crc kubenswrapper[4606]: I1212 00:46:32.274567 4606 generic.go:334] "Generic (PLEG): container finished" podID="e071e571-9ded-4520-9275-221d832aa78d" containerID="6f7d5a31d87804a5a38e1d766b9397d4097fc85d7e30158912179eb190bdd558" exitCode=0 Dec 12 00:46:32 crc kubenswrapper[4606]: I1212 00:46:32.275274 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-qqpvb" event={"ID":"e071e571-9ded-4520-9275-221d832aa78d","Type":"ContainerDied","Data":"6f7d5a31d87804a5a38e1d766b9397d4097fc85d7e30158912179eb190bdd558"} Dec 12 00:46:33 crc kubenswrapper[4606]: I1212 00:46:33.528342 4606 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6b7b667979-qqpvb" podUID="e071e571-9ded-4520-9275-221d832aa78d" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.151:5353: connect: connection refused" Dec 12 00:46:33 crc kubenswrapper[4606]: I1212 00:46:33.804096 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6575dff54-tdrpc" Dec 12 00:46:33 crc kubenswrapper[4606]: I1212 00:46:33.954977 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6575dff54-tdrpc" Dec 12 00:46:36 crc kubenswrapper[4606]: I1212 00:46:36.839197 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5fd8976c8-ks4n2" Dec 12 00:46:36 crc kubenswrapper[4606]: I1212 00:46:36.873019 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5fd8976c8-ks4n2" Dec 12 00:46:36 crc kubenswrapper[4606]: I1212 00:46:36.929743 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6575dff54-tdrpc"] Dec 12 00:46:36 crc kubenswrapper[4606]: I1212 00:46:36.929951 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6575dff54-tdrpc" podUID="dec2500f-eaf5-4ee7-9c07-66f7dda126f3" containerName="barbican-api-log" containerID="cri-o://28897c4ec05b5aa59ad1ed5aa483cdb74e5499a6e8b201838073af70d30aefe4" gracePeriod=30 Dec 12 00:46:36 crc kubenswrapper[4606]: I1212 00:46:36.930074 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6575dff54-tdrpc" podUID="dec2500f-eaf5-4ee7-9c07-66f7dda126f3" containerName="barbican-api" containerID="cri-o://d12094fab3625aae157adf8e8fe6fd662c5ca4433bcaea8eb4452d022b07b44f" gracePeriod=30 Dec 12 00:46:37 crc kubenswrapper[4606]: I1212 00:46:37.353775 4606 generic.go:334] "Generic (PLEG): container finished" podID="dec2500f-eaf5-4ee7-9c07-66f7dda126f3" containerID="28897c4ec05b5aa59ad1ed5aa483cdb74e5499a6e8b201838073af70d30aefe4" exitCode=143 Dec 12 00:46:37 crc kubenswrapper[4606]: I1212 00:46:37.354354 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6575dff54-tdrpc" event={"ID":"dec2500f-eaf5-4ee7-9c07-66f7dda126f3","Type":"ContainerDied","Data":"28897c4ec05b5aa59ad1ed5aa483cdb74e5499a6e8b201838073af70d30aefe4"} Dec 12 00:46:38 crc kubenswrapper[4606]: I1212 00:46:38.905424 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-9xbj8" Dec 12 00:46:38 crc kubenswrapper[4606]: I1212 00:46:38.907230 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-qqpvb" Dec 12 00:46:39 crc kubenswrapper[4606]: I1212 00:46:39.042726 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e071e571-9ded-4520-9275-221d832aa78d-dns-svc\") pod \"e071e571-9ded-4520-9275-221d832aa78d\" (UID: \"e071e571-9ded-4520-9275-221d832aa78d\") " Dec 12 00:46:39 crc kubenswrapper[4606]: I1212 00:46:39.042787 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jwfdl\" (UniqueName: \"kubernetes.io/projected/7978c0cd-b859-49f1-ad0e-1cb88ff58495-kube-api-access-jwfdl\") pod \"7978c0cd-b859-49f1-ad0e-1cb88ff58495\" (UID: \"7978c0cd-b859-49f1-ad0e-1cb88ff58495\") " Dec 12 00:46:39 crc kubenswrapper[4606]: I1212 00:46:39.042823 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7978c0cd-b859-49f1-ad0e-1cb88ff58495-combined-ca-bundle\") pod \"7978c0cd-b859-49f1-ad0e-1cb88ff58495\" (UID: \"7978c0cd-b859-49f1-ad0e-1cb88ff58495\") " Dec 12 00:46:39 crc kubenswrapper[4606]: I1212 00:46:39.042847 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7978c0cd-b859-49f1-ad0e-1cb88ff58495-etc-machine-id\") pod \"7978c0cd-b859-49f1-ad0e-1cb88ff58495\" (UID: \"7978c0cd-b859-49f1-ad0e-1cb88ff58495\") " Dec 12 00:46:39 crc kubenswrapper[4606]: I1212 00:46:39.042862 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e071e571-9ded-4520-9275-221d832aa78d-ovsdbserver-nb\") pod \"e071e571-9ded-4520-9275-221d832aa78d\" (UID: \"e071e571-9ded-4520-9275-221d832aa78d\") " Dec 12 00:46:39 crc kubenswrapper[4606]: I1212 00:46:39.042885 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e071e571-9ded-4520-9275-221d832aa78d-ovsdbserver-sb\") pod \"e071e571-9ded-4520-9275-221d832aa78d\" (UID: \"e071e571-9ded-4520-9275-221d832aa78d\") " Dec 12 00:46:39 crc kubenswrapper[4606]: I1212 00:46:39.042988 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7978c0cd-b859-49f1-ad0e-1cb88ff58495-scripts\") pod \"7978c0cd-b859-49f1-ad0e-1cb88ff58495\" (UID: \"7978c0cd-b859-49f1-ad0e-1cb88ff58495\") " Dec 12 00:46:39 crc kubenswrapper[4606]: I1212 00:46:39.043007 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jn9k2\" (UniqueName: \"kubernetes.io/projected/e071e571-9ded-4520-9275-221d832aa78d-kube-api-access-jn9k2\") pod \"e071e571-9ded-4520-9275-221d832aa78d\" (UID: \"e071e571-9ded-4520-9275-221d832aa78d\") " Dec 12 00:46:39 crc kubenswrapper[4606]: I1212 00:46:39.043047 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7978c0cd-b859-49f1-ad0e-1cb88ff58495-config-data\") pod \"7978c0cd-b859-49f1-ad0e-1cb88ff58495\" (UID: \"7978c0cd-b859-49f1-ad0e-1cb88ff58495\") " Dec 12 00:46:39 crc kubenswrapper[4606]: I1212 00:46:39.043081 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7978c0cd-b859-49f1-ad0e-1cb88ff58495-db-sync-config-data\") pod \"7978c0cd-b859-49f1-ad0e-1cb88ff58495\" (UID: \"7978c0cd-b859-49f1-ad0e-1cb88ff58495\") " Dec 12 00:46:39 crc kubenswrapper[4606]: I1212 00:46:39.043112 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e071e571-9ded-4520-9275-221d832aa78d-dns-swift-storage-0\") pod \"e071e571-9ded-4520-9275-221d832aa78d\" (UID: \"e071e571-9ded-4520-9275-221d832aa78d\") " Dec 12 00:46:39 crc kubenswrapper[4606]: I1212 00:46:39.043156 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e071e571-9ded-4520-9275-221d832aa78d-config\") pod \"e071e571-9ded-4520-9275-221d832aa78d\" (UID: \"e071e571-9ded-4520-9275-221d832aa78d\") " Dec 12 00:46:39 crc kubenswrapper[4606]: I1212 00:46:39.050689 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7978c0cd-b859-49f1-ad0e-1cb88ff58495-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "7978c0cd-b859-49f1-ad0e-1cb88ff58495" (UID: "7978c0cd-b859-49f1-ad0e-1cb88ff58495"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 12 00:46:39 crc kubenswrapper[4606]: I1212 00:46:39.075392 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7978c0cd-b859-49f1-ad0e-1cb88ff58495-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "7978c0cd-b859-49f1-ad0e-1cb88ff58495" (UID: "7978c0cd-b859-49f1-ad0e-1cb88ff58495"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:46:39 crc kubenswrapper[4606]: I1212 00:46:39.079521 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7978c0cd-b859-49f1-ad0e-1cb88ff58495-kube-api-access-jwfdl" (OuterVolumeSpecName: "kube-api-access-jwfdl") pod "7978c0cd-b859-49f1-ad0e-1cb88ff58495" (UID: "7978c0cd-b859-49f1-ad0e-1cb88ff58495"). InnerVolumeSpecName "kube-api-access-jwfdl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:46:39 crc kubenswrapper[4606]: I1212 00:46:39.091110 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7978c0cd-b859-49f1-ad0e-1cb88ff58495-scripts" (OuterVolumeSpecName: "scripts") pod "7978c0cd-b859-49f1-ad0e-1cb88ff58495" (UID: "7978c0cd-b859-49f1-ad0e-1cb88ff58495"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:46:39 crc kubenswrapper[4606]: I1212 00:46:39.111361 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e071e571-9ded-4520-9275-221d832aa78d-kube-api-access-jn9k2" (OuterVolumeSpecName: "kube-api-access-jn9k2") pod "e071e571-9ded-4520-9275-221d832aa78d" (UID: "e071e571-9ded-4520-9275-221d832aa78d"). InnerVolumeSpecName "kube-api-access-jn9k2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:46:39 crc kubenswrapper[4606]: I1212 00:46:39.125454 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-b55cc6bf7-ckhd2" Dec 12 00:46:39 crc kubenswrapper[4606]: I1212 00:46:39.142722 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e071e571-9ded-4520-9275-221d832aa78d-config" (OuterVolumeSpecName: "config") pod "e071e571-9ded-4520-9275-221d832aa78d" (UID: "e071e571-9ded-4520-9275-221d832aa78d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:46:39 crc kubenswrapper[4606]: I1212 00:46:39.145153 4606 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e071e571-9ded-4520-9275-221d832aa78d-config\") on node \"crc\" DevicePath \"\"" Dec 12 00:46:39 crc kubenswrapper[4606]: I1212 00:46:39.145269 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jwfdl\" (UniqueName: \"kubernetes.io/projected/7978c0cd-b859-49f1-ad0e-1cb88ff58495-kube-api-access-jwfdl\") on node \"crc\" DevicePath \"\"" Dec 12 00:46:39 crc kubenswrapper[4606]: I1212 00:46:39.145332 4606 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7978c0cd-b859-49f1-ad0e-1cb88ff58495-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 12 00:46:39 crc kubenswrapper[4606]: I1212 00:46:39.145394 4606 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7978c0cd-b859-49f1-ad0e-1cb88ff58495-scripts\") on node \"crc\" DevicePath \"\"" Dec 12 00:46:39 crc kubenswrapper[4606]: I1212 00:46:39.145452 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jn9k2\" (UniqueName: \"kubernetes.io/projected/e071e571-9ded-4520-9275-221d832aa78d-kube-api-access-jn9k2\") on node \"crc\" DevicePath \"\"" Dec 12 00:46:39 crc kubenswrapper[4606]: I1212 00:46:39.145505 4606 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7978c0cd-b859-49f1-ad0e-1cb88ff58495-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 12 00:46:39 crc kubenswrapper[4606]: I1212 00:46:39.163766 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e071e571-9ded-4520-9275-221d832aa78d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e071e571-9ded-4520-9275-221d832aa78d" (UID: "e071e571-9ded-4520-9275-221d832aa78d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:46:39 crc kubenswrapper[4606]: I1212 00:46:39.225855 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e071e571-9ded-4520-9275-221d832aa78d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e071e571-9ded-4520-9275-221d832aa78d" (UID: "e071e571-9ded-4520-9275-221d832aa78d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:46:39 crc kubenswrapper[4606]: I1212 00:46:39.232604 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e071e571-9ded-4520-9275-221d832aa78d-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "e071e571-9ded-4520-9275-221d832aa78d" (UID: "e071e571-9ded-4520-9275-221d832aa78d"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:46:39 crc kubenswrapper[4606]: I1212 00:46:39.233475 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7978c0cd-b859-49f1-ad0e-1cb88ff58495-config-data" (OuterVolumeSpecName: "config-data") pod "7978c0cd-b859-49f1-ad0e-1cb88ff58495" (UID: "7978c0cd-b859-49f1-ad0e-1cb88ff58495"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:46:39 crc kubenswrapper[4606]: I1212 00:46:39.243462 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7978c0cd-b859-49f1-ad0e-1cb88ff58495-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7978c0cd-b859-49f1-ad0e-1cb88ff58495" (UID: "7978c0cd-b859-49f1-ad0e-1cb88ff58495"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:46:39 crc kubenswrapper[4606]: I1212 00:46:39.249952 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/465e3cb1-d565-45fb-9251-de59579f3add-logs\") pod \"465e3cb1-d565-45fb-9251-de59579f3add\" (UID: \"465e3cb1-d565-45fb-9251-de59579f3add\") " Dec 12 00:46:39 crc kubenswrapper[4606]: I1212 00:46:39.250015 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/465e3cb1-d565-45fb-9251-de59579f3add-scripts\") pod \"465e3cb1-d565-45fb-9251-de59579f3add\" (UID: \"465e3cb1-d565-45fb-9251-de59579f3add\") " Dec 12 00:46:39 crc kubenswrapper[4606]: I1212 00:46:39.250038 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/465e3cb1-d565-45fb-9251-de59579f3add-config-data\") pod \"465e3cb1-d565-45fb-9251-de59579f3add\" (UID: \"465e3cb1-d565-45fb-9251-de59579f3add\") " Dec 12 00:46:39 crc kubenswrapper[4606]: I1212 00:46:39.250062 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jjwp4\" (UniqueName: \"kubernetes.io/projected/465e3cb1-d565-45fb-9251-de59579f3add-kube-api-access-jjwp4\") pod \"465e3cb1-d565-45fb-9251-de59579f3add\" (UID: \"465e3cb1-d565-45fb-9251-de59579f3add\") " Dec 12 00:46:39 crc kubenswrapper[4606]: I1212 00:46:39.250205 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/465e3cb1-d565-45fb-9251-de59579f3add-horizon-secret-key\") pod \"465e3cb1-d565-45fb-9251-de59579f3add\" (UID: \"465e3cb1-d565-45fb-9251-de59579f3add\") " Dec 12 00:46:39 crc kubenswrapper[4606]: I1212 00:46:39.250607 4606 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7978c0cd-b859-49f1-ad0e-1cb88ff58495-config-data\") on node \"crc\" DevicePath \"\"" Dec 12 00:46:39 crc kubenswrapper[4606]: I1212 00:46:39.250620 4606 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e071e571-9ded-4520-9275-221d832aa78d-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 12 00:46:39 crc kubenswrapper[4606]: I1212 00:46:39.250631 4606 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e071e571-9ded-4520-9275-221d832aa78d-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 12 00:46:39 crc kubenswrapper[4606]: I1212 00:46:39.250641 4606 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7978c0cd-b859-49f1-ad0e-1cb88ff58495-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 12 00:46:39 crc kubenswrapper[4606]: I1212 00:46:39.250650 4606 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e071e571-9ded-4520-9275-221d832aa78d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 12 00:46:39 crc kubenswrapper[4606]: I1212 00:46:39.251472 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/465e3cb1-d565-45fb-9251-de59579f3add-logs" (OuterVolumeSpecName: "logs") pod "465e3cb1-d565-45fb-9251-de59579f3add" (UID: "465e3cb1-d565-45fb-9251-de59579f3add"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 00:46:39 crc kubenswrapper[4606]: I1212 00:46:39.297392 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/465e3cb1-d565-45fb-9251-de59579f3add-kube-api-access-jjwp4" (OuterVolumeSpecName: "kube-api-access-jjwp4") pod "465e3cb1-d565-45fb-9251-de59579f3add" (UID: "465e3cb1-d565-45fb-9251-de59579f3add"). InnerVolumeSpecName "kube-api-access-jjwp4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:46:39 crc kubenswrapper[4606]: I1212 00:46:39.298541 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/465e3cb1-d565-45fb-9251-de59579f3add-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "465e3cb1-d565-45fb-9251-de59579f3add" (UID: "465e3cb1-d565-45fb-9251-de59579f3add"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:46:39 crc kubenswrapper[4606]: I1212 00:46:39.344446 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e071e571-9ded-4520-9275-221d832aa78d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e071e571-9ded-4520-9275-221d832aa78d" (UID: "e071e571-9ded-4520-9275-221d832aa78d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:46:39 crc kubenswrapper[4606]: I1212 00:46:39.352382 4606 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/465e3cb1-d565-45fb-9251-de59579f3add-logs\") on node \"crc\" DevicePath \"\"" Dec 12 00:46:39 crc kubenswrapper[4606]: I1212 00:46:39.352418 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jjwp4\" (UniqueName: \"kubernetes.io/projected/465e3cb1-d565-45fb-9251-de59579f3add-kube-api-access-jjwp4\") on node \"crc\" DevicePath \"\"" Dec 12 00:46:39 crc kubenswrapper[4606]: I1212 00:46:39.352430 4606 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/465e3cb1-d565-45fb-9251-de59579f3add-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 12 00:46:39 crc kubenswrapper[4606]: I1212 00:46:39.352439 4606 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e071e571-9ded-4520-9275-221d832aa78d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 12 00:46:39 crc kubenswrapper[4606]: I1212 00:46:39.394859 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-qqpvb" event={"ID":"e071e571-9ded-4520-9275-221d832aa78d","Type":"ContainerDied","Data":"619cec974fb9ba7102c351dfddea3a15ed285c8a8993ed86e929e8d9b1794a6a"} Dec 12 00:46:39 crc kubenswrapper[4606]: I1212 00:46:39.394908 4606 scope.go:117] "RemoveContainer" containerID="6f7d5a31d87804a5a38e1d766b9397d4097fc85d7e30158912179eb190bdd558" Dec 12 00:46:39 crc kubenswrapper[4606]: I1212 00:46:39.395026 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-qqpvb" Dec 12 00:46:39 crc kubenswrapper[4606]: I1212 00:46:39.404825 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/465e3cb1-d565-45fb-9251-de59579f3add-scripts" (OuterVolumeSpecName: "scripts") pod "465e3cb1-d565-45fb-9251-de59579f3add" (UID: "465e3cb1-d565-45fb-9251-de59579f3add"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:46:39 crc kubenswrapper[4606]: I1212 00:46:39.405599 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-b55cc6bf7-ckhd2" event={"ID":"465e3cb1-d565-45fb-9251-de59579f3add","Type":"ContainerDied","Data":"9d92f8ce684fac8750989f792341f509ed07e1ddbe387fc745d30d311638ab5e"} Dec 12 00:46:39 crc kubenswrapper[4606]: I1212 00:46:39.405749 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-b55cc6bf7-ckhd2" Dec 12 00:46:39 crc kubenswrapper[4606]: I1212 00:46:39.405751 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/465e3cb1-d565-45fb-9251-de59579f3add-config-data" (OuterVolumeSpecName: "config-data") pod "465e3cb1-d565-45fb-9251-de59579f3add" (UID: "465e3cb1-d565-45fb-9251-de59579f3add"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:46:39 crc kubenswrapper[4606]: I1212 00:46:39.423842 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-9xbj8" event={"ID":"7978c0cd-b859-49f1-ad0e-1cb88ff58495","Type":"ContainerDied","Data":"76e42ca04fa8a4754b1af4600675dfe684c3689b875e7071dbc32d210f1af767"} Dec 12 00:46:39 crc kubenswrapper[4606]: I1212 00:46:39.423876 4606 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="76e42ca04fa8a4754b1af4600675dfe684c3689b875e7071dbc32d210f1af767" Dec 12 00:46:39 crc kubenswrapper[4606]: I1212 00:46:39.423961 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-9xbj8" Dec 12 00:46:39 crc kubenswrapper[4606]: I1212 00:46:39.425689 4606 scope.go:117] "RemoveContainer" containerID="08cd718141c9d291cf03d46cabc4d360935067d8b604b7d59d1bfb86de4fe6a4" Dec 12 00:46:39 crc kubenswrapper[4606]: I1212 00:46:39.447811 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-qqpvb"] Dec 12 00:46:39 crc kubenswrapper[4606]: I1212 00:46:39.466925 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-qqpvb"] Dec 12 00:46:39 crc kubenswrapper[4606]: I1212 00:46:39.472285 4606 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/465e3cb1-d565-45fb-9251-de59579f3add-scripts\") on node \"crc\" DevicePath \"\"" Dec 12 00:46:39 crc kubenswrapper[4606]: I1212 00:46:39.472322 4606 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/465e3cb1-d565-45fb-9251-de59579f3add-config-data\") on node \"crc\" DevicePath \"\"" Dec 12 00:46:39 crc kubenswrapper[4606]: I1212 00:46:39.482241 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-b55cc6bf7-ckhd2"] Dec 12 00:46:39 crc kubenswrapper[4606]: I1212 00:46:39.483050 4606 scope.go:117] "RemoveContainer" containerID="cf0be897f83d1499d58628455a2bba9f36282137aa420d1d9db956ae32c07e35" Dec 12 00:46:39 crc kubenswrapper[4606]: I1212 00:46:39.492389 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-b55cc6bf7-ckhd2"] Dec 12 00:46:39 crc kubenswrapper[4606]: I1212 00:46:39.674960 4606 scope.go:117] "RemoveContainer" containerID="2f51d6b05b49c252f430521dbf06561a8e8107c42c290eb1b48f934e24e71d34" Dec 12 00:46:39 crc kubenswrapper[4606]: I1212 00:46:39.711944 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="465e3cb1-d565-45fb-9251-de59579f3add" path="/var/lib/kubelet/pods/465e3cb1-d565-45fb-9251-de59579f3add/volumes" Dec 12 00:46:39 crc kubenswrapper[4606]: I1212 00:46:39.712644 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e071e571-9ded-4520-9275-221d832aa78d" path="/var/lib/kubelet/pods/e071e571-9ded-4520-9275-221d832aa78d/volumes" Dec 12 00:46:40 crc kubenswrapper[4606]: I1212 00:46:40.167066 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 12 00:46:40 crc kubenswrapper[4606]: E1212 00:46:40.167435 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="465e3cb1-d565-45fb-9251-de59579f3add" containerName="horizon-log" Dec 12 00:46:40 crc kubenswrapper[4606]: I1212 00:46:40.167449 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="465e3cb1-d565-45fb-9251-de59579f3add" containerName="horizon-log" Dec 12 00:46:40 crc kubenswrapper[4606]: E1212 00:46:40.167471 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e071e571-9ded-4520-9275-221d832aa78d" containerName="dnsmasq-dns" Dec 12 00:46:40 crc kubenswrapper[4606]: I1212 00:46:40.167477 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="e071e571-9ded-4520-9275-221d832aa78d" containerName="dnsmasq-dns" Dec 12 00:46:40 crc kubenswrapper[4606]: E1212 00:46:40.167490 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7978c0cd-b859-49f1-ad0e-1cb88ff58495" containerName="cinder-db-sync" Dec 12 00:46:40 crc kubenswrapper[4606]: I1212 00:46:40.167496 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="7978c0cd-b859-49f1-ad0e-1cb88ff58495" containerName="cinder-db-sync" Dec 12 00:46:40 crc kubenswrapper[4606]: E1212 00:46:40.167507 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e071e571-9ded-4520-9275-221d832aa78d" containerName="init" Dec 12 00:46:40 crc kubenswrapper[4606]: I1212 00:46:40.167513 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="e071e571-9ded-4520-9275-221d832aa78d" containerName="init" Dec 12 00:46:40 crc kubenswrapper[4606]: E1212 00:46:40.167520 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="465e3cb1-d565-45fb-9251-de59579f3add" containerName="horizon" Dec 12 00:46:40 crc kubenswrapper[4606]: I1212 00:46:40.167526 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="465e3cb1-d565-45fb-9251-de59579f3add" containerName="horizon" Dec 12 00:46:40 crc kubenswrapper[4606]: I1212 00:46:40.167726 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="465e3cb1-d565-45fb-9251-de59579f3add" containerName="horizon-log" Dec 12 00:46:40 crc kubenswrapper[4606]: I1212 00:46:40.167737 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="e071e571-9ded-4520-9275-221d832aa78d" containerName="dnsmasq-dns" Dec 12 00:46:40 crc kubenswrapper[4606]: I1212 00:46:40.167747 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="465e3cb1-d565-45fb-9251-de59579f3add" containerName="horizon" Dec 12 00:46:40 crc kubenswrapper[4606]: I1212 00:46:40.167761 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="7978c0cd-b859-49f1-ad0e-1cb88ff58495" containerName="cinder-db-sync" Dec 12 00:46:40 crc kubenswrapper[4606]: I1212 00:46:40.168636 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 12 00:46:40 crc kubenswrapper[4606]: I1212 00:46:40.180912 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 12 00:46:40 crc kubenswrapper[4606]: I1212 00:46:40.180931 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 12 00:46:40 crc kubenswrapper[4606]: I1212 00:46:40.181038 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 12 00:46:40 crc kubenswrapper[4606]: I1212 00:46:40.180927 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-m4kqj" Dec 12 00:46:40 crc kubenswrapper[4606]: I1212 00:46:40.207537 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-p8nxl"] Dec 12 00:46:40 crc kubenswrapper[4606]: I1212 00:46:40.209004 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-p8nxl" Dec 12 00:46:40 crc kubenswrapper[4606]: I1212 00:46:40.231306 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 12 00:46:40 crc kubenswrapper[4606]: I1212 00:46:40.243597 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-p8nxl"] Dec 12 00:46:40 crc kubenswrapper[4606]: I1212 00:46:40.285132 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/06ce9244-6f03-4386-8b05-fdca2f2e1120-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"06ce9244-6f03-4386-8b05-fdca2f2e1120\") " pod="openstack/cinder-scheduler-0" Dec 12 00:46:40 crc kubenswrapper[4606]: I1212 00:46:40.285430 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/36e6ef5a-367e-412f-af42-3bba95417184-dns-swift-storage-0\") pod \"dnsmasq-dns-6578955fd5-p8nxl\" (UID: \"36e6ef5a-367e-412f-af42-3bba95417184\") " pod="openstack/dnsmasq-dns-6578955fd5-p8nxl" Dec 12 00:46:40 crc kubenswrapper[4606]: I1212 00:46:40.285524 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4dgq\" (UniqueName: \"kubernetes.io/projected/36e6ef5a-367e-412f-af42-3bba95417184-kube-api-access-c4dgq\") pod \"dnsmasq-dns-6578955fd5-p8nxl\" (UID: \"36e6ef5a-367e-412f-af42-3bba95417184\") " pod="openstack/dnsmasq-dns-6578955fd5-p8nxl" Dec 12 00:46:40 crc kubenswrapper[4606]: I1212 00:46:40.285822 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/06ce9244-6f03-4386-8b05-fdca2f2e1120-scripts\") pod \"cinder-scheduler-0\" (UID: \"06ce9244-6f03-4386-8b05-fdca2f2e1120\") " pod="openstack/cinder-scheduler-0" Dec 12 00:46:40 crc kubenswrapper[4606]: I1212 00:46:40.285944 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06ce9244-6f03-4386-8b05-fdca2f2e1120-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"06ce9244-6f03-4386-8b05-fdca2f2e1120\") " pod="openstack/cinder-scheduler-0" Dec 12 00:46:40 crc kubenswrapper[4606]: I1212 00:46:40.286024 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/36e6ef5a-367e-412f-af42-3bba95417184-dns-svc\") pod \"dnsmasq-dns-6578955fd5-p8nxl\" (UID: \"36e6ef5a-367e-412f-af42-3bba95417184\") " pod="openstack/dnsmasq-dns-6578955fd5-p8nxl" Dec 12 00:46:40 crc kubenswrapper[4606]: I1212 00:46:40.286114 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4d2d\" (UniqueName: \"kubernetes.io/projected/06ce9244-6f03-4386-8b05-fdca2f2e1120-kube-api-access-s4d2d\") pod \"cinder-scheduler-0\" (UID: \"06ce9244-6f03-4386-8b05-fdca2f2e1120\") " pod="openstack/cinder-scheduler-0" Dec 12 00:46:40 crc kubenswrapper[4606]: I1212 00:46:40.286257 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/36e6ef5a-367e-412f-af42-3bba95417184-ovsdbserver-nb\") pod \"dnsmasq-dns-6578955fd5-p8nxl\" (UID: \"36e6ef5a-367e-412f-af42-3bba95417184\") " pod="openstack/dnsmasq-dns-6578955fd5-p8nxl" Dec 12 00:46:40 crc kubenswrapper[4606]: I1212 00:46:40.286338 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/36e6ef5a-367e-412f-af42-3bba95417184-ovsdbserver-sb\") pod \"dnsmasq-dns-6578955fd5-p8nxl\" (UID: \"36e6ef5a-367e-412f-af42-3bba95417184\") " pod="openstack/dnsmasq-dns-6578955fd5-p8nxl" Dec 12 00:46:40 crc kubenswrapper[4606]: I1212 00:46:40.286416 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/06ce9244-6f03-4386-8b05-fdca2f2e1120-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"06ce9244-6f03-4386-8b05-fdca2f2e1120\") " pod="openstack/cinder-scheduler-0" Dec 12 00:46:40 crc kubenswrapper[4606]: I1212 00:46:40.286498 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36e6ef5a-367e-412f-af42-3bba95417184-config\") pod \"dnsmasq-dns-6578955fd5-p8nxl\" (UID: \"36e6ef5a-367e-412f-af42-3bba95417184\") " pod="openstack/dnsmasq-dns-6578955fd5-p8nxl" Dec 12 00:46:40 crc kubenswrapper[4606]: I1212 00:46:40.286607 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06ce9244-6f03-4386-8b05-fdca2f2e1120-config-data\") pod \"cinder-scheduler-0\" (UID: \"06ce9244-6f03-4386-8b05-fdca2f2e1120\") " pod="openstack/cinder-scheduler-0" Dec 12 00:46:40 crc kubenswrapper[4606]: I1212 00:46:40.388285 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06ce9244-6f03-4386-8b05-fdca2f2e1120-config-data\") pod \"cinder-scheduler-0\" (UID: \"06ce9244-6f03-4386-8b05-fdca2f2e1120\") " pod="openstack/cinder-scheduler-0" Dec 12 00:46:40 crc kubenswrapper[4606]: I1212 00:46:40.388375 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/06ce9244-6f03-4386-8b05-fdca2f2e1120-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"06ce9244-6f03-4386-8b05-fdca2f2e1120\") " pod="openstack/cinder-scheduler-0" Dec 12 00:46:40 crc kubenswrapper[4606]: I1212 00:46:40.388423 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/36e6ef5a-367e-412f-af42-3bba95417184-dns-swift-storage-0\") pod \"dnsmasq-dns-6578955fd5-p8nxl\" (UID: \"36e6ef5a-367e-412f-af42-3bba95417184\") " pod="openstack/dnsmasq-dns-6578955fd5-p8nxl" Dec 12 00:46:40 crc kubenswrapper[4606]: I1212 00:46:40.388461 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c4dgq\" (UniqueName: \"kubernetes.io/projected/36e6ef5a-367e-412f-af42-3bba95417184-kube-api-access-c4dgq\") pod \"dnsmasq-dns-6578955fd5-p8nxl\" (UID: \"36e6ef5a-367e-412f-af42-3bba95417184\") " pod="openstack/dnsmasq-dns-6578955fd5-p8nxl" Dec 12 00:46:40 crc kubenswrapper[4606]: I1212 00:46:40.388501 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/06ce9244-6f03-4386-8b05-fdca2f2e1120-scripts\") pod \"cinder-scheduler-0\" (UID: \"06ce9244-6f03-4386-8b05-fdca2f2e1120\") " pod="openstack/cinder-scheduler-0" Dec 12 00:46:40 crc kubenswrapper[4606]: I1212 00:46:40.388533 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06ce9244-6f03-4386-8b05-fdca2f2e1120-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"06ce9244-6f03-4386-8b05-fdca2f2e1120\") " pod="openstack/cinder-scheduler-0" Dec 12 00:46:40 crc kubenswrapper[4606]: I1212 00:46:40.388559 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/36e6ef5a-367e-412f-af42-3bba95417184-dns-svc\") pod \"dnsmasq-dns-6578955fd5-p8nxl\" (UID: \"36e6ef5a-367e-412f-af42-3bba95417184\") " pod="openstack/dnsmasq-dns-6578955fd5-p8nxl" Dec 12 00:46:40 crc kubenswrapper[4606]: I1212 00:46:40.388593 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4d2d\" (UniqueName: \"kubernetes.io/projected/06ce9244-6f03-4386-8b05-fdca2f2e1120-kube-api-access-s4d2d\") pod \"cinder-scheduler-0\" (UID: \"06ce9244-6f03-4386-8b05-fdca2f2e1120\") " pod="openstack/cinder-scheduler-0" Dec 12 00:46:40 crc kubenswrapper[4606]: I1212 00:46:40.388622 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/36e6ef5a-367e-412f-af42-3bba95417184-ovsdbserver-nb\") pod \"dnsmasq-dns-6578955fd5-p8nxl\" (UID: \"36e6ef5a-367e-412f-af42-3bba95417184\") " pod="openstack/dnsmasq-dns-6578955fd5-p8nxl" Dec 12 00:46:40 crc kubenswrapper[4606]: I1212 00:46:40.388638 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/36e6ef5a-367e-412f-af42-3bba95417184-ovsdbserver-sb\") pod \"dnsmasq-dns-6578955fd5-p8nxl\" (UID: \"36e6ef5a-367e-412f-af42-3bba95417184\") " pod="openstack/dnsmasq-dns-6578955fd5-p8nxl" Dec 12 00:46:40 crc kubenswrapper[4606]: I1212 00:46:40.388659 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/06ce9244-6f03-4386-8b05-fdca2f2e1120-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"06ce9244-6f03-4386-8b05-fdca2f2e1120\") " pod="openstack/cinder-scheduler-0" Dec 12 00:46:40 crc kubenswrapper[4606]: I1212 00:46:40.388679 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36e6ef5a-367e-412f-af42-3bba95417184-config\") pod \"dnsmasq-dns-6578955fd5-p8nxl\" (UID: \"36e6ef5a-367e-412f-af42-3bba95417184\") " pod="openstack/dnsmasq-dns-6578955fd5-p8nxl" Dec 12 00:46:40 crc kubenswrapper[4606]: I1212 00:46:40.391390 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36e6ef5a-367e-412f-af42-3bba95417184-config\") pod \"dnsmasq-dns-6578955fd5-p8nxl\" (UID: \"36e6ef5a-367e-412f-af42-3bba95417184\") " pod="openstack/dnsmasq-dns-6578955fd5-p8nxl" Dec 12 00:46:40 crc kubenswrapper[4606]: I1212 00:46:40.391521 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/36e6ef5a-367e-412f-af42-3bba95417184-dns-swift-storage-0\") pod \"dnsmasq-dns-6578955fd5-p8nxl\" (UID: \"36e6ef5a-367e-412f-af42-3bba95417184\") " pod="openstack/dnsmasq-dns-6578955fd5-p8nxl" Dec 12 00:46:40 crc kubenswrapper[4606]: I1212 00:46:40.391566 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/36e6ef5a-367e-412f-af42-3bba95417184-ovsdbserver-nb\") pod \"dnsmasq-dns-6578955fd5-p8nxl\" (UID: \"36e6ef5a-367e-412f-af42-3bba95417184\") " pod="openstack/dnsmasq-dns-6578955fd5-p8nxl" Dec 12 00:46:40 crc kubenswrapper[4606]: I1212 00:46:40.391772 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/06ce9244-6f03-4386-8b05-fdca2f2e1120-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"06ce9244-6f03-4386-8b05-fdca2f2e1120\") " pod="openstack/cinder-scheduler-0" Dec 12 00:46:40 crc kubenswrapper[4606]: I1212 00:46:40.392825 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/36e6ef5a-367e-412f-af42-3bba95417184-ovsdbserver-sb\") pod \"dnsmasq-dns-6578955fd5-p8nxl\" (UID: \"36e6ef5a-367e-412f-af42-3bba95417184\") " pod="openstack/dnsmasq-dns-6578955fd5-p8nxl" Dec 12 00:46:40 crc kubenswrapper[4606]: I1212 00:46:40.392887 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/36e6ef5a-367e-412f-af42-3bba95417184-dns-svc\") pod \"dnsmasq-dns-6578955fd5-p8nxl\" (UID: \"36e6ef5a-367e-412f-af42-3bba95417184\") " pod="openstack/dnsmasq-dns-6578955fd5-p8nxl" Dec 12 00:46:40 crc kubenswrapper[4606]: I1212 00:46:40.396152 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/06ce9244-6f03-4386-8b05-fdca2f2e1120-scripts\") pod \"cinder-scheduler-0\" (UID: \"06ce9244-6f03-4386-8b05-fdca2f2e1120\") " pod="openstack/cinder-scheduler-0" Dec 12 00:46:40 crc kubenswrapper[4606]: I1212 00:46:40.397397 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06ce9244-6f03-4386-8b05-fdca2f2e1120-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"06ce9244-6f03-4386-8b05-fdca2f2e1120\") " pod="openstack/cinder-scheduler-0" Dec 12 00:46:40 crc kubenswrapper[4606]: I1212 00:46:40.398417 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/06ce9244-6f03-4386-8b05-fdca2f2e1120-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"06ce9244-6f03-4386-8b05-fdca2f2e1120\") " pod="openstack/cinder-scheduler-0" Dec 12 00:46:40 crc kubenswrapper[4606]: I1212 00:46:40.412907 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06ce9244-6f03-4386-8b05-fdca2f2e1120-config-data\") pod \"cinder-scheduler-0\" (UID: \"06ce9244-6f03-4386-8b05-fdca2f2e1120\") " pod="openstack/cinder-scheduler-0" Dec 12 00:46:40 crc kubenswrapper[4606]: I1212 00:46:40.415400 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4d2d\" (UniqueName: \"kubernetes.io/projected/06ce9244-6f03-4386-8b05-fdca2f2e1120-kube-api-access-s4d2d\") pod \"cinder-scheduler-0\" (UID: \"06ce9244-6f03-4386-8b05-fdca2f2e1120\") " pod="openstack/cinder-scheduler-0" Dec 12 00:46:40 crc kubenswrapper[4606]: I1212 00:46:40.421587 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4dgq\" (UniqueName: \"kubernetes.io/projected/36e6ef5a-367e-412f-af42-3bba95417184-kube-api-access-c4dgq\") pod \"dnsmasq-dns-6578955fd5-p8nxl\" (UID: \"36e6ef5a-367e-412f-af42-3bba95417184\") " pod="openstack/dnsmasq-dns-6578955fd5-p8nxl" Dec 12 00:46:40 crc kubenswrapper[4606]: I1212 00:46:40.472598 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 12 00:46:40 crc kubenswrapper[4606]: I1212 00:46:40.475266 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="88bd7935-19a0-486d-b1e7-4737abcf21ab" containerName="ceilometer-central-agent" containerID="cri-o://4b5bdf970c54fef8e9c089fcdbb9c46b973ecdbfcce0ef2b0dc6b071a5a7f0b0" gracePeriod=30 Dec 12 00:46:40 crc kubenswrapper[4606]: I1212 00:46:40.475846 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="88bd7935-19a0-486d-b1e7-4737abcf21ab" containerName="proxy-httpd" containerID="cri-o://76d3f4c4a61e46d0c4f7b43e3c1a4412a13b89768aaae2e240e93ad49520ebce" gracePeriod=30 Dec 12 00:46:40 crc kubenswrapper[4606]: I1212 00:46:40.475905 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="88bd7935-19a0-486d-b1e7-4737abcf21ab" containerName="sg-core" containerID="cri-o://50ac6e7844b43e9e559ddc58160fae5e9b534b89c917ad342b3072a675f38fc5" gracePeriod=30 Dec 12 00:46:40 crc kubenswrapper[4606]: I1212 00:46:40.475940 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="88bd7935-19a0-486d-b1e7-4737abcf21ab" containerName="ceilometer-notification-agent" containerID="cri-o://37f3cac0d54a98025f3895722de48b6be5179e91e78542fa9dd933d5d466826f" gracePeriod=30 Dec 12 00:46:40 crc kubenswrapper[4606]: I1212 00:46:40.482694 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"88bd7935-19a0-486d-b1e7-4737abcf21ab","Type":"ContainerStarted","Data":"76d3f4c4a61e46d0c4f7b43e3c1a4412a13b89768aaae2e240e93ad49520ebce"} Dec 12 00:46:40 crc kubenswrapper[4606]: I1212 00:46:40.482736 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 12 00:46:40 crc kubenswrapper[4606]: I1212 00:46:40.482812 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 12 00:46:40 crc kubenswrapper[4606]: I1212 00:46:40.487090 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 12 00:46:40 crc kubenswrapper[4606]: I1212 00:46:40.506696 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 12 00:46:40 crc kubenswrapper[4606]: I1212 00:46:40.526930 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 12 00:46:40 crc kubenswrapper[4606]: I1212 00:46:40.537674 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-p8nxl" Dec 12 00:46:40 crc kubenswrapper[4606]: I1212 00:46:40.593049 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fbd55103-6007-4c6a-b06e-add09fe3483a-logs\") pod \"cinder-api-0\" (UID: \"fbd55103-6007-4c6a-b06e-add09fe3483a\") " pod="openstack/cinder-api-0" Dec 12 00:46:40 crc kubenswrapper[4606]: I1212 00:46:40.593086 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7n6xc\" (UniqueName: \"kubernetes.io/projected/fbd55103-6007-4c6a-b06e-add09fe3483a-kube-api-access-7n6xc\") pod \"cinder-api-0\" (UID: \"fbd55103-6007-4c6a-b06e-add09fe3483a\") " pod="openstack/cinder-api-0" Dec 12 00:46:40 crc kubenswrapper[4606]: I1212 00:46:40.593125 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fbd55103-6007-4c6a-b06e-add09fe3483a-config-data-custom\") pod \"cinder-api-0\" (UID: \"fbd55103-6007-4c6a-b06e-add09fe3483a\") " pod="openstack/cinder-api-0" Dec 12 00:46:40 crc kubenswrapper[4606]: I1212 00:46:40.593208 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fbd55103-6007-4c6a-b06e-add09fe3483a-scripts\") pod \"cinder-api-0\" (UID: \"fbd55103-6007-4c6a-b06e-add09fe3483a\") " pod="openstack/cinder-api-0" Dec 12 00:46:40 crc kubenswrapper[4606]: I1212 00:46:40.593266 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbd55103-6007-4c6a-b06e-add09fe3483a-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"fbd55103-6007-4c6a-b06e-add09fe3483a\") " pod="openstack/cinder-api-0" Dec 12 00:46:40 crc kubenswrapper[4606]: I1212 00:46:40.593298 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbd55103-6007-4c6a-b06e-add09fe3483a-config-data\") pod \"cinder-api-0\" (UID: \"fbd55103-6007-4c6a-b06e-add09fe3483a\") " pod="openstack/cinder-api-0" Dec 12 00:46:40 crc kubenswrapper[4606]: I1212 00:46:40.593316 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fbd55103-6007-4c6a-b06e-add09fe3483a-etc-machine-id\") pod \"cinder-api-0\" (UID: \"fbd55103-6007-4c6a-b06e-add09fe3483a\") " pod="openstack/cinder-api-0" Dec 12 00:46:40 crc kubenswrapper[4606]: I1212 00:46:40.604911 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=4.8295311640000005 podStartE2EDuration="1m18.604896656s" podCreationTimestamp="2025-12-12 00:45:22 +0000 UTC" firstStartedPulling="2025-12-12 00:45:25.112981243 +0000 UTC m=+1315.658334119" lastFinishedPulling="2025-12-12 00:46:38.888346745 +0000 UTC m=+1389.433699611" observedRunningTime="2025-12-12 00:46:40.550667152 +0000 UTC m=+1391.096020028" watchObservedRunningTime="2025-12-12 00:46:40.604896656 +0000 UTC m=+1391.150249522" Dec 12 00:46:40 crc kubenswrapper[4606]: I1212 00:46:40.695366 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fbd55103-6007-4c6a-b06e-add09fe3483a-logs\") pod \"cinder-api-0\" (UID: \"fbd55103-6007-4c6a-b06e-add09fe3483a\") " pod="openstack/cinder-api-0" Dec 12 00:46:40 crc kubenswrapper[4606]: I1212 00:46:40.695415 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7n6xc\" (UniqueName: \"kubernetes.io/projected/fbd55103-6007-4c6a-b06e-add09fe3483a-kube-api-access-7n6xc\") pod \"cinder-api-0\" (UID: \"fbd55103-6007-4c6a-b06e-add09fe3483a\") " pod="openstack/cinder-api-0" Dec 12 00:46:40 crc kubenswrapper[4606]: I1212 00:46:40.695442 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fbd55103-6007-4c6a-b06e-add09fe3483a-config-data-custom\") pod \"cinder-api-0\" (UID: \"fbd55103-6007-4c6a-b06e-add09fe3483a\") " pod="openstack/cinder-api-0" Dec 12 00:46:40 crc kubenswrapper[4606]: I1212 00:46:40.695507 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fbd55103-6007-4c6a-b06e-add09fe3483a-scripts\") pod \"cinder-api-0\" (UID: \"fbd55103-6007-4c6a-b06e-add09fe3483a\") " pod="openstack/cinder-api-0" Dec 12 00:46:40 crc kubenswrapper[4606]: I1212 00:46:40.695548 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbd55103-6007-4c6a-b06e-add09fe3483a-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"fbd55103-6007-4c6a-b06e-add09fe3483a\") " pod="openstack/cinder-api-0" Dec 12 00:46:40 crc kubenswrapper[4606]: I1212 00:46:40.695579 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbd55103-6007-4c6a-b06e-add09fe3483a-config-data\") pod \"cinder-api-0\" (UID: \"fbd55103-6007-4c6a-b06e-add09fe3483a\") " pod="openstack/cinder-api-0" Dec 12 00:46:40 crc kubenswrapper[4606]: I1212 00:46:40.695598 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fbd55103-6007-4c6a-b06e-add09fe3483a-etc-machine-id\") pod \"cinder-api-0\" (UID: \"fbd55103-6007-4c6a-b06e-add09fe3483a\") " pod="openstack/cinder-api-0" Dec 12 00:46:40 crc kubenswrapper[4606]: I1212 00:46:40.695677 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fbd55103-6007-4c6a-b06e-add09fe3483a-etc-machine-id\") pod \"cinder-api-0\" (UID: \"fbd55103-6007-4c6a-b06e-add09fe3483a\") " pod="openstack/cinder-api-0" Dec 12 00:46:40 crc kubenswrapper[4606]: I1212 00:46:40.696024 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fbd55103-6007-4c6a-b06e-add09fe3483a-logs\") pod \"cinder-api-0\" (UID: \"fbd55103-6007-4c6a-b06e-add09fe3483a\") " pod="openstack/cinder-api-0" Dec 12 00:46:40 crc kubenswrapper[4606]: I1212 00:46:40.702485 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbd55103-6007-4c6a-b06e-add09fe3483a-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"fbd55103-6007-4c6a-b06e-add09fe3483a\") " pod="openstack/cinder-api-0" Dec 12 00:46:40 crc kubenswrapper[4606]: I1212 00:46:40.705566 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fbd55103-6007-4c6a-b06e-add09fe3483a-config-data-custom\") pod \"cinder-api-0\" (UID: \"fbd55103-6007-4c6a-b06e-add09fe3483a\") " pod="openstack/cinder-api-0" Dec 12 00:46:40 crc kubenswrapper[4606]: I1212 00:46:40.706019 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fbd55103-6007-4c6a-b06e-add09fe3483a-scripts\") pod \"cinder-api-0\" (UID: \"fbd55103-6007-4c6a-b06e-add09fe3483a\") " pod="openstack/cinder-api-0" Dec 12 00:46:40 crc kubenswrapper[4606]: I1212 00:46:40.714331 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbd55103-6007-4c6a-b06e-add09fe3483a-config-data\") pod \"cinder-api-0\" (UID: \"fbd55103-6007-4c6a-b06e-add09fe3483a\") " pod="openstack/cinder-api-0" Dec 12 00:46:40 crc kubenswrapper[4606]: I1212 00:46:40.719286 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7n6xc\" (UniqueName: \"kubernetes.io/projected/fbd55103-6007-4c6a-b06e-add09fe3483a-kube-api-access-7n6xc\") pod \"cinder-api-0\" (UID: \"fbd55103-6007-4c6a-b06e-add09fe3483a\") " pod="openstack/cinder-api-0" Dec 12 00:46:40 crc kubenswrapper[4606]: I1212 00:46:40.826927 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 12 00:46:41 crc kubenswrapper[4606]: I1212 00:46:41.501553 4606 generic.go:334] "Generic (PLEG): container finished" podID="dec2500f-eaf5-4ee7-9c07-66f7dda126f3" containerID="d12094fab3625aae157adf8e8fe6fd662c5ca4433bcaea8eb4452d022b07b44f" exitCode=0 Dec 12 00:46:41 crc kubenswrapper[4606]: I1212 00:46:41.502615 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6575dff54-tdrpc" event={"ID":"dec2500f-eaf5-4ee7-9c07-66f7dda126f3","Type":"ContainerDied","Data":"d12094fab3625aae157adf8e8fe6fd662c5ca4433bcaea8eb4452d022b07b44f"} Dec 12 00:46:41 crc kubenswrapper[4606]: I1212 00:46:41.521985 4606 generic.go:334] "Generic (PLEG): container finished" podID="88bd7935-19a0-486d-b1e7-4737abcf21ab" containerID="76d3f4c4a61e46d0c4f7b43e3c1a4412a13b89768aaae2e240e93ad49520ebce" exitCode=0 Dec 12 00:46:41 crc kubenswrapper[4606]: I1212 00:46:41.522017 4606 generic.go:334] "Generic (PLEG): container finished" podID="88bd7935-19a0-486d-b1e7-4737abcf21ab" containerID="50ac6e7844b43e9e559ddc58160fae5e9b534b89c917ad342b3072a675f38fc5" exitCode=2 Dec 12 00:46:41 crc kubenswrapper[4606]: I1212 00:46:41.522026 4606 generic.go:334] "Generic (PLEG): container finished" podID="88bd7935-19a0-486d-b1e7-4737abcf21ab" containerID="4b5bdf970c54fef8e9c089fcdbb9c46b973ecdbfcce0ef2b0dc6b071a5a7f0b0" exitCode=0 Dec 12 00:46:41 crc kubenswrapper[4606]: I1212 00:46:41.522046 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"88bd7935-19a0-486d-b1e7-4737abcf21ab","Type":"ContainerDied","Data":"76d3f4c4a61e46d0c4f7b43e3c1a4412a13b89768aaae2e240e93ad49520ebce"} Dec 12 00:46:41 crc kubenswrapper[4606]: I1212 00:46:41.522072 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"88bd7935-19a0-486d-b1e7-4737abcf21ab","Type":"ContainerDied","Data":"50ac6e7844b43e9e559ddc58160fae5e9b534b89c917ad342b3072a675f38fc5"} Dec 12 00:46:41 crc kubenswrapper[4606]: I1212 00:46:41.522082 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"88bd7935-19a0-486d-b1e7-4737abcf21ab","Type":"ContainerDied","Data":"4b5bdf970c54fef8e9c089fcdbb9c46b973ecdbfcce0ef2b0dc6b071a5a7f0b0"} Dec 12 00:46:41 crc kubenswrapper[4606]: I1212 00:46:41.606151 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 12 00:46:41 crc kubenswrapper[4606]: W1212 00:46:41.606915 4606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod06ce9244_6f03_4386_8b05_fdca2f2e1120.slice/crio-37cb80a31bb6d32d18b30de4db071bc18f81850cf0c10c78ee00f99b06da5ec5 WatchSource:0}: Error finding container 37cb80a31bb6d32d18b30de4db071bc18f81850cf0c10c78ee00f99b06da5ec5: Status 404 returned error can't find the container with id 37cb80a31bb6d32d18b30de4db071bc18f81850cf0c10c78ee00f99b06da5ec5 Dec 12 00:46:41 crc kubenswrapper[4606]: I1212 00:46:41.622406 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6575dff54-tdrpc" Dec 12 00:46:41 crc kubenswrapper[4606]: W1212 00:46:41.634064 4606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod36e6ef5a_367e_412f_af42_3bba95417184.slice/crio-343ca413d6d41b7feaf5639e68653591545070834a2d53185e957e8f56ef024e WatchSource:0}: Error finding container 343ca413d6d41b7feaf5639e68653591545070834a2d53185e957e8f56ef024e: Status 404 returned error can't find the container with id 343ca413d6d41b7feaf5639e68653591545070834a2d53185e957e8f56ef024e Dec 12 00:46:41 crc kubenswrapper[4606]: I1212 00:46:41.662523 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-p8nxl"] Dec 12 00:46:41 crc kubenswrapper[4606]: I1212 00:46:41.744815 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dec2500f-eaf5-4ee7-9c07-66f7dda126f3-combined-ca-bundle\") pod \"dec2500f-eaf5-4ee7-9c07-66f7dda126f3\" (UID: \"dec2500f-eaf5-4ee7-9c07-66f7dda126f3\") " Dec 12 00:46:41 crc kubenswrapper[4606]: I1212 00:46:41.744868 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dec2500f-eaf5-4ee7-9c07-66f7dda126f3-config-data-custom\") pod \"dec2500f-eaf5-4ee7-9c07-66f7dda126f3\" (UID: \"dec2500f-eaf5-4ee7-9c07-66f7dda126f3\") " Dec 12 00:46:41 crc kubenswrapper[4606]: I1212 00:46:41.744887 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dec2500f-eaf5-4ee7-9c07-66f7dda126f3-logs\") pod \"dec2500f-eaf5-4ee7-9c07-66f7dda126f3\" (UID: \"dec2500f-eaf5-4ee7-9c07-66f7dda126f3\") " Dec 12 00:46:41 crc kubenswrapper[4606]: I1212 00:46:41.744962 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dec2500f-eaf5-4ee7-9c07-66f7dda126f3-config-data\") pod \"dec2500f-eaf5-4ee7-9c07-66f7dda126f3\" (UID: \"dec2500f-eaf5-4ee7-9c07-66f7dda126f3\") " Dec 12 00:46:41 crc kubenswrapper[4606]: I1212 00:46:41.745126 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-86nmv\" (UniqueName: \"kubernetes.io/projected/dec2500f-eaf5-4ee7-9c07-66f7dda126f3-kube-api-access-86nmv\") pod \"dec2500f-eaf5-4ee7-9c07-66f7dda126f3\" (UID: \"dec2500f-eaf5-4ee7-9c07-66f7dda126f3\") " Dec 12 00:46:41 crc kubenswrapper[4606]: I1212 00:46:41.746474 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dec2500f-eaf5-4ee7-9c07-66f7dda126f3-logs" (OuterVolumeSpecName: "logs") pod "dec2500f-eaf5-4ee7-9c07-66f7dda126f3" (UID: "dec2500f-eaf5-4ee7-9c07-66f7dda126f3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 00:46:41 crc kubenswrapper[4606]: I1212 00:46:41.752944 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dec2500f-eaf5-4ee7-9c07-66f7dda126f3-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "dec2500f-eaf5-4ee7-9c07-66f7dda126f3" (UID: "dec2500f-eaf5-4ee7-9c07-66f7dda126f3"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:46:41 crc kubenswrapper[4606]: I1212 00:46:41.760492 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dec2500f-eaf5-4ee7-9c07-66f7dda126f3-kube-api-access-86nmv" (OuterVolumeSpecName: "kube-api-access-86nmv") pod "dec2500f-eaf5-4ee7-9c07-66f7dda126f3" (UID: "dec2500f-eaf5-4ee7-9c07-66f7dda126f3"). InnerVolumeSpecName "kube-api-access-86nmv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:46:41 crc kubenswrapper[4606]: I1212 00:46:41.808256 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dec2500f-eaf5-4ee7-9c07-66f7dda126f3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dec2500f-eaf5-4ee7-9c07-66f7dda126f3" (UID: "dec2500f-eaf5-4ee7-9c07-66f7dda126f3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:46:41 crc kubenswrapper[4606]: I1212 00:46:41.847416 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-86nmv\" (UniqueName: \"kubernetes.io/projected/dec2500f-eaf5-4ee7-9c07-66f7dda126f3-kube-api-access-86nmv\") on node \"crc\" DevicePath \"\"" Dec 12 00:46:41 crc kubenswrapper[4606]: I1212 00:46:41.847439 4606 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dec2500f-eaf5-4ee7-9c07-66f7dda126f3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 12 00:46:41 crc kubenswrapper[4606]: I1212 00:46:41.847447 4606 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dec2500f-eaf5-4ee7-9c07-66f7dda126f3-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 12 00:46:41 crc kubenswrapper[4606]: I1212 00:46:41.847455 4606 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dec2500f-eaf5-4ee7-9c07-66f7dda126f3-logs\") on node \"crc\" DevicePath \"\"" Dec 12 00:46:41 crc kubenswrapper[4606]: I1212 00:46:41.961544 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 12 00:46:42 crc kubenswrapper[4606]: I1212 00:46:42.023640 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dec2500f-eaf5-4ee7-9c07-66f7dda126f3-config-data" (OuterVolumeSpecName: "config-data") pod "dec2500f-eaf5-4ee7-9c07-66f7dda126f3" (UID: "dec2500f-eaf5-4ee7-9c07-66f7dda126f3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:46:42 crc kubenswrapper[4606]: I1212 00:46:42.050667 4606 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dec2500f-eaf5-4ee7-9c07-66f7dda126f3-config-data\") on node \"crc\" DevicePath \"\"" Dec 12 00:46:42 crc kubenswrapper[4606]: I1212 00:46:42.534641 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"06ce9244-6f03-4386-8b05-fdca2f2e1120","Type":"ContainerStarted","Data":"37cb80a31bb6d32d18b30de4db071bc18f81850cf0c10c78ee00f99b06da5ec5"} Dec 12 00:46:42 crc kubenswrapper[4606]: I1212 00:46:42.540431 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6575dff54-tdrpc" event={"ID":"dec2500f-eaf5-4ee7-9c07-66f7dda126f3","Type":"ContainerDied","Data":"c31ae76f40f1a26090b22570f232c7a1bf92119acc58bdf11a07775c7d82f3c0"} Dec 12 00:46:42 crc kubenswrapper[4606]: I1212 00:46:42.540473 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6575dff54-tdrpc" Dec 12 00:46:42 crc kubenswrapper[4606]: I1212 00:46:42.540486 4606 scope.go:117] "RemoveContainer" containerID="d12094fab3625aae157adf8e8fe6fd662c5ca4433bcaea8eb4452d022b07b44f" Dec 12 00:46:42 crc kubenswrapper[4606]: I1212 00:46:42.556652 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"fbd55103-6007-4c6a-b06e-add09fe3483a","Type":"ContainerStarted","Data":"01789232fd6a2fc91306ac8aa3932c797e58354bba354de23bc80747991d47b5"} Dec 12 00:46:42 crc kubenswrapper[4606]: I1212 00:46:42.564865 4606 generic.go:334] "Generic (PLEG): container finished" podID="36e6ef5a-367e-412f-af42-3bba95417184" containerID="060726d7cba75c94080e422330e8a6b95588f87352842c10483c5166661c537c" exitCode=0 Dec 12 00:46:42 crc kubenswrapper[4606]: I1212 00:46:42.564904 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-p8nxl" event={"ID":"36e6ef5a-367e-412f-af42-3bba95417184","Type":"ContainerDied","Data":"060726d7cba75c94080e422330e8a6b95588f87352842c10483c5166661c537c"} Dec 12 00:46:42 crc kubenswrapper[4606]: I1212 00:46:42.564928 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-p8nxl" event={"ID":"36e6ef5a-367e-412f-af42-3bba95417184","Type":"ContainerStarted","Data":"343ca413d6d41b7feaf5639e68653591545070834a2d53185e957e8f56ef024e"} Dec 12 00:46:42 crc kubenswrapper[4606]: I1212 00:46:42.583609 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6575dff54-tdrpc"] Dec 12 00:46:42 crc kubenswrapper[4606]: I1212 00:46:42.584664 4606 scope.go:117] "RemoveContainer" containerID="28897c4ec05b5aa59ad1ed5aa483cdb74e5499a6e8b201838073af70d30aefe4" Dec 12 00:46:42 crc kubenswrapper[4606]: I1212 00:46:42.588902 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-6575dff54-tdrpc"] Dec 12 00:46:43 crc kubenswrapper[4606]: I1212 00:46:43.521154 4606 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6b7b667979-qqpvb" podUID="e071e571-9ded-4520-9275-221d832aa78d" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.151:5353: i/o timeout" Dec 12 00:46:43 crc kubenswrapper[4606]: I1212 00:46:43.586009 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"fbd55103-6007-4c6a-b06e-add09fe3483a","Type":"ContainerStarted","Data":"1cfb5ccac676e76ed0f873d2700b84ee06e877d6c8ee5c03e3ed0d8cb7b07559"} Dec 12 00:46:43 crc kubenswrapper[4606]: I1212 00:46:43.595566 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-p8nxl" event={"ID":"36e6ef5a-367e-412f-af42-3bba95417184","Type":"ContainerStarted","Data":"cf236970ecdd890458909955d16a3c5ec90b299c8f6eee249228c1fd39f1aacf"} Dec 12 00:46:43 crc kubenswrapper[4606]: I1212 00:46:43.595793 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6578955fd5-p8nxl" Dec 12 00:46:43 crc kubenswrapper[4606]: I1212 00:46:43.600268 4606 generic.go:334] "Generic (PLEG): container finished" podID="218c1acf-b25f-43b6-9967-badd62c1a155" containerID="44e3ffe4c5022a3dae1e2fd671e95fc1ee641f7665dabc91a545e50f3b8f0634" exitCode=0 Dec 12 00:46:43 crc kubenswrapper[4606]: I1212 00:46:43.600314 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5dfd45968b-nj9ll" event={"ID":"218c1acf-b25f-43b6-9967-badd62c1a155","Type":"ContainerDied","Data":"44e3ffe4c5022a3dae1e2fd671e95fc1ee641f7665dabc91a545e50f3b8f0634"} Dec 12 00:46:43 crc kubenswrapper[4606]: I1212 00:46:43.620737 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6578955fd5-p8nxl" podStartSLOduration=3.6207210930000002 podStartE2EDuration="3.620721093s" podCreationTimestamp="2025-12-12 00:46:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:46:43.618570536 +0000 UTC m=+1394.163923412" watchObservedRunningTime="2025-12-12 00:46:43.620721093 +0000 UTC m=+1394.166073959" Dec 12 00:46:43 crc kubenswrapper[4606]: I1212 00:46:43.742356 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dec2500f-eaf5-4ee7-9c07-66f7dda126f3" path="/var/lib/kubelet/pods/dec2500f-eaf5-4ee7-9c07-66f7dda126f3/volumes" Dec 12 00:46:44 crc kubenswrapper[4606]: I1212 00:46:44.003681 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 12 00:46:44 crc kubenswrapper[4606]: I1212 00:46:44.120345 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5dfd45968b-nj9ll" Dec 12 00:46:44 crc kubenswrapper[4606]: I1212 00:46:44.222363 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/218c1acf-b25f-43b6-9967-badd62c1a155-ovndb-tls-certs\") pod \"218c1acf-b25f-43b6-9967-badd62c1a155\" (UID: \"218c1acf-b25f-43b6-9967-badd62c1a155\") " Dec 12 00:46:44 crc kubenswrapper[4606]: I1212 00:46:44.222446 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8s9dc\" (UniqueName: \"kubernetes.io/projected/218c1acf-b25f-43b6-9967-badd62c1a155-kube-api-access-8s9dc\") pod \"218c1acf-b25f-43b6-9967-badd62c1a155\" (UID: \"218c1acf-b25f-43b6-9967-badd62c1a155\") " Dec 12 00:46:44 crc kubenswrapper[4606]: I1212 00:46:44.222558 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/218c1acf-b25f-43b6-9967-badd62c1a155-config\") pod \"218c1acf-b25f-43b6-9967-badd62c1a155\" (UID: \"218c1acf-b25f-43b6-9967-badd62c1a155\") " Dec 12 00:46:44 crc kubenswrapper[4606]: I1212 00:46:44.222577 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/218c1acf-b25f-43b6-9967-badd62c1a155-combined-ca-bundle\") pod \"218c1acf-b25f-43b6-9967-badd62c1a155\" (UID: \"218c1acf-b25f-43b6-9967-badd62c1a155\") " Dec 12 00:46:44 crc kubenswrapper[4606]: I1212 00:46:44.222619 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/218c1acf-b25f-43b6-9967-badd62c1a155-httpd-config\") pod \"218c1acf-b25f-43b6-9967-badd62c1a155\" (UID: \"218c1acf-b25f-43b6-9967-badd62c1a155\") " Dec 12 00:46:44 crc kubenswrapper[4606]: I1212 00:46:44.230277 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/218c1acf-b25f-43b6-9967-badd62c1a155-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "218c1acf-b25f-43b6-9967-badd62c1a155" (UID: "218c1acf-b25f-43b6-9967-badd62c1a155"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:46:44 crc kubenswrapper[4606]: I1212 00:46:44.234377 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/218c1acf-b25f-43b6-9967-badd62c1a155-kube-api-access-8s9dc" (OuterVolumeSpecName: "kube-api-access-8s9dc") pod "218c1acf-b25f-43b6-9967-badd62c1a155" (UID: "218c1acf-b25f-43b6-9967-badd62c1a155"). InnerVolumeSpecName "kube-api-access-8s9dc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:46:44 crc kubenswrapper[4606]: I1212 00:46:44.312711 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/218c1acf-b25f-43b6-9967-badd62c1a155-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "218c1acf-b25f-43b6-9967-badd62c1a155" (UID: "218c1acf-b25f-43b6-9967-badd62c1a155"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:46:44 crc kubenswrapper[4606]: I1212 00:46:44.316768 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/218c1acf-b25f-43b6-9967-badd62c1a155-config" (OuterVolumeSpecName: "config") pod "218c1acf-b25f-43b6-9967-badd62c1a155" (UID: "218c1acf-b25f-43b6-9967-badd62c1a155"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:46:44 crc kubenswrapper[4606]: I1212 00:46:44.321786 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/218c1acf-b25f-43b6-9967-badd62c1a155-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "218c1acf-b25f-43b6-9967-badd62c1a155" (UID: "218c1acf-b25f-43b6-9967-badd62c1a155"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:46:44 crc kubenswrapper[4606]: I1212 00:46:44.325315 4606 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/218c1acf-b25f-43b6-9967-badd62c1a155-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 12 00:46:44 crc kubenswrapper[4606]: I1212 00:46:44.325364 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8s9dc\" (UniqueName: \"kubernetes.io/projected/218c1acf-b25f-43b6-9967-badd62c1a155-kube-api-access-8s9dc\") on node \"crc\" DevicePath \"\"" Dec 12 00:46:44 crc kubenswrapper[4606]: I1212 00:46:44.325378 4606 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/218c1acf-b25f-43b6-9967-badd62c1a155-config\") on node \"crc\" DevicePath \"\"" Dec 12 00:46:44 crc kubenswrapper[4606]: I1212 00:46:44.325388 4606 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/218c1acf-b25f-43b6-9967-badd62c1a155-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 12 00:46:44 crc kubenswrapper[4606]: I1212 00:46:44.325397 4606 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/218c1acf-b25f-43b6-9967-badd62c1a155-httpd-config\") on node \"crc\" DevicePath \"\"" Dec 12 00:46:44 crc kubenswrapper[4606]: I1212 00:46:44.610040 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"06ce9244-6f03-4386-8b05-fdca2f2e1120","Type":"ContainerStarted","Data":"16bc2caf9abd0d9904b1f352d55a13e16dca4b6ee1ea0868f25d0bd709b7f6e1"} Dec 12 00:46:44 crc kubenswrapper[4606]: I1212 00:46:44.612464 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"fbd55103-6007-4c6a-b06e-add09fe3483a","Type":"ContainerStarted","Data":"baf887f253e34b99d257143fb713a695464c8c53d561ab4cb959e9b411161693"} Dec 12 00:46:44 crc kubenswrapper[4606]: I1212 00:46:44.612657 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="fbd55103-6007-4c6a-b06e-add09fe3483a" containerName="cinder-api-log" containerID="cri-o://1cfb5ccac676e76ed0f873d2700b84ee06e877d6c8ee5c03e3ed0d8cb7b07559" gracePeriod=30 Dec 12 00:46:44 crc kubenswrapper[4606]: I1212 00:46:44.613061 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 12 00:46:44 crc kubenswrapper[4606]: I1212 00:46:44.613474 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="fbd55103-6007-4c6a-b06e-add09fe3483a" containerName="cinder-api" containerID="cri-o://baf887f253e34b99d257143fb713a695464c8c53d561ab4cb959e9b411161693" gracePeriod=30 Dec 12 00:46:44 crc kubenswrapper[4606]: I1212 00:46:44.627074 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5dfd45968b-nj9ll" Dec 12 00:46:44 crc kubenswrapper[4606]: I1212 00:46:44.627472 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5dfd45968b-nj9ll" event={"ID":"218c1acf-b25f-43b6-9967-badd62c1a155","Type":"ContainerDied","Data":"88f5868b474b733fb729888cb06fc225d7f583d8440c6ee4c40f655edef62b54"} Dec 12 00:46:44 crc kubenswrapper[4606]: I1212 00:46:44.627551 4606 scope.go:117] "RemoveContainer" containerID="d1b6480067229a434be4e619b942a9f48ca05a819714ceeaa05d7ca9e025c51b" Dec 12 00:46:44 crc kubenswrapper[4606]: I1212 00:46:44.641319 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.641303483 podStartE2EDuration="4.641303483s" podCreationTimestamp="2025-12-12 00:46:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:46:44.630985736 +0000 UTC m=+1395.176338602" watchObservedRunningTime="2025-12-12 00:46:44.641303483 +0000 UTC m=+1395.186656349" Dec 12 00:46:44 crc kubenswrapper[4606]: I1212 00:46:44.666542 4606 scope.go:117] "RemoveContainer" containerID="44e3ffe4c5022a3dae1e2fd671e95fc1ee641f7665dabc91a545e50f3b8f0634" Dec 12 00:46:44 crc kubenswrapper[4606]: I1212 00:46:44.671281 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5dfd45968b-nj9ll"] Dec 12 00:46:44 crc kubenswrapper[4606]: I1212 00:46:44.682011 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-5dfd45968b-nj9ll"] Dec 12 00:46:45 crc kubenswrapper[4606]: I1212 00:46:45.201591 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 12 00:46:45 crc kubenswrapper[4606]: I1212 00:46:45.343761 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbd55103-6007-4c6a-b06e-add09fe3483a-combined-ca-bundle\") pod \"fbd55103-6007-4c6a-b06e-add09fe3483a\" (UID: \"fbd55103-6007-4c6a-b06e-add09fe3483a\") " Dec 12 00:46:45 crc kubenswrapper[4606]: I1212 00:46:45.343996 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbd55103-6007-4c6a-b06e-add09fe3483a-config-data\") pod \"fbd55103-6007-4c6a-b06e-add09fe3483a\" (UID: \"fbd55103-6007-4c6a-b06e-add09fe3483a\") " Dec 12 00:46:45 crc kubenswrapper[4606]: I1212 00:46:45.344550 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fbd55103-6007-4c6a-b06e-add09fe3483a-logs\") pod \"fbd55103-6007-4c6a-b06e-add09fe3483a\" (UID: \"fbd55103-6007-4c6a-b06e-add09fe3483a\") " Dec 12 00:46:45 crc kubenswrapper[4606]: I1212 00:46:45.344714 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fbd55103-6007-4c6a-b06e-add09fe3483a-etc-machine-id\") pod \"fbd55103-6007-4c6a-b06e-add09fe3483a\" (UID: \"fbd55103-6007-4c6a-b06e-add09fe3483a\") " Dec 12 00:46:45 crc kubenswrapper[4606]: I1212 00:46:45.344818 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fbd55103-6007-4c6a-b06e-add09fe3483a-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "fbd55103-6007-4c6a-b06e-add09fe3483a" (UID: "fbd55103-6007-4c6a-b06e-add09fe3483a"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 12 00:46:45 crc kubenswrapper[4606]: I1212 00:46:45.344864 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fbd55103-6007-4c6a-b06e-add09fe3483a-logs" (OuterVolumeSpecName: "logs") pod "fbd55103-6007-4c6a-b06e-add09fe3483a" (UID: "fbd55103-6007-4c6a-b06e-add09fe3483a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 00:46:45 crc kubenswrapper[4606]: I1212 00:46:45.344883 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fbd55103-6007-4c6a-b06e-add09fe3483a-scripts\") pod \"fbd55103-6007-4c6a-b06e-add09fe3483a\" (UID: \"fbd55103-6007-4c6a-b06e-add09fe3483a\") " Dec 12 00:46:45 crc kubenswrapper[4606]: I1212 00:46:45.345084 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fbd55103-6007-4c6a-b06e-add09fe3483a-config-data-custom\") pod \"fbd55103-6007-4c6a-b06e-add09fe3483a\" (UID: \"fbd55103-6007-4c6a-b06e-add09fe3483a\") " Dec 12 00:46:45 crc kubenswrapper[4606]: I1212 00:46:45.345466 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7n6xc\" (UniqueName: \"kubernetes.io/projected/fbd55103-6007-4c6a-b06e-add09fe3483a-kube-api-access-7n6xc\") pod \"fbd55103-6007-4c6a-b06e-add09fe3483a\" (UID: \"fbd55103-6007-4c6a-b06e-add09fe3483a\") " Dec 12 00:46:45 crc kubenswrapper[4606]: I1212 00:46:45.345904 4606 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fbd55103-6007-4c6a-b06e-add09fe3483a-logs\") on node \"crc\" DevicePath \"\"" Dec 12 00:46:45 crc kubenswrapper[4606]: I1212 00:46:45.345969 4606 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fbd55103-6007-4c6a-b06e-add09fe3483a-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 12 00:46:45 crc kubenswrapper[4606]: I1212 00:46:45.348932 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbd55103-6007-4c6a-b06e-add09fe3483a-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "fbd55103-6007-4c6a-b06e-add09fe3483a" (UID: "fbd55103-6007-4c6a-b06e-add09fe3483a"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:46:45 crc kubenswrapper[4606]: I1212 00:46:45.349485 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fbd55103-6007-4c6a-b06e-add09fe3483a-kube-api-access-7n6xc" (OuterVolumeSpecName: "kube-api-access-7n6xc") pod "fbd55103-6007-4c6a-b06e-add09fe3483a" (UID: "fbd55103-6007-4c6a-b06e-add09fe3483a"). InnerVolumeSpecName "kube-api-access-7n6xc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:46:45 crc kubenswrapper[4606]: I1212 00:46:45.350115 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbd55103-6007-4c6a-b06e-add09fe3483a-scripts" (OuterVolumeSpecName: "scripts") pod "fbd55103-6007-4c6a-b06e-add09fe3483a" (UID: "fbd55103-6007-4c6a-b06e-add09fe3483a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:46:45 crc kubenswrapper[4606]: I1212 00:46:45.382332 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbd55103-6007-4c6a-b06e-add09fe3483a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fbd55103-6007-4c6a-b06e-add09fe3483a" (UID: "fbd55103-6007-4c6a-b06e-add09fe3483a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:46:45 crc kubenswrapper[4606]: I1212 00:46:45.413444 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbd55103-6007-4c6a-b06e-add09fe3483a-config-data" (OuterVolumeSpecName: "config-data") pod "fbd55103-6007-4c6a-b06e-add09fe3483a" (UID: "fbd55103-6007-4c6a-b06e-add09fe3483a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:46:45 crc kubenswrapper[4606]: I1212 00:46:45.448594 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7n6xc\" (UniqueName: \"kubernetes.io/projected/fbd55103-6007-4c6a-b06e-add09fe3483a-kube-api-access-7n6xc\") on node \"crc\" DevicePath \"\"" Dec 12 00:46:45 crc kubenswrapper[4606]: I1212 00:46:45.448674 4606 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbd55103-6007-4c6a-b06e-add09fe3483a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 12 00:46:45 crc kubenswrapper[4606]: I1212 00:46:45.448684 4606 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbd55103-6007-4c6a-b06e-add09fe3483a-config-data\") on node \"crc\" DevicePath \"\"" Dec 12 00:46:45 crc kubenswrapper[4606]: I1212 00:46:45.448695 4606 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fbd55103-6007-4c6a-b06e-add09fe3483a-scripts\") on node \"crc\" DevicePath \"\"" Dec 12 00:46:45 crc kubenswrapper[4606]: I1212 00:46:45.448703 4606 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fbd55103-6007-4c6a-b06e-add09fe3483a-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 12 00:46:45 crc kubenswrapper[4606]: I1212 00:46:45.639700 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"06ce9244-6f03-4386-8b05-fdca2f2e1120","Type":"ContainerStarted","Data":"117a918723b40673fbcff7184808486967fd2181a70829cedb1116c344a75dd0"} Dec 12 00:46:45 crc kubenswrapper[4606]: I1212 00:46:45.641828 4606 generic.go:334] "Generic (PLEG): container finished" podID="fbd55103-6007-4c6a-b06e-add09fe3483a" containerID="baf887f253e34b99d257143fb713a695464c8c53d561ab4cb959e9b411161693" exitCode=0 Dec 12 00:46:45 crc kubenswrapper[4606]: I1212 00:46:45.641968 4606 generic.go:334] "Generic (PLEG): container finished" podID="fbd55103-6007-4c6a-b06e-add09fe3483a" containerID="1cfb5ccac676e76ed0f873d2700b84ee06e877d6c8ee5c03e3ed0d8cb7b07559" exitCode=143 Dec 12 00:46:45 crc kubenswrapper[4606]: I1212 00:46:45.641896 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"fbd55103-6007-4c6a-b06e-add09fe3483a","Type":"ContainerDied","Data":"baf887f253e34b99d257143fb713a695464c8c53d561ab4cb959e9b411161693"} Dec 12 00:46:45 crc kubenswrapper[4606]: I1212 00:46:45.642150 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"fbd55103-6007-4c6a-b06e-add09fe3483a","Type":"ContainerDied","Data":"1cfb5ccac676e76ed0f873d2700b84ee06e877d6c8ee5c03e3ed0d8cb7b07559"} Dec 12 00:46:45 crc kubenswrapper[4606]: I1212 00:46:45.642271 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"fbd55103-6007-4c6a-b06e-add09fe3483a","Type":"ContainerDied","Data":"01789232fd6a2fc91306ac8aa3932c797e58354bba354de23bc80747991d47b5"} Dec 12 00:46:45 crc kubenswrapper[4606]: I1212 00:46:45.642190 4606 scope.go:117] "RemoveContainer" containerID="baf887f253e34b99d257143fb713a695464c8c53d561ab4cb959e9b411161693" Dec 12 00:46:45 crc kubenswrapper[4606]: I1212 00:46:45.641880 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 12 00:46:45 crc kubenswrapper[4606]: I1212 00:46:45.664548 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.343439389 podStartE2EDuration="5.664531472s" podCreationTimestamp="2025-12-12 00:46:40 +0000 UTC" firstStartedPulling="2025-12-12 00:46:41.61936062 +0000 UTC m=+1392.164713486" lastFinishedPulling="2025-12-12 00:46:42.940452703 +0000 UTC m=+1393.485805569" observedRunningTime="2025-12-12 00:46:45.659264111 +0000 UTC m=+1396.204616977" watchObservedRunningTime="2025-12-12 00:46:45.664531472 +0000 UTC m=+1396.209884328" Dec 12 00:46:45 crc kubenswrapper[4606]: I1212 00:46:45.673482 4606 scope.go:117] "RemoveContainer" containerID="1cfb5ccac676e76ed0f873d2700b84ee06e877d6c8ee5c03e3ed0d8cb7b07559" Dec 12 00:46:45 crc kubenswrapper[4606]: I1212 00:46:45.690252 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 12 00:46:45 crc kubenswrapper[4606]: I1212 00:46:45.711829 4606 scope.go:117] "RemoveContainer" containerID="baf887f253e34b99d257143fb713a695464c8c53d561ab4cb959e9b411161693" Dec 12 00:46:45 crc kubenswrapper[4606]: E1212 00:46:45.712983 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"baf887f253e34b99d257143fb713a695464c8c53d561ab4cb959e9b411161693\": container with ID starting with baf887f253e34b99d257143fb713a695464c8c53d561ab4cb959e9b411161693 not found: ID does not exist" containerID="baf887f253e34b99d257143fb713a695464c8c53d561ab4cb959e9b411161693" Dec 12 00:46:45 crc kubenswrapper[4606]: I1212 00:46:45.713025 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"baf887f253e34b99d257143fb713a695464c8c53d561ab4cb959e9b411161693"} err="failed to get container status \"baf887f253e34b99d257143fb713a695464c8c53d561ab4cb959e9b411161693\": rpc error: code = NotFound desc = could not find container \"baf887f253e34b99d257143fb713a695464c8c53d561ab4cb959e9b411161693\": container with ID starting with baf887f253e34b99d257143fb713a695464c8c53d561ab4cb959e9b411161693 not found: ID does not exist" Dec 12 00:46:45 crc kubenswrapper[4606]: I1212 00:46:45.713042 4606 scope.go:117] "RemoveContainer" containerID="1cfb5ccac676e76ed0f873d2700b84ee06e877d6c8ee5c03e3ed0d8cb7b07559" Dec 12 00:46:45 crc kubenswrapper[4606]: E1212 00:46:45.714782 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1cfb5ccac676e76ed0f873d2700b84ee06e877d6c8ee5c03e3ed0d8cb7b07559\": container with ID starting with 1cfb5ccac676e76ed0f873d2700b84ee06e877d6c8ee5c03e3ed0d8cb7b07559 not found: ID does not exist" containerID="1cfb5ccac676e76ed0f873d2700b84ee06e877d6c8ee5c03e3ed0d8cb7b07559" Dec 12 00:46:45 crc kubenswrapper[4606]: I1212 00:46:45.714841 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1cfb5ccac676e76ed0f873d2700b84ee06e877d6c8ee5c03e3ed0d8cb7b07559"} err="failed to get container status \"1cfb5ccac676e76ed0f873d2700b84ee06e877d6c8ee5c03e3ed0d8cb7b07559\": rpc error: code = NotFound desc = could not find container \"1cfb5ccac676e76ed0f873d2700b84ee06e877d6c8ee5c03e3ed0d8cb7b07559\": container with ID starting with 1cfb5ccac676e76ed0f873d2700b84ee06e877d6c8ee5c03e3ed0d8cb7b07559 not found: ID does not exist" Dec 12 00:46:45 crc kubenswrapper[4606]: I1212 00:46:45.714867 4606 scope.go:117] "RemoveContainer" containerID="baf887f253e34b99d257143fb713a695464c8c53d561ab4cb959e9b411161693" Dec 12 00:46:45 crc kubenswrapper[4606]: I1212 00:46:45.719824 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"baf887f253e34b99d257143fb713a695464c8c53d561ab4cb959e9b411161693"} err="failed to get container status \"baf887f253e34b99d257143fb713a695464c8c53d561ab4cb959e9b411161693\": rpc error: code = NotFound desc = could not find container \"baf887f253e34b99d257143fb713a695464c8c53d561ab4cb959e9b411161693\": container with ID starting with baf887f253e34b99d257143fb713a695464c8c53d561ab4cb959e9b411161693 not found: ID does not exist" Dec 12 00:46:45 crc kubenswrapper[4606]: I1212 00:46:45.720048 4606 scope.go:117] "RemoveContainer" containerID="1cfb5ccac676e76ed0f873d2700b84ee06e877d6c8ee5c03e3ed0d8cb7b07559" Dec 12 00:46:45 crc kubenswrapper[4606]: I1212 00:46:45.721572 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1cfb5ccac676e76ed0f873d2700b84ee06e877d6c8ee5c03e3ed0d8cb7b07559"} err="failed to get container status \"1cfb5ccac676e76ed0f873d2700b84ee06e877d6c8ee5c03e3ed0d8cb7b07559\": rpc error: code = NotFound desc = could not find container \"1cfb5ccac676e76ed0f873d2700b84ee06e877d6c8ee5c03e3ed0d8cb7b07559\": container with ID starting with 1cfb5ccac676e76ed0f873d2700b84ee06e877d6c8ee5c03e3ed0d8cb7b07559 not found: ID does not exist" Dec 12 00:46:45 crc kubenswrapper[4606]: I1212 00:46:45.723546 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="218c1acf-b25f-43b6-9967-badd62c1a155" path="/var/lib/kubelet/pods/218c1acf-b25f-43b6-9967-badd62c1a155/volumes" Dec 12 00:46:45 crc kubenswrapper[4606]: I1212 00:46:45.725388 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Dec 12 00:46:45 crc kubenswrapper[4606]: I1212 00:46:45.725601 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 12 00:46:45 crc kubenswrapper[4606]: E1212 00:46:45.726324 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dec2500f-eaf5-4ee7-9c07-66f7dda126f3" containerName="barbican-api-log" Dec 12 00:46:45 crc kubenswrapper[4606]: I1212 00:46:45.726440 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="dec2500f-eaf5-4ee7-9c07-66f7dda126f3" containerName="barbican-api-log" Dec 12 00:46:45 crc kubenswrapper[4606]: E1212 00:46:45.726590 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="218c1acf-b25f-43b6-9967-badd62c1a155" containerName="neutron-api" Dec 12 00:46:45 crc kubenswrapper[4606]: I1212 00:46:45.726699 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="218c1acf-b25f-43b6-9967-badd62c1a155" containerName="neutron-api" Dec 12 00:46:45 crc kubenswrapper[4606]: E1212 00:46:45.726837 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbd55103-6007-4c6a-b06e-add09fe3483a" containerName="cinder-api-log" Dec 12 00:46:45 crc kubenswrapper[4606]: I1212 00:46:45.726946 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbd55103-6007-4c6a-b06e-add09fe3483a" containerName="cinder-api-log" Dec 12 00:46:45 crc kubenswrapper[4606]: E1212 00:46:45.727072 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbd55103-6007-4c6a-b06e-add09fe3483a" containerName="cinder-api" Dec 12 00:46:45 crc kubenswrapper[4606]: I1212 00:46:45.727141 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbd55103-6007-4c6a-b06e-add09fe3483a" containerName="cinder-api" Dec 12 00:46:45 crc kubenswrapper[4606]: E1212 00:46:45.727975 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="218c1acf-b25f-43b6-9967-badd62c1a155" containerName="neutron-httpd" Dec 12 00:46:45 crc kubenswrapper[4606]: I1212 00:46:45.728122 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="218c1acf-b25f-43b6-9967-badd62c1a155" containerName="neutron-httpd" Dec 12 00:46:45 crc kubenswrapper[4606]: E1212 00:46:45.728278 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dec2500f-eaf5-4ee7-9c07-66f7dda126f3" containerName="barbican-api" Dec 12 00:46:45 crc kubenswrapper[4606]: I1212 00:46:45.728369 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="dec2500f-eaf5-4ee7-9c07-66f7dda126f3" containerName="barbican-api" Dec 12 00:46:45 crc kubenswrapper[4606]: I1212 00:46:45.728753 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="218c1acf-b25f-43b6-9967-badd62c1a155" containerName="neutron-httpd" Dec 12 00:46:45 crc kubenswrapper[4606]: I1212 00:46:45.728894 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="fbd55103-6007-4c6a-b06e-add09fe3483a" containerName="cinder-api" Dec 12 00:46:45 crc kubenswrapper[4606]: I1212 00:46:45.728988 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="218c1acf-b25f-43b6-9967-badd62c1a155" containerName="neutron-api" Dec 12 00:46:45 crc kubenswrapper[4606]: I1212 00:46:45.729061 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="dec2500f-eaf5-4ee7-9c07-66f7dda126f3" containerName="barbican-api-log" Dec 12 00:46:45 crc kubenswrapper[4606]: I1212 00:46:45.729128 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="dec2500f-eaf5-4ee7-9c07-66f7dda126f3" containerName="barbican-api" Dec 12 00:46:45 crc kubenswrapper[4606]: I1212 00:46:45.729240 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="fbd55103-6007-4c6a-b06e-add09fe3483a" containerName="cinder-api-log" Dec 12 00:46:45 crc kubenswrapper[4606]: I1212 00:46:45.733770 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 12 00:46:45 crc kubenswrapper[4606]: I1212 00:46:45.735308 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 12 00:46:45 crc kubenswrapper[4606]: I1212 00:46:45.750638 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 12 00:46:45 crc kubenswrapper[4606]: I1212 00:46:45.751017 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Dec 12 00:46:45 crc kubenswrapper[4606]: I1212 00:46:45.751012 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Dec 12 00:46:45 crc kubenswrapper[4606]: I1212 00:46:45.856133 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2cacde66-96b8-437e-86b5-aefba1e473ae-scripts\") pod \"cinder-api-0\" (UID: \"2cacde66-96b8-437e-86b5-aefba1e473ae\") " pod="openstack/cinder-api-0" Dec 12 00:46:45 crc kubenswrapper[4606]: I1212 00:46:45.856555 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2cacde66-96b8-437e-86b5-aefba1e473ae-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"2cacde66-96b8-437e-86b5-aefba1e473ae\") " pod="openstack/cinder-api-0" Dec 12 00:46:45 crc kubenswrapper[4606]: I1212 00:46:45.856630 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cacde66-96b8-437e-86b5-aefba1e473ae-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"2cacde66-96b8-437e-86b5-aefba1e473ae\") " pod="openstack/cinder-api-0" Dec 12 00:46:45 crc kubenswrapper[4606]: I1212 00:46:45.856706 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8lt44\" (UniqueName: \"kubernetes.io/projected/2cacde66-96b8-437e-86b5-aefba1e473ae-kube-api-access-8lt44\") pod \"cinder-api-0\" (UID: \"2cacde66-96b8-437e-86b5-aefba1e473ae\") " pod="openstack/cinder-api-0" Dec 12 00:46:45 crc kubenswrapper[4606]: I1212 00:46:45.856803 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2cacde66-96b8-437e-86b5-aefba1e473ae-logs\") pod \"cinder-api-0\" (UID: \"2cacde66-96b8-437e-86b5-aefba1e473ae\") " pod="openstack/cinder-api-0" Dec 12 00:46:45 crc kubenswrapper[4606]: I1212 00:46:45.856890 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cacde66-96b8-437e-86b5-aefba1e473ae-config-data\") pod \"cinder-api-0\" (UID: \"2cacde66-96b8-437e-86b5-aefba1e473ae\") " pod="openstack/cinder-api-0" Dec 12 00:46:45 crc kubenswrapper[4606]: I1212 00:46:45.856971 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2cacde66-96b8-437e-86b5-aefba1e473ae-config-data-custom\") pod \"cinder-api-0\" (UID: \"2cacde66-96b8-437e-86b5-aefba1e473ae\") " pod="openstack/cinder-api-0" Dec 12 00:46:45 crc kubenswrapper[4606]: I1212 00:46:45.857040 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2cacde66-96b8-437e-86b5-aefba1e473ae-public-tls-certs\") pod \"cinder-api-0\" (UID: \"2cacde66-96b8-437e-86b5-aefba1e473ae\") " pod="openstack/cinder-api-0" Dec 12 00:46:45 crc kubenswrapper[4606]: I1212 00:46:45.857109 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2cacde66-96b8-437e-86b5-aefba1e473ae-etc-machine-id\") pod \"cinder-api-0\" (UID: \"2cacde66-96b8-437e-86b5-aefba1e473ae\") " pod="openstack/cinder-api-0" Dec 12 00:46:45 crc kubenswrapper[4606]: I1212 00:46:45.960587 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2cacde66-96b8-437e-86b5-aefba1e473ae-scripts\") pod \"cinder-api-0\" (UID: \"2cacde66-96b8-437e-86b5-aefba1e473ae\") " pod="openstack/cinder-api-0" Dec 12 00:46:45 crc kubenswrapper[4606]: I1212 00:46:45.960671 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2cacde66-96b8-437e-86b5-aefba1e473ae-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"2cacde66-96b8-437e-86b5-aefba1e473ae\") " pod="openstack/cinder-api-0" Dec 12 00:46:45 crc kubenswrapper[4606]: I1212 00:46:45.960706 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cacde66-96b8-437e-86b5-aefba1e473ae-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"2cacde66-96b8-437e-86b5-aefba1e473ae\") " pod="openstack/cinder-api-0" Dec 12 00:46:45 crc kubenswrapper[4606]: I1212 00:46:45.960760 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8lt44\" (UniqueName: \"kubernetes.io/projected/2cacde66-96b8-437e-86b5-aefba1e473ae-kube-api-access-8lt44\") pod \"cinder-api-0\" (UID: \"2cacde66-96b8-437e-86b5-aefba1e473ae\") " pod="openstack/cinder-api-0" Dec 12 00:46:45 crc kubenswrapper[4606]: I1212 00:46:45.960855 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2cacde66-96b8-437e-86b5-aefba1e473ae-logs\") pod \"cinder-api-0\" (UID: \"2cacde66-96b8-437e-86b5-aefba1e473ae\") " pod="openstack/cinder-api-0" Dec 12 00:46:45 crc kubenswrapper[4606]: I1212 00:46:45.960905 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cacde66-96b8-437e-86b5-aefba1e473ae-config-data\") pod \"cinder-api-0\" (UID: \"2cacde66-96b8-437e-86b5-aefba1e473ae\") " pod="openstack/cinder-api-0" Dec 12 00:46:45 crc kubenswrapper[4606]: I1212 00:46:45.960940 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2cacde66-96b8-437e-86b5-aefba1e473ae-config-data-custom\") pod \"cinder-api-0\" (UID: \"2cacde66-96b8-437e-86b5-aefba1e473ae\") " pod="openstack/cinder-api-0" Dec 12 00:46:45 crc kubenswrapper[4606]: I1212 00:46:45.960966 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2cacde66-96b8-437e-86b5-aefba1e473ae-public-tls-certs\") pod \"cinder-api-0\" (UID: \"2cacde66-96b8-437e-86b5-aefba1e473ae\") " pod="openstack/cinder-api-0" Dec 12 00:46:45 crc kubenswrapper[4606]: I1212 00:46:45.961003 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2cacde66-96b8-437e-86b5-aefba1e473ae-etc-machine-id\") pod \"cinder-api-0\" (UID: \"2cacde66-96b8-437e-86b5-aefba1e473ae\") " pod="openstack/cinder-api-0" Dec 12 00:46:45 crc kubenswrapper[4606]: I1212 00:46:45.961209 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2cacde66-96b8-437e-86b5-aefba1e473ae-etc-machine-id\") pod \"cinder-api-0\" (UID: \"2cacde66-96b8-437e-86b5-aefba1e473ae\") " pod="openstack/cinder-api-0" Dec 12 00:46:45 crc kubenswrapper[4606]: I1212 00:46:45.961998 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2cacde66-96b8-437e-86b5-aefba1e473ae-logs\") pod \"cinder-api-0\" (UID: \"2cacde66-96b8-437e-86b5-aefba1e473ae\") " pod="openstack/cinder-api-0" Dec 12 00:46:45 crc kubenswrapper[4606]: I1212 00:46:45.978385 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cacde66-96b8-437e-86b5-aefba1e473ae-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"2cacde66-96b8-437e-86b5-aefba1e473ae\") " pod="openstack/cinder-api-0" Dec 12 00:46:45 crc kubenswrapper[4606]: I1212 00:46:45.979165 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2cacde66-96b8-437e-86b5-aefba1e473ae-scripts\") pod \"cinder-api-0\" (UID: \"2cacde66-96b8-437e-86b5-aefba1e473ae\") " pod="openstack/cinder-api-0" Dec 12 00:46:45 crc kubenswrapper[4606]: I1212 00:46:45.981845 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2cacde66-96b8-437e-86b5-aefba1e473ae-config-data-custom\") pod \"cinder-api-0\" (UID: \"2cacde66-96b8-437e-86b5-aefba1e473ae\") " pod="openstack/cinder-api-0" Dec 12 00:46:45 crc kubenswrapper[4606]: I1212 00:46:45.983570 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cacde66-96b8-437e-86b5-aefba1e473ae-config-data\") pod \"cinder-api-0\" (UID: \"2cacde66-96b8-437e-86b5-aefba1e473ae\") " pod="openstack/cinder-api-0" Dec 12 00:46:45 crc kubenswrapper[4606]: I1212 00:46:45.988620 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2cacde66-96b8-437e-86b5-aefba1e473ae-public-tls-certs\") pod \"cinder-api-0\" (UID: \"2cacde66-96b8-437e-86b5-aefba1e473ae\") " pod="openstack/cinder-api-0" Dec 12 00:46:46 crc kubenswrapper[4606]: I1212 00:46:46.003228 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8lt44\" (UniqueName: \"kubernetes.io/projected/2cacde66-96b8-437e-86b5-aefba1e473ae-kube-api-access-8lt44\") pod \"cinder-api-0\" (UID: \"2cacde66-96b8-437e-86b5-aefba1e473ae\") " pod="openstack/cinder-api-0" Dec 12 00:46:46 crc kubenswrapper[4606]: I1212 00:46:46.010429 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2cacde66-96b8-437e-86b5-aefba1e473ae-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"2cacde66-96b8-437e-86b5-aefba1e473ae\") " pod="openstack/cinder-api-0" Dec 12 00:46:46 crc kubenswrapper[4606]: I1212 00:46:46.148216 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 12 00:46:46 crc kubenswrapper[4606]: I1212 00:46:46.176478 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 12 00:46:46 crc kubenswrapper[4606]: I1212 00:46:46.299845 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/88bd7935-19a0-486d-b1e7-4737abcf21ab-run-httpd\") pod \"88bd7935-19a0-486d-b1e7-4737abcf21ab\" (UID: \"88bd7935-19a0-486d-b1e7-4737abcf21ab\") " Dec 12 00:46:46 crc kubenswrapper[4606]: I1212 00:46:46.300096 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88bd7935-19a0-486d-b1e7-4737abcf21ab-config-data\") pod \"88bd7935-19a0-486d-b1e7-4737abcf21ab\" (UID: \"88bd7935-19a0-486d-b1e7-4737abcf21ab\") " Dec 12 00:46:46 crc kubenswrapper[4606]: I1212 00:46:46.300122 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/88bd7935-19a0-486d-b1e7-4737abcf21ab-log-httpd\") pod \"88bd7935-19a0-486d-b1e7-4737abcf21ab\" (UID: \"88bd7935-19a0-486d-b1e7-4737abcf21ab\") " Dec 12 00:46:46 crc kubenswrapper[4606]: I1212 00:46:46.300159 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88bd7935-19a0-486d-b1e7-4737abcf21ab-scripts\") pod \"88bd7935-19a0-486d-b1e7-4737abcf21ab\" (UID: \"88bd7935-19a0-486d-b1e7-4737abcf21ab\") " Dec 12 00:46:46 crc kubenswrapper[4606]: I1212 00:46:46.300209 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88bd7935-19a0-486d-b1e7-4737abcf21ab-combined-ca-bundle\") pod \"88bd7935-19a0-486d-b1e7-4737abcf21ab\" (UID: \"88bd7935-19a0-486d-b1e7-4737abcf21ab\") " Dec 12 00:46:46 crc kubenswrapper[4606]: I1212 00:46:46.300296 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/88bd7935-19a0-486d-b1e7-4737abcf21ab-sg-core-conf-yaml\") pod \"88bd7935-19a0-486d-b1e7-4737abcf21ab\" (UID: \"88bd7935-19a0-486d-b1e7-4737abcf21ab\") " Dec 12 00:46:46 crc kubenswrapper[4606]: I1212 00:46:46.300360 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jqrqm\" (UniqueName: \"kubernetes.io/projected/88bd7935-19a0-486d-b1e7-4737abcf21ab-kube-api-access-jqrqm\") pod \"88bd7935-19a0-486d-b1e7-4737abcf21ab\" (UID: \"88bd7935-19a0-486d-b1e7-4737abcf21ab\") " Dec 12 00:46:46 crc kubenswrapper[4606]: I1212 00:46:46.307408 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88bd7935-19a0-486d-b1e7-4737abcf21ab-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "88bd7935-19a0-486d-b1e7-4737abcf21ab" (UID: "88bd7935-19a0-486d-b1e7-4737abcf21ab"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 00:46:46 crc kubenswrapper[4606]: I1212 00:46:46.307566 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88bd7935-19a0-486d-b1e7-4737abcf21ab-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "88bd7935-19a0-486d-b1e7-4737abcf21ab" (UID: "88bd7935-19a0-486d-b1e7-4737abcf21ab"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 00:46:46 crc kubenswrapper[4606]: I1212 00:46:46.307973 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88bd7935-19a0-486d-b1e7-4737abcf21ab-kube-api-access-jqrqm" (OuterVolumeSpecName: "kube-api-access-jqrqm") pod "88bd7935-19a0-486d-b1e7-4737abcf21ab" (UID: "88bd7935-19a0-486d-b1e7-4737abcf21ab"). InnerVolumeSpecName "kube-api-access-jqrqm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:46:46 crc kubenswrapper[4606]: I1212 00:46:46.312523 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88bd7935-19a0-486d-b1e7-4737abcf21ab-scripts" (OuterVolumeSpecName: "scripts") pod "88bd7935-19a0-486d-b1e7-4737abcf21ab" (UID: "88bd7935-19a0-486d-b1e7-4737abcf21ab"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:46:46 crc kubenswrapper[4606]: I1212 00:46:46.352129 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88bd7935-19a0-486d-b1e7-4737abcf21ab-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "88bd7935-19a0-486d-b1e7-4737abcf21ab" (UID: "88bd7935-19a0-486d-b1e7-4737abcf21ab"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:46:46 crc kubenswrapper[4606]: I1212 00:46:46.402841 4606 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/88bd7935-19a0-486d-b1e7-4737abcf21ab-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 12 00:46:46 crc kubenswrapper[4606]: I1212 00:46:46.402870 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jqrqm\" (UniqueName: \"kubernetes.io/projected/88bd7935-19a0-486d-b1e7-4737abcf21ab-kube-api-access-jqrqm\") on node \"crc\" DevicePath \"\"" Dec 12 00:46:46 crc kubenswrapper[4606]: I1212 00:46:46.402882 4606 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/88bd7935-19a0-486d-b1e7-4737abcf21ab-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 12 00:46:46 crc kubenswrapper[4606]: I1212 00:46:46.402890 4606 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/88bd7935-19a0-486d-b1e7-4737abcf21ab-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 12 00:46:46 crc kubenswrapper[4606]: I1212 00:46:46.402898 4606 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88bd7935-19a0-486d-b1e7-4737abcf21ab-scripts\") on node \"crc\" DevicePath \"\"" Dec 12 00:46:46 crc kubenswrapper[4606]: I1212 00:46:46.424805 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88bd7935-19a0-486d-b1e7-4737abcf21ab-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "88bd7935-19a0-486d-b1e7-4737abcf21ab" (UID: "88bd7935-19a0-486d-b1e7-4737abcf21ab"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:46:46 crc kubenswrapper[4606]: I1212 00:46:46.443269 4606 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6575dff54-tdrpc" podUID="dec2500f-eaf5-4ee7-9c07-66f7dda126f3" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.161:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 12 00:46:46 crc kubenswrapper[4606]: I1212 00:46:46.444288 4606 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6575dff54-tdrpc" podUID="dec2500f-eaf5-4ee7-9c07-66f7dda126f3" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.161:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 12 00:46:46 crc kubenswrapper[4606]: I1212 00:46:46.462518 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88bd7935-19a0-486d-b1e7-4737abcf21ab-config-data" (OuterVolumeSpecName: "config-data") pod "88bd7935-19a0-486d-b1e7-4737abcf21ab" (UID: "88bd7935-19a0-486d-b1e7-4737abcf21ab"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:46:46 crc kubenswrapper[4606]: I1212 00:46:46.504741 4606 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88bd7935-19a0-486d-b1e7-4737abcf21ab-config-data\") on node \"crc\" DevicePath \"\"" Dec 12 00:46:46 crc kubenswrapper[4606]: I1212 00:46:46.504910 4606 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88bd7935-19a0-486d-b1e7-4737abcf21ab-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 12 00:46:46 crc kubenswrapper[4606]: I1212 00:46:46.663001 4606 generic.go:334] "Generic (PLEG): container finished" podID="88bd7935-19a0-486d-b1e7-4737abcf21ab" containerID="37f3cac0d54a98025f3895722de48b6be5179e91e78542fa9dd933d5d466826f" exitCode=0 Dec 12 00:46:46 crc kubenswrapper[4606]: I1212 00:46:46.663407 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"88bd7935-19a0-486d-b1e7-4737abcf21ab","Type":"ContainerDied","Data":"37f3cac0d54a98025f3895722de48b6be5179e91e78542fa9dd933d5d466826f"} Dec 12 00:46:46 crc kubenswrapper[4606]: I1212 00:46:46.663455 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"88bd7935-19a0-486d-b1e7-4737abcf21ab","Type":"ContainerDied","Data":"19657743b1bb044f1d0b8091aa2db14366bbfaf9e010174bb9f635c7d1affe54"} Dec 12 00:46:46 crc kubenswrapper[4606]: I1212 00:46:46.663476 4606 scope.go:117] "RemoveContainer" containerID="76d3f4c4a61e46d0c4f7b43e3c1a4412a13b89768aaae2e240e93ad49520ebce" Dec 12 00:46:46 crc kubenswrapper[4606]: I1212 00:46:46.664072 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 12 00:46:46 crc kubenswrapper[4606]: W1212 00:46:46.671948 4606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2cacde66_96b8_437e_86b5_aefba1e473ae.slice/crio-f1470c82d61d09b43dc3daa8a8ded8945d76c5e6e390d7aadb63967f737569b6 WatchSource:0}: Error finding container f1470c82d61d09b43dc3daa8a8ded8945d76c5e6e390d7aadb63967f737569b6: Status 404 returned error can't find the container with id f1470c82d61d09b43dc3daa8a8ded8945d76c5e6e390d7aadb63967f737569b6 Dec 12 00:46:46 crc kubenswrapper[4606]: I1212 00:46:46.673678 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 12 00:46:46 crc kubenswrapper[4606]: I1212 00:46:46.741093 4606 scope.go:117] "RemoveContainer" containerID="50ac6e7844b43e9e559ddc58160fae5e9b534b89c917ad342b3072a675f38fc5" Dec 12 00:46:46 crc kubenswrapper[4606]: I1212 00:46:46.746365 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 12 00:46:46 crc kubenswrapper[4606]: I1212 00:46:46.761313 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 12 00:46:46 crc kubenswrapper[4606]: I1212 00:46:46.770247 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 12 00:46:46 crc kubenswrapper[4606]: E1212 00:46:46.770658 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88bd7935-19a0-486d-b1e7-4737abcf21ab" containerName="sg-core" Dec 12 00:46:46 crc kubenswrapper[4606]: I1212 00:46:46.770681 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="88bd7935-19a0-486d-b1e7-4737abcf21ab" containerName="sg-core" Dec 12 00:46:46 crc kubenswrapper[4606]: E1212 00:46:46.770720 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88bd7935-19a0-486d-b1e7-4737abcf21ab" containerName="proxy-httpd" Dec 12 00:46:46 crc kubenswrapper[4606]: I1212 00:46:46.770728 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="88bd7935-19a0-486d-b1e7-4737abcf21ab" containerName="proxy-httpd" Dec 12 00:46:46 crc kubenswrapper[4606]: E1212 00:46:46.770754 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88bd7935-19a0-486d-b1e7-4737abcf21ab" containerName="ceilometer-central-agent" Dec 12 00:46:46 crc kubenswrapper[4606]: I1212 00:46:46.770760 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="88bd7935-19a0-486d-b1e7-4737abcf21ab" containerName="ceilometer-central-agent" Dec 12 00:46:46 crc kubenswrapper[4606]: E1212 00:46:46.770770 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88bd7935-19a0-486d-b1e7-4737abcf21ab" containerName="ceilometer-notification-agent" Dec 12 00:46:46 crc kubenswrapper[4606]: I1212 00:46:46.770775 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="88bd7935-19a0-486d-b1e7-4737abcf21ab" containerName="ceilometer-notification-agent" Dec 12 00:46:46 crc kubenswrapper[4606]: I1212 00:46:46.770949 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="88bd7935-19a0-486d-b1e7-4737abcf21ab" containerName="proxy-httpd" Dec 12 00:46:46 crc kubenswrapper[4606]: I1212 00:46:46.770964 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="88bd7935-19a0-486d-b1e7-4737abcf21ab" containerName="sg-core" Dec 12 00:46:46 crc kubenswrapper[4606]: I1212 00:46:46.770978 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="88bd7935-19a0-486d-b1e7-4737abcf21ab" containerName="ceilometer-central-agent" Dec 12 00:46:46 crc kubenswrapper[4606]: I1212 00:46:46.770987 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="88bd7935-19a0-486d-b1e7-4737abcf21ab" containerName="ceilometer-notification-agent" Dec 12 00:46:46 crc kubenswrapper[4606]: I1212 00:46:46.772853 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 12 00:46:46 crc kubenswrapper[4606]: I1212 00:46:46.775708 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 12 00:46:46 crc kubenswrapper[4606]: I1212 00:46:46.783574 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 12 00:46:46 crc kubenswrapper[4606]: I1212 00:46:46.793998 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 12 00:46:46 crc kubenswrapper[4606]: I1212 00:46:46.864964 4606 scope.go:117] "RemoveContainer" containerID="37f3cac0d54a98025f3895722de48b6be5179e91e78542fa9dd933d5d466826f" Dec 12 00:46:46 crc kubenswrapper[4606]: I1212 00:46:46.913704 4606 scope.go:117] "RemoveContainer" containerID="4b5bdf970c54fef8e9c089fcdbb9c46b973ecdbfcce0ef2b0dc6b071a5a7f0b0" Dec 12 00:46:46 crc kubenswrapper[4606]: I1212 00:46:46.915135 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jwsl\" (UniqueName: \"kubernetes.io/projected/96587cc5-74f0-484c-8217-7d1a09d39580-kube-api-access-8jwsl\") pod \"ceilometer-0\" (UID: \"96587cc5-74f0-484c-8217-7d1a09d39580\") " pod="openstack/ceilometer-0" Dec 12 00:46:46 crc kubenswrapper[4606]: I1212 00:46:46.915220 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96587cc5-74f0-484c-8217-7d1a09d39580-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"96587cc5-74f0-484c-8217-7d1a09d39580\") " pod="openstack/ceilometer-0" Dec 12 00:46:46 crc kubenswrapper[4606]: I1212 00:46:46.915239 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/96587cc5-74f0-484c-8217-7d1a09d39580-log-httpd\") pod \"ceilometer-0\" (UID: \"96587cc5-74f0-484c-8217-7d1a09d39580\") " pod="openstack/ceilometer-0" Dec 12 00:46:46 crc kubenswrapper[4606]: I1212 00:46:46.915254 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/96587cc5-74f0-484c-8217-7d1a09d39580-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"96587cc5-74f0-484c-8217-7d1a09d39580\") " pod="openstack/ceilometer-0" Dec 12 00:46:46 crc kubenswrapper[4606]: I1212 00:46:46.915306 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96587cc5-74f0-484c-8217-7d1a09d39580-config-data\") pod \"ceilometer-0\" (UID: \"96587cc5-74f0-484c-8217-7d1a09d39580\") " pod="openstack/ceilometer-0" Dec 12 00:46:46 crc kubenswrapper[4606]: I1212 00:46:46.915415 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/96587cc5-74f0-484c-8217-7d1a09d39580-run-httpd\") pod \"ceilometer-0\" (UID: \"96587cc5-74f0-484c-8217-7d1a09d39580\") " pod="openstack/ceilometer-0" Dec 12 00:46:46 crc kubenswrapper[4606]: I1212 00:46:46.915568 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/96587cc5-74f0-484c-8217-7d1a09d39580-scripts\") pod \"ceilometer-0\" (UID: \"96587cc5-74f0-484c-8217-7d1a09d39580\") " pod="openstack/ceilometer-0" Dec 12 00:46:46 crc kubenswrapper[4606]: I1212 00:46:46.951530 4606 scope.go:117] "RemoveContainer" containerID="76d3f4c4a61e46d0c4f7b43e3c1a4412a13b89768aaae2e240e93ad49520ebce" Dec 12 00:46:46 crc kubenswrapper[4606]: E1212 00:46:46.952295 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76d3f4c4a61e46d0c4f7b43e3c1a4412a13b89768aaae2e240e93ad49520ebce\": container with ID starting with 76d3f4c4a61e46d0c4f7b43e3c1a4412a13b89768aaae2e240e93ad49520ebce not found: ID does not exist" containerID="76d3f4c4a61e46d0c4f7b43e3c1a4412a13b89768aaae2e240e93ad49520ebce" Dec 12 00:46:46 crc kubenswrapper[4606]: I1212 00:46:46.952345 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76d3f4c4a61e46d0c4f7b43e3c1a4412a13b89768aaae2e240e93ad49520ebce"} err="failed to get container status \"76d3f4c4a61e46d0c4f7b43e3c1a4412a13b89768aaae2e240e93ad49520ebce\": rpc error: code = NotFound desc = could not find container \"76d3f4c4a61e46d0c4f7b43e3c1a4412a13b89768aaae2e240e93ad49520ebce\": container with ID starting with 76d3f4c4a61e46d0c4f7b43e3c1a4412a13b89768aaae2e240e93ad49520ebce not found: ID does not exist" Dec 12 00:46:46 crc kubenswrapper[4606]: I1212 00:46:46.952370 4606 scope.go:117] "RemoveContainer" containerID="50ac6e7844b43e9e559ddc58160fae5e9b534b89c917ad342b3072a675f38fc5" Dec 12 00:46:46 crc kubenswrapper[4606]: E1212 00:46:46.952760 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50ac6e7844b43e9e559ddc58160fae5e9b534b89c917ad342b3072a675f38fc5\": container with ID starting with 50ac6e7844b43e9e559ddc58160fae5e9b534b89c917ad342b3072a675f38fc5 not found: ID does not exist" containerID="50ac6e7844b43e9e559ddc58160fae5e9b534b89c917ad342b3072a675f38fc5" Dec 12 00:46:46 crc kubenswrapper[4606]: I1212 00:46:46.952794 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50ac6e7844b43e9e559ddc58160fae5e9b534b89c917ad342b3072a675f38fc5"} err="failed to get container status \"50ac6e7844b43e9e559ddc58160fae5e9b534b89c917ad342b3072a675f38fc5\": rpc error: code = NotFound desc = could not find container \"50ac6e7844b43e9e559ddc58160fae5e9b534b89c917ad342b3072a675f38fc5\": container with ID starting with 50ac6e7844b43e9e559ddc58160fae5e9b534b89c917ad342b3072a675f38fc5 not found: ID does not exist" Dec 12 00:46:46 crc kubenswrapper[4606]: I1212 00:46:46.952816 4606 scope.go:117] "RemoveContainer" containerID="37f3cac0d54a98025f3895722de48b6be5179e91e78542fa9dd933d5d466826f" Dec 12 00:46:46 crc kubenswrapper[4606]: E1212 00:46:46.953603 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37f3cac0d54a98025f3895722de48b6be5179e91e78542fa9dd933d5d466826f\": container with ID starting with 37f3cac0d54a98025f3895722de48b6be5179e91e78542fa9dd933d5d466826f not found: ID does not exist" containerID="37f3cac0d54a98025f3895722de48b6be5179e91e78542fa9dd933d5d466826f" Dec 12 00:46:46 crc kubenswrapper[4606]: I1212 00:46:46.953633 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37f3cac0d54a98025f3895722de48b6be5179e91e78542fa9dd933d5d466826f"} err="failed to get container status \"37f3cac0d54a98025f3895722de48b6be5179e91e78542fa9dd933d5d466826f\": rpc error: code = NotFound desc = could not find container \"37f3cac0d54a98025f3895722de48b6be5179e91e78542fa9dd933d5d466826f\": container with ID starting with 37f3cac0d54a98025f3895722de48b6be5179e91e78542fa9dd933d5d466826f not found: ID does not exist" Dec 12 00:46:46 crc kubenswrapper[4606]: I1212 00:46:46.953658 4606 scope.go:117] "RemoveContainer" containerID="4b5bdf970c54fef8e9c089fcdbb9c46b973ecdbfcce0ef2b0dc6b071a5a7f0b0" Dec 12 00:46:46 crc kubenswrapper[4606]: E1212 00:46:46.953931 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b5bdf970c54fef8e9c089fcdbb9c46b973ecdbfcce0ef2b0dc6b071a5a7f0b0\": container with ID starting with 4b5bdf970c54fef8e9c089fcdbb9c46b973ecdbfcce0ef2b0dc6b071a5a7f0b0 not found: ID does not exist" containerID="4b5bdf970c54fef8e9c089fcdbb9c46b973ecdbfcce0ef2b0dc6b071a5a7f0b0" Dec 12 00:46:46 crc kubenswrapper[4606]: I1212 00:46:46.953991 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b5bdf970c54fef8e9c089fcdbb9c46b973ecdbfcce0ef2b0dc6b071a5a7f0b0"} err="failed to get container status \"4b5bdf970c54fef8e9c089fcdbb9c46b973ecdbfcce0ef2b0dc6b071a5a7f0b0\": rpc error: code = NotFound desc = could not find container \"4b5bdf970c54fef8e9c089fcdbb9c46b973ecdbfcce0ef2b0dc6b071a5a7f0b0\": container with ID starting with 4b5bdf970c54fef8e9c089fcdbb9c46b973ecdbfcce0ef2b0dc6b071a5a7f0b0 not found: ID does not exist" Dec 12 00:46:47 crc kubenswrapper[4606]: I1212 00:46:47.018546 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/96587cc5-74f0-484c-8217-7d1a09d39580-run-httpd\") pod \"ceilometer-0\" (UID: \"96587cc5-74f0-484c-8217-7d1a09d39580\") " pod="openstack/ceilometer-0" Dec 12 00:46:47 crc kubenswrapper[4606]: I1212 00:46:47.018671 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/96587cc5-74f0-484c-8217-7d1a09d39580-scripts\") pod \"ceilometer-0\" (UID: \"96587cc5-74f0-484c-8217-7d1a09d39580\") " pod="openstack/ceilometer-0" Dec 12 00:46:47 crc kubenswrapper[4606]: I1212 00:46:47.018747 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8jwsl\" (UniqueName: \"kubernetes.io/projected/96587cc5-74f0-484c-8217-7d1a09d39580-kube-api-access-8jwsl\") pod \"ceilometer-0\" (UID: \"96587cc5-74f0-484c-8217-7d1a09d39580\") " pod="openstack/ceilometer-0" Dec 12 00:46:47 crc kubenswrapper[4606]: I1212 00:46:47.018800 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/96587cc5-74f0-484c-8217-7d1a09d39580-log-httpd\") pod \"ceilometer-0\" (UID: \"96587cc5-74f0-484c-8217-7d1a09d39580\") " pod="openstack/ceilometer-0" Dec 12 00:46:47 crc kubenswrapper[4606]: I1212 00:46:47.018852 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96587cc5-74f0-484c-8217-7d1a09d39580-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"96587cc5-74f0-484c-8217-7d1a09d39580\") " pod="openstack/ceilometer-0" Dec 12 00:46:47 crc kubenswrapper[4606]: I1212 00:46:47.019398 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/96587cc5-74f0-484c-8217-7d1a09d39580-run-httpd\") pod \"ceilometer-0\" (UID: \"96587cc5-74f0-484c-8217-7d1a09d39580\") " pod="openstack/ceilometer-0" Dec 12 00:46:47 crc kubenswrapper[4606]: I1212 00:46:47.019558 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/96587cc5-74f0-484c-8217-7d1a09d39580-log-httpd\") pod \"ceilometer-0\" (UID: \"96587cc5-74f0-484c-8217-7d1a09d39580\") " pod="openstack/ceilometer-0" Dec 12 00:46:47 crc kubenswrapper[4606]: I1212 00:46:47.018875 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96587cc5-74f0-484c-8217-7d1a09d39580-config-data\") pod \"ceilometer-0\" (UID: \"96587cc5-74f0-484c-8217-7d1a09d39580\") " pod="openstack/ceilometer-0" Dec 12 00:46:47 crc kubenswrapper[4606]: I1212 00:46:47.020587 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/96587cc5-74f0-484c-8217-7d1a09d39580-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"96587cc5-74f0-484c-8217-7d1a09d39580\") " pod="openstack/ceilometer-0" Dec 12 00:46:47 crc kubenswrapper[4606]: I1212 00:46:47.027020 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/96587cc5-74f0-484c-8217-7d1a09d39580-scripts\") pod \"ceilometer-0\" (UID: \"96587cc5-74f0-484c-8217-7d1a09d39580\") " pod="openstack/ceilometer-0" Dec 12 00:46:47 crc kubenswrapper[4606]: I1212 00:46:47.027505 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96587cc5-74f0-484c-8217-7d1a09d39580-config-data\") pod \"ceilometer-0\" (UID: \"96587cc5-74f0-484c-8217-7d1a09d39580\") " pod="openstack/ceilometer-0" Dec 12 00:46:47 crc kubenswrapper[4606]: I1212 00:46:47.038361 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jwsl\" (UniqueName: \"kubernetes.io/projected/96587cc5-74f0-484c-8217-7d1a09d39580-kube-api-access-8jwsl\") pod \"ceilometer-0\" (UID: \"96587cc5-74f0-484c-8217-7d1a09d39580\") " pod="openstack/ceilometer-0" Dec 12 00:46:47 crc kubenswrapper[4606]: I1212 00:46:47.040356 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96587cc5-74f0-484c-8217-7d1a09d39580-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"96587cc5-74f0-484c-8217-7d1a09d39580\") " pod="openstack/ceilometer-0" Dec 12 00:46:47 crc kubenswrapper[4606]: I1212 00:46:47.050864 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/96587cc5-74f0-484c-8217-7d1a09d39580-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"96587cc5-74f0-484c-8217-7d1a09d39580\") " pod="openstack/ceilometer-0" Dec 12 00:46:47 crc kubenswrapper[4606]: I1212 00:46:47.166200 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 12 00:46:47 crc kubenswrapper[4606]: I1212 00:46:47.688541 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2cacde66-96b8-437e-86b5-aefba1e473ae","Type":"ContainerStarted","Data":"f1470c82d61d09b43dc3daa8a8ded8945d76c5e6e390d7aadb63967f737569b6"} Dec 12 00:46:47 crc kubenswrapper[4606]: I1212 00:46:47.711056 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88bd7935-19a0-486d-b1e7-4737abcf21ab" path="/var/lib/kubelet/pods/88bd7935-19a0-486d-b1e7-4737abcf21ab/volumes" Dec 12 00:46:47 crc kubenswrapper[4606]: I1212 00:46:47.712005 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fbd55103-6007-4c6a-b06e-add09fe3483a" path="/var/lib/kubelet/pods/fbd55103-6007-4c6a-b06e-add09fe3483a/volumes" Dec 12 00:46:47 crc kubenswrapper[4606]: I1212 00:46:47.923424 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 12 00:46:47 crc kubenswrapper[4606]: W1212 00:46:47.932775 4606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod96587cc5_74f0_484c_8217_7d1a09d39580.slice/crio-fd93240d55a848dddf7cac622fc2ad62a501602170326b396b860eb1c8fe9f80 WatchSource:0}: Error finding container fd93240d55a848dddf7cac622fc2ad62a501602170326b396b860eb1c8fe9f80: Status 404 returned error can't find the container with id fd93240d55a848dddf7cac622fc2ad62a501602170326b396b860eb1c8fe9f80 Dec 12 00:46:48 crc kubenswrapper[4606]: I1212 00:46:48.700747 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"96587cc5-74f0-484c-8217-7d1a09d39580","Type":"ContainerStarted","Data":"5ef316d75b266fd5423e7c5484db50d86dee9e7d2190bf4b280ed073bc688763"} Dec 12 00:46:48 crc kubenswrapper[4606]: I1212 00:46:48.701270 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"96587cc5-74f0-484c-8217-7d1a09d39580","Type":"ContainerStarted","Data":"fd93240d55a848dddf7cac622fc2ad62a501602170326b396b860eb1c8fe9f80"} Dec 12 00:46:48 crc kubenswrapper[4606]: I1212 00:46:48.703593 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2cacde66-96b8-437e-86b5-aefba1e473ae","Type":"ContainerStarted","Data":"09b07bc0ecf529c3bfe3aa055c576d67ac656acd1d98b872e77146921f265b04"} Dec 12 00:46:48 crc kubenswrapper[4606]: I1212 00:46:48.703622 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2cacde66-96b8-437e-86b5-aefba1e473ae","Type":"ContainerStarted","Data":"64c45d944111ff3b62a9c2b09e1277e85fad6e3efb4e91cc79a957e97fb518cd"} Dec 12 00:46:48 crc kubenswrapper[4606]: I1212 00:46:48.703795 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 12 00:46:48 crc kubenswrapper[4606]: I1212 00:46:48.723625 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.723606758 podStartE2EDuration="3.723606758s" podCreationTimestamp="2025-12-12 00:46:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:46:48.721553753 +0000 UTC m=+1399.266906639" watchObservedRunningTime="2025-12-12 00:46:48.723606758 +0000 UTC m=+1399.268959624" Dec 12 00:46:49 crc kubenswrapper[4606]: I1212 00:46:49.726772 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"96587cc5-74f0-484c-8217-7d1a09d39580","Type":"ContainerStarted","Data":"a71d4498cd22cd092cf4563dd14745bff4bb0cd8de6d11b853522795379c5bed"} Dec 12 00:46:50 crc kubenswrapper[4606]: I1212 00:46:50.507738 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 12 00:46:50 crc kubenswrapper[4606]: I1212 00:46:50.540781 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6578955fd5-p8nxl" Dec 12 00:46:50 crc kubenswrapper[4606]: I1212 00:46:50.614831 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-gjqxm"] Dec 12 00:46:50 crc kubenswrapper[4606]: I1212 00:46:50.615110 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-848cf88cfc-gjqxm" podUID="17508d9b-339d-4b3a-ab72-e234a8ec168a" containerName="dnsmasq-dns" containerID="cri-o://ede0ca97d555480e9bff7d8de1dec15def97f16bfc3692e0b4036023c8b4e71f" gracePeriod=10 Dec 12 00:46:50 crc kubenswrapper[4606]: I1212 00:46:50.763413 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"96587cc5-74f0-484c-8217-7d1a09d39580","Type":"ContainerStarted","Data":"facdd7194e1066b996e364ccd0c06b074318090aeaaa48cc595ed41968e7f63f"} Dec 12 00:46:50 crc kubenswrapper[4606]: I1212 00:46:50.987125 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 12 00:46:51 crc kubenswrapper[4606]: I1212 00:46:51.058674 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 12 00:46:51 crc kubenswrapper[4606]: I1212 00:46:51.361901 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-gjqxm" Dec 12 00:46:51 crc kubenswrapper[4606]: I1212 00:46:51.500136 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/17508d9b-339d-4b3a-ab72-e234a8ec168a-dns-svc\") pod \"17508d9b-339d-4b3a-ab72-e234a8ec168a\" (UID: \"17508d9b-339d-4b3a-ab72-e234a8ec168a\") " Dec 12 00:46:51 crc kubenswrapper[4606]: I1212 00:46:51.500514 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/17508d9b-339d-4b3a-ab72-e234a8ec168a-ovsdbserver-sb\") pod \"17508d9b-339d-4b3a-ab72-e234a8ec168a\" (UID: \"17508d9b-339d-4b3a-ab72-e234a8ec168a\") " Dec 12 00:46:51 crc kubenswrapper[4606]: I1212 00:46:51.500668 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/17508d9b-339d-4b3a-ab72-e234a8ec168a-ovsdbserver-nb\") pod \"17508d9b-339d-4b3a-ab72-e234a8ec168a\" (UID: \"17508d9b-339d-4b3a-ab72-e234a8ec168a\") " Dec 12 00:46:51 crc kubenswrapper[4606]: I1212 00:46:51.500793 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17508d9b-339d-4b3a-ab72-e234a8ec168a-config\") pod \"17508d9b-339d-4b3a-ab72-e234a8ec168a\" (UID: \"17508d9b-339d-4b3a-ab72-e234a8ec168a\") " Dec 12 00:46:51 crc kubenswrapper[4606]: I1212 00:46:51.501415 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ntxf2\" (UniqueName: \"kubernetes.io/projected/17508d9b-339d-4b3a-ab72-e234a8ec168a-kube-api-access-ntxf2\") pod \"17508d9b-339d-4b3a-ab72-e234a8ec168a\" (UID: \"17508d9b-339d-4b3a-ab72-e234a8ec168a\") " Dec 12 00:46:51 crc kubenswrapper[4606]: I1212 00:46:51.501633 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/17508d9b-339d-4b3a-ab72-e234a8ec168a-dns-swift-storage-0\") pod \"17508d9b-339d-4b3a-ab72-e234a8ec168a\" (UID: \"17508d9b-339d-4b3a-ab72-e234a8ec168a\") " Dec 12 00:46:51 crc kubenswrapper[4606]: I1212 00:46:51.508076 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17508d9b-339d-4b3a-ab72-e234a8ec168a-kube-api-access-ntxf2" (OuterVolumeSpecName: "kube-api-access-ntxf2") pod "17508d9b-339d-4b3a-ab72-e234a8ec168a" (UID: "17508d9b-339d-4b3a-ab72-e234a8ec168a"). InnerVolumeSpecName "kube-api-access-ntxf2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:46:51 crc kubenswrapper[4606]: I1212 00:46:51.569822 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17508d9b-339d-4b3a-ab72-e234a8ec168a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "17508d9b-339d-4b3a-ab72-e234a8ec168a" (UID: "17508d9b-339d-4b3a-ab72-e234a8ec168a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:46:51 crc kubenswrapper[4606]: I1212 00:46:51.581436 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17508d9b-339d-4b3a-ab72-e234a8ec168a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "17508d9b-339d-4b3a-ab72-e234a8ec168a" (UID: "17508d9b-339d-4b3a-ab72-e234a8ec168a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:46:51 crc kubenswrapper[4606]: I1212 00:46:51.583924 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17508d9b-339d-4b3a-ab72-e234a8ec168a-config" (OuterVolumeSpecName: "config") pod "17508d9b-339d-4b3a-ab72-e234a8ec168a" (UID: "17508d9b-339d-4b3a-ab72-e234a8ec168a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:46:51 crc kubenswrapper[4606]: I1212 00:46:51.599754 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17508d9b-339d-4b3a-ab72-e234a8ec168a-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "17508d9b-339d-4b3a-ab72-e234a8ec168a" (UID: "17508d9b-339d-4b3a-ab72-e234a8ec168a"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:46:51 crc kubenswrapper[4606]: I1212 00:46:51.604809 4606 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/17508d9b-339d-4b3a-ab72-e234a8ec168a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 12 00:46:51 crc kubenswrapper[4606]: I1212 00:46:51.604839 4606 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17508d9b-339d-4b3a-ab72-e234a8ec168a-config\") on node \"crc\" DevicePath \"\"" Dec 12 00:46:51 crc kubenswrapper[4606]: I1212 00:46:51.604851 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ntxf2\" (UniqueName: \"kubernetes.io/projected/17508d9b-339d-4b3a-ab72-e234a8ec168a-kube-api-access-ntxf2\") on node \"crc\" DevicePath \"\"" Dec 12 00:46:51 crc kubenswrapper[4606]: I1212 00:46:51.604863 4606 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/17508d9b-339d-4b3a-ab72-e234a8ec168a-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 12 00:46:51 crc kubenswrapper[4606]: I1212 00:46:51.604871 4606 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/17508d9b-339d-4b3a-ab72-e234a8ec168a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 12 00:46:51 crc kubenswrapper[4606]: I1212 00:46:51.605945 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17508d9b-339d-4b3a-ab72-e234a8ec168a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "17508d9b-339d-4b3a-ab72-e234a8ec168a" (UID: "17508d9b-339d-4b3a-ab72-e234a8ec168a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:46:51 crc kubenswrapper[4606]: I1212 00:46:51.706220 4606 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/17508d9b-339d-4b3a-ab72-e234a8ec168a-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 12 00:46:51 crc kubenswrapper[4606]: I1212 00:46:51.772277 4606 generic.go:334] "Generic (PLEG): container finished" podID="17508d9b-339d-4b3a-ab72-e234a8ec168a" containerID="ede0ca97d555480e9bff7d8de1dec15def97f16bfc3692e0b4036023c8b4e71f" exitCode=0 Dec 12 00:46:51 crc kubenswrapper[4606]: I1212 00:46:51.772473 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-gjqxm" event={"ID":"17508d9b-339d-4b3a-ab72-e234a8ec168a","Type":"ContainerDied","Data":"ede0ca97d555480e9bff7d8de1dec15def97f16bfc3692e0b4036023c8b4e71f"} Dec 12 00:46:51 crc kubenswrapper[4606]: I1212 00:46:51.773248 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-gjqxm" event={"ID":"17508d9b-339d-4b3a-ab72-e234a8ec168a","Type":"ContainerDied","Data":"289ea5f374286f826b2c02768e282721733354cb1a0c0ffa427f39c7320130ad"} Dec 12 00:46:51 crc kubenswrapper[4606]: I1212 00:46:51.773316 4606 scope.go:117] "RemoveContainer" containerID="ede0ca97d555480e9bff7d8de1dec15def97f16bfc3692e0b4036023c8b4e71f" Dec 12 00:46:51 crc kubenswrapper[4606]: I1212 00:46:51.772539 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-gjqxm" Dec 12 00:46:51 crc kubenswrapper[4606]: I1212 00:46:51.777374 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="06ce9244-6f03-4386-8b05-fdca2f2e1120" containerName="cinder-scheduler" containerID="cri-o://16bc2caf9abd0d9904b1f352d55a13e16dca4b6ee1ea0868f25d0bd709b7f6e1" gracePeriod=30 Dec 12 00:46:51 crc kubenswrapper[4606]: I1212 00:46:51.778404 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"96587cc5-74f0-484c-8217-7d1a09d39580","Type":"ContainerStarted","Data":"42f5bc69b4cc2aa4850677899d07272bbd6a3678c140953c96a2c041e4557759"} Dec 12 00:46:51 crc kubenswrapper[4606]: I1212 00:46:51.778533 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="06ce9244-6f03-4386-8b05-fdca2f2e1120" containerName="probe" containerID="cri-o://117a918723b40673fbcff7184808486967fd2181a70829cedb1116c344a75dd0" gracePeriod=30 Dec 12 00:46:51 crc kubenswrapper[4606]: I1212 00:46:51.778620 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 12 00:46:51 crc kubenswrapper[4606]: I1212 00:46:51.815832 4606 scope.go:117] "RemoveContainer" containerID="f2ffccb029ab6dad21f9321765317f8d1bc1d579c34566e7f8d448acfc094fa0" Dec 12 00:46:51 crc kubenswrapper[4606]: I1212 00:46:51.861962 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.609967685 podStartE2EDuration="5.86194561s" podCreationTimestamp="2025-12-12 00:46:46 +0000 UTC" firstStartedPulling="2025-12-12 00:46:47.935832909 +0000 UTC m=+1398.481185775" lastFinishedPulling="2025-12-12 00:46:51.187810834 +0000 UTC m=+1401.733163700" observedRunningTime="2025-12-12 00:46:51.828441662 +0000 UTC m=+1402.373794538" watchObservedRunningTime="2025-12-12 00:46:51.86194561 +0000 UTC m=+1402.407298476" Dec 12 00:46:51 crc kubenswrapper[4606]: I1212 00:46:51.874995 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-gjqxm"] Dec 12 00:46:51 crc kubenswrapper[4606]: I1212 00:46:51.891501 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-gjqxm"] Dec 12 00:46:51 crc kubenswrapper[4606]: I1212 00:46:51.891764 4606 scope.go:117] "RemoveContainer" containerID="ede0ca97d555480e9bff7d8de1dec15def97f16bfc3692e0b4036023c8b4e71f" Dec 12 00:46:51 crc kubenswrapper[4606]: E1212 00:46:51.892272 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ede0ca97d555480e9bff7d8de1dec15def97f16bfc3692e0b4036023c8b4e71f\": container with ID starting with ede0ca97d555480e9bff7d8de1dec15def97f16bfc3692e0b4036023c8b4e71f not found: ID does not exist" containerID="ede0ca97d555480e9bff7d8de1dec15def97f16bfc3692e0b4036023c8b4e71f" Dec 12 00:46:51 crc kubenswrapper[4606]: I1212 00:46:51.892306 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ede0ca97d555480e9bff7d8de1dec15def97f16bfc3692e0b4036023c8b4e71f"} err="failed to get container status \"ede0ca97d555480e9bff7d8de1dec15def97f16bfc3692e0b4036023c8b4e71f\": rpc error: code = NotFound desc = could not find container \"ede0ca97d555480e9bff7d8de1dec15def97f16bfc3692e0b4036023c8b4e71f\": container with ID starting with ede0ca97d555480e9bff7d8de1dec15def97f16bfc3692e0b4036023c8b4e71f not found: ID does not exist" Dec 12 00:46:51 crc kubenswrapper[4606]: I1212 00:46:51.892331 4606 scope.go:117] "RemoveContainer" containerID="f2ffccb029ab6dad21f9321765317f8d1bc1d579c34566e7f8d448acfc094fa0" Dec 12 00:46:51 crc kubenswrapper[4606]: E1212 00:46:51.892621 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2ffccb029ab6dad21f9321765317f8d1bc1d579c34566e7f8d448acfc094fa0\": container with ID starting with f2ffccb029ab6dad21f9321765317f8d1bc1d579c34566e7f8d448acfc094fa0 not found: ID does not exist" containerID="f2ffccb029ab6dad21f9321765317f8d1bc1d579c34566e7f8d448acfc094fa0" Dec 12 00:46:51 crc kubenswrapper[4606]: I1212 00:46:51.892647 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2ffccb029ab6dad21f9321765317f8d1bc1d579c34566e7f8d448acfc094fa0"} err="failed to get container status \"f2ffccb029ab6dad21f9321765317f8d1bc1d579c34566e7f8d448acfc094fa0\": rpc error: code = NotFound desc = could not find container \"f2ffccb029ab6dad21f9321765317f8d1bc1d579c34566e7f8d448acfc094fa0\": container with ID starting with f2ffccb029ab6dad21f9321765317f8d1bc1d579c34566e7f8d448acfc094fa0 not found: ID does not exist" Dec 12 00:46:52 crc kubenswrapper[4606]: I1212 00:46:52.786997 4606 generic.go:334] "Generic (PLEG): container finished" podID="06ce9244-6f03-4386-8b05-fdca2f2e1120" containerID="117a918723b40673fbcff7184808486967fd2181a70829cedb1116c344a75dd0" exitCode=0 Dec 12 00:46:52 crc kubenswrapper[4606]: I1212 00:46:52.787074 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"06ce9244-6f03-4386-8b05-fdca2f2e1120","Type":"ContainerDied","Data":"117a918723b40673fbcff7184808486967fd2181a70829cedb1116c344a75dd0"} Dec 12 00:46:52 crc kubenswrapper[4606]: I1212 00:46:52.930380 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-76b48998b8-ff8r8" Dec 12 00:46:53 crc kubenswrapper[4606]: I1212 00:46:53.710518 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17508d9b-339d-4b3a-ab72-e234a8ec168a" path="/var/lib/kubelet/pods/17508d9b-339d-4b3a-ab72-e234a8ec168a/volumes" Dec 12 00:46:56 crc kubenswrapper[4606]: I1212 00:46:56.181844 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-7b868b57d5-xjh67"] Dec 12 00:46:56 crc kubenswrapper[4606]: E1212 00:46:56.188716 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17508d9b-339d-4b3a-ab72-e234a8ec168a" containerName="dnsmasq-dns" Dec 12 00:46:56 crc kubenswrapper[4606]: I1212 00:46:56.188922 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="17508d9b-339d-4b3a-ab72-e234a8ec168a" containerName="dnsmasq-dns" Dec 12 00:46:56 crc kubenswrapper[4606]: E1212 00:46:56.188946 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17508d9b-339d-4b3a-ab72-e234a8ec168a" containerName="init" Dec 12 00:46:56 crc kubenswrapper[4606]: I1212 00:46:56.188952 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="17508d9b-339d-4b3a-ab72-e234a8ec168a" containerName="init" Dec 12 00:46:56 crc kubenswrapper[4606]: I1212 00:46:56.189218 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="17508d9b-339d-4b3a-ab72-e234a8ec168a" containerName="dnsmasq-dns" Dec 12 00:46:56 crc kubenswrapper[4606]: I1212 00:46:56.191945 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-7b868b57d5-xjh67" Dec 12 00:46:56 crc kubenswrapper[4606]: I1212 00:46:56.198803 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Dec 12 00:46:56 crc kubenswrapper[4606]: I1212 00:46:56.199044 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Dec 12 00:46:56 crc kubenswrapper[4606]: I1212 00:46:56.199148 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Dec 12 00:46:56 crc kubenswrapper[4606]: I1212 00:46:56.205382 4606 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-848cf88cfc-gjqxm" podUID="17508d9b-339d-4b3a-ab72-e234a8ec168a" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.160:5353: i/o timeout" Dec 12 00:46:56 crc kubenswrapper[4606]: I1212 00:46:56.209435 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-7b868b57d5-xjh67"] Dec 12 00:46:56 crc kubenswrapper[4606]: I1212 00:46:56.309091 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/723ab405-5905-44a6-a625-39fbc78948ef-combined-ca-bundle\") pod \"swift-proxy-7b868b57d5-xjh67\" (UID: \"723ab405-5905-44a6-a625-39fbc78948ef\") " pod="openstack/swift-proxy-7b868b57d5-xjh67" Dec 12 00:46:56 crc kubenswrapper[4606]: I1212 00:46:56.309149 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/723ab405-5905-44a6-a625-39fbc78948ef-log-httpd\") pod \"swift-proxy-7b868b57d5-xjh67\" (UID: \"723ab405-5905-44a6-a625-39fbc78948ef\") " pod="openstack/swift-proxy-7b868b57d5-xjh67" Dec 12 00:46:56 crc kubenswrapper[4606]: I1212 00:46:56.309197 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/723ab405-5905-44a6-a625-39fbc78948ef-run-httpd\") pod \"swift-proxy-7b868b57d5-xjh67\" (UID: \"723ab405-5905-44a6-a625-39fbc78948ef\") " pod="openstack/swift-proxy-7b868b57d5-xjh67" Dec 12 00:46:56 crc kubenswrapper[4606]: I1212 00:46:56.309238 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/723ab405-5905-44a6-a625-39fbc78948ef-etc-swift\") pod \"swift-proxy-7b868b57d5-xjh67\" (UID: \"723ab405-5905-44a6-a625-39fbc78948ef\") " pod="openstack/swift-proxy-7b868b57d5-xjh67" Dec 12 00:46:56 crc kubenswrapper[4606]: I1212 00:46:56.309265 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/723ab405-5905-44a6-a625-39fbc78948ef-internal-tls-certs\") pod \"swift-proxy-7b868b57d5-xjh67\" (UID: \"723ab405-5905-44a6-a625-39fbc78948ef\") " pod="openstack/swift-proxy-7b868b57d5-xjh67" Dec 12 00:46:56 crc kubenswrapper[4606]: I1212 00:46:56.309321 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/723ab405-5905-44a6-a625-39fbc78948ef-public-tls-certs\") pod \"swift-proxy-7b868b57d5-xjh67\" (UID: \"723ab405-5905-44a6-a625-39fbc78948ef\") " pod="openstack/swift-proxy-7b868b57d5-xjh67" Dec 12 00:46:56 crc kubenswrapper[4606]: I1212 00:46:56.309342 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hcmx\" (UniqueName: \"kubernetes.io/projected/723ab405-5905-44a6-a625-39fbc78948ef-kube-api-access-7hcmx\") pod \"swift-proxy-7b868b57d5-xjh67\" (UID: \"723ab405-5905-44a6-a625-39fbc78948ef\") " pod="openstack/swift-proxy-7b868b57d5-xjh67" Dec 12 00:46:56 crc kubenswrapper[4606]: I1212 00:46:56.309363 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/723ab405-5905-44a6-a625-39fbc78948ef-config-data\") pod \"swift-proxy-7b868b57d5-xjh67\" (UID: \"723ab405-5905-44a6-a625-39fbc78948ef\") " pod="openstack/swift-proxy-7b868b57d5-xjh67" Dec 12 00:46:56 crc kubenswrapper[4606]: I1212 00:46:56.415389 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/723ab405-5905-44a6-a625-39fbc78948ef-etc-swift\") pod \"swift-proxy-7b868b57d5-xjh67\" (UID: \"723ab405-5905-44a6-a625-39fbc78948ef\") " pod="openstack/swift-proxy-7b868b57d5-xjh67" Dec 12 00:46:56 crc kubenswrapper[4606]: I1212 00:46:56.415483 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/723ab405-5905-44a6-a625-39fbc78948ef-internal-tls-certs\") pod \"swift-proxy-7b868b57d5-xjh67\" (UID: \"723ab405-5905-44a6-a625-39fbc78948ef\") " pod="openstack/swift-proxy-7b868b57d5-xjh67" Dec 12 00:46:56 crc kubenswrapper[4606]: I1212 00:46:56.415636 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/723ab405-5905-44a6-a625-39fbc78948ef-public-tls-certs\") pod \"swift-proxy-7b868b57d5-xjh67\" (UID: \"723ab405-5905-44a6-a625-39fbc78948ef\") " pod="openstack/swift-proxy-7b868b57d5-xjh67" Dec 12 00:46:56 crc kubenswrapper[4606]: I1212 00:46:56.415669 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7hcmx\" (UniqueName: \"kubernetes.io/projected/723ab405-5905-44a6-a625-39fbc78948ef-kube-api-access-7hcmx\") pod \"swift-proxy-7b868b57d5-xjh67\" (UID: \"723ab405-5905-44a6-a625-39fbc78948ef\") " pod="openstack/swift-proxy-7b868b57d5-xjh67" Dec 12 00:46:56 crc kubenswrapper[4606]: I1212 00:46:56.415706 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/723ab405-5905-44a6-a625-39fbc78948ef-config-data\") pod \"swift-proxy-7b868b57d5-xjh67\" (UID: \"723ab405-5905-44a6-a625-39fbc78948ef\") " pod="openstack/swift-proxy-7b868b57d5-xjh67" Dec 12 00:46:56 crc kubenswrapper[4606]: I1212 00:46:56.415788 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/723ab405-5905-44a6-a625-39fbc78948ef-combined-ca-bundle\") pod \"swift-proxy-7b868b57d5-xjh67\" (UID: \"723ab405-5905-44a6-a625-39fbc78948ef\") " pod="openstack/swift-proxy-7b868b57d5-xjh67" Dec 12 00:46:56 crc kubenswrapper[4606]: I1212 00:46:56.415846 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/723ab405-5905-44a6-a625-39fbc78948ef-log-httpd\") pod \"swift-proxy-7b868b57d5-xjh67\" (UID: \"723ab405-5905-44a6-a625-39fbc78948ef\") " pod="openstack/swift-proxy-7b868b57d5-xjh67" Dec 12 00:46:56 crc kubenswrapper[4606]: I1212 00:46:56.415905 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/723ab405-5905-44a6-a625-39fbc78948ef-run-httpd\") pod \"swift-proxy-7b868b57d5-xjh67\" (UID: \"723ab405-5905-44a6-a625-39fbc78948ef\") " pod="openstack/swift-proxy-7b868b57d5-xjh67" Dec 12 00:46:56 crc kubenswrapper[4606]: I1212 00:46:56.417536 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/723ab405-5905-44a6-a625-39fbc78948ef-run-httpd\") pod \"swift-proxy-7b868b57d5-xjh67\" (UID: \"723ab405-5905-44a6-a625-39fbc78948ef\") " pod="openstack/swift-proxy-7b868b57d5-xjh67" Dec 12 00:46:56 crc kubenswrapper[4606]: I1212 00:46:56.418671 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/723ab405-5905-44a6-a625-39fbc78948ef-log-httpd\") pod \"swift-proxy-7b868b57d5-xjh67\" (UID: \"723ab405-5905-44a6-a625-39fbc78948ef\") " pod="openstack/swift-proxy-7b868b57d5-xjh67" Dec 12 00:46:56 crc kubenswrapper[4606]: I1212 00:46:56.421115 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/723ab405-5905-44a6-a625-39fbc78948ef-etc-swift\") pod \"swift-proxy-7b868b57d5-xjh67\" (UID: \"723ab405-5905-44a6-a625-39fbc78948ef\") " pod="openstack/swift-proxy-7b868b57d5-xjh67" Dec 12 00:46:56 crc kubenswrapper[4606]: I1212 00:46:56.421518 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/723ab405-5905-44a6-a625-39fbc78948ef-combined-ca-bundle\") pod \"swift-proxy-7b868b57d5-xjh67\" (UID: \"723ab405-5905-44a6-a625-39fbc78948ef\") " pod="openstack/swift-proxy-7b868b57d5-xjh67" Dec 12 00:46:56 crc kubenswrapper[4606]: I1212 00:46:56.422889 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/723ab405-5905-44a6-a625-39fbc78948ef-internal-tls-certs\") pod \"swift-proxy-7b868b57d5-xjh67\" (UID: \"723ab405-5905-44a6-a625-39fbc78948ef\") " pod="openstack/swift-proxy-7b868b57d5-xjh67" Dec 12 00:46:56 crc kubenswrapper[4606]: I1212 00:46:56.426117 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/723ab405-5905-44a6-a625-39fbc78948ef-config-data\") pod \"swift-proxy-7b868b57d5-xjh67\" (UID: \"723ab405-5905-44a6-a625-39fbc78948ef\") " pod="openstack/swift-proxy-7b868b57d5-xjh67" Dec 12 00:46:56 crc kubenswrapper[4606]: I1212 00:46:56.428260 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/723ab405-5905-44a6-a625-39fbc78948ef-public-tls-certs\") pod \"swift-proxy-7b868b57d5-xjh67\" (UID: \"723ab405-5905-44a6-a625-39fbc78948ef\") " pod="openstack/swift-proxy-7b868b57d5-xjh67" Dec 12 00:46:56 crc kubenswrapper[4606]: I1212 00:46:56.441212 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hcmx\" (UniqueName: \"kubernetes.io/projected/723ab405-5905-44a6-a625-39fbc78948ef-kube-api-access-7hcmx\") pod \"swift-proxy-7b868b57d5-xjh67\" (UID: \"723ab405-5905-44a6-a625-39fbc78948ef\") " pod="openstack/swift-proxy-7b868b57d5-xjh67" Dec 12 00:46:56 crc kubenswrapper[4606]: I1212 00:46:56.540280 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 12 00:46:56 crc kubenswrapper[4606]: I1212 00:46:56.572862 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-7b868b57d5-xjh67" Dec 12 00:46:56 crc kubenswrapper[4606]: I1212 00:46:56.619727 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06ce9244-6f03-4386-8b05-fdca2f2e1120-combined-ca-bundle\") pod \"06ce9244-6f03-4386-8b05-fdca2f2e1120\" (UID: \"06ce9244-6f03-4386-8b05-fdca2f2e1120\") " Dec 12 00:46:56 crc kubenswrapper[4606]: I1212 00:46:56.619797 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/06ce9244-6f03-4386-8b05-fdca2f2e1120-scripts\") pod \"06ce9244-6f03-4386-8b05-fdca2f2e1120\" (UID: \"06ce9244-6f03-4386-8b05-fdca2f2e1120\") " Dec 12 00:46:56 crc kubenswrapper[4606]: I1212 00:46:56.619837 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06ce9244-6f03-4386-8b05-fdca2f2e1120-config-data\") pod \"06ce9244-6f03-4386-8b05-fdca2f2e1120\" (UID: \"06ce9244-6f03-4386-8b05-fdca2f2e1120\") " Dec 12 00:46:56 crc kubenswrapper[4606]: I1212 00:46:56.619912 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/06ce9244-6f03-4386-8b05-fdca2f2e1120-config-data-custom\") pod \"06ce9244-6f03-4386-8b05-fdca2f2e1120\" (UID: \"06ce9244-6f03-4386-8b05-fdca2f2e1120\") " Dec 12 00:46:56 crc kubenswrapper[4606]: I1212 00:46:56.619973 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/06ce9244-6f03-4386-8b05-fdca2f2e1120-etc-machine-id\") pod \"06ce9244-6f03-4386-8b05-fdca2f2e1120\" (UID: \"06ce9244-6f03-4386-8b05-fdca2f2e1120\") " Dec 12 00:46:56 crc kubenswrapper[4606]: I1212 00:46:56.620004 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4d2d\" (UniqueName: \"kubernetes.io/projected/06ce9244-6f03-4386-8b05-fdca2f2e1120-kube-api-access-s4d2d\") pod \"06ce9244-6f03-4386-8b05-fdca2f2e1120\" (UID: \"06ce9244-6f03-4386-8b05-fdca2f2e1120\") " Dec 12 00:46:56 crc kubenswrapper[4606]: I1212 00:46:56.623245 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/06ce9244-6f03-4386-8b05-fdca2f2e1120-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "06ce9244-6f03-4386-8b05-fdca2f2e1120" (UID: "06ce9244-6f03-4386-8b05-fdca2f2e1120"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 12 00:46:56 crc kubenswrapper[4606]: I1212 00:46:56.627291 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06ce9244-6f03-4386-8b05-fdca2f2e1120-scripts" (OuterVolumeSpecName: "scripts") pod "06ce9244-6f03-4386-8b05-fdca2f2e1120" (UID: "06ce9244-6f03-4386-8b05-fdca2f2e1120"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:46:56 crc kubenswrapper[4606]: I1212 00:46:56.635226 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06ce9244-6f03-4386-8b05-fdca2f2e1120-kube-api-access-s4d2d" (OuterVolumeSpecName: "kube-api-access-s4d2d") pod "06ce9244-6f03-4386-8b05-fdca2f2e1120" (UID: "06ce9244-6f03-4386-8b05-fdca2f2e1120"). InnerVolumeSpecName "kube-api-access-s4d2d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:46:56 crc kubenswrapper[4606]: I1212 00:46:56.642296 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06ce9244-6f03-4386-8b05-fdca2f2e1120-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "06ce9244-6f03-4386-8b05-fdca2f2e1120" (UID: "06ce9244-6f03-4386-8b05-fdca2f2e1120"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:46:56 crc kubenswrapper[4606]: I1212 00:46:56.690156 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06ce9244-6f03-4386-8b05-fdca2f2e1120-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "06ce9244-6f03-4386-8b05-fdca2f2e1120" (UID: "06ce9244-6f03-4386-8b05-fdca2f2e1120"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:46:56 crc kubenswrapper[4606]: I1212 00:46:56.722576 4606 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/06ce9244-6f03-4386-8b05-fdca2f2e1120-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 12 00:46:56 crc kubenswrapper[4606]: I1212 00:46:56.723429 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4d2d\" (UniqueName: \"kubernetes.io/projected/06ce9244-6f03-4386-8b05-fdca2f2e1120-kube-api-access-s4d2d\") on node \"crc\" DevicePath \"\"" Dec 12 00:46:56 crc kubenswrapper[4606]: I1212 00:46:56.723969 4606 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06ce9244-6f03-4386-8b05-fdca2f2e1120-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 12 00:46:56 crc kubenswrapper[4606]: I1212 00:46:56.724040 4606 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/06ce9244-6f03-4386-8b05-fdca2f2e1120-scripts\") on node \"crc\" DevicePath \"\"" Dec 12 00:46:56 crc kubenswrapper[4606]: I1212 00:46:56.724096 4606 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/06ce9244-6f03-4386-8b05-fdca2f2e1120-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 12 00:46:56 crc kubenswrapper[4606]: I1212 00:46:56.814909 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06ce9244-6f03-4386-8b05-fdca2f2e1120-config-data" (OuterVolumeSpecName: "config-data") pod "06ce9244-6f03-4386-8b05-fdca2f2e1120" (UID: "06ce9244-6f03-4386-8b05-fdca2f2e1120"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:46:56 crc kubenswrapper[4606]: I1212 00:46:56.826126 4606 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06ce9244-6f03-4386-8b05-fdca2f2e1120-config-data\") on node \"crc\" DevicePath \"\"" Dec 12 00:46:56 crc kubenswrapper[4606]: I1212 00:46:56.836536 4606 generic.go:334] "Generic (PLEG): container finished" podID="06ce9244-6f03-4386-8b05-fdca2f2e1120" containerID="16bc2caf9abd0d9904b1f352d55a13e16dca4b6ee1ea0868f25d0bd709b7f6e1" exitCode=0 Dec 12 00:46:56 crc kubenswrapper[4606]: I1212 00:46:56.836752 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"06ce9244-6f03-4386-8b05-fdca2f2e1120","Type":"ContainerDied","Data":"16bc2caf9abd0d9904b1f352d55a13e16dca4b6ee1ea0868f25d0bd709b7f6e1"} Dec 12 00:46:56 crc kubenswrapper[4606]: I1212 00:46:56.836849 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"06ce9244-6f03-4386-8b05-fdca2f2e1120","Type":"ContainerDied","Data":"37cb80a31bb6d32d18b30de4db071bc18f81850cf0c10c78ee00f99b06da5ec5"} Dec 12 00:46:56 crc kubenswrapper[4606]: I1212 00:46:56.836951 4606 scope.go:117] "RemoveContainer" containerID="117a918723b40673fbcff7184808486967fd2181a70829cedb1116c344a75dd0" Dec 12 00:46:56 crc kubenswrapper[4606]: I1212 00:46:56.837359 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 12 00:46:56 crc kubenswrapper[4606]: I1212 00:46:56.887266 4606 scope.go:117] "RemoveContainer" containerID="16bc2caf9abd0d9904b1f352d55a13e16dca4b6ee1ea0868f25d0bd709b7f6e1" Dec 12 00:46:56 crc kubenswrapper[4606]: I1212 00:46:56.951367 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 12 00:46:56 crc kubenswrapper[4606]: I1212 00:46:56.974882 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 12 00:46:56 crc kubenswrapper[4606]: I1212 00:46:56.977970 4606 scope.go:117] "RemoveContainer" containerID="117a918723b40673fbcff7184808486967fd2181a70829cedb1116c344a75dd0" Dec 12 00:46:56 crc kubenswrapper[4606]: E1212 00:46:56.980313 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"117a918723b40673fbcff7184808486967fd2181a70829cedb1116c344a75dd0\": container with ID starting with 117a918723b40673fbcff7184808486967fd2181a70829cedb1116c344a75dd0 not found: ID does not exist" containerID="117a918723b40673fbcff7184808486967fd2181a70829cedb1116c344a75dd0" Dec 12 00:46:56 crc kubenswrapper[4606]: I1212 00:46:56.980486 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"117a918723b40673fbcff7184808486967fd2181a70829cedb1116c344a75dd0"} err="failed to get container status \"117a918723b40673fbcff7184808486967fd2181a70829cedb1116c344a75dd0\": rpc error: code = NotFound desc = could not find container \"117a918723b40673fbcff7184808486967fd2181a70829cedb1116c344a75dd0\": container with ID starting with 117a918723b40673fbcff7184808486967fd2181a70829cedb1116c344a75dd0 not found: ID does not exist" Dec 12 00:46:56 crc kubenswrapper[4606]: I1212 00:46:56.980603 4606 scope.go:117] "RemoveContainer" containerID="16bc2caf9abd0d9904b1f352d55a13e16dca4b6ee1ea0868f25d0bd709b7f6e1" Dec 12 00:46:56 crc kubenswrapper[4606]: E1212 00:46:56.980943 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16bc2caf9abd0d9904b1f352d55a13e16dca4b6ee1ea0868f25d0bd709b7f6e1\": container with ID starting with 16bc2caf9abd0d9904b1f352d55a13e16dca4b6ee1ea0868f25d0bd709b7f6e1 not found: ID does not exist" containerID="16bc2caf9abd0d9904b1f352d55a13e16dca4b6ee1ea0868f25d0bd709b7f6e1" Dec 12 00:46:56 crc kubenswrapper[4606]: I1212 00:46:56.980980 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16bc2caf9abd0d9904b1f352d55a13e16dca4b6ee1ea0868f25d0bd709b7f6e1"} err="failed to get container status \"16bc2caf9abd0d9904b1f352d55a13e16dca4b6ee1ea0868f25d0bd709b7f6e1\": rpc error: code = NotFound desc = could not find container \"16bc2caf9abd0d9904b1f352d55a13e16dca4b6ee1ea0868f25d0bd709b7f6e1\": container with ID starting with 16bc2caf9abd0d9904b1f352d55a13e16dca4b6ee1ea0868f25d0bd709b7f6e1 not found: ID does not exist" Dec 12 00:46:56 crc kubenswrapper[4606]: I1212 00:46:56.982032 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 12 00:46:56 crc kubenswrapper[4606]: E1212 00:46:56.982527 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06ce9244-6f03-4386-8b05-fdca2f2e1120" containerName="cinder-scheduler" Dec 12 00:46:56 crc kubenswrapper[4606]: I1212 00:46:56.982547 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="06ce9244-6f03-4386-8b05-fdca2f2e1120" containerName="cinder-scheduler" Dec 12 00:46:56 crc kubenswrapper[4606]: E1212 00:46:56.982576 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06ce9244-6f03-4386-8b05-fdca2f2e1120" containerName="probe" Dec 12 00:46:56 crc kubenswrapper[4606]: I1212 00:46:56.982585 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="06ce9244-6f03-4386-8b05-fdca2f2e1120" containerName="probe" Dec 12 00:46:56 crc kubenswrapper[4606]: I1212 00:46:56.982813 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="06ce9244-6f03-4386-8b05-fdca2f2e1120" containerName="cinder-scheduler" Dec 12 00:46:56 crc kubenswrapper[4606]: I1212 00:46:56.982841 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="06ce9244-6f03-4386-8b05-fdca2f2e1120" containerName="probe" Dec 12 00:46:56 crc kubenswrapper[4606]: I1212 00:46:56.983884 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 12 00:46:56 crc kubenswrapper[4606]: I1212 00:46:56.991869 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 12 00:46:56 crc kubenswrapper[4606]: I1212 00:46:56.996569 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Dec 12 00:46:57 crc kubenswrapper[4606]: I1212 00:46:56.997950 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 12 00:46:57 crc kubenswrapper[4606]: I1212 00:46:57.000490 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Dec 12 00:46:57 crc kubenswrapper[4606]: I1212 00:46:57.000632 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Dec 12 00:46:57 crc kubenswrapper[4606]: I1212 00:46:57.003519 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-99llt" Dec 12 00:46:57 crc kubenswrapper[4606]: I1212 00:46:57.007907 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 12 00:46:57 crc kubenswrapper[4606]: I1212 00:46:57.039371 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 12 00:46:57 crc kubenswrapper[4606]: I1212 00:46:57.132228 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dedb8b2a-538c-4175-9ead-0d889ae2fd40-config-data\") pod \"cinder-scheduler-0\" (UID: \"dedb8b2a-538c-4175-9ead-0d889ae2fd40\") " pod="openstack/cinder-scheduler-0" Dec 12 00:46:57 crc kubenswrapper[4606]: I1212 00:46:57.132268 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dedb8b2a-538c-4175-9ead-0d889ae2fd40-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"dedb8b2a-538c-4175-9ead-0d889ae2fd40\") " pod="openstack/cinder-scheduler-0" Dec 12 00:46:57 crc kubenswrapper[4606]: I1212 00:46:57.132319 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/1bacd965-540d-4189-b4f8-592b4449cdac-openstack-config\") pod \"openstackclient\" (UID: \"1bacd965-540d-4189-b4f8-592b4449cdac\") " pod="openstack/openstackclient" Dec 12 00:46:57 crc kubenswrapper[4606]: I1212 00:46:57.132337 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bacd965-540d-4189-b4f8-592b4449cdac-combined-ca-bundle\") pod \"openstackclient\" (UID: \"1bacd965-540d-4189-b4f8-592b4449cdac\") " pod="openstack/openstackclient" Dec 12 00:46:57 crc kubenswrapper[4606]: I1212 00:46:57.132351 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dedb8b2a-538c-4175-9ead-0d889ae2fd40-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"dedb8b2a-538c-4175-9ead-0d889ae2fd40\") " pod="openstack/cinder-scheduler-0" Dec 12 00:46:57 crc kubenswrapper[4606]: I1212 00:46:57.132374 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/1bacd965-540d-4189-b4f8-592b4449cdac-openstack-config-secret\") pod \"openstackclient\" (UID: \"1bacd965-540d-4189-b4f8-592b4449cdac\") " pod="openstack/openstackclient" Dec 12 00:46:57 crc kubenswrapper[4606]: I1212 00:46:57.132405 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dedb8b2a-538c-4175-9ead-0d889ae2fd40-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"dedb8b2a-538c-4175-9ead-0d889ae2fd40\") " pod="openstack/cinder-scheduler-0" Dec 12 00:46:57 crc kubenswrapper[4606]: I1212 00:46:57.132431 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hnfn\" (UniqueName: \"kubernetes.io/projected/1bacd965-540d-4189-b4f8-592b4449cdac-kube-api-access-2hnfn\") pod \"openstackclient\" (UID: \"1bacd965-540d-4189-b4f8-592b4449cdac\") " pod="openstack/openstackclient" Dec 12 00:46:57 crc kubenswrapper[4606]: I1212 00:46:57.132467 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gc2hh\" (UniqueName: \"kubernetes.io/projected/dedb8b2a-538c-4175-9ead-0d889ae2fd40-kube-api-access-gc2hh\") pod \"cinder-scheduler-0\" (UID: \"dedb8b2a-538c-4175-9ead-0d889ae2fd40\") " pod="openstack/cinder-scheduler-0" Dec 12 00:46:57 crc kubenswrapper[4606]: I1212 00:46:57.132496 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dedb8b2a-538c-4175-9ead-0d889ae2fd40-scripts\") pod \"cinder-scheduler-0\" (UID: \"dedb8b2a-538c-4175-9ead-0d889ae2fd40\") " pod="openstack/cinder-scheduler-0" Dec 12 00:46:57 crc kubenswrapper[4606]: I1212 00:46:57.226164 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-7b868b57d5-xjh67"] Dec 12 00:46:57 crc kubenswrapper[4606]: W1212 00:46:57.229513 4606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod723ab405_5905_44a6_a625_39fbc78948ef.slice/crio-f7eee0b4086267a11d064513e9d7315ee5214c698ca17ef39f9f009e4ff3cf32 WatchSource:0}: Error finding container f7eee0b4086267a11d064513e9d7315ee5214c698ca17ef39f9f009e4ff3cf32: Status 404 returned error can't find the container with id f7eee0b4086267a11d064513e9d7315ee5214c698ca17ef39f9f009e4ff3cf32 Dec 12 00:46:57 crc kubenswrapper[4606]: I1212 00:46:57.233536 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gc2hh\" (UniqueName: \"kubernetes.io/projected/dedb8b2a-538c-4175-9ead-0d889ae2fd40-kube-api-access-gc2hh\") pod \"cinder-scheduler-0\" (UID: \"dedb8b2a-538c-4175-9ead-0d889ae2fd40\") " pod="openstack/cinder-scheduler-0" Dec 12 00:46:57 crc kubenswrapper[4606]: I1212 00:46:57.233593 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dedb8b2a-538c-4175-9ead-0d889ae2fd40-scripts\") pod \"cinder-scheduler-0\" (UID: \"dedb8b2a-538c-4175-9ead-0d889ae2fd40\") " pod="openstack/cinder-scheduler-0" Dec 12 00:46:57 crc kubenswrapper[4606]: I1212 00:46:57.233634 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dedb8b2a-538c-4175-9ead-0d889ae2fd40-config-data\") pod \"cinder-scheduler-0\" (UID: \"dedb8b2a-538c-4175-9ead-0d889ae2fd40\") " pod="openstack/cinder-scheduler-0" Dec 12 00:46:57 crc kubenswrapper[4606]: I1212 00:46:57.233656 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dedb8b2a-538c-4175-9ead-0d889ae2fd40-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"dedb8b2a-538c-4175-9ead-0d889ae2fd40\") " pod="openstack/cinder-scheduler-0" Dec 12 00:46:57 crc kubenswrapper[4606]: I1212 00:46:57.233705 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/1bacd965-540d-4189-b4f8-592b4449cdac-openstack-config\") pod \"openstackclient\" (UID: \"1bacd965-540d-4189-b4f8-592b4449cdac\") " pod="openstack/openstackclient" Dec 12 00:46:57 crc kubenswrapper[4606]: I1212 00:46:57.233727 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bacd965-540d-4189-b4f8-592b4449cdac-combined-ca-bundle\") pod \"openstackclient\" (UID: \"1bacd965-540d-4189-b4f8-592b4449cdac\") " pod="openstack/openstackclient" Dec 12 00:46:57 crc kubenswrapper[4606]: I1212 00:46:57.233744 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dedb8b2a-538c-4175-9ead-0d889ae2fd40-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"dedb8b2a-538c-4175-9ead-0d889ae2fd40\") " pod="openstack/cinder-scheduler-0" Dec 12 00:46:57 crc kubenswrapper[4606]: I1212 00:46:57.233768 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/1bacd965-540d-4189-b4f8-592b4449cdac-openstack-config-secret\") pod \"openstackclient\" (UID: \"1bacd965-540d-4189-b4f8-592b4449cdac\") " pod="openstack/openstackclient" Dec 12 00:46:57 crc kubenswrapper[4606]: I1212 00:46:57.233796 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dedb8b2a-538c-4175-9ead-0d889ae2fd40-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"dedb8b2a-538c-4175-9ead-0d889ae2fd40\") " pod="openstack/cinder-scheduler-0" Dec 12 00:46:57 crc kubenswrapper[4606]: I1212 00:46:57.233830 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2hnfn\" (UniqueName: \"kubernetes.io/projected/1bacd965-540d-4189-b4f8-592b4449cdac-kube-api-access-2hnfn\") pod \"openstackclient\" (UID: \"1bacd965-540d-4189-b4f8-592b4449cdac\") " pod="openstack/openstackclient" Dec 12 00:46:57 crc kubenswrapper[4606]: I1212 00:46:57.243321 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dedb8b2a-538c-4175-9ead-0d889ae2fd40-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"dedb8b2a-538c-4175-9ead-0d889ae2fd40\") " pod="openstack/cinder-scheduler-0" Dec 12 00:46:57 crc kubenswrapper[4606]: I1212 00:46:57.244470 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/1bacd965-540d-4189-b4f8-592b4449cdac-openstack-config\") pod \"openstackclient\" (UID: \"1bacd965-540d-4189-b4f8-592b4449cdac\") " pod="openstack/openstackclient" Dec 12 00:46:57 crc kubenswrapper[4606]: I1212 00:46:57.245023 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dedb8b2a-538c-4175-9ead-0d889ae2fd40-scripts\") pod \"cinder-scheduler-0\" (UID: \"dedb8b2a-538c-4175-9ead-0d889ae2fd40\") " pod="openstack/cinder-scheduler-0" Dec 12 00:46:57 crc kubenswrapper[4606]: I1212 00:46:57.247898 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dedb8b2a-538c-4175-9ead-0d889ae2fd40-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"dedb8b2a-538c-4175-9ead-0d889ae2fd40\") " pod="openstack/cinder-scheduler-0" Dec 12 00:46:57 crc kubenswrapper[4606]: I1212 00:46:57.249597 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bacd965-540d-4189-b4f8-592b4449cdac-combined-ca-bundle\") pod \"openstackclient\" (UID: \"1bacd965-540d-4189-b4f8-592b4449cdac\") " pod="openstack/openstackclient" Dec 12 00:46:57 crc kubenswrapper[4606]: I1212 00:46:57.249897 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dedb8b2a-538c-4175-9ead-0d889ae2fd40-config-data\") pod \"cinder-scheduler-0\" (UID: \"dedb8b2a-538c-4175-9ead-0d889ae2fd40\") " pod="openstack/cinder-scheduler-0" Dec 12 00:46:57 crc kubenswrapper[4606]: I1212 00:46:57.261876 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dedb8b2a-538c-4175-9ead-0d889ae2fd40-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"dedb8b2a-538c-4175-9ead-0d889ae2fd40\") " pod="openstack/cinder-scheduler-0" Dec 12 00:46:57 crc kubenswrapper[4606]: I1212 00:46:57.263384 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gc2hh\" (UniqueName: \"kubernetes.io/projected/dedb8b2a-538c-4175-9ead-0d889ae2fd40-kube-api-access-gc2hh\") pod \"cinder-scheduler-0\" (UID: \"dedb8b2a-538c-4175-9ead-0d889ae2fd40\") " pod="openstack/cinder-scheduler-0" Dec 12 00:46:57 crc kubenswrapper[4606]: I1212 00:46:57.274752 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hnfn\" (UniqueName: \"kubernetes.io/projected/1bacd965-540d-4189-b4f8-592b4449cdac-kube-api-access-2hnfn\") pod \"openstackclient\" (UID: \"1bacd965-540d-4189-b4f8-592b4449cdac\") " pod="openstack/openstackclient" Dec 12 00:46:57 crc kubenswrapper[4606]: I1212 00:46:57.275300 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/1bacd965-540d-4189-b4f8-592b4449cdac-openstack-config-secret\") pod \"openstackclient\" (UID: \"1bacd965-540d-4189-b4f8-592b4449cdac\") " pod="openstack/openstackclient" Dec 12 00:46:57 crc kubenswrapper[4606]: I1212 00:46:57.326562 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 12 00:46:57 crc kubenswrapper[4606]: I1212 00:46:57.353772 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 12 00:46:57 crc kubenswrapper[4606]: I1212 00:46:57.361001 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Dec 12 00:46:57 crc kubenswrapper[4606]: I1212 00:46:57.384496 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Dec 12 00:46:57 crc kubenswrapper[4606]: I1212 00:46:57.404233 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Dec 12 00:46:57 crc kubenswrapper[4606]: I1212 00:46:57.405624 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 12 00:46:57 crc kubenswrapper[4606]: I1212 00:46:57.426075 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 12 00:46:57 crc kubenswrapper[4606]: I1212 00:46:57.542128 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/0d0864c8-b45f-4324-a56f-ff583d488da0-openstack-config-secret\") pod \"openstackclient\" (UID: \"0d0864c8-b45f-4324-a56f-ff583d488da0\") " pod="openstack/openstackclient" Dec 12 00:46:57 crc kubenswrapper[4606]: I1212 00:46:57.542614 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/0d0864c8-b45f-4324-a56f-ff583d488da0-openstack-config\") pod \"openstackclient\" (UID: \"0d0864c8-b45f-4324-a56f-ff583d488da0\") " pod="openstack/openstackclient" Dec 12 00:46:57 crc kubenswrapper[4606]: I1212 00:46:57.542659 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rctgv\" (UniqueName: \"kubernetes.io/projected/0d0864c8-b45f-4324-a56f-ff583d488da0-kube-api-access-rctgv\") pod \"openstackclient\" (UID: \"0d0864c8-b45f-4324-a56f-ff583d488da0\") " pod="openstack/openstackclient" Dec 12 00:46:57 crc kubenswrapper[4606]: I1212 00:46:57.542743 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d0864c8-b45f-4324-a56f-ff583d488da0-combined-ca-bundle\") pod \"openstackclient\" (UID: \"0d0864c8-b45f-4324-a56f-ff583d488da0\") " pod="openstack/openstackclient" Dec 12 00:46:57 crc kubenswrapper[4606]: E1212 00:46:57.614356 4606 log.go:32] "RunPodSandbox from runtime service failed" err=< Dec 12 00:46:57 crc kubenswrapper[4606]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_1bacd965-540d-4189-b4f8-592b4449cdac_0(ea7d16b94c085e342be02c4bcdcb23aa925bd2c164a7d2e6d633eb45275b6f60): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"ea7d16b94c085e342be02c4bcdcb23aa925bd2c164a7d2e6d633eb45275b6f60" Netns:"/var/run/netns/172b0c27-d1e9-4d91-8232-e09beea15d93" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=ea7d16b94c085e342be02c4bcdcb23aa925bd2c164a7d2e6d633eb45275b6f60;K8S_POD_UID=1bacd965-540d-4189-b4f8-592b4449cdac" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/1bacd965-540d-4189-b4f8-592b4449cdac]: expected pod UID "1bacd965-540d-4189-b4f8-592b4449cdac" but got "0d0864c8-b45f-4324-a56f-ff583d488da0" from Kube API Dec 12 00:46:57 crc kubenswrapper[4606]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 12 00:46:57 crc kubenswrapper[4606]: > Dec 12 00:46:57 crc kubenswrapper[4606]: E1212 00:46:57.614646 4606 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Dec 12 00:46:57 crc kubenswrapper[4606]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_1bacd965-540d-4189-b4f8-592b4449cdac_0(ea7d16b94c085e342be02c4bcdcb23aa925bd2c164a7d2e6d633eb45275b6f60): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"ea7d16b94c085e342be02c4bcdcb23aa925bd2c164a7d2e6d633eb45275b6f60" Netns:"/var/run/netns/172b0c27-d1e9-4d91-8232-e09beea15d93" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=ea7d16b94c085e342be02c4bcdcb23aa925bd2c164a7d2e6d633eb45275b6f60;K8S_POD_UID=1bacd965-540d-4189-b4f8-592b4449cdac" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/1bacd965-540d-4189-b4f8-592b4449cdac]: expected pod UID "1bacd965-540d-4189-b4f8-592b4449cdac" but got "0d0864c8-b45f-4324-a56f-ff583d488da0" from Kube API Dec 12 00:46:57 crc kubenswrapper[4606]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 12 00:46:57 crc kubenswrapper[4606]: > pod="openstack/openstackclient" Dec 12 00:46:57 crc kubenswrapper[4606]: I1212 00:46:57.644251 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rctgv\" (UniqueName: \"kubernetes.io/projected/0d0864c8-b45f-4324-a56f-ff583d488da0-kube-api-access-rctgv\") pod \"openstackclient\" (UID: \"0d0864c8-b45f-4324-a56f-ff583d488da0\") " pod="openstack/openstackclient" Dec 12 00:46:57 crc kubenswrapper[4606]: I1212 00:46:57.644311 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d0864c8-b45f-4324-a56f-ff583d488da0-combined-ca-bundle\") pod \"openstackclient\" (UID: \"0d0864c8-b45f-4324-a56f-ff583d488da0\") " pod="openstack/openstackclient" Dec 12 00:46:57 crc kubenswrapper[4606]: I1212 00:46:57.644354 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/0d0864c8-b45f-4324-a56f-ff583d488da0-openstack-config-secret\") pod \"openstackclient\" (UID: \"0d0864c8-b45f-4324-a56f-ff583d488da0\") " pod="openstack/openstackclient" Dec 12 00:46:57 crc kubenswrapper[4606]: I1212 00:46:57.644460 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/0d0864c8-b45f-4324-a56f-ff583d488da0-openstack-config\") pod \"openstackclient\" (UID: \"0d0864c8-b45f-4324-a56f-ff583d488da0\") " pod="openstack/openstackclient" Dec 12 00:46:57 crc kubenswrapper[4606]: I1212 00:46:57.645230 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/0d0864c8-b45f-4324-a56f-ff583d488da0-openstack-config\") pod \"openstackclient\" (UID: \"0d0864c8-b45f-4324-a56f-ff583d488da0\") " pod="openstack/openstackclient" Dec 12 00:46:57 crc kubenswrapper[4606]: I1212 00:46:57.653640 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d0864c8-b45f-4324-a56f-ff583d488da0-combined-ca-bundle\") pod \"openstackclient\" (UID: \"0d0864c8-b45f-4324-a56f-ff583d488da0\") " pod="openstack/openstackclient" Dec 12 00:46:57 crc kubenswrapper[4606]: I1212 00:46:57.663835 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/0d0864c8-b45f-4324-a56f-ff583d488da0-openstack-config-secret\") pod \"openstackclient\" (UID: \"0d0864c8-b45f-4324-a56f-ff583d488da0\") " pod="openstack/openstackclient" Dec 12 00:46:57 crc kubenswrapper[4606]: I1212 00:46:57.675367 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rctgv\" (UniqueName: \"kubernetes.io/projected/0d0864c8-b45f-4324-a56f-ff583d488da0-kube-api-access-rctgv\") pod \"openstackclient\" (UID: \"0d0864c8-b45f-4324-a56f-ff583d488da0\") " pod="openstack/openstackclient" Dec 12 00:46:57 crc kubenswrapper[4606]: I1212 00:46:57.714570 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06ce9244-6f03-4386-8b05-fdca2f2e1120" path="/var/lib/kubelet/pods/06ce9244-6f03-4386-8b05-fdca2f2e1120/volumes" Dec 12 00:46:57 crc kubenswrapper[4606]: I1212 00:46:57.778683 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 12 00:46:57 crc kubenswrapper[4606]: I1212 00:46:57.825412 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 12 00:46:57 crc kubenswrapper[4606]: I1212 00:46:57.877889 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7b868b57d5-xjh67" event={"ID":"723ab405-5905-44a6-a625-39fbc78948ef","Type":"ContainerStarted","Data":"18163c80cf9214b254eeffc62ceeeb020145ea5c1c645b095c3d4c6ed109f92c"} Dec 12 00:46:57 crc kubenswrapper[4606]: I1212 00:46:57.878100 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7b868b57d5-xjh67" event={"ID":"723ab405-5905-44a6-a625-39fbc78948ef","Type":"ContainerStarted","Data":"f7eee0b4086267a11d064513e9d7315ee5214c698ca17ef39f9f009e4ff3cf32"} Dec 12 00:46:57 crc kubenswrapper[4606]: I1212 00:46:57.886766 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 12 00:46:57 crc kubenswrapper[4606]: I1212 00:46:57.974697 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 12 00:46:57 crc kubenswrapper[4606]: I1212 00:46:57.986644 4606 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="1bacd965-540d-4189-b4f8-592b4449cdac" podUID="0d0864c8-b45f-4324-a56f-ff583d488da0" Dec 12 00:46:58 crc kubenswrapper[4606]: I1212 00:46:58.050512 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bacd965-540d-4189-b4f8-592b4449cdac-combined-ca-bundle\") pod \"1bacd965-540d-4189-b4f8-592b4449cdac\" (UID: \"1bacd965-540d-4189-b4f8-592b4449cdac\") " Dec 12 00:46:58 crc kubenswrapper[4606]: I1212 00:46:58.050868 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/1bacd965-540d-4189-b4f8-592b4449cdac-openstack-config-secret\") pod \"1bacd965-540d-4189-b4f8-592b4449cdac\" (UID: \"1bacd965-540d-4189-b4f8-592b4449cdac\") " Dec 12 00:46:58 crc kubenswrapper[4606]: I1212 00:46:58.050890 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2hnfn\" (UniqueName: \"kubernetes.io/projected/1bacd965-540d-4189-b4f8-592b4449cdac-kube-api-access-2hnfn\") pod \"1bacd965-540d-4189-b4f8-592b4449cdac\" (UID: \"1bacd965-540d-4189-b4f8-592b4449cdac\") " Dec 12 00:46:58 crc kubenswrapper[4606]: I1212 00:46:58.050978 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/1bacd965-540d-4189-b4f8-592b4449cdac-openstack-config\") pod \"1bacd965-540d-4189-b4f8-592b4449cdac\" (UID: \"1bacd965-540d-4189-b4f8-592b4449cdac\") " Dec 12 00:46:58 crc kubenswrapper[4606]: I1212 00:46:58.052202 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bacd965-540d-4189-b4f8-592b4449cdac-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "1bacd965-540d-4189-b4f8-592b4449cdac" (UID: "1bacd965-540d-4189-b4f8-592b4449cdac"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:46:58 crc kubenswrapper[4606]: I1212 00:46:58.071511 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bacd965-540d-4189-b4f8-592b4449cdac-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1bacd965-540d-4189-b4f8-592b4449cdac" (UID: "1bacd965-540d-4189-b4f8-592b4449cdac"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:46:58 crc kubenswrapper[4606]: I1212 00:46:58.073390 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bacd965-540d-4189-b4f8-592b4449cdac-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "1bacd965-540d-4189-b4f8-592b4449cdac" (UID: "1bacd965-540d-4189-b4f8-592b4449cdac"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:46:58 crc kubenswrapper[4606]: I1212 00:46:58.082359 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bacd965-540d-4189-b4f8-592b4449cdac-kube-api-access-2hnfn" (OuterVolumeSpecName: "kube-api-access-2hnfn") pod "1bacd965-540d-4189-b4f8-592b4449cdac" (UID: "1bacd965-540d-4189-b4f8-592b4449cdac"). InnerVolumeSpecName "kube-api-access-2hnfn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:46:58 crc kubenswrapper[4606]: I1212 00:46:58.155347 4606 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/1bacd965-540d-4189-b4f8-592b4449cdac-openstack-config\") on node \"crc\" DevicePath \"\"" Dec 12 00:46:58 crc kubenswrapper[4606]: I1212 00:46:58.155382 4606 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bacd965-540d-4189-b4f8-592b4449cdac-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 12 00:46:58 crc kubenswrapper[4606]: I1212 00:46:58.155391 4606 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/1bacd965-540d-4189-b4f8-592b4449cdac-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Dec 12 00:46:58 crc kubenswrapper[4606]: I1212 00:46:58.155401 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2hnfn\" (UniqueName: \"kubernetes.io/projected/1bacd965-540d-4189-b4f8-592b4449cdac-kube-api-access-2hnfn\") on node \"crc\" DevicePath \"\"" Dec 12 00:46:58 crc kubenswrapper[4606]: I1212 00:46:58.578553 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 12 00:46:58 crc kubenswrapper[4606]: I1212 00:46:58.903552 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7b868b57d5-xjh67" event={"ID":"723ab405-5905-44a6-a625-39fbc78948ef","Type":"ContainerStarted","Data":"f49f6b8c3e2ea6319ad56f307eb92a0c49acf0778c415fa871d87d92bdcaa7b2"} Dec 12 00:46:58 crc kubenswrapper[4606]: I1212 00:46:58.903877 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-7b868b57d5-xjh67" Dec 12 00:46:58 crc kubenswrapper[4606]: I1212 00:46:58.908573 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"0d0864c8-b45f-4324-a56f-ff583d488da0","Type":"ContainerStarted","Data":"c866229e90e45f4b95e29adaf2ad4d48f6c3a737a1d24baf2200d17b63f4d015"} Dec 12 00:46:58 crc kubenswrapper[4606]: I1212 00:46:58.910490 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 12 00:46:58 crc kubenswrapper[4606]: I1212 00:46:58.911355 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"dedb8b2a-538c-4175-9ead-0d889ae2fd40","Type":"ContainerStarted","Data":"3ebdf359ad447399222c92120035771f838855925414f9ea21445f14bc94e4aa"} Dec 12 00:46:58 crc kubenswrapper[4606]: I1212 00:46:58.911721 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"dedb8b2a-538c-4175-9ead-0d889ae2fd40","Type":"ContainerStarted","Data":"80b98bf25d0d73046bd2ad37215b3175f52b8d4c6ed73afafc18e29c8f268b6c"} Dec 12 00:46:58 crc kubenswrapper[4606]: I1212 00:46:58.972541 4606 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="1bacd965-540d-4189-b4f8-592b4449cdac" podUID="0d0864c8-b45f-4324-a56f-ff583d488da0" Dec 12 00:46:59 crc kubenswrapper[4606]: I1212 00:46:59.724467 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bacd965-540d-4189-b4f8-592b4449cdac" path="/var/lib/kubelet/pods/1bacd965-540d-4189-b4f8-592b4449cdac/volumes" Dec 12 00:46:59 crc kubenswrapper[4606]: I1212 00:46:59.729274 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-7b868b57d5-xjh67" podStartSLOduration=3.729255186 podStartE2EDuration="3.729255186s" podCreationTimestamp="2025-12-12 00:46:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:46:58.935105815 +0000 UTC m=+1409.480458681" watchObservedRunningTime="2025-12-12 00:46:59.729255186 +0000 UTC m=+1410.274608062" Dec 12 00:46:59 crc kubenswrapper[4606]: I1212 00:46:59.925541 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"dedb8b2a-538c-4175-9ead-0d889ae2fd40","Type":"ContainerStarted","Data":"70afbebab73efb09c8820867e62ab45d47e37f78d4c02b2fc0e6d2e7c7dfc37b"} Dec 12 00:46:59 crc kubenswrapper[4606]: I1212 00:46:59.925606 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-7b868b57d5-xjh67" Dec 12 00:46:59 crc kubenswrapper[4606]: I1212 00:46:59.956335 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.95631065 podStartE2EDuration="3.95631065s" podCreationTimestamp="2025-12-12 00:46:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:46:59.951474631 +0000 UTC m=+1410.496827497" watchObservedRunningTime="2025-12-12 00:46:59.95631065 +0000 UTC m=+1410.501663516" Dec 12 00:47:01 crc kubenswrapper[4606]: I1212 00:47:01.085987 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Dec 12 00:47:01 crc kubenswrapper[4606]: I1212 00:47:01.359490 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 12 00:47:01 crc kubenswrapper[4606]: I1212 00:47:01.359987 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="96587cc5-74f0-484c-8217-7d1a09d39580" containerName="ceilometer-central-agent" containerID="cri-o://5ef316d75b266fd5423e7c5484db50d86dee9e7d2190bf4b280ed073bc688763" gracePeriod=30 Dec 12 00:47:01 crc kubenswrapper[4606]: I1212 00:47:01.360647 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="96587cc5-74f0-484c-8217-7d1a09d39580" containerName="proxy-httpd" containerID="cri-o://42f5bc69b4cc2aa4850677899d07272bbd6a3678c140953c96a2c041e4557759" gracePeriod=30 Dec 12 00:47:01 crc kubenswrapper[4606]: I1212 00:47:01.360710 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="96587cc5-74f0-484c-8217-7d1a09d39580" containerName="sg-core" containerID="cri-o://facdd7194e1066b996e364ccd0c06b074318090aeaaa48cc595ed41968e7f63f" gracePeriod=30 Dec 12 00:47:01 crc kubenswrapper[4606]: I1212 00:47:01.360746 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="96587cc5-74f0-484c-8217-7d1a09d39580" containerName="ceilometer-notification-agent" containerID="cri-o://a71d4498cd22cd092cf4563dd14745bff4bb0cd8de6d11b853522795379c5bed" gracePeriod=30 Dec 12 00:47:01 crc kubenswrapper[4606]: I1212 00:47:01.385622 4606 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="96587cc5-74f0-484c-8217-7d1a09d39580" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 502" Dec 12 00:47:01 crc kubenswrapper[4606]: I1212 00:47:01.946388 4606 generic.go:334] "Generic (PLEG): container finished" podID="96587cc5-74f0-484c-8217-7d1a09d39580" containerID="facdd7194e1066b996e364ccd0c06b074318090aeaaa48cc595ed41968e7f63f" exitCode=2 Dec 12 00:47:01 crc kubenswrapper[4606]: I1212 00:47:01.946475 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"96587cc5-74f0-484c-8217-7d1a09d39580","Type":"ContainerDied","Data":"facdd7194e1066b996e364ccd0c06b074318090aeaaa48cc595ed41968e7f63f"} Dec 12 00:47:02 crc kubenswrapper[4606]: I1212 00:47:02.013021 4606 patch_prober.go:28] interesting pod/machine-config-daemon-cqmz5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 12 00:47:02 crc kubenswrapper[4606]: I1212 00:47:02.013269 4606 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 12 00:47:02 crc kubenswrapper[4606]: I1212 00:47:02.326742 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 12 00:47:02 crc kubenswrapper[4606]: I1212 00:47:02.964208 4606 generic.go:334] "Generic (PLEG): container finished" podID="9ede4720-3fd7-4524-adfc-c1c395f12170" containerID="a50c810b61ab80031ddc96b3bb79c28f1af6ee88b45271a71187d7164a11dd04" exitCode=137 Dec 12 00:47:02 crc kubenswrapper[4606]: I1212 00:47:02.964533 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-79c99578bb-cdgsn" event={"ID":"9ede4720-3fd7-4524-adfc-c1c395f12170","Type":"ContainerDied","Data":"a50c810b61ab80031ddc96b3bb79c28f1af6ee88b45271a71187d7164a11dd04"} Dec 12 00:47:02 crc kubenswrapper[4606]: I1212 00:47:02.964558 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-79c99578bb-cdgsn" event={"ID":"9ede4720-3fd7-4524-adfc-c1c395f12170","Type":"ContainerStarted","Data":"118461cd66eb187af751f8b1bf5a9a842145b3d677c56d382d11224f393d77e9"} Dec 12 00:47:02 crc kubenswrapper[4606]: I1212 00:47:02.980384 4606 generic.go:334] "Generic (PLEG): container finished" podID="e38df57e-1a86-4c45-bf40-6282a6a049ed" containerID="2dbf7369ad77ec21071d76169ebe84202398f88e4b0d626bed0634bc2c0923cd" exitCode=137 Dec 12 00:47:02 crc kubenswrapper[4606]: I1212 00:47:02.980469 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-b9fb498f6-62fcc" event={"ID":"e38df57e-1a86-4c45-bf40-6282a6a049ed","Type":"ContainerDied","Data":"2dbf7369ad77ec21071d76169ebe84202398f88e4b0d626bed0634bc2c0923cd"} Dec 12 00:47:02 crc kubenswrapper[4606]: I1212 00:47:02.980520 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-b9fb498f6-62fcc" event={"ID":"e38df57e-1a86-4c45-bf40-6282a6a049ed","Type":"ContainerStarted","Data":"45c639ce0fe5cb7959a1e8ff1646d4eb9c473c0bc5c88c824b78ae25e0ea3b06"} Dec 12 00:47:03 crc kubenswrapper[4606]: I1212 00:47:03.005132 4606 generic.go:334] "Generic (PLEG): container finished" podID="96587cc5-74f0-484c-8217-7d1a09d39580" containerID="42f5bc69b4cc2aa4850677899d07272bbd6a3678c140953c96a2c041e4557759" exitCode=0 Dec 12 00:47:03 crc kubenswrapper[4606]: I1212 00:47:03.005160 4606 generic.go:334] "Generic (PLEG): container finished" podID="96587cc5-74f0-484c-8217-7d1a09d39580" containerID="a71d4498cd22cd092cf4563dd14745bff4bb0cd8de6d11b853522795379c5bed" exitCode=0 Dec 12 00:47:03 crc kubenswrapper[4606]: I1212 00:47:03.005167 4606 generic.go:334] "Generic (PLEG): container finished" podID="96587cc5-74f0-484c-8217-7d1a09d39580" containerID="5ef316d75b266fd5423e7c5484db50d86dee9e7d2190bf4b280ed073bc688763" exitCode=0 Dec 12 00:47:03 crc kubenswrapper[4606]: I1212 00:47:03.005207 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"96587cc5-74f0-484c-8217-7d1a09d39580","Type":"ContainerDied","Data":"42f5bc69b4cc2aa4850677899d07272bbd6a3678c140953c96a2c041e4557759"} Dec 12 00:47:03 crc kubenswrapper[4606]: I1212 00:47:03.005248 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"96587cc5-74f0-484c-8217-7d1a09d39580","Type":"ContainerDied","Data":"a71d4498cd22cd092cf4563dd14745bff4bb0cd8de6d11b853522795379c5bed"} Dec 12 00:47:03 crc kubenswrapper[4606]: I1212 00:47:03.005261 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"96587cc5-74f0-484c-8217-7d1a09d39580","Type":"ContainerDied","Data":"5ef316d75b266fd5423e7c5484db50d86dee9e7d2190bf4b280ed073bc688763"} Dec 12 00:47:03 crc kubenswrapper[4606]: I1212 00:47:03.071428 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 12 00:47:03 crc kubenswrapper[4606]: I1212 00:47:03.221689 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8jwsl\" (UniqueName: \"kubernetes.io/projected/96587cc5-74f0-484c-8217-7d1a09d39580-kube-api-access-8jwsl\") pod \"96587cc5-74f0-484c-8217-7d1a09d39580\" (UID: \"96587cc5-74f0-484c-8217-7d1a09d39580\") " Dec 12 00:47:03 crc kubenswrapper[4606]: I1212 00:47:03.222834 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/96587cc5-74f0-484c-8217-7d1a09d39580-run-httpd\") pod \"96587cc5-74f0-484c-8217-7d1a09d39580\" (UID: \"96587cc5-74f0-484c-8217-7d1a09d39580\") " Dec 12 00:47:03 crc kubenswrapper[4606]: I1212 00:47:03.223003 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/96587cc5-74f0-484c-8217-7d1a09d39580-sg-core-conf-yaml\") pod \"96587cc5-74f0-484c-8217-7d1a09d39580\" (UID: \"96587cc5-74f0-484c-8217-7d1a09d39580\") " Dec 12 00:47:03 crc kubenswrapper[4606]: I1212 00:47:03.223145 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96587cc5-74f0-484c-8217-7d1a09d39580-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "96587cc5-74f0-484c-8217-7d1a09d39580" (UID: "96587cc5-74f0-484c-8217-7d1a09d39580"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 00:47:03 crc kubenswrapper[4606]: I1212 00:47:03.223198 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/96587cc5-74f0-484c-8217-7d1a09d39580-log-httpd\") pod \"96587cc5-74f0-484c-8217-7d1a09d39580\" (UID: \"96587cc5-74f0-484c-8217-7d1a09d39580\") " Dec 12 00:47:03 crc kubenswrapper[4606]: I1212 00:47:03.223321 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/96587cc5-74f0-484c-8217-7d1a09d39580-scripts\") pod \"96587cc5-74f0-484c-8217-7d1a09d39580\" (UID: \"96587cc5-74f0-484c-8217-7d1a09d39580\") " Dec 12 00:47:03 crc kubenswrapper[4606]: I1212 00:47:03.223383 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96587cc5-74f0-484c-8217-7d1a09d39580-combined-ca-bundle\") pod \"96587cc5-74f0-484c-8217-7d1a09d39580\" (UID: \"96587cc5-74f0-484c-8217-7d1a09d39580\") " Dec 12 00:47:03 crc kubenswrapper[4606]: I1212 00:47:03.223439 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96587cc5-74f0-484c-8217-7d1a09d39580-config-data\") pod \"96587cc5-74f0-484c-8217-7d1a09d39580\" (UID: \"96587cc5-74f0-484c-8217-7d1a09d39580\") " Dec 12 00:47:03 crc kubenswrapper[4606]: I1212 00:47:03.224018 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96587cc5-74f0-484c-8217-7d1a09d39580-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "96587cc5-74f0-484c-8217-7d1a09d39580" (UID: "96587cc5-74f0-484c-8217-7d1a09d39580"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 00:47:03 crc kubenswrapper[4606]: I1212 00:47:03.224249 4606 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/96587cc5-74f0-484c-8217-7d1a09d39580-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 12 00:47:03 crc kubenswrapper[4606]: I1212 00:47:03.224333 4606 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/96587cc5-74f0-484c-8217-7d1a09d39580-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 12 00:47:03 crc kubenswrapper[4606]: I1212 00:47:03.232321 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96587cc5-74f0-484c-8217-7d1a09d39580-scripts" (OuterVolumeSpecName: "scripts") pod "96587cc5-74f0-484c-8217-7d1a09d39580" (UID: "96587cc5-74f0-484c-8217-7d1a09d39580"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:47:03 crc kubenswrapper[4606]: I1212 00:47:03.242023 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96587cc5-74f0-484c-8217-7d1a09d39580-kube-api-access-8jwsl" (OuterVolumeSpecName: "kube-api-access-8jwsl") pod "96587cc5-74f0-484c-8217-7d1a09d39580" (UID: "96587cc5-74f0-484c-8217-7d1a09d39580"). InnerVolumeSpecName "kube-api-access-8jwsl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:47:03 crc kubenswrapper[4606]: I1212 00:47:03.267997 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96587cc5-74f0-484c-8217-7d1a09d39580-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "96587cc5-74f0-484c-8217-7d1a09d39580" (UID: "96587cc5-74f0-484c-8217-7d1a09d39580"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:47:03 crc kubenswrapper[4606]: I1212 00:47:03.316490 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96587cc5-74f0-484c-8217-7d1a09d39580-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "96587cc5-74f0-484c-8217-7d1a09d39580" (UID: "96587cc5-74f0-484c-8217-7d1a09d39580"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:47:03 crc kubenswrapper[4606]: I1212 00:47:03.328419 4606 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96587cc5-74f0-484c-8217-7d1a09d39580-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 12 00:47:03 crc kubenswrapper[4606]: I1212 00:47:03.328453 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8jwsl\" (UniqueName: \"kubernetes.io/projected/96587cc5-74f0-484c-8217-7d1a09d39580-kube-api-access-8jwsl\") on node \"crc\" DevicePath \"\"" Dec 12 00:47:03 crc kubenswrapper[4606]: I1212 00:47:03.328466 4606 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/96587cc5-74f0-484c-8217-7d1a09d39580-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 12 00:47:03 crc kubenswrapper[4606]: I1212 00:47:03.328477 4606 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/96587cc5-74f0-484c-8217-7d1a09d39580-scripts\") on node \"crc\" DevicePath \"\"" Dec 12 00:47:03 crc kubenswrapper[4606]: I1212 00:47:03.383640 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96587cc5-74f0-484c-8217-7d1a09d39580-config-data" (OuterVolumeSpecName: "config-data") pod "96587cc5-74f0-484c-8217-7d1a09d39580" (UID: "96587cc5-74f0-484c-8217-7d1a09d39580"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:47:03 crc kubenswrapper[4606]: I1212 00:47:03.430414 4606 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96587cc5-74f0-484c-8217-7d1a09d39580-config-data\") on node \"crc\" DevicePath \"\"" Dec 12 00:47:04 crc kubenswrapper[4606]: I1212 00:47:04.018990 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"96587cc5-74f0-484c-8217-7d1a09d39580","Type":"ContainerDied","Data":"fd93240d55a848dddf7cac622fc2ad62a501602170326b396b860eb1c8fe9f80"} Dec 12 00:47:04 crc kubenswrapper[4606]: I1212 00:47:04.019314 4606 scope.go:117] "RemoveContainer" containerID="42f5bc69b4cc2aa4850677899d07272bbd6a3678c140953c96a2c041e4557759" Dec 12 00:47:04 crc kubenswrapper[4606]: I1212 00:47:04.019081 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 12 00:47:04 crc kubenswrapper[4606]: I1212 00:47:04.052220 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 12 00:47:04 crc kubenswrapper[4606]: I1212 00:47:04.055918 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 12 00:47:04 crc kubenswrapper[4606]: I1212 00:47:04.083628 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 12 00:47:04 crc kubenswrapper[4606]: E1212 00:47:04.083989 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96587cc5-74f0-484c-8217-7d1a09d39580" containerName="ceilometer-central-agent" Dec 12 00:47:04 crc kubenswrapper[4606]: I1212 00:47:04.084007 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="96587cc5-74f0-484c-8217-7d1a09d39580" containerName="ceilometer-central-agent" Dec 12 00:47:04 crc kubenswrapper[4606]: E1212 00:47:04.084015 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96587cc5-74f0-484c-8217-7d1a09d39580" containerName="sg-core" Dec 12 00:47:04 crc kubenswrapper[4606]: I1212 00:47:04.084021 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="96587cc5-74f0-484c-8217-7d1a09d39580" containerName="sg-core" Dec 12 00:47:04 crc kubenswrapper[4606]: E1212 00:47:04.084037 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96587cc5-74f0-484c-8217-7d1a09d39580" containerName="proxy-httpd" Dec 12 00:47:04 crc kubenswrapper[4606]: I1212 00:47:04.084048 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="96587cc5-74f0-484c-8217-7d1a09d39580" containerName="proxy-httpd" Dec 12 00:47:04 crc kubenswrapper[4606]: E1212 00:47:04.084125 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96587cc5-74f0-484c-8217-7d1a09d39580" containerName="ceilometer-notification-agent" Dec 12 00:47:04 crc kubenswrapper[4606]: I1212 00:47:04.084131 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="96587cc5-74f0-484c-8217-7d1a09d39580" containerName="ceilometer-notification-agent" Dec 12 00:47:04 crc kubenswrapper[4606]: I1212 00:47:04.084339 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="96587cc5-74f0-484c-8217-7d1a09d39580" containerName="ceilometer-central-agent" Dec 12 00:47:04 crc kubenswrapper[4606]: I1212 00:47:04.084362 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="96587cc5-74f0-484c-8217-7d1a09d39580" containerName="sg-core" Dec 12 00:47:04 crc kubenswrapper[4606]: I1212 00:47:04.084374 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="96587cc5-74f0-484c-8217-7d1a09d39580" containerName="ceilometer-notification-agent" Dec 12 00:47:04 crc kubenswrapper[4606]: I1212 00:47:04.084383 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="96587cc5-74f0-484c-8217-7d1a09d39580" containerName="proxy-httpd" Dec 12 00:47:04 crc kubenswrapper[4606]: I1212 00:47:04.085825 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 12 00:47:04 crc kubenswrapper[4606]: I1212 00:47:04.086599 4606 scope.go:117] "RemoveContainer" containerID="facdd7194e1066b996e364ccd0c06b074318090aeaaa48cc595ed41968e7f63f" Dec 12 00:47:04 crc kubenswrapper[4606]: I1212 00:47:04.093667 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 12 00:47:04 crc kubenswrapper[4606]: I1212 00:47:04.094093 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 12 00:47:04 crc kubenswrapper[4606]: I1212 00:47:04.103483 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 12 00:47:04 crc kubenswrapper[4606]: I1212 00:47:04.141314 4606 scope.go:117] "RemoveContainer" containerID="a71d4498cd22cd092cf4563dd14745bff4bb0cd8de6d11b853522795379c5bed" Dec 12 00:47:04 crc kubenswrapper[4606]: I1212 00:47:04.145244 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ecbf2172-2168-4e63-9523-38683ef6eb49-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ecbf2172-2168-4e63-9523-38683ef6eb49\") " pod="openstack/ceilometer-0" Dec 12 00:47:04 crc kubenswrapper[4606]: I1212 00:47:04.145279 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ecbf2172-2168-4e63-9523-38683ef6eb49-config-data\") pod \"ceilometer-0\" (UID: \"ecbf2172-2168-4e63-9523-38683ef6eb49\") " pod="openstack/ceilometer-0" Dec 12 00:47:04 crc kubenswrapper[4606]: I1212 00:47:04.145319 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ecbf2172-2168-4e63-9523-38683ef6eb49-scripts\") pod \"ceilometer-0\" (UID: \"ecbf2172-2168-4e63-9523-38683ef6eb49\") " pod="openstack/ceilometer-0" Dec 12 00:47:04 crc kubenswrapper[4606]: I1212 00:47:04.145408 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gw2tk\" (UniqueName: \"kubernetes.io/projected/ecbf2172-2168-4e63-9523-38683ef6eb49-kube-api-access-gw2tk\") pod \"ceilometer-0\" (UID: \"ecbf2172-2168-4e63-9523-38683ef6eb49\") " pod="openstack/ceilometer-0" Dec 12 00:47:04 crc kubenswrapper[4606]: I1212 00:47:04.145429 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ecbf2172-2168-4e63-9523-38683ef6eb49-log-httpd\") pod \"ceilometer-0\" (UID: \"ecbf2172-2168-4e63-9523-38683ef6eb49\") " pod="openstack/ceilometer-0" Dec 12 00:47:04 crc kubenswrapper[4606]: I1212 00:47:04.145446 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ecbf2172-2168-4e63-9523-38683ef6eb49-run-httpd\") pod \"ceilometer-0\" (UID: \"ecbf2172-2168-4e63-9523-38683ef6eb49\") " pod="openstack/ceilometer-0" Dec 12 00:47:04 crc kubenswrapper[4606]: I1212 00:47:04.145468 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecbf2172-2168-4e63-9523-38683ef6eb49-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ecbf2172-2168-4e63-9523-38683ef6eb49\") " pod="openstack/ceilometer-0" Dec 12 00:47:04 crc kubenswrapper[4606]: I1212 00:47:04.182952 4606 scope.go:117] "RemoveContainer" containerID="5ef316d75b266fd5423e7c5484db50d86dee9e7d2190bf4b280ed073bc688763" Dec 12 00:47:04 crc kubenswrapper[4606]: I1212 00:47:04.247226 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gw2tk\" (UniqueName: \"kubernetes.io/projected/ecbf2172-2168-4e63-9523-38683ef6eb49-kube-api-access-gw2tk\") pod \"ceilometer-0\" (UID: \"ecbf2172-2168-4e63-9523-38683ef6eb49\") " pod="openstack/ceilometer-0" Dec 12 00:47:04 crc kubenswrapper[4606]: I1212 00:47:04.247278 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ecbf2172-2168-4e63-9523-38683ef6eb49-log-httpd\") pod \"ceilometer-0\" (UID: \"ecbf2172-2168-4e63-9523-38683ef6eb49\") " pod="openstack/ceilometer-0" Dec 12 00:47:04 crc kubenswrapper[4606]: I1212 00:47:04.247297 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ecbf2172-2168-4e63-9523-38683ef6eb49-run-httpd\") pod \"ceilometer-0\" (UID: \"ecbf2172-2168-4e63-9523-38683ef6eb49\") " pod="openstack/ceilometer-0" Dec 12 00:47:04 crc kubenswrapper[4606]: I1212 00:47:04.247320 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecbf2172-2168-4e63-9523-38683ef6eb49-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ecbf2172-2168-4e63-9523-38683ef6eb49\") " pod="openstack/ceilometer-0" Dec 12 00:47:04 crc kubenswrapper[4606]: I1212 00:47:04.247358 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ecbf2172-2168-4e63-9523-38683ef6eb49-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ecbf2172-2168-4e63-9523-38683ef6eb49\") " pod="openstack/ceilometer-0" Dec 12 00:47:04 crc kubenswrapper[4606]: I1212 00:47:04.247393 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ecbf2172-2168-4e63-9523-38683ef6eb49-config-data\") pod \"ceilometer-0\" (UID: \"ecbf2172-2168-4e63-9523-38683ef6eb49\") " pod="openstack/ceilometer-0" Dec 12 00:47:04 crc kubenswrapper[4606]: I1212 00:47:04.247428 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ecbf2172-2168-4e63-9523-38683ef6eb49-scripts\") pod \"ceilometer-0\" (UID: \"ecbf2172-2168-4e63-9523-38683ef6eb49\") " pod="openstack/ceilometer-0" Dec 12 00:47:04 crc kubenswrapper[4606]: I1212 00:47:04.248029 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ecbf2172-2168-4e63-9523-38683ef6eb49-log-httpd\") pod \"ceilometer-0\" (UID: \"ecbf2172-2168-4e63-9523-38683ef6eb49\") " pod="openstack/ceilometer-0" Dec 12 00:47:04 crc kubenswrapper[4606]: I1212 00:47:04.248047 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ecbf2172-2168-4e63-9523-38683ef6eb49-run-httpd\") pod \"ceilometer-0\" (UID: \"ecbf2172-2168-4e63-9523-38683ef6eb49\") " pod="openstack/ceilometer-0" Dec 12 00:47:04 crc kubenswrapper[4606]: I1212 00:47:04.253108 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecbf2172-2168-4e63-9523-38683ef6eb49-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ecbf2172-2168-4e63-9523-38683ef6eb49\") " pod="openstack/ceilometer-0" Dec 12 00:47:04 crc kubenswrapper[4606]: I1212 00:47:04.254822 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ecbf2172-2168-4e63-9523-38683ef6eb49-scripts\") pod \"ceilometer-0\" (UID: \"ecbf2172-2168-4e63-9523-38683ef6eb49\") " pod="openstack/ceilometer-0" Dec 12 00:47:04 crc kubenswrapper[4606]: I1212 00:47:04.255817 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ecbf2172-2168-4e63-9523-38683ef6eb49-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ecbf2172-2168-4e63-9523-38683ef6eb49\") " pod="openstack/ceilometer-0" Dec 12 00:47:04 crc kubenswrapper[4606]: I1212 00:47:04.259128 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ecbf2172-2168-4e63-9523-38683ef6eb49-config-data\") pod \"ceilometer-0\" (UID: \"ecbf2172-2168-4e63-9523-38683ef6eb49\") " pod="openstack/ceilometer-0" Dec 12 00:47:04 crc kubenswrapper[4606]: I1212 00:47:04.275894 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gw2tk\" (UniqueName: \"kubernetes.io/projected/ecbf2172-2168-4e63-9523-38683ef6eb49-kube-api-access-gw2tk\") pod \"ceilometer-0\" (UID: \"ecbf2172-2168-4e63-9523-38683ef6eb49\") " pod="openstack/ceilometer-0" Dec 12 00:47:04 crc kubenswrapper[4606]: I1212 00:47:04.407865 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 12 00:47:04 crc kubenswrapper[4606]: I1212 00:47:04.935519 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 12 00:47:05 crc kubenswrapper[4606]: I1212 00:47:05.041808 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ecbf2172-2168-4e63-9523-38683ef6eb49","Type":"ContainerStarted","Data":"86cf2bf89f54183596dda74afde766f0c31e6e77d8abcfe864aa8104318458b0"} Dec 12 00:47:05 crc kubenswrapper[4606]: I1212 00:47:05.712073 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96587cc5-74f0-484c-8217-7d1a09d39580" path="/var/lib/kubelet/pods/96587cc5-74f0-484c-8217-7d1a09d39580/volumes" Dec 12 00:47:06 crc kubenswrapper[4606]: I1212 00:47:06.050653 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ecbf2172-2168-4e63-9523-38683ef6eb49","Type":"ContainerStarted","Data":"ffac6eed7556dcdd49e518d306caae0f6b3f832fa7c08cab62b5616f706bb4e3"} Dec 12 00:47:06 crc kubenswrapper[4606]: I1212 00:47:06.581548 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-7b868b57d5-xjh67" Dec 12 00:47:06 crc kubenswrapper[4606]: I1212 00:47:06.582264 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-7b868b57d5-xjh67" Dec 12 00:47:07 crc kubenswrapper[4606]: I1212 00:47:07.697585 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 12 00:47:08 crc kubenswrapper[4606]: I1212 00:47:08.092232 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ecbf2172-2168-4e63-9523-38683ef6eb49","Type":"ContainerStarted","Data":"5e0eaee84c7ab6407f25b69401b94702d3e502491630fe200f1bb88bcb234568"} Dec 12 00:47:09 crc kubenswrapper[4606]: I1212 00:47:09.122920 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 12 00:47:11 crc kubenswrapper[4606]: I1212 00:47:11.904878 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-79c99578bb-cdgsn" Dec 12 00:47:11 crc kubenswrapper[4606]: I1212 00:47:11.905095 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-79c99578bb-cdgsn" Dec 12 00:47:12 crc kubenswrapper[4606]: I1212 00:47:12.178380 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-b9fb498f6-62fcc" Dec 12 00:47:12 crc kubenswrapper[4606]: I1212 00:47:12.178439 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-b9fb498f6-62fcc" Dec 12 00:47:12 crc kubenswrapper[4606]: I1212 00:47:12.179735 4606 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-b9fb498f6-62fcc" podUID="e38df57e-1a86-4c45-bf40-6282a6a049ed" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.149:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.149:8443: connect: connection refused" Dec 12 00:47:16 crc kubenswrapper[4606]: E1212 00:47:16.799872 4606 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-openstackclient:current-podified" Dec 12 00:47:16 crc kubenswrapper[4606]: E1212 00:47:16.800567 4606 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:openstackclient,Image:quay.io/podified-antelope-centos9/openstack-openstackclient:current-podified,Command:[/bin/sleep],Args:[infinity],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n9dhdh577h68bhb5h569h6bhf8hfdh5bbh5b9h55fh587h557h5d8h669h77h54dh66bh65bh554h5c9h59bh5d8h57hfch675h56h66fhb6hcbh576q,ValueFrom:nil,},EnvVar{Name:OS_CLOUD,Value:default,ValueFrom:nil,},EnvVar{Name:PROMETHEUS_HOST,Value:metric-storage-prometheus.openstack.svc,ValueFrom:nil,},EnvVar{Name:PROMETHEUS_PORT,Value:9090,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:openstack-config,ReadOnly:false,MountPath:/home/cloud-admin/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/home/cloud-admin/.config/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/home/cloud-admin/cloudrc,SubPath:cloudrc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rctgv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42401,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*42401,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstackclient_openstack(0d0864c8-b45f-4324-a56f-ff583d488da0): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 12 00:47:16 crc kubenswrapper[4606]: E1212 00:47:16.801789 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openstackclient\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstackclient" podUID="0d0864c8-b45f-4324-a56f-ff583d488da0" Dec 12 00:47:17 crc kubenswrapper[4606]: I1212 00:47:17.188462 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ecbf2172-2168-4e63-9523-38683ef6eb49","Type":"ContainerStarted","Data":"ecaeb92c5c6a60a43d48996e299eafc84a23df3aaa72de29f47c555748d054cd"} Dec 12 00:47:17 crc kubenswrapper[4606]: E1212 00:47:17.189808 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openstackclient\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-openstackclient:current-podified\\\"\"" pod="openstack/openstackclient" podUID="0d0864c8-b45f-4324-a56f-ff583d488da0" Dec 12 00:47:17 crc kubenswrapper[4606]: I1212 00:47:17.404298 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 12 00:47:17 crc kubenswrapper[4606]: I1212 00:47:17.404744 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="e3dc8210-b17e-45f0-8501-2a545cd4d020" containerName="glance-httpd" containerID="cri-o://ea4874522323d37556a3ca228c6e8c9fd26d6a3c5e91668d929069262ffb5441" gracePeriod=30 Dec 12 00:47:17 crc kubenswrapper[4606]: I1212 00:47:17.404953 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="e3dc8210-b17e-45f0-8501-2a545cd4d020" containerName="glance-log" containerID="cri-o://6f8ed27d7178a7d8152db04f45d45142e85ee1cea9642100ea6666b984537706" gracePeriod=30 Dec 12 00:47:18 crc kubenswrapper[4606]: I1212 00:47:18.200702 4606 generic.go:334] "Generic (PLEG): container finished" podID="e3dc8210-b17e-45f0-8501-2a545cd4d020" containerID="6f8ed27d7178a7d8152db04f45d45142e85ee1cea9642100ea6666b984537706" exitCode=143 Dec 12 00:47:18 crc kubenswrapper[4606]: I1212 00:47:18.200738 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e3dc8210-b17e-45f0-8501-2a545cd4d020","Type":"ContainerDied","Data":"6f8ed27d7178a7d8152db04f45d45142e85ee1cea9642100ea6666b984537706"} Dec 12 00:47:18 crc kubenswrapper[4606]: I1212 00:47:18.613330 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-5k8jp"] Dec 12 00:47:18 crc kubenswrapper[4606]: I1212 00:47:18.614579 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-5k8jp" Dec 12 00:47:18 crc kubenswrapper[4606]: I1212 00:47:18.629821 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-5k8jp"] Dec 12 00:47:18 crc kubenswrapper[4606]: I1212 00:47:18.644935 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/920936eb-f659-4feb-b571-e90906e8bee2-operator-scripts\") pod \"nova-api-db-create-5k8jp\" (UID: \"920936eb-f659-4feb-b571-e90906e8bee2\") " pod="openstack/nova-api-db-create-5k8jp" Dec 12 00:47:18 crc kubenswrapper[4606]: I1212 00:47:18.645025 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p94rb\" (UniqueName: \"kubernetes.io/projected/920936eb-f659-4feb-b571-e90906e8bee2-kube-api-access-p94rb\") pod \"nova-api-db-create-5k8jp\" (UID: \"920936eb-f659-4feb-b571-e90906e8bee2\") " pod="openstack/nova-api-db-create-5k8jp" Dec 12 00:47:18 crc kubenswrapper[4606]: I1212 00:47:18.700942 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-zhvg8"] Dec 12 00:47:18 crc kubenswrapper[4606]: I1212 00:47:18.702004 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-zhvg8" Dec 12 00:47:18 crc kubenswrapper[4606]: I1212 00:47:18.714136 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-zhvg8"] Dec 12 00:47:18 crc kubenswrapper[4606]: I1212 00:47:18.747002 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2bbd1699-c391-4c27-9a8b-9dadfc9d5530-operator-scripts\") pod \"nova-cell0-db-create-zhvg8\" (UID: \"2bbd1699-c391-4c27-9a8b-9dadfc9d5530\") " pod="openstack/nova-cell0-db-create-zhvg8" Dec 12 00:47:18 crc kubenswrapper[4606]: I1212 00:47:18.747164 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/920936eb-f659-4feb-b571-e90906e8bee2-operator-scripts\") pod \"nova-api-db-create-5k8jp\" (UID: \"920936eb-f659-4feb-b571-e90906e8bee2\") " pod="openstack/nova-api-db-create-5k8jp" Dec 12 00:47:18 crc kubenswrapper[4606]: I1212 00:47:18.747240 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjpv8\" (UniqueName: \"kubernetes.io/projected/2bbd1699-c391-4c27-9a8b-9dadfc9d5530-kube-api-access-fjpv8\") pod \"nova-cell0-db-create-zhvg8\" (UID: \"2bbd1699-c391-4c27-9a8b-9dadfc9d5530\") " pod="openstack/nova-cell0-db-create-zhvg8" Dec 12 00:47:18 crc kubenswrapper[4606]: I1212 00:47:18.747855 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p94rb\" (UniqueName: \"kubernetes.io/projected/920936eb-f659-4feb-b571-e90906e8bee2-kube-api-access-p94rb\") pod \"nova-api-db-create-5k8jp\" (UID: \"920936eb-f659-4feb-b571-e90906e8bee2\") " pod="openstack/nova-api-db-create-5k8jp" Dec 12 00:47:18 crc kubenswrapper[4606]: I1212 00:47:18.749570 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/920936eb-f659-4feb-b571-e90906e8bee2-operator-scripts\") pod \"nova-api-db-create-5k8jp\" (UID: \"920936eb-f659-4feb-b571-e90906e8bee2\") " pod="openstack/nova-api-db-create-5k8jp" Dec 12 00:47:18 crc kubenswrapper[4606]: I1212 00:47:18.771776 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p94rb\" (UniqueName: \"kubernetes.io/projected/920936eb-f659-4feb-b571-e90906e8bee2-kube-api-access-p94rb\") pod \"nova-api-db-create-5k8jp\" (UID: \"920936eb-f659-4feb-b571-e90906e8bee2\") " pod="openstack/nova-api-db-create-5k8jp" Dec 12 00:47:18 crc kubenswrapper[4606]: I1212 00:47:18.818054 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-lh8vs"] Dec 12 00:47:18 crc kubenswrapper[4606]: I1212 00:47:18.819116 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-lh8vs" Dec 12 00:47:18 crc kubenswrapper[4606]: I1212 00:47:18.848526 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-7fad-account-create-update-bgwst"] Dec 12 00:47:18 crc kubenswrapper[4606]: I1212 00:47:18.849696 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-7fad-account-create-update-bgwst" Dec 12 00:47:18 crc kubenswrapper[4606]: I1212 00:47:18.849696 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f17c1ec8-277b-4c4d-9dc7-75cdc2fb0eda-operator-scripts\") pod \"nova-cell1-db-create-lh8vs\" (UID: \"f17c1ec8-277b-4c4d-9dc7-75cdc2fb0eda\") " pod="openstack/nova-cell1-db-create-lh8vs" Dec 12 00:47:18 crc kubenswrapper[4606]: I1212 00:47:18.849801 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7x7v6\" (UniqueName: \"kubernetes.io/projected/f17c1ec8-277b-4c4d-9dc7-75cdc2fb0eda-kube-api-access-7x7v6\") pod \"nova-cell1-db-create-lh8vs\" (UID: \"f17c1ec8-277b-4c4d-9dc7-75cdc2fb0eda\") " pod="openstack/nova-cell1-db-create-lh8vs" Dec 12 00:47:18 crc kubenswrapper[4606]: I1212 00:47:18.849844 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjpv8\" (UniqueName: \"kubernetes.io/projected/2bbd1699-c391-4c27-9a8b-9dadfc9d5530-kube-api-access-fjpv8\") pod \"nova-cell0-db-create-zhvg8\" (UID: \"2bbd1699-c391-4c27-9a8b-9dadfc9d5530\") " pod="openstack/nova-cell0-db-create-zhvg8" Dec 12 00:47:18 crc kubenswrapper[4606]: I1212 00:47:18.849909 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2bbd1699-c391-4c27-9a8b-9dadfc9d5530-operator-scripts\") pod \"nova-cell0-db-create-zhvg8\" (UID: \"2bbd1699-c391-4c27-9a8b-9dadfc9d5530\") " pod="openstack/nova-cell0-db-create-zhvg8" Dec 12 00:47:18 crc kubenswrapper[4606]: I1212 00:47:18.850609 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2bbd1699-c391-4c27-9a8b-9dadfc9d5530-operator-scripts\") pod \"nova-cell0-db-create-zhvg8\" (UID: \"2bbd1699-c391-4c27-9a8b-9dadfc9d5530\") " pod="openstack/nova-cell0-db-create-zhvg8" Dec 12 00:47:18 crc kubenswrapper[4606]: I1212 00:47:18.852128 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Dec 12 00:47:18 crc kubenswrapper[4606]: I1212 00:47:18.863511 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-lh8vs"] Dec 12 00:47:18 crc kubenswrapper[4606]: I1212 00:47:18.874493 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-7fad-account-create-update-bgwst"] Dec 12 00:47:18 crc kubenswrapper[4606]: I1212 00:47:18.892883 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjpv8\" (UniqueName: \"kubernetes.io/projected/2bbd1699-c391-4c27-9a8b-9dadfc9d5530-kube-api-access-fjpv8\") pod \"nova-cell0-db-create-zhvg8\" (UID: \"2bbd1699-c391-4c27-9a8b-9dadfc9d5530\") " pod="openstack/nova-cell0-db-create-zhvg8" Dec 12 00:47:18 crc kubenswrapper[4606]: I1212 00:47:18.944085 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-5k8jp" Dec 12 00:47:18 crc kubenswrapper[4606]: I1212 00:47:18.955944 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f17c1ec8-277b-4c4d-9dc7-75cdc2fb0eda-operator-scripts\") pod \"nova-cell1-db-create-lh8vs\" (UID: \"f17c1ec8-277b-4c4d-9dc7-75cdc2fb0eda\") " pod="openstack/nova-cell1-db-create-lh8vs" Dec 12 00:47:18 crc kubenswrapper[4606]: I1212 00:47:18.956000 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3483c50d-cf68-45ab-b01b-7fe2e6f1c057-operator-scripts\") pod \"nova-api-7fad-account-create-update-bgwst\" (UID: \"3483c50d-cf68-45ab-b01b-7fe2e6f1c057\") " pod="openstack/nova-api-7fad-account-create-update-bgwst" Dec 12 00:47:18 crc kubenswrapper[4606]: I1212 00:47:18.956040 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cpfsj\" (UniqueName: \"kubernetes.io/projected/3483c50d-cf68-45ab-b01b-7fe2e6f1c057-kube-api-access-cpfsj\") pod \"nova-api-7fad-account-create-update-bgwst\" (UID: \"3483c50d-cf68-45ab-b01b-7fe2e6f1c057\") " pod="openstack/nova-api-7fad-account-create-update-bgwst" Dec 12 00:47:18 crc kubenswrapper[4606]: I1212 00:47:18.956093 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7x7v6\" (UniqueName: \"kubernetes.io/projected/f17c1ec8-277b-4c4d-9dc7-75cdc2fb0eda-kube-api-access-7x7v6\") pod \"nova-cell1-db-create-lh8vs\" (UID: \"f17c1ec8-277b-4c4d-9dc7-75cdc2fb0eda\") " pod="openstack/nova-cell1-db-create-lh8vs" Dec 12 00:47:18 crc kubenswrapper[4606]: I1212 00:47:18.957046 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f17c1ec8-277b-4c4d-9dc7-75cdc2fb0eda-operator-scripts\") pod \"nova-cell1-db-create-lh8vs\" (UID: \"f17c1ec8-277b-4c4d-9dc7-75cdc2fb0eda\") " pod="openstack/nova-cell1-db-create-lh8vs" Dec 12 00:47:19 crc kubenswrapper[4606]: I1212 00:47:19.003455 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7x7v6\" (UniqueName: \"kubernetes.io/projected/f17c1ec8-277b-4c4d-9dc7-75cdc2fb0eda-kube-api-access-7x7v6\") pod \"nova-cell1-db-create-lh8vs\" (UID: \"f17c1ec8-277b-4c4d-9dc7-75cdc2fb0eda\") " pod="openstack/nova-cell1-db-create-lh8vs" Dec 12 00:47:19 crc kubenswrapper[4606]: I1212 00:47:19.028703 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-zhvg8" Dec 12 00:47:19 crc kubenswrapper[4606]: I1212 00:47:19.056021 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-13d9-account-create-update-glqls"] Dec 12 00:47:19 crc kubenswrapper[4606]: I1212 00:47:19.057291 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-13d9-account-create-update-glqls" Dec 12 00:47:19 crc kubenswrapper[4606]: I1212 00:47:19.058820 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3483c50d-cf68-45ab-b01b-7fe2e6f1c057-operator-scripts\") pod \"nova-api-7fad-account-create-update-bgwst\" (UID: \"3483c50d-cf68-45ab-b01b-7fe2e6f1c057\") " pod="openstack/nova-api-7fad-account-create-update-bgwst" Dec 12 00:47:19 crc kubenswrapper[4606]: I1212 00:47:19.058883 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cpfsj\" (UniqueName: \"kubernetes.io/projected/3483c50d-cf68-45ab-b01b-7fe2e6f1c057-kube-api-access-cpfsj\") pod \"nova-api-7fad-account-create-update-bgwst\" (UID: \"3483c50d-cf68-45ab-b01b-7fe2e6f1c057\") " pod="openstack/nova-api-7fad-account-create-update-bgwst" Dec 12 00:47:19 crc kubenswrapper[4606]: I1212 00:47:19.059717 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3483c50d-cf68-45ab-b01b-7fe2e6f1c057-operator-scripts\") pod \"nova-api-7fad-account-create-update-bgwst\" (UID: \"3483c50d-cf68-45ab-b01b-7fe2e6f1c057\") " pod="openstack/nova-api-7fad-account-create-update-bgwst" Dec 12 00:47:19 crc kubenswrapper[4606]: I1212 00:47:19.060124 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Dec 12 00:47:19 crc kubenswrapper[4606]: I1212 00:47:19.102107 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cpfsj\" (UniqueName: \"kubernetes.io/projected/3483c50d-cf68-45ab-b01b-7fe2e6f1c057-kube-api-access-cpfsj\") pod \"nova-api-7fad-account-create-update-bgwst\" (UID: \"3483c50d-cf68-45ab-b01b-7fe2e6f1c057\") " pod="openstack/nova-api-7fad-account-create-update-bgwst" Dec 12 00:47:19 crc kubenswrapper[4606]: I1212 00:47:19.119222 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-13d9-account-create-update-glqls"] Dec 12 00:47:19 crc kubenswrapper[4606]: I1212 00:47:19.134624 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-lh8vs" Dec 12 00:47:19 crc kubenswrapper[4606]: I1212 00:47:19.161273 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c3208805-45c9-44bd-b7b3-622cdbc2dae9-operator-scripts\") pod \"nova-cell0-13d9-account-create-update-glqls\" (UID: \"c3208805-45c9-44bd-b7b3-622cdbc2dae9\") " pod="openstack/nova-cell0-13d9-account-create-update-glqls" Dec 12 00:47:19 crc kubenswrapper[4606]: I1212 00:47:19.161341 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rb77\" (UniqueName: \"kubernetes.io/projected/c3208805-45c9-44bd-b7b3-622cdbc2dae9-kube-api-access-6rb77\") pod \"nova-cell0-13d9-account-create-update-glqls\" (UID: \"c3208805-45c9-44bd-b7b3-622cdbc2dae9\") " pod="openstack/nova-cell0-13d9-account-create-update-glqls" Dec 12 00:47:19 crc kubenswrapper[4606]: I1212 00:47:19.173478 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-7fad-account-create-update-bgwst" Dec 12 00:47:19 crc kubenswrapper[4606]: I1212 00:47:19.263082 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c3208805-45c9-44bd-b7b3-622cdbc2dae9-operator-scripts\") pod \"nova-cell0-13d9-account-create-update-glqls\" (UID: \"c3208805-45c9-44bd-b7b3-622cdbc2dae9\") " pod="openstack/nova-cell0-13d9-account-create-update-glqls" Dec 12 00:47:19 crc kubenswrapper[4606]: I1212 00:47:19.263201 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rb77\" (UniqueName: \"kubernetes.io/projected/c3208805-45c9-44bd-b7b3-622cdbc2dae9-kube-api-access-6rb77\") pod \"nova-cell0-13d9-account-create-update-glqls\" (UID: \"c3208805-45c9-44bd-b7b3-622cdbc2dae9\") " pod="openstack/nova-cell0-13d9-account-create-update-glqls" Dec 12 00:47:19 crc kubenswrapper[4606]: I1212 00:47:19.264105 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c3208805-45c9-44bd-b7b3-622cdbc2dae9-operator-scripts\") pod \"nova-cell0-13d9-account-create-update-glqls\" (UID: \"c3208805-45c9-44bd-b7b3-622cdbc2dae9\") " pod="openstack/nova-cell0-13d9-account-create-update-glqls" Dec 12 00:47:19 crc kubenswrapper[4606]: I1212 00:47:19.284263 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-6bb0-account-create-update-p7kwg"] Dec 12 00:47:19 crc kubenswrapper[4606]: I1212 00:47:19.285397 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-6bb0-account-create-update-p7kwg" Dec 12 00:47:19 crc kubenswrapper[4606]: I1212 00:47:19.286235 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ecbf2172-2168-4e63-9523-38683ef6eb49","Type":"ContainerStarted","Data":"a96ae92641bb508ce08f144e25adddc702a59a36991edae30ebf58eb6b203e70"} Dec 12 00:47:19 crc kubenswrapper[4606]: I1212 00:47:19.286355 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ecbf2172-2168-4e63-9523-38683ef6eb49" containerName="ceilometer-central-agent" containerID="cri-o://ffac6eed7556dcdd49e518d306caae0f6b3f832fa7c08cab62b5616f706bb4e3" gracePeriod=30 Dec 12 00:47:19 crc kubenswrapper[4606]: I1212 00:47:19.286454 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 12 00:47:19 crc kubenswrapper[4606]: I1212 00:47:19.286490 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ecbf2172-2168-4e63-9523-38683ef6eb49" containerName="proxy-httpd" containerID="cri-o://a96ae92641bb508ce08f144e25adddc702a59a36991edae30ebf58eb6b203e70" gracePeriod=30 Dec 12 00:47:19 crc kubenswrapper[4606]: I1212 00:47:19.286525 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ecbf2172-2168-4e63-9523-38683ef6eb49" containerName="sg-core" containerID="cri-o://ecaeb92c5c6a60a43d48996e299eafc84a23df3aaa72de29f47c555748d054cd" gracePeriod=30 Dec 12 00:47:19 crc kubenswrapper[4606]: I1212 00:47:19.286558 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ecbf2172-2168-4e63-9523-38683ef6eb49" containerName="ceilometer-notification-agent" containerID="cri-o://5e0eaee84c7ab6407f25b69401b94702d3e502491630fe200f1bb88bcb234568" gracePeriod=30 Dec 12 00:47:19 crc kubenswrapper[4606]: I1212 00:47:19.294379 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Dec 12 00:47:19 crc kubenswrapper[4606]: I1212 00:47:19.303327 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-6bb0-account-create-update-p7kwg"] Dec 12 00:47:19 crc kubenswrapper[4606]: I1212 00:47:19.360791 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rb77\" (UniqueName: \"kubernetes.io/projected/c3208805-45c9-44bd-b7b3-622cdbc2dae9-kube-api-access-6rb77\") pod \"nova-cell0-13d9-account-create-update-glqls\" (UID: \"c3208805-45c9-44bd-b7b3-622cdbc2dae9\") " pod="openstack/nova-cell0-13d9-account-create-update-glqls" Dec 12 00:47:19 crc kubenswrapper[4606]: I1212 00:47:19.368134 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/509a7acf-27c5-45b9-8018-2b21b84b9b0a-operator-scripts\") pod \"nova-cell1-6bb0-account-create-update-p7kwg\" (UID: \"509a7acf-27c5-45b9-8018-2b21b84b9b0a\") " pod="openstack/nova-cell1-6bb0-account-create-update-p7kwg" Dec 12 00:47:19 crc kubenswrapper[4606]: I1212 00:47:19.368206 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdg66\" (UniqueName: \"kubernetes.io/projected/509a7acf-27c5-45b9-8018-2b21b84b9b0a-kube-api-access-fdg66\") pod \"nova-cell1-6bb0-account-create-update-p7kwg\" (UID: \"509a7acf-27c5-45b9-8018-2b21b84b9b0a\") " pod="openstack/nova-cell1-6bb0-account-create-update-p7kwg" Dec 12 00:47:19 crc kubenswrapper[4606]: I1212 00:47:19.372884 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.139883767 podStartE2EDuration="15.372859252s" podCreationTimestamp="2025-12-12 00:47:04 +0000 UTC" firstStartedPulling="2025-12-12 00:47:04.970100679 +0000 UTC m=+1415.515453545" lastFinishedPulling="2025-12-12 00:47:18.203076164 +0000 UTC m=+1428.748429030" observedRunningTime="2025-12-12 00:47:19.366721437 +0000 UTC m=+1429.912074303" watchObservedRunningTime="2025-12-12 00:47:19.372859252 +0000 UTC m=+1429.918212118" Dec 12 00:47:19 crc kubenswrapper[4606]: I1212 00:47:19.382305 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-13d9-account-create-update-glqls" Dec 12 00:47:19 crc kubenswrapper[4606]: I1212 00:47:19.469957 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/509a7acf-27c5-45b9-8018-2b21b84b9b0a-operator-scripts\") pod \"nova-cell1-6bb0-account-create-update-p7kwg\" (UID: \"509a7acf-27c5-45b9-8018-2b21b84b9b0a\") " pod="openstack/nova-cell1-6bb0-account-create-update-p7kwg" Dec 12 00:47:19 crc kubenswrapper[4606]: I1212 00:47:19.470016 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdg66\" (UniqueName: \"kubernetes.io/projected/509a7acf-27c5-45b9-8018-2b21b84b9b0a-kube-api-access-fdg66\") pod \"nova-cell1-6bb0-account-create-update-p7kwg\" (UID: \"509a7acf-27c5-45b9-8018-2b21b84b9b0a\") " pod="openstack/nova-cell1-6bb0-account-create-update-p7kwg" Dec 12 00:47:19 crc kubenswrapper[4606]: I1212 00:47:19.471113 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/509a7acf-27c5-45b9-8018-2b21b84b9b0a-operator-scripts\") pod \"nova-cell1-6bb0-account-create-update-p7kwg\" (UID: \"509a7acf-27c5-45b9-8018-2b21b84b9b0a\") " pod="openstack/nova-cell1-6bb0-account-create-update-p7kwg" Dec 12 00:47:19 crc kubenswrapper[4606]: I1212 00:47:19.535823 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdg66\" (UniqueName: \"kubernetes.io/projected/509a7acf-27c5-45b9-8018-2b21b84b9b0a-kube-api-access-fdg66\") pod \"nova-cell1-6bb0-account-create-update-p7kwg\" (UID: \"509a7acf-27c5-45b9-8018-2b21b84b9b0a\") " pod="openstack/nova-cell1-6bb0-account-create-update-p7kwg" Dec 12 00:47:19 crc kubenswrapper[4606]: I1212 00:47:19.584987 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-6bb0-account-create-update-p7kwg" Dec 12 00:47:20 crc kubenswrapper[4606]: I1212 00:47:20.088031 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-5k8jp"] Dec 12 00:47:20 crc kubenswrapper[4606]: I1212 00:47:20.114683 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-zhvg8"] Dec 12 00:47:20 crc kubenswrapper[4606]: I1212 00:47:20.216546 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-lh8vs"] Dec 12 00:47:20 crc kubenswrapper[4606]: I1212 00:47:20.270627 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-7fad-account-create-update-bgwst"] Dec 12 00:47:20 crc kubenswrapper[4606]: I1212 00:47:20.301012 4606 generic.go:334] "Generic (PLEG): container finished" podID="ecbf2172-2168-4e63-9523-38683ef6eb49" containerID="a96ae92641bb508ce08f144e25adddc702a59a36991edae30ebf58eb6b203e70" exitCode=0 Dec 12 00:47:20 crc kubenswrapper[4606]: I1212 00:47:20.301041 4606 generic.go:334] "Generic (PLEG): container finished" podID="ecbf2172-2168-4e63-9523-38683ef6eb49" containerID="ecaeb92c5c6a60a43d48996e299eafc84a23df3aaa72de29f47c555748d054cd" exitCode=2 Dec 12 00:47:20 crc kubenswrapper[4606]: I1212 00:47:20.301050 4606 generic.go:334] "Generic (PLEG): container finished" podID="ecbf2172-2168-4e63-9523-38683ef6eb49" containerID="ffac6eed7556dcdd49e518d306caae0f6b3f832fa7c08cab62b5616f706bb4e3" exitCode=0 Dec 12 00:47:20 crc kubenswrapper[4606]: I1212 00:47:20.301084 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ecbf2172-2168-4e63-9523-38683ef6eb49","Type":"ContainerDied","Data":"a96ae92641bb508ce08f144e25adddc702a59a36991edae30ebf58eb6b203e70"} Dec 12 00:47:20 crc kubenswrapper[4606]: I1212 00:47:20.301110 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ecbf2172-2168-4e63-9523-38683ef6eb49","Type":"ContainerDied","Data":"ecaeb92c5c6a60a43d48996e299eafc84a23df3aaa72de29f47c555748d054cd"} Dec 12 00:47:20 crc kubenswrapper[4606]: I1212 00:47:20.301121 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ecbf2172-2168-4e63-9523-38683ef6eb49","Type":"ContainerDied","Data":"ffac6eed7556dcdd49e518d306caae0f6b3f832fa7c08cab62b5616f706bb4e3"} Dec 12 00:47:20 crc kubenswrapper[4606]: I1212 00:47:20.306385 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-zhvg8" event={"ID":"2bbd1699-c391-4c27-9a8b-9dadfc9d5530","Type":"ContainerStarted","Data":"d0f0505c21f4eb9eeba32f5364c3426638241d3d8b8ab81ce587882f768208c8"} Dec 12 00:47:20 crc kubenswrapper[4606]: I1212 00:47:20.320102 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-5k8jp" event={"ID":"920936eb-f659-4feb-b571-e90906e8bee2","Type":"ContainerStarted","Data":"7b1f810ebad14e2477ba861506f00a5acaf2bd68c8015dadc1b96dcda94da917"} Dec 12 00:47:20 crc kubenswrapper[4606]: I1212 00:47:20.329230 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-lh8vs" event={"ID":"f17c1ec8-277b-4c4d-9dc7-75cdc2fb0eda","Type":"ContainerStarted","Data":"5d8c3857532e877392c9d46a3d22441d63c9ff008846a4c7a21098792c898aa4"} Dec 12 00:47:20 crc kubenswrapper[4606]: I1212 00:47:20.364360 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-6bb0-account-create-update-p7kwg"] Dec 12 00:47:20 crc kubenswrapper[4606]: I1212 00:47:20.381812 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-13d9-account-create-update-glqls"] Dec 12 00:47:20 crc kubenswrapper[4606]: W1212 00:47:20.384430 4606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod509a7acf_27c5_45b9_8018_2b21b84b9b0a.slice/crio-1f7547771307df898eff7eba37c34c9cc7196d6f110c52c9f7e779bc31abff65 WatchSource:0}: Error finding container 1f7547771307df898eff7eba37c34c9cc7196d6f110c52c9f7e779bc31abff65: Status 404 returned error can't find the container with id 1f7547771307df898eff7eba37c34c9cc7196d6f110c52c9f7e779bc31abff65 Dec 12 00:47:21 crc kubenswrapper[4606]: I1212 00:47:21.157375 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 12 00:47:21 crc kubenswrapper[4606]: I1212 00:47:21.219507 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ecbf2172-2168-4e63-9523-38683ef6eb49-log-httpd\") pod \"ecbf2172-2168-4e63-9523-38683ef6eb49\" (UID: \"ecbf2172-2168-4e63-9523-38683ef6eb49\") " Dec 12 00:47:21 crc kubenswrapper[4606]: I1212 00:47:21.219556 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ecbf2172-2168-4e63-9523-38683ef6eb49-config-data\") pod \"ecbf2172-2168-4e63-9523-38683ef6eb49\" (UID: \"ecbf2172-2168-4e63-9523-38683ef6eb49\") " Dec 12 00:47:21 crc kubenswrapper[4606]: I1212 00:47:21.219584 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gw2tk\" (UniqueName: \"kubernetes.io/projected/ecbf2172-2168-4e63-9523-38683ef6eb49-kube-api-access-gw2tk\") pod \"ecbf2172-2168-4e63-9523-38683ef6eb49\" (UID: \"ecbf2172-2168-4e63-9523-38683ef6eb49\") " Dec 12 00:47:21 crc kubenswrapper[4606]: I1212 00:47:21.219642 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ecbf2172-2168-4e63-9523-38683ef6eb49-run-httpd\") pod \"ecbf2172-2168-4e63-9523-38683ef6eb49\" (UID: \"ecbf2172-2168-4e63-9523-38683ef6eb49\") " Dec 12 00:47:21 crc kubenswrapper[4606]: I1212 00:47:21.219837 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ecbf2172-2168-4e63-9523-38683ef6eb49-sg-core-conf-yaml\") pod \"ecbf2172-2168-4e63-9523-38683ef6eb49\" (UID: \"ecbf2172-2168-4e63-9523-38683ef6eb49\") " Dec 12 00:47:21 crc kubenswrapper[4606]: I1212 00:47:21.219876 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecbf2172-2168-4e63-9523-38683ef6eb49-combined-ca-bundle\") pod \"ecbf2172-2168-4e63-9523-38683ef6eb49\" (UID: \"ecbf2172-2168-4e63-9523-38683ef6eb49\") " Dec 12 00:47:21 crc kubenswrapper[4606]: I1212 00:47:21.221731 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ecbf2172-2168-4e63-9523-38683ef6eb49-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "ecbf2172-2168-4e63-9523-38683ef6eb49" (UID: "ecbf2172-2168-4e63-9523-38683ef6eb49"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 00:47:21 crc kubenswrapper[4606]: I1212 00:47:21.238638 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ecbf2172-2168-4e63-9523-38683ef6eb49-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "ecbf2172-2168-4e63-9523-38683ef6eb49" (UID: "ecbf2172-2168-4e63-9523-38683ef6eb49"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 00:47:21 crc kubenswrapper[4606]: I1212 00:47:21.246590 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ecbf2172-2168-4e63-9523-38683ef6eb49-kube-api-access-gw2tk" (OuterVolumeSpecName: "kube-api-access-gw2tk") pod "ecbf2172-2168-4e63-9523-38683ef6eb49" (UID: "ecbf2172-2168-4e63-9523-38683ef6eb49"). InnerVolumeSpecName "kube-api-access-gw2tk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:47:21 crc kubenswrapper[4606]: I1212 00:47:21.323720 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ecbf2172-2168-4e63-9523-38683ef6eb49-scripts\") pod \"ecbf2172-2168-4e63-9523-38683ef6eb49\" (UID: \"ecbf2172-2168-4e63-9523-38683ef6eb49\") " Dec 12 00:47:21 crc kubenswrapper[4606]: I1212 00:47:21.324166 4606 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ecbf2172-2168-4e63-9523-38683ef6eb49-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 12 00:47:21 crc kubenswrapper[4606]: I1212 00:47:21.324191 4606 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ecbf2172-2168-4e63-9523-38683ef6eb49-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 12 00:47:21 crc kubenswrapper[4606]: I1212 00:47:21.324200 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gw2tk\" (UniqueName: \"kubernetes.io/projected/ecbf2172-2168-4e63-9523-38683ef6eb49-kube-api-access-gw2tk\") on node \"crc\" DevicePath \"\"" Dec 12 00:47:21 crc kubenswrapper[4606]: I1212 00:47:21.337406 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecbf2172-2168-4e63-9523-38683ef6eb49-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "ecbf2172-2168-4e63-9523-38683ef6eb49" (UID: "ecbf2172-2168-4e63-9523-38683ef6eb49"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:47:21 crc kubenswrapper[4606]: I1212 00:47:21.349239 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecbf2172-2168-4e63-9523-38683ef6eb49-scripts" (OuterVolumeSpecName: "scripts") pod "ecbf2172-2168-4e63-9523-38683ef6eb49" (UID: "ecbf2172-2168-4e63-9523-38683ef6eb49"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:47:21 crc kubenswrapper[4606]: I1212 00:47:21.396411 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-zhvg8" event={"ID":"2bbd1699-c391-4c27-9a8b-9dadfc9d5530","Type":"ContainerStarted","Data":"213d60894d2453712748ea73c425102921cc0b24b95201f323d5c4ba48526224"} Dec 12 00:47:21 crc kubenswrapper[4606]: I1212 00:47:21.426286 4606 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ecbf2172-2168-4e63-9523-38683ef6eb49-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 12 00:47:21 crc kubenswrapper[4606]: I1212 00:47:21.426318 4606 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ecbf2172-2168-4e63-9523-38683ef6eb49-scripts\") on node \"crc\" DevicePath \"\"" Dec 12 00:47:21 crc kubenswrapper[4606]: I1212 00:47:21.433409 4606 generic.go:334] "Generic (PLEG): container finished" podID="e3dc8210-b17e-45f0-8501-2a545cd4d020" containerID="ea4874522323d37556a3ca228c6e8c9fd26d6a3c5e91668d929069262ffb5441" exitCode=0 Dec 12 00:47:21 crc kubenswrapper[4606]: I1212 00:47:21.433488 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e3dc8210-b17e-45f0-8501-2a545cd4d020","Type":"ContainerDied","Data":"ea4874522323d37556a3ca228c6e8c9fd26d6a3c5e91668d929069262ffb5441"} Dec 12 00:47:21 crc kubenswrapper[4606]: I1212 00:47:21.439365 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecbf2172-2168-4e63-9523-38683ef6eb49-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ecbf2172-2168-4e63-9523-38683ef6eb49" (UID: "ecbf2172-2168-4e63-9523-38683ef6eb49"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:47:21 crc kubenswrapper[4606]: I1212 00:47:21.442402 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-6bb0-account-create-update-p7kwg" event={"ID":"509a7acf-27c5-45b9-8018-2b21b84b9b0a","Type":"ContainerStarted","Data":"06e78b4586f69ffcd4a6697e1b14dae2d3096a03ec94847beb9405c7f122232a"} Dec 12 00:47:21 crc kubenswrapper[4606]: I1212 00:47:21.442440 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-6bb0-account-create-update-p7kwg" event={"ID":"509a7acf-27c5-45b9-8018-2b21b84b9b0a","Type":"ContainerStarted","Data":"1f7547771307df898eff7eba37c34c9cc7196d6f110c52c9f7e779bc31abff65"} Dec 12 00:47:21 crc kubenswrapper[4606]: I1212 00:47:21.503363 4606 generic.go:334] "Generic (PLEG): container finished" podID="920936eb-f659-4feb-b571-e90906e8bee2" containerID="0037d02b8812391ec58697f3534f566bd94955c3d84ef3b0cf1e6e57cbb7e6f9" exitCode=0 Dec 12 00:47:21 crc kubenswrapper[4606]: I1212 00:47:21.503433 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-5k8jp" event={"ID":"920936eb-f659-4feb-b571-e90906e8bee2","Type":"ContainerDied","Data":"0037d02b8812391ec58697f3534f566bd94955c3d84ef3b0cf1e6e57cbb7e6f9"} Dec 12 00:47:21 crc kubenswrapper[4606]: I1212 00:47:21.522312 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-13d9-account-create-update-glqls" event={"ID":"c3208805-45c9-44bd-b7b3-622cdbc2dae9","Type":"ContainerStarted","Data":"38af5d998135ba12b6d1e8e5c2970e40b0c33bb52defd76f200d68829570d9ad"} Dec 12 00:47:21 crc kubenswrapper[4606]: I1212 00:47:21.522353 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-13d9-account-create-update-glqls" event={"ID":"c3208805-45c9-44bd-b7b3-622cdbc2dae9","Type":"ContainerStarted","Data":"9a8018148efc3a1351027878edc96c727797f229798e65e1c3a76ae6f2eeb991"} Dec 12 00:47:21 crc kubenswrapper[4606]: I1212 00:47:21.511550 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-6bb0-account-create-update-p7kwg" podStartSLOduration=2.511527143 podStartE2EDuration="2.511527143s" podCreationTimestamp="2025-12-12 00:47:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:47:21.478317623 +0000 UTC m=+1432.023670489" watchObservedRunningTime="2025-12-12 00:47:21.511527143 +0000 UTC m=+1432.056880009" Dec 12 00:47:21 crc kubenswrapper[4606]: I1212 00:47:21.573567 4606 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecbf2172-2168-4e63-9523-38683ef6eb49-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 12 00:47:21 crc kubenswrapper[4606]: I1212 00:47:21.589660 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-lh8vs" event={"ID":"f17c1ec8-277b-4c4d-9dc7-75cdc2fb0eda","Type":"ContainerStarted","Data":"4cf8b7c0ae6c560fa6186ad7009387bc04e6f2e401a474840065e803126a8b84"} Dec 12 00:47:21 crc kubenswrapper[4606]: I1212 00:47:21.594075 4606 generic.go:334] "Generic (PLEG): container finished" podID="ecbf2172-2168-4e63-9523-38683ef6eb49" containerID="5e0eaee84c7ab6407f25b69401b94702d3e502491630fe200f1bb88bcb234568" exitCode=0 Dec 12 00:47:21 crc kubenswrapper[4606]: I1212 00:47:21.594150 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ecbf2172-2168-4e63-9523-38683ef6eb49","Type":"ContainerDied","Data":"5e0eaee84c7ab6407f25b69401b94702d3e502491630fe200f1bb88bcb234568"} Dec 12 00:47:21 crc kubenswrapper[4606]: I1212 00:47:21.594202 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ecbf2172-2168-4e63-9523-38683ef6eb49","Type":"ContainerDied","Data":"86cf2bf89f54183596dda74afde766f0c31e6e77d8abcfe864aa8104318458b0"} Dec 12 00:47:21 crc kubenswrapper[4606]: I1212 00:47:21.594220 4606 scope.go:117] "RemoveContainer" containerID="a96ae92641bb508ce08f144e25adddc702a59a36991edae30ebf58eb6b203e70" Dec 12 00:47:21 crc kubenswrapper[4606]: I1212 00:47:21.594607 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 12 00:47:21 crc kubenswrapper[4606]: I1212 00:47:21.597434 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecbf2172-2168-4e63-9523-38683ef6eb49-config-data" (OuterVolumeSpecName: "config-data") pod "ecbf2172-2168-4e63-9523-38683ef6eb49" (UID: "ecbf2172-2168-4e63-9523-38683ef6eb49"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:47:21 crc kubenswrapper[4606]: I1212 00:47:21.602931 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-7fad-account-create-update-bgwst" event={"ID":"3483c50d-cf68-45ab-b01b-7fe2e6f1c057","Type":"ContainerStarted","Data":"a8677ce0e335635f3ad4f08931e968cfaa89743560ba6a0c94f0e8aa068b2550"} Dec 12 00:47:21 crc kubenswrapper[4606]: I1212 00:47:21.603043 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-7fad-account-create-update-bgwst" event={"ID":"3483c50d-cf68-45ab-b01b-7fe2e6f1c057","Type":"ContainerStarted","Data":"f0e3bd0e5509d46adc65ddce5f2c30b11947697efdde6c50bc129ea84af9295d"} Dec 12 00:47:21 crc kubenswrapper[4606]: I1212 00:47:21.610768 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-13d9-account-create-update-glqls" podStartSLOduration=2.6107501319999997 podStartE2EDuration="2.610750132s" podCreationTimestamp="2025-12-12 00:47:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:47:21.587510049 +0000 UTC m=+1432.132862925" watchObservedRunningTime="2025-12-12 00:47:21.610750132 +0000 UTC m=+1432.156102998" Dec 12 00:47:21 crc kubenswrapper[4606]: I1212 00:47:21.675314 4606 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ecbf2172-2168-4e63-9523-38683ef6eb49-config-data\") on node \"crc\" DevicePath \"\"" Dec 12 00:47:21 crc kubenswrapper[4606]: I1212 00:47:21.748082 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 12 00:47:21 crc kubenswrapper[4606]: I1212 00:47:21.759562 4606 scope.go:117] "RemoveContainer" containerID="ecaeb92c5c6a60a43d48996e299eafc84a23df3aaa72de29f47c555748d054cd" Dec 12 00:47:21 crc kubenswrapper[4606]: I1212 00:47:21.776748 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e3dc8210-b17e-45f0-8501-2a545cd4d020-httpd-run\") pod \"e3dc8210-b17e-45f0-8501-2a545cd4d020\" (UID: \"e3dc8210-b17e-45f0-8501-2a545cd4d020\") " Dec 12 00:47:21 crc kubenswrapper[4606]: I1212 00:47:21.776826 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l6kqb\" (UniqueName: \"kubernetes.io/projected/e3dc8210-b17e-45f0-8501-2a545cd4d020-kube-api-access-l6kqb\") pod \"e3dc8210-b17e-45f0-8501-2a545cd4d020\" (UID: \"e3dc8210-b17e-45f0-8501-2a545cd4d020\") " Dec 12 00:47:21 crc kubenswrapper[4606]: I1212 00:47:21.776917 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"e3dc8210-b17e-45f0-8501-2a545cd4d020\" (UID: \"e3dc8210-b17e-45f0-8501-2a545cd4d020\") " Dec 12 00:47:21 crc kubenswrapper[4606]: I1212 00:47:21.776950 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3dc8210-b17e-45f0-8501-2a545cd4d020-scripts\") pod \"e3dc8210-b17e-45f0-8501-2a545cd4d020\" (UID: \"e3dc8210-b17e-45f0-8501-2a545cd4d020\") " Dec 12 00:47:21 crc kubenswrapper[4606]: I1212 00:47:21.776995 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3dc8210-b17e-45f0-8501-2a545cd4d020-config-data\") pod \"e3dc8210-b17e-45f0-8501-2a545cd4d020\" (UID: \"e3dc8210-b17e-45f0-8501-2a545cd4d020\") " Dec 12 00:47:21 crc kubenswrapper[4606]: I1212 00:47:21.777024 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3dc8210-b17e-45f0-8501-2a545cd4d020-logs\") pod \"e3dc8210-b17e-45f0-8501-2a545cd4d020\" (UID: \"e3dc8210-b17e-45f0-8501-2a545cd4d020\") " Dec 12 00:47:21 crc kubenswrapper[4606]: I1212 00:47:21.777059 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3dc8210-b17e-45f0-8501-2a545cd4d020-internal-tls-certs\") pod \"e3dc8210-b17e-45f0-8501-2a545cd4d020\" (UID: \"e3dc8210-b17e-45f0-8501-2a545cd4d020\") " Dec 12 00:47:21 crc kubenswrapper[4606]: I1212 00:47:21.777112 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3dc8210-b17e-45f0-8501-2a545cd4d020-combined-ca-bundle\") pod \"e3dc8210-b17e-45f0-8501-2a545cd4d020\" (UID: \"e3dc8210-b17e-45f0-8501-2a545cd4d020\") " Dec 12 00:47:21 crc kubenswrapper[4606]: I1212 00:47:21.777349 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3dc8210-b17e-45f0-8501-2a545cd4d020-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "e3dc8210-b17e-45f0-8501-2a545cd4d020" (UID: "e3dc8210-b17e-45f0-8501-2a545cd4d020"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 00:47:21 crc kubenswrapper[4606]: I1212 00:47:21.777636 4606 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e3dc8210-b17e-45f0-8501-2a545cd4d020-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 12 00:47:21 crc kubenswrapper[4606]: I1212 00:47:21.784111 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3dc8210-b17e-45f0-8501-2a545cd4d020-logs" (OuterVolumeSpecName: "logs") pod "e3dc8210-b17e-45f0-8501-2a545cd4d020" (UID: "e3dc8210-b17e-45f0-8501-2a545cd4d020"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 00:47:21 crc kubenswrapper[4606]: I1212 00:47:21.788619 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3dc8210-b17e-45f0-8501-2a545cd4d020-kube-api-access-l6kqb" (OuterVolumeSpecName: "kube-api-access-l6kqb") pod "e3dc8210-b17e-45f0-8501-2a545cd4d020" (UID: "e3dc8210-b17e-45f0-8501-2a545cd4d020"). InnerVolumeSpecName "kube-api-access-l6kqb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:47:21 crc kubenswrapper[4606]: I1212 00:47:21.788727 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3dc8210-b17e-45f0-8501-2a545cd4d020-scripts" (OuterVolumeSpecName: "scripts") pod "e3dc8210-b17e-45f0-8501-2a545cd4d020" (UID: "e3dc8210-b17e-45f0-8501-2a545cd4d020"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:47:21 crc kubenswrapper[4606]: I1212 00:47:21.791538 4606 scope.go:117] "RemoveContainer" containerID="5e0eaee84c7ab6407f25b69401b94702d3e502491630fe200f1bb88bcb234568" Dec 12 00:47:21 crc kubenswrapper[4606]: I1212 00:47:21.793479 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance") pod "e3dc8210-b17e-45f0-8501-2a545cd4d020" (UID: "e3dc8210-b17e-45f0-8501-2a545cd4d020"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 12 00:47:21 crc kubenswrapper[4606]: I1212 00:47:21.815931 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3dc8210-b17e-45f0-8501-2a545cd4d020-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e3dc8210-b17e-45f0-8501-2a545cd4d020" (UID: "e3dc8210-b17e-45f0-8501-2a545cd4d020"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:47:21 crc kubenswrapper[4606]: I1212 00:47:21.827802 4606 scope.go:117] "RemoveContainer" containerID="ffac6eed7556dcdd49e518d306caae0f6b3f832fa7c08cab62b5616f706bb4e3" Dec 12 00:47:21 crc kubenswrapper[4606]: I1212 00:47:21.863586 4606 scope.go:117] "RemoveContainer" containerID="a96ae92641bb508ce08f144e25adddc702a59a36991edae30ebf58eb6b203e70" Dec 12 00:47:21 crc kubenswrapper[4606]: E1212 00:47:21.864080 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a96ae92641bb508ce08f144e25adddc702a59a36991edae30ebf58eb6b203e70\": container with ID starting with a96ae92641bb508ce08f144e25adddc702a59a36991edae30ebf58eb6b203e70 not found: ID does not exist" containerID="a96ae92641bb508ce08f144e25adddc702a59a36991edae30ebf58eb6b203e70" Dec 12 00:47:21 crc kubenswrapper[4606]: I1212 00:47:21.864116 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a96ae92641bb508ce08f144e25adddc702a59a36991edae30ebf58eb6b203e70"} err="failed to get container status \"a96ae92641bb508ce08f144e25adddc702a59a36991edae30ebf58eb6b203e70\": rpc error: code = NotFound desc = could not find container \"a96ae92641bb508ce08f144e25adddc702a59a36991edae30ebf58eb6b203e70\": container with ID starting with a96ae92641bb508ce08f144e25adddc702a59a36991edae30ebf58eb6b203e70 not found: ID does not exist" Dec 12 00:47:21 crc kubenswrapper[4606]: I1212 00:47:21.864142 4606 scope.go:117] "RemoveContainer" containerID="ecaeb92c5c6a60a43d48996e299eafc84a23df3aaa72de29f47c555748d054cd" Dec 12 00:47:21 crc kubenswrapper[4606]: E1212 00:47:21.864572 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ecaeb92c5c6a60a43d48996e299eafc84a23df3aaa72de29f47c555748d054cd\": container with ID starting with ecaeb92c5c6a60a43d48996e299eafc84a23df3aaa72de29f47c555748d054cd not found: ID does not exist" containerID="ecaeb92c5c6a60a43d48996e299eafc84a23df3aaa72de29f47c555748d054cd" Dec 12 00:47:21 crc kubenswrapper[4606]: I1212 00:47:21.864608 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ecaeb92c5c6a60a43d48996e299eafc84a23df3aaa72de29f47c555748d054cd"} err="failed to get container status \"ecaeb92c5c6a60a43d48996e299eafc84a23df3aaa72de29f47c555748d054cd\": rpc error: code = NotFound desc = could not find container \"ecaeb92c5c6a60a43d48996e299eafc84a23df3aaa72de29f47c555748d054cd\": container with ID starting with ecaeb92c5c6a60a43d48996e299eafc84a23df3aaa72de29f47c555748d054cd not found: ID does not exist" Dec 12 00:47:21 crc kubenswrapper[4606]: I1212 00:47:21.864625 4606 scope.go:117] "RemoveContainer" containerID="5e0eaee84c7ab6407f25b69401b94702d3e502491630fe200f1bb88bcb234568" Dec 12 00:47:21 crc kubenswrapper[4606]: E1212 00:47:21.864849 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e0eaee84c7ab6407f25b69401b94702d3e502491630fe200f1bb88bcb234568\": container with ID starting with 5e0eaee84c7ab6407f25b69401b94702d3e502491630fe200f1bb88bcb234568 not found: ID does not exist" containerID="5e0eaee84c7ab6407f25b69401b94702d3e502491630fe200f1bb88bcb234568" Dec 12 00:47:21 crc kubenswrapper[4606]: I1212 00:47:21.864874 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e0eaee84c7ab6407f25b69401b94702d3e502491630fe200f1bb88bcb234568"} err="failed to get container status \"5e0eaee84c7ab6407f25b69401b94702d3e502491630fe200f1bb88bcb234568\": rpc error: code = NotFound desc = could not find container \"5e0eaee84c7ab6407f25b69401b94702d3e502491630fe200f1bb88bcb234568\": container with ID starting with 5e0eaee84c7ab6407f25b69401b94702d3e502491630fe200f1bb88bcb234568 not found: ID does not exist" Dec 12 00:47:21 crc kubenswrapper[4606]: I1212 00:47:21.864890 4606 scope.go:117] "RemoveContainer" containerID="ffac6eed7556dcdd49e518d306caae0f6b3f832fa7c08cab62b5616f706bb4e3" Dec 12 00:47:21 crc kubenswrapper[4606]: E1212 00:47:21.865803 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ffac6eed7556dcdd49e518d306caae0f6b3f832fa7c08cab62b5616f706bb4e3\": container with ID starting with ffac6eed7556dcdd49e518d306caae0f6b3f832fa7c08cab62b5616f706bb4e3 not found: ID does not exist" containerID="ffac6eed7556dcdd49e518d306caae0f6b3f832fa7c08cab62b5616f706bb4e3" Dec 12 00:47:21 crc kubenswrapper[4606]: I1212 00:47:21.865849 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffac6eed7556dcdd49e518d306caae0f6b3f832fa7c08cab62b5616f706bb4e3"} err="failed to get container status \"ffac6eed7556dcdd49e518d306caae0f6b3f832fa7c08cab62b5616f706bb4e3\": rpc error: code = NotFound desc = could not find container \"ffac6eed7556dcdd49e518d306caae0f6b3f832fa7c08cab62b5616f706bb4e3\": container with ID starting with ffac6eed7556dcdd49e518d306caae0f6b3f832fa7c08cab62b5616f706bb4e3 not found: ID does not exist" Dec 12 00:47:21 crc kubenswrapper[4606]: I1212 00:47:21.872468 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3dc8210-b17e-45f0-8501-2a545cd4d020-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "e3dc8210-b17e-45f0-8501-2a545cd4d020" (UID: "e3dc8210-b17e-45f0-8501-2a545cd4d020"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:47:21 crc kubenswrapper[4606]: I1212 00:47:21.879123 4606 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Dec 12 00:47:21 crc kubenswrapper[4606]: I1212 00:47:21.879153 4606 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3dc8210-b17e-45f0-8501-2a545cd4d020-scripts\") on node \"crc\" DevicePath \"\"" Dec 12 00:47:21 crc kubenswrapper[4606]: I1212 00:47:21.879162 4606 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3dc8210-b17e-45f0-8501-2a545cd4d020-logs\") on node \"crc\" DevicePath \"\"" Dec 12 00:47:21 crc kubenswrapper[4606]: I1212 00:47:21.879184 4606 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3dc8210-b17e-45f0-8501-2a545cd4d020-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 12 00:47:21 crc kubenswrapper[4606]: I1212 00:47:21.879195 4606 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3dc8210-b17e-45f0-8501-2a545cd4d020-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 12 00:47:21 crc kubenswrapper[4606]: I1212 00:47:21.879203 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l6kqb\" (UniqueName: \"kubernetes.io/projected/e3dc8210-b17e-45f0-8501-2a545cd4d020-kube-api-access-l6kqb\") on node \"crc\" DevicePath \"\"" Dec 12 00:47:21 crc kubenswrapper[4606]: I1212 00:47:21.905898 4606 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Dec 12 00:47:21 crc kubenswrapper[4606]: I1212 00:47:21.906091 4606 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-79c99578bb-cdgsn" podUID="9ede4720-3fd7-4524-adfc-c1c395f12170" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.148:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.148:8443: connect: connection refused" Dec 12 00:47:21 crc kubenswrapper[4606]: I1212 00:47:21.919495 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3dc8210-b17e-45f0-8501-2a545cd4d020-config-data" (OuterVolumeSpecName: "config-data") pod "e3dc8210-b17e-45f0-8501-2a545cd4d020" (UID: "e3dc8210-b17e-45f0-8501-2a545cd4d020"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:47:21 crc kubenswrapper[4606]: I1212 00:47:21.940415 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 12 00:47:21 crc kubenswrapper[4606]: I1212 00:47:21.949281 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 12 00:47:21 crc kubenswrapper[4606]: I1212 00:47:21.961393 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 12 00:47:21 crc kubenswrapper[4606]: E1212 00:47:21.961818 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecbf2172-2168-4e63-9523-38683ef6eb49" containerName="sg-core" Dec 12 00:47:21 crc kubenswrapper[4606]: I1212 00:47:21.961835 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecbf2172-2168-4e63-9523-38683ef6eb49" containerName="sg-core" Dec 12 00:47:21 crc kubenswrapper[4606]: E1212 00:47:21.961855 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3dc8210-b17e-45f0-8501-2a545cd4d020" containerName="glance-httpd" Dec 12 00:47:21 crc kubenswrapper[4606]: I1212 00:47:21.961863 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3dc8210-b17e-45f0-8501-2a545cd4d020" containerName="glance-httpd" Dec 12 00:47:21 crc kubenswrapper[4606]: E1212 00:47:21.961883 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3dc8210-b17e-45f0-8501-2a545cd4d020" containerName="glance-log" Dec 12 00:47:21 crc kubenswrapper[4606]: I1212 00:47:21.961888 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3dc8210-b17e-45f0-8501-2a545cd4d020" containerName="glance-log" Dec 12 00:47:21 crc kubenswrapper[4606]: E1212 00:47:21.961903 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecbf2172-2168-4e63-9523-38683ef6eb49" containerName="ceilometer-notification-agent" Dec 12 00:47:21 crc kubenswrapper[4606]: I1212 00:47:21.961909 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecbf2172-2168-4e63-9523-38683ef6eb49" containerName="ceilometer-notification-agent" Dec 12 00:47:21 crc kubenswrapper[4606]: E1212 00:47:21.961926 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecbf2172-2168-4e63-9523-38683ef6eb49" containerName="proxy-httpd" Dec 12 00:47:21 crc kubenswrapper[4606]: I1212 00:47:21.961931 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecbf2172-2168-4e63-9523-38683ef6eb49" containerName="proxy-httpd" Dec 12 00:47:21 crc kubenswrapper[4606]: E1212 00:47:21.961952 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecbf2172-2168-4e63-9523-38683ef6eb49" containerName="ceilometer-central-agent" Dec 12 00:47:21 crc kubenswrapper[4606]: I1212 00:47:21.961958 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecbf2172-2168-4e63-9523-38683ef6eb49" containerName="ceilometer-central-agent" Dec 12 00:47:21 crc kubenswrapper[4606]: I1212 00:47:21.962137 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3dc8210-b17e-45f0-8501-2a545cd4d020" containerName="glance-log" Dec 12 00:47:21 crc kubenswrapper[4606]: I1212 00:47:21.962156 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="ecbf2172-2168-4e63-9523-38683ef6eb49" containerName="ceilometer-central-agent" Dec 12 00:47:21 crc kubenswrapper[4606]: I1212 00:47:21.962183 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3dc8210-b17e-45f0-8501-2a545cd4d020" containerName="glance-httpd" Dec 12 00:47:21 crc kubenswrapper[4606]: I1212 00:47:21.962195 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="ecbf2172-2168-4e63-9523-38683ef6eb49" containerName="ceilometer-notification-agent" Dec 12 00:47:21 crc kubenswrapper[4606]: I1212 00:47:21.962212 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="ecbf2172-2168-4e63-9523-38683ef6eb49" containerName="proxy-httpd" Dec 12 00:47:21 crc kubenswrapper[4606]: I1212 00:47:21.962219 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="ecbf2172-2168-4e63-9523-38683ef6eb49" containerName="sg-core" Dec 12 00:47:21 crc kubenswrapper[4606]: I1212 00:47:21.963959 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 12 00:47:21 crc kubenswrapper[4606]: I1212 00:47:21.967513 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 12 00:47:21 crc kubenswrapper[4606]: I1212 00:47:21.968045 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 12 00:47:21 crc kubenswrapper[4606]: I1212 00:47:21.972445 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 12 00:47:22 crc kubenswrapper[4606]: I1212 00:47:21.982710 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8225349-5e4b-4de7-8a94-abb2feb23dc5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e8225349-5e4b-4de7-8a94-abb2feb23dc5\") " pod="openstack/ceilometer-0" Dec 12 00:47:22 crc kubenswrapper[4606]: I1212 00:47:21.982754 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9bb7\" (UniqueName: \"kubernetes.io/projected/e8225349-5e4b-4de7-8a94-abb2feb23dc5-kube-api-access-h9bb7\") pod \"ceilometer-0\" (UID: \"e8225349-5e4b-4de7-8a94-abb2feb23dc5\") " pod="openstack/ceilometer-0" Dec 12 00:47:22 crc kubenswrapper[4606]: I1212 00:47:21.982809 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e8225349-5e4b-4de7-8a94-abb2feb23dc5-run-httpd\") pod \"ceilometer-0\" (UID: \"e8225349-5e4b-4de7-8a94-abb2feb23dc5\") " pod="openstack/ceilometer-0" Dec 12 00:47:22 crc kubenswrapper[4606]: I1212 00:47:21.982863 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8225349-5e4b-4de7-8a94-abb2feb23dc5-scripts\") pod \"ceilometer-0\" (UID: \"e8225349-5e4b-4de7-8a94-abb2feb23dc5\") " pod="openstack/ceilometer-0" Dec 12 00:47:22 crc kubenswrapper[4606]: I1212 00:47:21.982933 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e8225349-5e4b-4de7-8a94-abb2feb23dc5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e8225349-5e4b-4de7-8a94-abb2feb23dc5\") " pod="openstack/ceilometer-0" Dec 12 00:47:22 crc kubenswrapper[4606]: I1212 00:47:21.982956 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e8225349-5e4b-4de7-8a94-abb2feb23dc5-log-httpd\") pod \"ceilometer-0\" (UID: \"e8225349-5e4b-4de7-8a94-abb2feb23dc5\") " pod="openstack/ceilometer-0" Dec 12 00:47:22 crc kubenswrapper[4606]: I1212 00:47:21.982978 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8225349-5e4b-4de7-8a94-abb2feb23dc5-config-data\") pod \"ceilometer-0\" (UID: \"e8225349-5e4b-4de7-8a94-abb2feb23dc5\") " pod="openstack/ceilometer-0" Dec 12 00:47:22 crc kubenswrapper[4606]: I1212 00:47:21.983033 4606 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Dec 12 00:47:22 crc kubenswrapper[4606]: I1212 00:47:21.983044 4606 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3dc8210-b17e-45f0-8501-2a545cd4d020-config-data\") on node \"crc\" DevicePath \"\"" Dec 12 00:47:22 crc kubenswrapper[4606]: I1212 00:47:22.196550 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e8225349-5e4b-4de7-8a94-abb2feb23dc5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e8225349-5e4b-4de7-8a94-abb2feb23dc5\") " pod="openstack/ceilometer-0" Dec 12 00:47:22 crc kubenswrapper[4606]: I1212 00:47:22.196618 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e8225349-5e4b-4de7-8a94-abb2feb23dc5-log-httpd\") pod \"ceilometer-0\" (UID: \"e8225349-5e4b-4de7-8a94-abb2feb23dc5\") " pod="openstack/ceilometer-0" Dec 12 00:47:22 crc kubenswrapper[4606]: I1212 00:47:22.196663 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8225349-5e4b-4de7-8a94-abb2feb23dc5-config-data\") pod \"ceilometer-0\" (UID: \"e8225349-5e4b-4de7-8a94-abb2feb23dc5\") " pod="openstack/ceilometer-0" Dec 12 00:47:22 crc kubenswrapper[4606]: I1212 00:47:22.196723 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8225349-5e4b-4de7-8a94-abb2feb23dc5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e8225349-5e4b-4de7-8a94-abb2feb23dc5\") " pod="openstack/ceilometer-0" Dec 12 00:47:22 crc kubenswrapper[4606]: I1212 00:47:22.196755 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h9bb7\" (UniqueName: \"kubernetes.io/projected/e8225349-5e4b-4de7-8a94-abb2feb23dc5-kube-api-access-h9bb7\") pod \"ceilometer-0\" (UID: \"e8225349-5e4b-4de7-8a94-abb2feb23dc5\") " pod="openstack/ceilometer-0" Dec 12 00:47:22 crc kubenswrapper[4606]: I1212 00:47:22.196829 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e8225349-5e4b-4de7-8a94-abb2feb23dc5-run-httpd\") pod \"ceilometer-0\" (UID: \"e8225349-5e4b-4de7-8a94-abb2feb23dc5\") " pod="openstack/ceilometer-0" Dec 12 00:47:22 crc kubenswrapper[4606]: I1212 00:47:22.196861 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8225349-5e4b-4de7-8a94-abb2feb23dc5-scripts\") pod \"ceilometer-0\" (UID: \"e8225349-5e4b-4de7-8a94-abb2feb23dc5\") " pod="openstack/ceilometer-0" Dec 12 00:47:22 crc kubenswrapper[4606]: I1212 00:47:22.206106 4606 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-b9fb498f6-62fcc" podUID="e38df57e-1a86-4c45-bf40-6282a6a049ed" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.149:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.149:8443: connect: connection refused" Dec 12 00:47:22 crc kubenswrapper[4606]: I1212 00:47:22.217890 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e8225349-5e4b-4de7-8a94-abb2feb23dc5-run-httpd\") pod \"ceilometer-0\" (UID: \"e8225349-5e4b-4de7-8a94-abb2feb23dc5\") " pod="openstack/ceilometer-0" Dec 12 00:47:22 crc kubenswrapper[4606]: I1212 00:47:22.231156 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e8225349-5e4b-4de7-8a94-abb2feb23dc5-log-httpd\") pod \"ceilometer-0\" (UID: \"e8225349-5e4b-4de7-8a94-abb2feb23dc5\") " pod="openstack/ceilometer-0" Dec 12 00:47:22 crc kubenswrapper[4606]: I1212 00:47:22.243259 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8225349-5e4b-4de7-8a94-abb2feb23dc5-config-data\") pod \"ceilometer-0\" (UID: \"e8225349-5e4b-4de7-8a94-abb2feb23dc5\") " pod="openstack/ceilometer-0" Dec 12 00:47:22 crc kubenswrapper[4606]: I1212 00:47:22.243833 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e8225349-5e4b-4de7-8a94-abb2feb23dc5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e8225349-5e4b-4de7-8a94-abb2feb23dc5\") " pod="openstack/ceilometer-0" Dec 12 00:47:22 crc kubenswrapper[4606]: I1212 00:47:22.286375 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9bb7\" (UniqueName: \"kubernetes.io/projected/e8225349-5e4b-4de7-8a94-abb2feb23dc5-kube-api-access-h9bb7\") pod \"ceilometer-0\" (UID: \"e8225349-5e4b-4de7-8a94-abb2feb23dc5\") " pod="openstack/ceilometer-0" Dec 12 00:47:22 crc kubenswrapper[4606]: I1212 00:47:22.286570 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8225349-5e4b-4de7-8a94-abb2feb23dc5-scripts\") pod \"ceilometer-0\" (UID: \"e8225349-5e4b-4de7-8a94-abb2feb23dc5\") " pod="openstack/ceilometer-0" Dec 12 00:47:22 crc kubenswrapper[4606]: I1212 00:47:22.315939 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8225349-5e4b-4de7-8a94-abb2feb23dc5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e8225349-5e4b-4de7-8a94-abb2feb23dc5\") " pod="openstack/ceilometer-0" Dec 12 00:47:22 crc kubenswrapper[4606]: I1212 00:47:22.514445 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 12 00:47:22 crc kubenswrapper[4606]: I1212 00:47:22.642357 4606 generic.go:334] "Generic (PLEG): container finished" podID="3483c50d-cf68-45ab-b01b-7fe2e6f1c057" containerID="a8677ce0e335635f3ad4f08931e968cfaa89743560ba6a0c94f0e8aa068b2550" exitCode=0 Dec 12 00:47:22 crc kubenswrapper[4606]: I1212 00:47:22.642484 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-7fad-account-create-update-bgwst" event={"ID":"3483c50d-cf68-45ab-b01b-7fe2e6f1c057","Type":"ContainerDied","Data":"a8677ce0e335635f3ad4f08931e968cfaa89743560ba6a0c94f0e8aa068b2550"} Dec 12 00:47:22 crc kubenswrapper[4606]: I1212 00:47:22.666007 4606 generic.go:334] "Generic (PLEG): container finished" podID="2bbd1699-c391-4c27-9a8b-9dadfc9d5530" containerID="213d60894d2453712748ea73c425102921cc0b24b95201f323d5c4ba48526224" exitCode=0 Dec 12 00:47:22 crc kubenswrapper[4606]: I1212 00:47:22.666084 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-zhvg8" event={"ID":"2bbd1699-c391-4c27-9a8b-9dadfc9d5530","Type":"ContainerDied","Data":"213d60894d2453712748ea73c425102921cc0b24b95201f323d5c4ba48526224"} Dec 12 00:47:22 crc kubenswrapper[4606]: I1212 00:47:22.683322 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e3dc8210-b17e-45f0-8501-2a545cd4d020","Type":"ContainerDied","Data":"7a0046cca686cedb5519fc8ec4d0df3d5a83972639ef799f01de8e8d00b4c7b9"} Dec 12 00:47:22 crc kubenswrapper[4606]: I1212 00:47:22.683368 4606 scope.go:117] "RemoveContainer" containerID="ea4874522323d37556a3ca228c6e8c9fd26d6a3c5e91668d929069262ffb5441" Dec 12 00:47:22 crc kubenswrapper[4606]: I1212 00:47:22.683511 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 12 00:47:22 crc kubenswrapper[4606]: I1212 00:47:22.698390 4606 generic.go:334] "Generic (PLEG): container finished" podID="509a7acf-27c5-45b9-8018-2b21b84b9b0a" containerID="06e78b4586f69ffcd4a6697e1b14dae2d3096a03ec94847beb9405c7f122232a" exitCode=0 Dec 12 00:47:22 crc kubenswrapper[4606]: I1212 00:47:22.698445 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-6bb0-account-create-update-p7kwg" event={"ID":"509a7acf-27c5-45b9-8018-2b21b84b9b0a","Type":"ContainerDied","Data":"06e78b4586f69ffcd4a6697e1b14dae2d3096a03ec94847beb9405c7f122232a"} Dec 12 00:47:22 crc kubenswrapper[4606]: I1212 00:47:22.710680 4606 generic.go:334] "Generic (PLEG): container finished" podID="c3208805-45c9-44bd-b7b3-622cdbc2dae9" containerID="38af5d998135ba12b6d1e8e5c2970e40b0c33bb52defd76f200d68829570d9ad" exitCode=0 Dec 12 00:47:22 crc kubenswrapper[4606]: I1212 00:47:22.710792 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-13d9-account-create-update-glqls" event={"ID":"c3208805-45c9-44bd-b7b3-622cdbc2dae9","Type":"ContainerDied","Data":"38af5d998135ba12b6d1e8e5c2970e40b0c33bb52defd76f200d68829570d9ad"} Dec 12 00:47:22 crc kubenswrapper[4606]: I1212 00:47:22.728385 4606 generic.go:334] "Generic (PLEG): container finished" podID="f17c1ec8-277b-4c4d-9dc7-75cdc2fb0eda" containerID="4cf8b7c0ae6c560fa6186ad7009387bc04e6f2e401a474840065e803126a8b84" exitCode=0 Dec 12 00:47:22 crc kubenswrapper[4606]: I1212 00:47:22.728645 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-lh8vs" event={"ID":"f17c1ec8-277b-4c4d-9dc7-75cdc2fb0eda","Type":"ContainerDied","Data":"4cf8b7c0ae6c560fa6186ad7009387bc04e6f2e401a474840065e803126a8b84"} Dec 12 00:47:22 crc kubenswrapper[4606]: I1212 00:47:22.780326 4606 scope.go:117] "RemoveContainer" containerID="6f8ed27d7178a7d8152db04f45d45142e85ee1cea9642100ea6666b984537706" Dec 12 00:47:22 crc kubenswrapper[4606]: I1212 00:47:22.786330 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 12 00:47:22 crc kubenswrapper[4606]: I1212 00:47:22.795281 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 12 00:47:22 crc kubenswrapper[4606]: I1212 00:47:22.869776 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 12 00:47:22 crc kubenswrapper[4606]: I1212 00:47:22.882995 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 12 00:47:22 crc kubenswrapper[4606]: I1212 00:47:22.915702 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 12 00:47:22 crc kubenswrapper[4606]: I1212 00:47:22.915954 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 12 00:47:22 crc kubenswrapper[4606]: I1212 00:47:22.919282 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 12 00:47:23 crc kubenswrapper[4606]: I1212 00:47:23.049679 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/aa0e877c-3d78-482d-8bb0-003663d82e4a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"aa0e877c-3d78-482d-8bb0-003663d82e4a\") " pod="openstack/glance-default-internal-api-0" Dec 12 00:47:23 crc kubenswrapper[4606]: I1212 00:47:23.049773 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42b25\" (UniqueName: \"kubernetes.io/projected/aa0e877c-3d78-482d-8bb0-003663d82e4a-kube-api-access-42b25\") pod \"glance-default-internal-api-0\" (UID: \"aa0e877c-3d78-482d-8bb0-003663d82e4a\") " pod="openstack/glance-default-internal-api-0" Dec 12 00:47:23 crc kubenswrapper[4606]: I1212 00:47:23.049810 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa0e877c-3d78-482d-8bb0-003663d82e4a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"aa0e877c-3d78-482d-8bb0-003663d82e4a\") " pod="openstack/glance-default-internal-api-0" Dec 12 00:47:23 crc kubenswrapper[4606]: I1212 00:47:23.049861 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"aa0e877c-3d78-482d-8bb0-003663d82e4a\") " pod="openstack/glance-default-internal-api-0" Dec 12 00:47:23 crc kubenswrapper[4606]: I1212 00:47:23.049887 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa0e877c-3d78-482d-8bb0-003663d82e4a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"aa0e877c-3d78-482d-8bb0-003663d82e4a\") " pod="openstack/glance-default-internal-api-0" Dec 12 00:47:23 crc kubenswrapper[4606]: I1212 00:47:23.049964 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa0e877c-3d78-482d-8bb0-003663d82e4a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"aa0e877c-3d78-482d-8bb0-003663d82e4a\") " pod="openstack/glance-default-internal-api-0" Dec 12 00:47:23 crc kubenswrapper[4606]: I1212 00:47:23.050053 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa0e877c-3d78-482d-8bb0-003663d82e4a-logs\") pod \"glance-default-internal-api-0\" (UID: \"aa0e877c-3d78-482d-8bb0-003663d82e4a\") " pod="openstack/glance-default-internal-api-0" Dec 12 00:47:23 crc kubenswrapper[4606]: I1212 00:47:23.050077 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa0e877c-3d78-482d-8bb0-003663d82e4a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"aa0e877c-3d78-482d-8bb0-003663d82e4a\") " pod="openstack/glance-default-internal-api-0" Dec 12 00:47:23 crc kubenswrapper[4606]: I1212 00:47:23.155306 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa0e877c-3d78-482d-8bb0-003663d82e4a-logs\") pod \"glance-default-internal-api-0\" (UID: \"aa0e877c-3d78-482d-8bb0-003663d82e4a\") " pod="openstack/glance-default-internal-api-0" Dec 12 00:47:23 crc kubenswrapper[4606]: I1212 00:47:23.155354 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa0e877c-3d78-482d-8bb0-003663d82e4a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"aa0e877c-3d78-482d-8bb0-003663d82e4a\") " pod="openstack/glance-default-internal-api-0" Dec 12 00:47:23 crc kubenswrapper[4606]: I1212 00:47:23.155386 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/aa0e877c-3d78-482d-8bb0-003663d82e4a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"aa0e877c-3d78-482d-8bb0-003663d82e4a\") " pod="openstack/glance-default-internal-api-0" Dec 12 00:47:23 crc kubenswrapper[4606]: I1212 00:47:23.155413 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42b25\" (UniqueName: \"kubernetes.io/projected/aa0e877c-3d78-482d-8bb0-003663d82e4a-kube-api-access-42b25\") pod \"glance-default-internal-api-0\" (UID: \"aa0e877c-3d78-482d-8bb0-003663d82e4a\") " pod="openstack/glance-default-internal-api-0" Dec 12 00:47:23 crc kubenswrapper[4606]: I1212 00:47:23.155430 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa0e877c-3d78-482d-8bb0-003663d82e4a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"aa0e877c-3d78-482d-8bb0-003663d82e4a\") " pod="openstack/glance-default-internal-api-0" Dec 12 00:47:23 crc kubenswrapper[4606]: I1212 00:47:23.155462 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"aa0e877c-3d78-482d-8bb0-003663d82e4a\") " pod="openstack/glance-default-internal-api-0" Dec 12 00:47:23 crc kubenswrapper[4606]: I1212 00:47:23.155478 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa0e877c-3d78-482d-8bb0-003663d82e4a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"aa0e877c-3d78-482d-8bb0-003663d82e4a\") " pod="openstack/glance-default-internal-api-0" Dec 12 00:47:23 crc kubenswrapper[4606]: I1212 00:47:23.155532 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa0e877c-3d78-482d-8bb0-003663d82e4a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"aa0e877c-3d78-482d-8bb0-003663d82e4a\") " pod="openstack/glance-default-internal-api-0" Dec 12 00:47:23 crc kubenswrapper[4606]: I1212 00:47:23.157934 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa0e877c-3d78-482d-8bb0-003663d82e4a-logs\") pod \"glance-default-internal-api-0\" (UID: \"aa0e877c-3d78-482d-8bb0-003663d82e4a\") " pod="openstack/glance-default-internal-api-0" Dec 12 00:47:23 crc kubenswrapper[4606]: I1212 00:47:23.166521 4606 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"aa0e877c-3d78-482d-8bb0-003663d82e4a\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/glance-default-internal-api-0" Dec 12 00:47:23 crc kubenswrapper[4606]: I1212 00:47:23.168711 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/aa0e877c-3d78-482d-8bb0-003663d82e4a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"aa0e877c-3d78-482d-8bb0-003663d82e4a\") " pod="openstack/glance-default-internal-api-0" Dec 12 00:47:23 crc kubenswrapper[4606]: I1212 00:47:23.178944 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa0e877c-3d78-482d-8bb0-003663d82e4a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"aa0e877c-3d78-482d-8bb0-003663d82e4a\") " pod="openstack/glance-default-internal-api-0" Dec 12 00:47:23 crc kubenswrapper[4606]: I1212 00:47:23.192551 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa0e877c-3d78-482d-8bb0-003663d82e4a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"aa0e877c-3d78-482d-8bb0-003663d82e4a\") " pod="openstack/glance-default-internal-api-0" Dec 12 00:47:23 crc kubenswrapper[4606]: I1212 00:47:23.202155 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa0e877c-3d78-482d-8bb0-003663d82e4a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"aa0e877c-3d78-482d-8bb0-003663d82e4a\") " pod="openstack/glance-default-internal-api-0" Dec 12 00:47:23 crc kubenswrapper[4606]: I1212 00:47:23.219723 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa0e877c-3d78-482d-8bb0-003663d82e4a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"aa0e877c-3d78-482d-8bb0-003663d82e4a\") " pod="openstack/glance-default-internal-api-0" Dec 12 00:47:23 crc kubenswrapper[4606]: I1212 00:47:23.243901 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42b25\" (UniqueName: \"kubernetes.io/projected/aa0e877c-3d78-482d-8bb0-003663d82e4a-kube-api-access-42b25\") pod \"glance-default-internal-api-0\" (UID: \"aa0e877c-3d78-482d-8bb0-003663d82e4a\") " pod="openstack/glance-default-internal-api-0" Dec 12 00:47:23 crc kubenswrapper[4606]: I1212 00:47:23.332415 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"aa0e877c-3d78-482d-8bb0-003663d82e4a\") " pod="openstack/glance-default-internal-api-0" Dec 12 00:47:23 crc kubenswrapper[4606]: I1212 00:47:23.555770 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 12 00:47:23 crc kubenswrapper[4606]: I1212 00:47:23.756367 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3dc8210-b17e-45f0-8501-2a545cd4d020" path="/var/lib/kubelet/pods/e3dc8210-b17e-45f0-8501-2a545cd4d020/volumes" Dec 12 00:47:23 crc kubenswrapper[4606]: I1212 00:47:23.756964 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ecbf2172-2168-4e63-9523-38683ef6eb49" path="/var/lib/kubelet/pods/ecbf2172-2168-4e63-9523-38683ef6eb49/volumes" Dec 12 00:47:23 crc kubenswrapper[4606]: I1212 00:47:23.768306 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-7fad-account-create-update-bgwst" Dec 12 00:47:23 crc kubenswrapper[4606]: I1212 00:47:23.849805 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-7fad-account-create-update-bgwst" event={"ID":"3483c50d-cf68-45ab-b01b-7fe2e6f1c057","Type":"ContainerDied","Data":"f0e3bd0e5509d46adc65ddce5f2c30b11947697efdde6c50bc129ea84af9295d"} Dec 12 00:47:23 crc kubenswrapper[4606]: I1212 00:47:23.849842 4606 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f0e3bd0e5509d46adc65ddce5f2c30b11947697efdde6c50bc129ea84af9295d" Dec 12 00:47:23 crc kubenswrapper[4606]: I1212 00:47:23.849892 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-7fad-account-create-update-bgwst" Dec 12 00:47:23 crc kubenswrapper[4606]: I1212 00:47:23.857879 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-5k8jp" Dec 12 00:47:23 crc kubenswrapper[4606]: I1212 00:47:23.870486 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cpfsj\" (UniqueName: \"kubernetes.io/projected/3483c50d-cf68-45ab-b01b-7fe2e6f1c057-kube-api-access-cpfsj\") pod \"3483c50d-cf68-45ab-b01b-7fe2e6f1c057\" (UID: \"3483c50d-cf68-45ab-b01b-7fe2e6f1c057\") " Dec 12 00:47:23 crc kubenswrapper[4606]: I1212 00:47:23.870665 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3483c50d-cf68-45ab-b01b-7fe2e6f1c057-operator-scripts\") pod \"3483c50d-cf68-45ab-b01b-7fe2e6f1c057\" (UID: \"3483c50d-cf68-45ab-b01b-7fe2e6f1c057\") " Dec 12 00:47:23 crc kubenswrapper[4606]: I1212 00:47:23.873536 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3483c50d-cf68-45ab-b01b-7fe2e6f1c057-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3483c50d-cf68-45ab-b01b-7fe2e6f1c057" (UID: "3483c50d-cf68-45ab-b01b-7fe2e6f1c057"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:47:23 crc kubenswrapper[4606]: I1212 00:47:23.889989 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3483c50d-cf68-45ab-b01b-7fe2e6f1c057-kube-api-access-cpfsj" (OuterVolumeSpecName: "kube-api-access-cpfsj") pod "3483c50d-cf68-45ab-b01b-7fe2e6f1c057" (UID: "3483c50d-cf68-45ab-b01b-7fe2e6f1c057"). InnerVolumeSpecName "kube-api-access-cpfsj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:47:23 crc kubenswrapper[4606]: I1212 00:47:23.890670 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-lh8vs" Dec 12 00:47:23 crc kubenswrapper[4606]: I1212 00:47:23.916395 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 12 00:47:23 crc kubenswrapper[4606]: I1212 00:47:23.968313 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-zhvg8" Dec 12 00:47:23 crc kubenswrapper[4606]: I1212 00:47:23.972499 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p94rb\" (UniqueName: \"kubernetes.io/projected/920936eb-f659-4feb-b571-e90906e8bee2-kube-api-access-p94rb\") pod \"920936eb-f659-4feb-b571-e90906e8bee2\" (UID: \"920936eb-f659-4feb-b571-e90906e8bee2\") " Dec 12 00:47:23 crc kubenswrapper[4606]: I1212 00:47:23.972702 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/920936eb-f659-4feb-b571-e90906e8bee2-operator-scripts\") pod \"920936eb-f659-4feb-b571-e90906e8bee2\" (UID: \"920936eb-f659-4feb-b571-e90906e8bee2\") " Dec 12 00:47:23 crc kubenswrapper[4606]: I1212 00:47:23.973367 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/920936eb-f659-4feb-b571-e90906e8bee2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "920936eb-f659-4feb-b571-e90906e8bee2" (UID: "920936eb-f659-4feb-b571-e90906e8bee2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:47:23 crc kubenswrapper[4606]: I1212 00:47:23.973902 4606 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3483c50d-cf68-45ab-b01b-7fe2e6f1c057-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 12 00:47:23 crc kubenswrapper[4606]: I1212 00:47:23.973924 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cpfsj\" (UniqueName: \"kubernetes.io/projected/3483c50d-cf68-45ab-b01b-7fe2e6f1c057-kube-api-access-cpfsj\") on node \"crc\" DevicePath \"\"" Dec 12 00:47:23 crc kubenswrapper[4606]: I1212 00:47:23.973938 4606 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/920936eb-f659-4feb-b571-e90906e8bee2-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 12 00:47:23 crc kubenswrapper[4606]: I1212 00:47:23.981221 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/920936eb-f659-4feb-b571-e90906e8bee2-kube-api-access-p94rb" (OuterVolumeSpecName: "kube-api-access-p94rb") pod "920936eb-f659-4feb-b571-e90906e8bee2" (UID: "920936eb-f659-4feb-b571-e90906e8bee2"). InnerVolumeSpecName "kube-api-access-p94rb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:47:24 crc kubenswrapper[4606]: I1212 00:47:24.076596 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fjpv8\" (UniqueName: \"kubernetes.io/projected/2bbd1699-c391-4c27-9a8b-9dadfc9d5530-kube-api-access-fjpv8\") pod \"2bbd1699-c391-4c27-9a8b-9dadfc9d5530\" (UID: \"2bbd1699-c391-4c27-9a8b-9dadfc9d5530\") " Dec 12 00:47:24 crc kubenswrapper[4606]: I1212 00:47:24.076646 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7x7v6\" (UniqueName: \"kubernetes.io/projected/f17c1ec8-277b-4c4d-9dc7-75cdc2fb0eda-kube-api-access-7x7v6\") pod \"f17c1ec8-277b-4c4d-9dc7-75cdc2fb0eda\" (UID: \"f17c1ec8-277b-4c4d-9dc7-75cdc2fb0eda\") " Dec 12 00:47:24 crc kubenswrapper[4606]: I1212 00:47:24.076668 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f17c1ec8-277b-4c4d-9dc7-75cdc2fb0eda-operator-scripts\") pod \"f17c1ec8-277b-4c4d-9dc7-75cdc2fb0eda\" (UID: \"f17c1ec8-277b-4c4d-9dc7-75cdc2fb0eda\") " Dec 12 00:47:24 crc kubenswrapper[4606]: I1212 00:47:24.076698 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2bbd1699-c391-4c27-9a8b-9dadfc9d5530-operator-scripts\") pod \"2bbd1699-c391-4c27-9a8b-9dadfc9d5530\" (UID: \"2bbd1699-c391-4c27-9a8b-9dadfc9d5530\") " Dec 12 00:47:24 crc kubenswrapper[4606]: I1212 00:47:24.076984 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p94rb\" (UniqueName: \"kubernetes.io/projected/920936eb-f659-4feb-b571-e90906e8bee2-kube-api-access-p94rb\") on node \"crc\" DevicePath \"\"" Dec 12 00:47:24 crc kubenswrapper[4606]: I1212 00:47:24.077386 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2bbd1699-c391-4c27-9a8b-9dadfc9d5530-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2bbd1699-c391-4c27-9a8b-9dadfc9d5530" (UID: "2bbd1699-c391-4c27-9a8b-9dadfc9d5530"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:47:24 crc kubenswrapper[4606]: I1212 00:47:24.077774 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f17c1ec8-277b-4c4d-9dc7-75cdc2fb0eda-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f17c1ec8-277b-4c4d-9dc7-75cdc2fb0eda" (UID: "f17c1ec8-277b-4c4d-9dc7-75cdc2fb0eda"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:47:24 crc kubenswrapper[4606]: I1212 00:47:24.082389 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2bbd1699-c391-4c27-9a8b-9dadfc9d5530-kube-api-access-fjpv8" (OuterVolumeSpecName: "kube-api-access-fjpv8") pod "2bbd1699-c391-4c27-9a8b-9dadfc9d5530" (UID: "2bbd1699-c391-4c27-9a8b-9dadfc9d5530"). InnerVolumeSpecName "kube-api-access-fjpv8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:47:24 crc kubenswrapper[4606]: I1212 00:47:24.090390 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f17c1ec8-277b-4c4d-9dc7-75cdc2fb0eda-kube-api-access-7x7v6" (OuterVolumeSpecName: "kube-api-access-7x7v6") pod "f17c1ec8-277b-4c4d-9dc7-75cdc2fb0eda" (UID: "f17c1ec8-277b-4c4d-9dc7-75cdc2fb0eda"). InnerVolumeSpecName "kube-api-access-7x7v6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:47:24 crc kubenswrapper[4606]: I1212 00:47:24.178472 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fjpv8\" (UniqueName: \"kubernetes.io/projected/2bbd1699-c391-4c27-9a8b-9dadfc9d5530-kube-api-access-fjpv8\") on node \"crc\" DevicePath \"\"" Dec 12 00:47:24 crc kubenswrapper[4606]: I1212 00:47:24.178501 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7x7v6\" (UniqueName: \"kubernetes.io/projected/f17c1ec8-277b-4c4d-9dc7-75cdc2fb0eda-kube-api-access-7x7v6\") on node \"crc\" DevicePath \"\"" Dec 12 00:47:24 crc kubenswrapper[4606]: I1212 00:47:24.178511 4606 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f17c1ec8-277b-4c4d-9dc7-75cdc2fb0eda-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 12 00:47:24 crc kubenswrapper[4606]: I1212 00:47:24.178521 4606 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2bbd1699-c391-4c27-9a8b-9dadfc9d5530-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 12 00:47:24 crc kubenswrapper[4606]: I1212 00:47:24.513678 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 12 00:47:24 crc kubenswrapper[4606]: I1212 00:47:24.694765 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-6bb0-account-create-update-p7kwg" Dec 12 00:47:24 crc kubenswrapper[4606]: I1212 00:47:24.704218 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-13d9-account-create-update-glqls" Dec 12 00:47:24 crc kubenswrapper[4606]: I1212 00:47:24.789637 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fdg66\" (UniqueName: \"kubernetes.io/projected/509a7acf-27c5-45b9-8018-2b21b84b9b0a-kube-api-access-fdg66\") pod \"509a7acf-27c5-45b9-8018-2b21b84b9b0a\" (UID: \"509a7acf-27c5-45b9-8018-2b21b84b9b0a\") " Dec 12 00:47:24 crc kubenswrapper[4606]: I1212 00:47:24.789704 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c3208805-45c9-44bd-b7b3-622cdbc2dae9-operator-scripts\") pod \"c3208805-45c9-44bd-b7b3-622cdbc2dae9\" (UID: \"c3208805-45c9-44bd-b7b3-622cdbc2dae9\") " Dec 12 00:47:24 crc kubenswrapper[4606]: I1212 00:47:24.789735 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/509a7acf-27c5-45b9-8018-2b21b84b9b0a-operator-scripts\") pod \"509a7acf-27c5-45b9-8018-2b21b84b9b0a\" (UID: \"509a7acf-27c5-45b9-8018-2b21b84b9b0a\") " Dec 12 00:47:24 crc kubenswrapper[4606]: I1212 00:47:24.789756 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6rb77\" (UniqueName: \"kubernetes.io/projected/c3208805-45c9-44bd-b7b3-622cdbc2dae9-kube-api-access-6rb77\") pod \"c3208805-45c9-44bd-b7b3-622cdbc2dae9\" (UID: \"c3208805-45c9-44bd-b7b3-622cdbc2dae9\") " Dec 12 00:47:24 crc kubenswrapper[4606]: I1212 00:47:24.790840 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3208805-45c9-44bd-b7b3-622cdbc2dae9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c3208805-45c9-44bd-b7b3-622cdbc2dae9" (UID: "c3208805-45c9-44bd-b7b3-622cdbc2dae9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:47:24 crc kubenswrapper[4606]: I1212 00:47:24.791457 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/509a7acf-27c5-45b9-8018-2b21b84b9b0a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "509a7acf-27c5-45b9-8018-2b21b84b9b0a" (UID: "509a7acf-27c5-45b9-8018-2b21b84b9b0a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:47:24 crc kubenswrapper[4606]: I1212 00:47:24.801992 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3208805-45c9-44bd-b7b3-622cdbc2dae9-kube-api-access-6rb77" (OuterVolumeSpecName: "kube-api-access-6rb77") pod "c3208805-45c9-44bd-b7b3-622cdbc2dae9" (UID: "c3208805-45c9-44bd-b7b3-622cdbc2dae9"). InnerVolumeSpecName "kube-api-access-6rb77". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:47:24 crc kubenswrapper[4606]: I1212 00:47:24.802446 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/509a7acf-27c5-45b9-8018-2b21b84b9b0a-kube-api-access-fdg66" (OuterVolumeSpecName: "kube-api-access-fdg66") pod "509a7acf-27c5-45b9-8018-2b21b84b9b0a" (UID: "509a7acf-27c5-45b9-8018-2b21b84b9b0a"). InnerVolumeSpecName "kube-api-access-fdg66". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:47:24 crc kubenswrapper[4606]: I1212 00:47:24.879971 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-5k8jp" event={"ID":"920936eb-f659-4feb-b571-e90906e8bee2","Type":"ContainerDied","Data":"7b1f810ebad14e2477ba861506f00a5acaf2bd68c8015dadc1b96dcda94da917"} Dec 12 00:47:24 crc kubenswrapper[4606]: I1212 00:47:24.880006 4606 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7b1f810ebad14e2477ba861506f00a5acaf2bd68c8015dadc1b96dcda94da917" Dec 12 00:47:24 crc kubenswrapper[4606]: I1212 00:47:24.880086 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-5k8jp" Dec 12 00:47:24 crc kubenswrapper[4606]: I1212 00:47:24.891826 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fdg66\" (UniqueName: \"kubernetes.io/projected/509a7acf-27c5-45b9-8018-2b21b84b9b0a-kube-api-access-fdg66\") on node \"crc\" DevicePath \"\"" Dec 12 00:47:24 crc kubenswrapper[4606]: I1212 00:47:24.891861 4606 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c3208805-45c9-44bd-b7b3-622cdbc2dae9-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 12 00:47:24 crc kubenswrapper[4606]: I1212 00:47:24.891873 4606 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/509a7acf-27c5-45b9-8018-2b21b84b9b0a-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 12 00:47:24 crc kubenswrapper[4606]: I1212 00:47:24.891885 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6rb77\" (UniqueName: \"kubernetes.io/projected/c3208805-45c9-44bd-b7b3-622cdbc2dae9-kube-api-access-6rb77\") on node \"crc\" DevicePath \"\"" Dec 12 00:47:24 crc kubenswrapper[4606]: I1212 00:47:24.895305 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-13d9-account-create-update-glqls" event={"ID":"c3208805-45c9-44bd-b7b3-622cdbc2dae9","Type":"ContainerDied","Data":"9a8018148efc3a1351027878edc96c727797f229798e65e1c3a76ae6f2eeb991"} Dec 12 00:47:24 crc kubenswrapper[4606]: I1212 00:47:24.895340 4606 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9a8018148efc3a1351027878edc96c727797f229798e65e1c3a76ae6f2eeb991" Dec 12 00:47:24 crc kubenswrapper[4606]: I1212 00:47:24.895417 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-13d9-account-create-update-glqls" Dec 12 00:47:24 crc kubenswrapper[4606]: I1212 00:47:24.912924 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-lh8vs" Dec 12 00:47:24 crc kubenswrapper[4606]: I1212 00:47:24.912954 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-lh8vs" event={"ID":"f17c1ec8-277b-4c4d-9dc7-75cdc2fb0eda","Type":"ContainerDied","Data":"5d8c3857532e877392c9d46a3d22441d63c9ff008846a4c7a21098792c898aa4"} Dec 12 00:47:24 crc kubenswrapper[4606]: I1212 00:47:24.912995 4606 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5d8c3857532e877392c9d46a3d22441d63c9ff008846a4c7a21098792c898aa4" Dec 12 00:47:24 crc kubenswrapper[4606]: I1212 00:47:24.918141 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"aa0e877c-3d78-482d-8bb0-003663d82e4a","Type":"ContainerStarted","Data":"59fabc69fd242090f9bcc6cac9182ee95f5eca033be3101e6d5c5888b8cfc5a0"} Dec 12 00:47:24 crc kubenswrapper[4606]: I1212 00:47:24.921711 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-zhvg8" event={"ID":"2bbd1699-c391-4c27-9a8b-9dadfc9d5530","Type":"ContainerDied","Data":"d0f0505c21f4eb9eeba32f5364c3426638241d3d8b8ab81ce587882f768208c8"} Dec 12 00:47:24 crc kubenswrapper[4606]: I1212 00:47:24.921749 4606 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d0f0505c21f4eb9eeba32f5364c3426638241d3d8b8ab81ce587882f768208c8" Dec 12 00:47:24 crc kubenswrapper[4606]: I1212 00:47:24.921814 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-zhvg8" Dec 12 00:47:24 crc kubenswrapper[4606]: I1212 00:47:24.929793 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 12 00:47:24 crc kubenswrapper[4606]: I1212 00:47:24.930018 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="c4accb34-f155-4198-b222-0800ff8755c3" containerName="glance-log" containerID="cri-o://54585af99135294041a5d78a2a9e320dc5ae2f06f493abdfbd3547ad292de90b" gracePeriod=30 Dec 12 00:47:24 crc kubenswrapper[4606]: I1212 00:47:24.930473 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="c4accb34-f155-4198-b222-0800ff8755c3" containerName="glance-httpd" containerID="cri-o://12f4e82554976085fa170a9fcbd902bb2540c05f81b4c6d1b9cbca2a1a5219fb" gracePeriod=30 Dec 12 00:47:24 crc kubenswrapper[4606]: I1212 00:47:24.934697 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e8225349-5e4b-4de7-8a94-abb2feb23dc5","Type":"ContainerStarted","Data":"4d08906e5794ce9ba0fc81b6982aa61cedbc3bee3d449247e9efe2050bb5acd4"} Dec 12 00:47:24 crc kubenswrapper[4606]: I1212 00:47:24.934730 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e8225349-5e4b-4de7-8a94-abb2feb23dc5","Type":"ContainerStarted","Data":"19e16a7d61dd0c924474397f204012d9cffd29d3e6c96532d414aab9412c405e"} Dec 12 00:47:24 crc kubenswrapper[4606]: I1212 00:47:24.937779 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-6bb0-account-create-update-p7kwg" event={"ID":"509a7acf-27c5-45b9-8018-2b21b84b9b0a","Type":"ContainerDied","Data":"1f7547771307df898eff7eba37c34c9cc7196d6f110c52c9f7e779bc31abff65"} Dec 12 00:47:24 crc kubenswrapper[4606]: I1212 00:47:24.937808 4606 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1f7547771307df898eff7eba37c34c9cc7196d6f110c52c9f7e779bc31abff65" Dec 12 00:47:24 crc kubenswrapper[4606]: I1212 00:47:24.937861 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-6bb0-account-create-update-p7kwg" Dec 12 00:47:25 crc kubenswrapper[4606]: I1212 00:47:25.953692 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"aa0e877c-3d78-482d-8bb0-003663d82e4a","Type":"ContainerStarted","Data":"e2fca6014329db322949b7bf25061d69ee3b0f5d32a87b0a8c5f3f6dcad74949"} Dec 12 00:47:25 crc kubenswrapper[4606]: I1212 00:47:25.969776 4606 generic.go:334] "Generic (PLEG): container finished" podID="c4accb34-f155-4198-b222-0800ff8755c3" containerID="54585af99135294041a5d78a2a9e320dc5ae2f06f493abdfbd3547ad292de90b" exitCode=143 Dec 12 00:47:25 crc kubenswrapper[4606]: I1212 00:47:25.969816 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c4accb34-f155-4198-b222-0800ff8755c3","Type":"ContainerDied","Data":"54585af99135294041a5d78a2a9e320dc5ae2f06f493abdfbd3547ad292de90b"} Dec 12 00:47:26 crc kubenswrapper[4606]: I1212 00:47:26.979514 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"aa0e877c-3d78-482d-8bb0-003663d82e4a","Type":"ContainerStarted","Data":"f3d91cc9c27247ed117c293a9121118b05135cbd1ceb2f907ed3fd85a85df7c4"} Dec 12 00:47:26 crc kubenswrapper[4606]: I1212 00:47:26.982537 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e8225349-5e4b-4de7-8a94-abb2feb23dc5","Type":"ContainerStarted","Data":"fef1a907e5308de5f4e3bd067eae513f37d73ca5580b04fd8cf294f068e72cc6"} Dec 12 00:47:27 crc kubenswrapper[4606]: I1212 00:47:27.004814 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.004603065 podStartE2EDuration="5.004603065s" podCreationTimestamp="2025-12-12 00:47:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:47:27.003024883 +0000 UTC m=+1437.548377749" watchObservedRunningTime="2025-12-12 00:47:27.004603065 +0000 UTC m=+1437.549955931" Dec 12 00:47:27 crc kubenswrapper[4606]: I1212 00:47:27.994820 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e8225349-5e4b-4de7-8a94-abb2feb23dc5","Type":"ContainerStarted","Data":"8bb682f7dd660207006f92d30ed05a706a2f8b83c1f54ce22e8e4b86555d52bf"} Dec 12 00:47:28 crc kubenswrapper[4606]: I1212 00:47:28.751729 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 12 00:47:28 crc kubenswrapper[4606]: I1212 00:47:28.883564 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4accb34-f155-4198-b222-0800ff8755c3-combined-ca-bundle\") pod \"c4accb34-f155-4198-b222-0800ff8755c3\" (UID: \"c4accb34-f155-4198-b222-0800ff8755c3\") " Dec 12 00:47:28 crc kubenswrapper[4606]: I1212 00:47:28.883855 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c4accb34-f155-4198-b222-0800ff8755c3-logs\") pod \"c4accb34-f155-4198-b222-0800ff8755c3\" (UID: \"c4accb34-f155-4198-b222-0800ff8755c3\") " Dec 12 00:47:28 crc kubenswrapper[4606]: I1212 00:47:28.884014 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4accb34-f155-4198-b222-0800ff8755c3-config-data\") pod \"c4accb34-f155-4198-b222-0800ff8755c3\" (UID: \"c4accb34-f155-4198-b222-0800ff8755c3\") " Dec 12 00:47:28 crc kubenswrapper[4606]: I1212 00:47:28.884598 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c4accb34-f155-4198-b222-0800ff8755c3-logs" (OuterVolumeSpecName: "logs") pod "c4accb34-f155-4198-b222-0800ff8755c3" (UID: "c4accb34-f155-4198-b222-0800ff8755c3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 00:47:28 crc kubenswrapper[4606]: I1212 00:47:28.884694 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"c4accb34-f155-4198-b222-0800ff8755c3\" (UID: \"c4accb34-f155-4198-b222-0800ff8755c3\") " Dec 12 00:47:28 crc kubenswrapper[4606]: I1212 00:47:28.884770 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bsqzl\" (UniqueName: \"kubernetes.io/projected/c4accb34-f155-4198-b222-0800ff8755c3-kube-api-access-bsqzl\") pod \"c4accb34-f155-4198-b222-0800ff8755c3\" (UID: \"c4accb34-f155-4198-b222-0800ff8755c3\") " Dec 12 00:47:28 crc kubenswrapper[4606]: I1212 00:47:28.884889 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c4accb34-f155-4198-b222-0800ff8755c3-httpd-run\") pod \"c4accb34-f155-4198-b222-0800ff8755c3\" (UID: \"c4accb34-f155-4198-b222-0800ff8755c3\") " Dec 12 00:47:28 crc kubenswrapper[4606]: I1212 00:47:28.885052 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c4accb34-f155-4198-b222-0800ff8755c3-scripts\") pod \"c4accb34-f155-4198-b222-0800ff8755c3\" (UID: \"c4accb34-f155-4198-b222-0800ff8755c3\") " Dec 12 00:47:28 crc kubenswrapper[4606]: I1212 00:47:28.885133 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c4accb34-f155-4198-b222-0800ff8755c3-public-tls-certs\") pod \"c4accb34-f155-4198-b222-0800ff8755c3\" (UID: \"c4accb34-f155-4198-b222-0800ff8755c3\") " Dec 12 00:47:28 crc kubenswrapper[4606]: I1212 00:47:28.885676 4606 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c4accb34-f155-4198-b222-0800ff8755c3-logs\") on node \"crc\" DevicePath \"\"" Dec 12 00:47:28 crc kubenswrapper[4606]: I1212 00:47:28.886437 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c4accb34-f155-4198-b222-0800ff8755c3-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "c4accb34-f155-4198-b222-0800ff8755c3" (UID: "c4accb34-f155-4198-b222-0800ff8755c3"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 00:47:28 crc kubenswrapper[4606]: I1212 00:47:28.890082 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4accb34-f155-4198-b222-0800ff8755c3-kube-api-access-bsqzl" (OuterVolumeSpecName: "kube-api-access-bsqzl") pod "c4accb34-f155-4198-b222-0800ff8755c3" (UID: "c4accb34-f155-4198-b222-0800ff8755c3"). InnerVolumeSpecName "kube-api-access-bsqzl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:47:28 crc kubenswrapper[4606]: I1212 00:47:28.890102 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "c4accb34-f155-4198-b222-0800ff8755c3" (UID: "c4accb34-f155-4198-b222-0800ff8755c3"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 12 00:47:28 crc kubenswrapper[4606]: I1212 00:47:28.899097 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4accb34-f155-4198-b222-0800ff8755c3-scripts" (OuterVolumeSpecName: "scripts") pod "c4accb34-f155-4198-b222-0800ff8755c3" (UID: "c4accb34-f155-4198-b222-0800ff8755c3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:47:28 crc kubenswrapper[4606]: I1212 00:47:28.916288 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4accb34-f155-4198-b222-0800ff8755c3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c4accb34-f155-4198-b222-0800ff8755c3" (UID: "c4accb34-f155-4198-b222-0800ff8755c3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:47:28 crc kubenswrapper[4606]: I1212 00:47:28.957300 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4accb34-f155-4198-b222-0800ff8755c3-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "c4accb34-f155-4198-b222-0800ff8755c3" (UID: "c4accb34-f155-4198-b222-0800ff8755c3"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:47:28 crc kubenswrapper[4606]: I1212 00:47:28.968435 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4accb34-f155-4198-b222-0800ff8755c3-config-data" (OuterVolumeSpecName: "config-data") pod "c4accb34-f155-4198-b222-0800ff8755c3" (UID: "c4accb34-f155-4198-b222-0800ff8755c3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:47:28 crc kubenswrapper[4606]: I1212 00:47:28.987921 4606 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c4accb34-f155-4198-b222-0800ff8755c3-scripts\") on node \"crc\" DevicePath \"\"" Dec 12 00:47:28 crc kubenswrapper[4606]: I1212 00:47:28.988253 4606 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c4accb34-f155-4198-b222-0800ff8755c3-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 12 00:47:28 crc kubenswrapper[4606]: I1212 00:47:28.988367 4606 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4accb34-f155-4198-b222-0800ff8755c3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 12 00:47:28 crc kubenswrapper[4606]: I1212 00:47:28.988462 4606 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4accb34-f155-4198-b222-0800ff8755c3-config-data\") on node \"crc\" DevicePath \"\"" Dec 12 00:47:28 crc kubenswrapper[4606]: I1212 00:47:28.988584 4606 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Dec 12 00:47:28 crc kubenswrapper[4606]: I1212 00:47:28.988699 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bsqzl\" (UniqueName: \"kubernetes.io/projected/c4accb34-f155-4198-b222-0800ff8755c3-kube-api-access-bsqzl\") on node \"crc\" DevicePath \"\"" Dec 12 00:47:28 crc kubenswrapper[4606]: I1212 00:47:28.988770 4606 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c4accb34-f155-4198-b222-0800ff8755c3-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 12 00:47:29 crc kubenswrapper[4606]: I1212 00:47:29.004962 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e8225349-5e4b-4de7-8a94-abb2feb23dc5","Type":"ContainerStarted","Data":"22f615273af1cba8df38973d7592fbdfe3f400f290a85b5be4ed45569443c733"} Dec 12 00:47:29 crc kubenswrapper[4606]: I1212 00:47:29.007332 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 12 00:47:29 crc kubenswrapper[4606]: I1212 00:47:29.009564 4606 generic.go:334] "Generic (PLEG): container finished" podID="c4accb34-f155-4198-b222-0800ff8755c3" containerID="12f4e82554976085fa170a9fcbd902bb2540c05f81b4c6d1b9cbca2a1a5219fb" exitCode=0 Dec 12 00:47:29 crc kubenswrapper[4606]: I1212 00:47:29.009644 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 12 00:47:29 crc kubenswrapper[4606]: I1212 00:47:29.009664 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c4accb34-f155-4198-b222-0800ff8755c3","Type":"ContainerDied","Data":"12f4e82554976085fa170a9fcbd902bb2540c05f81b4c6d1b9cbca2a1a5219fb"} Dec 12 00:47:29 crc kubenswrapper[4606]: I1212 00:47:29.010863 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c4accb34-f155-4198-b222-0800ff8755c3","Type":"ContainerDied","Data":"a2183fecdd829489fcbd25eb8052b9d29697ccc0f5f85b55a4f7794ca15bb399"} Dec 12 00:47:29 crc kubenswrapper[4606]: I1212 00:47:29.010886 4606 scope.go:117] "RemoveContainer" containerID="12f4e82554976085fa170a9fcbd902bb2540c05f81b4c6d1b9cbca2a1a5219fb" Dec 12 00:47:29 crc kubenswrapper[4606]: I1212 00:47:29.015943 4606 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Dec 12 00:47:29 crc kubenswrapper[4606]: I1212 00:47:29.035925 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.796952786 podStartE2EDuration="8.03590618s" podCreationTimestamp="2025-12-12 00:47:21 +0000 UTC" firstStartedPulling="2025-12-12 00:47:23.910284975 +0000 UTC m=+1434.455637841" lastFinishedPulling="2025-12-12 00:47:28.149238369 +0000 UTC m=+1438.694591235" observedRunningTime="2025-12-12 00:47:29.032312263 +0000 UTC m=+1439.577665129" watchObservedRunningTime="2025-12-12 00:47:29.03590618 +0000 UTC m=+1439.581259046" Dec 12 00:47:29 crc kubenswrapper[4606]: I1212 00:47:29.042297 4606 scope.go:117] "RemoveContainer" containerID="54585af99135294041a5d78a2a9e320dc5ae2f06f493abdfbd3547ad292de90b" Dec 12 00:47:29 crc kubenswrapper[4606]: I1212 00:47:29.059117 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 12 00:47:29 crc kubenswrapper[4606]: I1212 00:47:29.062918 4606 scope.go:117] "RemoveContainer" containerID="12f4e82554976085fa170a9fcbd902bb2540c05f81b4c6d1b9cbca2a1a5219fb" Dec 12 00:47:29 crc kubenswrapper[4606]: E1212 00:47:29.063308 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12f4e82554976085fa170a9fcbd902bb2540c05f81b4c6d1b9cbca2a1a5219fb\": container with ID starting with 12f4e82554976085fa170a9fcbd902bb2540c05f81b4c6d1b9cbca2a1a5219fb not found: ID does not exist" containerID="12f4e82554976085fa170a9fcbd902bb2540c05f81b4c6d1b9cbca2a1a5219fb" Dec 12 00:47:29 crc kubenswrapper[4606]: I1212 00:47:29.063337 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12f4e82554976085fa170a9fcbd902bb2540c05f81b4c6d1b9cbca2a1a5219fb"} err="failed to get container status \"12f4e82554976085fa170a9fcbd902bb2540c05f81b4c6d1b9cbca2a1a5219fb\": rpc error: code = NotFound desc = could not find container \"12f4e82554976085fa170a9fcbd902bb2540c05f81b4c6d1b9cbca2a1a5219fb\": container with ID starting with 12f4e82554976085fa170a9fcbd902bb2540c05f81b4c6d1b9cbca2a1a5219fb not found: ID does not exist" Dec 12 00:47:29 crc kubenswrapper[4606]: I1212 00:47:29.063357 4606 scope.go:117] "RemoveContainer" containerID="54585af99135294041a5d78a2a9e320dc5ae2f06f493abdfbd3547ad292de90b" Dec 12 00:47:29 crc kubenswrapper[4606]: E1212 00:47:29.063654 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54585af99135294041a5d78a2a9e320dc5ae2f06f493abdfbd3547ad292de90b\": container with ID starting with 54585af99135294041a5d78a2a9e320dc5ae2f06f493abdfbd3547ad292de90b not found: ID does not exist" containerID="54585af99135294041a5d78a2a9e320dc5ae2f06f493abdfbd3547ad292de90b" Dec 12 00:47:29 crc kubenswrapper[4606]: I1212 00:47:29.063731 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54585af99135294041a5d78a2a9e320dc5ae2f06f493abdfbd3547ad292de90b"} err="failed to get container status \"54585af99135294041a5d78a2a9e320dc5ae2f06f493abdfbd3547ad292de90b\": rpc error: code = NotFound desc = could not find container \"54585af99135294041a5d78a2a9e320dc5ae2f06f493abdfbd3547ad292de90b\": container with ID starting with 54585af99135294041a5d78a2a9e320dc5ae2f06f493abdfbd3547ad292de90b not found: ID does not exist" Dec 12 00:47:29 crc kubenswrapper[4606]: I1212 00:47:29.070048 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 12 00:47:29 crc kubenswrapper[4606]: I1212 00:47:29.090267 4606 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Dec 12 00:47:29 crc kubenswrapper[4606]: I1212 00:47:29.096760 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 12 00:47:29 crc kubenswrapper[4606]: E1212 00:47:29.097115 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bbd1699-c391-4c27-9a8b-9dadfc9d5530" containerName="mariadb-database-create" Dec 12 00:47:29 crc kubenswrapper[4606]: I1212 00:47:29.097130 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bbd1699-c391-4c27-9a8b-9dadfc9d5530" containerName="mariadb-database-create" Dec 12 00:47:29 crc kubenswrapper[4606]: E1212 00:47:29.097141 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="920936eb-f659-4feb-b571-e90906e8bee2" containerName="mariadb-database-create" Dec 12 00:47:29 crc kubenswrapper[4606]: I1212 00:47:29.097147 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="920936eb-f659-4feb-b571-e90906e8bee2" containerName="mariadb-database-create" Dec 12 00:47:29 crc kubenswrapper[4606]: E1212 00:47:29.097161 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4accb34-f155-4198-b222-0800ff8755c3" containerName="glance-log" Dec 12 00:47:29 crc kubenswrapper[4606]: I1212 00:47:29.097166 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4accb34-f155-4198-b222-0800ff8755c3" containerName="glance-log" Dec 12 00:47:29 crc kubenswrapper[4606]: E1212 00:47:29.100306 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3483c50d-cf68-45ab-b01b-7fe2e6f1c057" containerName="mariadb-account-create-update" Dec 12 00:47:29 crc kubenswrapper[4606]: I1212 00:47:29.100327 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="3483c50d-cf68-45ab-b01b-7fe2e6f1c057" containerName="mariadb-account-create-update" Dec 12 00:47:29 crc kubenswrapper[4606]: E1212 00:47:29.100336 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4accb34-f155-4198-b222-0800ff8755c3" containerName="glance-httpd" Dec 12 00:47:29 crc kubenswrapper[4606]: I1212 00:47:29.100341 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4accb34-f155-4198-b222-0800ff8755c3" containerName="glance-httpd" Dec 12 00:47:29 crc kubenswrapper[4606]: E1212 00:47:29.100357 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="509a7acf-27c5-45b9-8018-2b21b84b9b0a" containerName="mariadb-account-create-update" Dec 12 00:47:29 crc kubenswrapper[4606]: I1212 00:47:29.100363 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="509a7acf-27c5-45b9-8018-2b21b84b9b0a" containerName="mariadb-account-create-update" Dec 12 00:47:29 crc kubenswrapper[4606]: E1212 00:47:29.100384 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f17c1ec8-277b-4c4d-9dc7-75cdc2fb0eda" containerName="mariadb-database-create" Dec 12 00:47:29 crc kubenswrapper[4606]: I1212 00:47:29.100389 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="f17c1ec8-277b-4c4d-9dc7-75cdc2fb0eda" containerName="mariadb-database-create" Dec 12 00:47:29 crc kubenswrapper[4606]: E1212 00:47:29.100403 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3208805-45c9-44bd-b7b3-622cdbc2dae9" containerName="mariadb-account-create-update" Dec 12 00:47:29 crc kubenswrapper[4606]: I1212 00:47:29.100409 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3208805-45c9-44bd-b7b3-622cdbc2dae9" containerName="mariadb-account-create-update" Dec 12 00:47:29 crc kubenswrapper[4606]: I1212 00:47:29.100633 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3208805-45c9-44bd-b7b3-622cdbc2dae9" containerName="mariadb-account-create-update" Dec 12 00:47:29 crc kubenswrapper[4606]: I1212 00:47:29.100650 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="3483c50d-cf68-45ab-b01b-7fe2e6f1c057" containerName="mariadb-account-create-update" Dec 12 00:47:29 crc kubenswrapper[4606]: I1212 00:47:29.100659 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4accb34-f155-4198-b222-0800ff8755c3" containerName="glance-log" Dec 12 00:47:29 crc kubenswrapper[4606]: I1212 00:47:29.100667 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="f17c1ec8-277b-4c4d-9dc7-75cdc2fb0eda" containerName="mariadb-database-create" Dec 12 00:47:29 crc kubenswrapper[4606]: I1212 00:47:29.100675 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="2bbd1699-c391-4c27-9a8b-9dadfc9d5530" containerName="mariadb-database-create" Dec 12 00:47:29 crc kubenswrapper[4606]: I1212 00:47:29.100685 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="920936eb-f659-4feb-b571-e90906e8bee2" containerName="mariadb-database-create" Dec 12 00:47:29 crc kubenswrapper[4606]: I1212 00:47:29.100695 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4accb34-f155-4198-b222-0800ff8755c3" containerName="glance-httpd" Dec 12 00:47:29 crc kubenswrapper[4606]: I1212 00:47:29.100705 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="509a7acf-27c5-45b9-8018-2b21b84b9b0a" containerName="mariadb-account-create-update" Dec 12 00:47:29 crc kubenswrapper[4606]: I1212 00:47:29.101545 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 12 00:47:29 crc kubenswrapper[4606]: I1212 00:47:29.128116 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 12 00:47:29 crc kubenswrapper[4606]: I1212 00:47:29.138793 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 12 00:47:29 crc kubenswrapper[4606]: I1212 00:47:29.153029 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 12 00:47:29 crc kubenswrapper[4606]: I1212 00:47:29.192243 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/24e9dfd8-9299-4981-b95d-a4200749037c-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"24e9dfd8-9299-4981-b95d-a4200749037c\") " pod="openstack/glance-default-external-api-0" Dec 12 00:47:29 crc kubenswrapper[4606]: I1212 00:47:29.192290 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24e9dfd8-9299-4981-b95d-a4200749037c-scripts\") pod \"glance-default-external-api-0\" (UID: \"24e9dfd8-9299-4981-b95d-a4200749037c\") " pod="openstack/glance-default-external-api-0" Dec 12 00:47:29 crc kubenswrapper[4606]: I1212 00:47:29.192349 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/24e9dfd8-9299-4981-b95d-a4200749037c-logs\") pod \"glance-default-external-api-0\" (UID: \"24e9dfd8-9299-4981-b95d-a4200749037c\") " pod="openstack/glance-default-external-api-0" Dec 12 00:47:29 crc kubenswrapper[4606]: I1212 00:47:29.192379 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxkd9\" (UniqueName: \"kubernetes.io/projected/24e9dfd8-9299-4981-b95d-a4200749037c-kube-api-access-qxkd9\") pod \"glance-default-external-api-0\" (UID: \"24e9dfd8-9299-4981-b95d-a4200749037c\") " pod="openstack/glance-default-external-api-0" Dec 12 00:47:29 crc kubenswrapper[4606]: I1212 00:47:29.192400 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24e9dfd8-9299-4981-b95d-a4200749037c-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"24e9dfd8-9299-4981-b95d-a4200749037c\") " pod="openstack/glance-default-external-api-0" Dec 12 00:47:29 crc kubenswrapper[4606]: I1212 00:47:29.192449 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24e9dfd8-9299-4981-b95d-a4200749037c-config-data\") pod \"glance-default-external-api-0\" (UID: \"24e9dfd8-9299-4981-b95d-a4200749037c\") " pod="openstack/glance-default-external-api-0" Dec 12 00:47:29 crc kubenswrapper[4606]: I1212 00:47:29.192464 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"24e9dfd8-9299-4981-b95d-a4200749037c\") " pod="openstack/glance-default-external-api-0" Dec 12 00:47:29 crc kubenswrapper[4606]: I1212 00:47:29.192494 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/24e9dfd8-9299-4981-b95d-a4200749037c-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"24e9dfd8-9299-4981-b95d-a4200749037c\") " pod="openstack/glance-default-external-api-0" Dec 12 00:47:29 crc kubenswrapper[4606]: I1212 00:47:29.293583 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/24e9dfd8-9299-4981-b95d-a4200749037c-logs\") pod \"glance-default-external-api-0\" (UID: \"24e9dfd8-9299-4981-b95d-a4200749037c\") " pod="openstack/glance-default-external-api-0" Dec 12 00:47:29 crc kubenswrapper[4606]: I1212 00:47:29.293883 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxkd9\" (UniqueName: \"kubernetes.io/projected/24e9dfd8-9299-4981-b95d-a4200749037c-kube-api-access-qxkd9\") pod \"glance-default-external-api-0\" (UID: \"24e9dfd8-9299-4981-b95d-a4200749037c\") " pod="openstack/glance-default-external-api-0" Dec 12 00:47:29 crc kubenswrapper[4606]: I1212 00:47:29.294230 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24e9dfd8-9299-4981-b95d-a4200749037c-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"24e9dfd8-9299-4981-b95d-a4200749037c\") " pod="openstack/glance-default-external-api-0" Dec 12 00:47:29 crc kubenswrapper[4606]: I1212 00:47:29.294097 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/24e9dfd8-9299-4981-b95d-a4200749037c-logs\") pod \"glance-default-external-api-0\" (UID: \"24e9dfd8-9299-4981-b95d-a4200749037c\") " pod="openstack/glance-default-external-api-0" Dec 12 00:47:29 crc kubenswrapper[4606]: I1212 00:47:29.294733 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24e9dfd8-9299-4981-b95d-a4200749037c-config-data\") pod \"glance-default-external-api-0\" (UID: \"24e9dfd8-9299-4981-b95d-a4200749037c\") " pod="openstack/glance-default-external-api-0" Dec 12 00:47:29 crc kubenswrapper[4606]: I1212 00:47:29.294800 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"24e9dfd8-9299-4981-b95d-a4200749037c\") " pod="openstack/glance-default-external-api-0" Dec 12 00:47:29 crc kubenswrapper[4606]: I1212 00:47:29.294981 4606 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"24e9dfd8-9299-4981-b95d-a4200749037c\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-external-api-0" Dec 12 00:47:29 crc kubenswrapper[4606]: I1212 00:47:29.297009 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/24e9dfd8-9299-4981-b95d-a4200749037c-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"24e9dfd8-9299-4981-b95d-a4200749037c\") " pod="openstack/glance-default-external-api-0" Dec 12 00:47:29 crc kubenswrapper[4606]: I1212 00:47:29.297095 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/24e9dfd8-9299-4981-b95d-a4200749037c-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"24e9dfd8-9299-4981-b95d-a4200749037c\") " pod="openstack/glance-default-external-api-0" Dec 12 00:47:29 crc kubenswrapper[4606]: I1212 00:47:29.297126 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24e9dfd8-9299-4981-b95d-a4200749037c-scripts\") pod \"glance-default-external-api-0\" (UID: \"24e9dfd8-9299-4981-b95d-a4200749037c\") " pod="openstack/glance-default-external-api-0" Dec 12 00:47:29 crc kubenswrapper[4606]: I1212 00:47:29.298491 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/24e9dfd8-9299-4981-b95d-a4200749037c-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"24e9dfd8-9299-4981-b95d-a4200749037c\") " pod="openstack/glance-default-external-api-0" Dec 12 00:47:29 crc kubenswrapper[4606]: I1212 00:47:29.303729 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/24e9dfd8-9299-4981-b95d-a4200749037c-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"24e9dfd8-9299-4981-b95d-a4200749037c\") " pod="openstack/glance-default-external-api-0" Dec 12 00:47:29 crc kubenswrapper[4606]: I1212 00:47:29.304114 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24e9dfd8-9299-4981-b95d-a4200749037c-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"24e9dfd8-9299-4981-b95d-a4200749037c\") " pod="openstack/glance-default-external-api-0" Dec 12 00:47:29 crc kubenswrapper[4606]: I1212 00:47:29.312398 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxkd9\" (UniqueName: \"kubernetes.io/projected/24e9dfd8-9299-4981-b95d-a4200749037c-kube-api-access-qxkd9\") pod \"glance-default-external-api-0\" (UID: \"24e9dfd8-9299-4981-b95d-a4200749037c\") " pod="openstack/glance-default-external-api-0" Dec 12 00:47:29 crc kubenswrapper[4606]: I1212 00:47:29.318782 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24e9dfd8-9299-4981-b95d-a4200749037c-scripts\") pod \"glance-default-external-api-0\" (UID: \"24e9dfd8-9299-4981-b95d-a4200749037c\") " pod="openstack/glance-default-external-api-0" Dec 12 00:47:29 crc kubenswrapper[4606]: I1212 00:47:29.325203 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24e9dfd8-9299-4981-b95d-a4200749037c-config-data\") pod \"glance-default-external-api-0\" (UID: \"24e9dfd8-9299-4981-b95d-a4200749037c\") " pod="openstack/glance-default-external-api-0" Dec 12 00:47:29 crc kubenswrapper[4606]: I1212 00:47:29.327748 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"24e9dfd8-9299-4981-b95d-a4200749037c\") " pod="openstack/glance-default-external-api-0" Dec 12 00:47:29 crc kubenswrapper[4606]: I1212 00:47:29.402786 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-s2ncv"] Dec 12 00:47:29 crc kubenswrapper[4606]: I1212 00:47:29.403974 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-s2ncv" Dec 12 00:47:29 crc kubenswrapper[4606]: I1212 00:47:29.411030 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Dec 12 00:47:29 crc kubenswrapper[4606]: I1212 00:47:29.411285 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-vcljj" Dec 12 00:47:29 crc kubenswrapper[4606]: I1212 00:47:29.411291 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 12 00:47:29 crc kubenswrapper[4606]: I1212 00:47:29.429087 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-s2ncv"] Dec 12 00:47:29 crc kubenswrapper[4606]: I1212 00:47:29.462262 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 12 00:47:29 crc kubenswrapper[4606]: I1212 00:47:29.501117 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73f59adf-51e1-45e9-95d8-ac24a6310f1e-config-data\") pod \"nova-cell0-conductor-db-sync-s2ncv\" (UID: \"73f59adf-51e1-45e9-95d8-ac24a6310f1e\") " pod="openstack/nova-cell0-conductor-db-sync-s2ncv" Dec 12 00:47:29 crc kubenswrapper[4606]: I1212 00:47:29.501475 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjx2d\" (UniqueName: \"kubernetes.io/projected/73f59adf-51e1-45e9-95d8-ac24a6310f1e-kube-api-access-zjx2d\") pod \"nova-cell0-conductor-db-sync-s2ncv\" (UID: \"73f59adf-51e1-45e9-95d8-ac24a6310f1e\") " pod="openstack/nova-cell0-conductor-db-sync-s2ncv" Dec 12 00:47:29 crc kubenswrapper[4606]: I1212 00:47:29.501624 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73f59adf-51e1-45e9-95d8-ac24a6310f1e-scripts\") pod \"nova-cell0-conductor-db-sync-s2ncv\" (UID: \"73f59adf-51e1-45e9-95d8-ac24a6310f1e\") " pod="openstack/nova-cell0-conductor-db-sync-s2ncv" Dec 12 00:47:29 crc kubenswrapper[4606]: I1212 00:47:29.501712 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73f59adf-51e1-45e9-95d8-ac24a6310f1e-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-s2ncv\" (UID: \"73f59adf-51e1-45e9-95d8-ac24a6310f1e\") " pod="openstack/nova-cell0-conductor-db-sync-s2ncv" Dec 12 00:47:29 crc kubenswrapper[4606]: I1212 00:47:29.603416 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73f59adf-51e1-45e9-95d8-ac24a6310f1e-scripts\") pod \"nova-cell0-conductor-db-sync-s2ncv\" (UID: \"73f59adf-51e1-45e9-95d8-ac24a6310f1e\") " pod="openstack/nova-cell0-conductor-db-sync-s2ncv" Dec 12 00:47:29 crc kubenswrapper[4606]: I1212 00:47:29.603685 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73f59adf-51e1-45e9-95d8-ac24a6310f1e-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-s2ncv\" (UID: \"73f59adf-51e1-45e9-95d8-ac24a6310f1e\") " pod="openstack/nova-cell0-conductor-db-sync-s2ncv" Dec 12 00:47:29 crc kubenswrapper[4606]: I1212 00:47:29.603817 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73f59adf-51e1-45e9-95d8-ac24a6310f1e-config-data\") pod \"nova-cell0-conductor-db-sync-s2ncv\" (UID: \"73f59adf-51e1-45e9-95d8-ac24a6310f1e\") " pod="openstack/nova-cell0-conductor-db-sync-s2ncv" Dec 12 00:47:29 crc kubenswrapper[4606]: I1212 00:47:29.603878 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjx2d\" (UniqueName: \"kubernetes.io/projected/73f59adf-51e1-45e9-95d8-ac24a6310f1e-kube-api-access-zjx2d\") pod \"nova-cell0-conductor-db-sync-s2ncv\" (UID: \"73f59adf-51e1-45e9-95d8-ac24a6310f1e\") " pod="openstack/nova-cell0-conductor-db-sync-s2ncv" Dec 12 00:47:29 crc kubenswrapper[4606]: I1212 00:47:29.613953 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73f59adf-51e1-45e9-95d8-ac24a6310f1e-config-data\") pod \"nova-cell0-conductor-db-sync-s2ncv\" (UID: \"73f59adf-51e1-45e9-95d8-ac24a6310f1e\") " pod="openstack/nova-cell0-conductor-db-sync-s2ncv" Dec 12 00:47:29 crc kubenswrapper[4606]: I1212 00:47:29.615697 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73f59adf-51e1-45e9-95d8-ac24a6310f1e-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-s2ncv\" (UID: \"73f59adf-51e1-45e9-95d8-ac24a6310f1e\") " pod="openstack/nova-cell0-conductor-db-sync-s2ncv" Dec 12 00:47:29 crc kubenswrapper[4606]: I1212 00:47:29.616091 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73f59adf-51e1-45e9-95d8-ac24a6310f1e-scripts\") pod \"nova-cell0-conductor-db-sync-s2ncv\" (UID: \"73f59adf-51e1-45e9-95d8-ac24a6310f1e\") " pod="openstack/nova-cell0-conductor-db-sync-s2ncv" Dec 12 00:47:29 crc kubenswrapper[4606]: I1212 00:47:29.630965 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjx2d\" (UniqueName: \"kubernetes.io/projected/73f59adf-51e1-45e9-95d8-ac24a6310f1e-kube-api-access-zjx2d\") pod \"nova-cell0-conductor-db-sync-s2ncv\" (UID: \"73f59adf-51e1-45e9-95d8-ac24a6310f1e\") " pod="openstack/nova-cell0-conductor-db-sync-s2ncv" Dec 12 00:47:29 crc kubenswrapper[4606]: I1212 00:47:29.726537 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-vcljj" Dec 12 00:47:29 crc kubenswrapper[4606]: I1212 00:47:29.731103 4606 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 12 00:47:29 crc kubenswrapper[4606]: I1212 00:47:29.731223 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-s2ncv" Dec 12 00:47:29 crc kubenswrapper[4606]: I1212 00:47:29.736208 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4accb34-f155-4198-b222-0800ff8755c3" path="/var/lib/kubelet/pods/c4accb34-f155-4198-b222-0800ff8755c3/volumes" Dec 12 00:47:30 crc kubenswrapper[4606]: I1212 00:47:30.182065 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 12 00:47:30 crc kubenswrapper[4606]: I1212 00:47:30.404834 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-s2ncv"] Dec 12 00:47:30 crc kubenswrapper[4606]: W1212 00:47:30.446722 4606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod73f59adf_51e1_45e9_95d8_ac24a6310f1e.slice/crio-89bec40075158683afd89f06646309bbd311dc563dddf3e31f8ce517c25a8524 WatchSource:0}: Error finding container 89bec40075158683afd89f06646309bbd311dc563dddf3e31f8ce517c25a8524: Status 404 returned error can't find the container with id 89bec40075158683afd89f06646309bbd311dc563dddf3e31f8ce517c25a8524 Dec 12 00:47:31 crc kubenswrapper[4606]: I1212 00:47:31.093186 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-s2ncv" event={"ID":"73f59adf-51e1-45e9-95d8-ac24a6310f1e","Type":"ContainerStarted","Data":"89bec40075158683afd89f06646309bbd311dc563dddf3e31f8ce517c25a8524"} Dec 12 00:47:31 crc kubenswrapper[4606]: I1212 00:47:31.094538 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"0d0864c8-b45f-4324-a56f-ff583d488da0","Type":"ContainerStarted","Data":"12f51c6711abdb42d80e24b4f68aa7bc1510eddc2ef86fc22ab966f15c67eb36"} Dec 12 00:47:31 crc kubenswrapper[4606]: I1212 00:47:31.097305 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"24e9dfd8-9299-4981-b95d-a4200749037c","Type":"ContainerStarted","Data":"759623b8be891c6a6df75f3df2371a5bb21b7779704528e8f25556dd2bb079c9"} Dec 12 00:47:31 crc kubenswrapper[4606]: I1212 00:47:31.097336 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"24e9dfd8-9299-4981-b95d-a4200749037c","Type":"ContainerStarted","Data":"176ab4ffacdb686ac8a0e04601812acaaf33737475e1f7374595775da950ffb0"} Dec 12 00:47:31 crc kubenswrapper[4606]: I1212 00:47:31.118851 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.487322572 podStartE2EDuration="34.118817756s" podCreationTimestamp="2025-12-12 00:46:57 +0000 UTC" firstStartedPulling="2025-12-12 00:46:58.605714818 +0000 UTC m=+1409.151067684" lastFinishedPulling="2025-12-12 00:47:30.237210002 +0000 UTC m=+1440.782562868" observedRunningTime="2025-12-12 00:47:31.114419988 +0000 UTC m=+1441.659772854" watchObservedRunningTime="2025-12-12 00:47:31.118817756 +0000 UTC m=+1441.664170622" Dec 12 00:47:31 crc kubenswrapper[4606]: I1212 00:47:31.906252 4606 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-79c99578bb-cdgsn" podUID="9ede4720-3fd7-4524-adfc-c1c395f12170" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.148:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.148:8443: connect: connection refused" Dec 12 00:47:32 crc kubenswrapper[4606]: I1212 00:47:32.010478 4606 patch_prober.go:28] interesting pod/machine-config-daemon-cqmz5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 12 00:47:32 crc kubenswrapper[4606]: I1212 00:47:32.010536 4606 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 12 00:47:32 crc kubenswrapper[4606]: I1212 00:47:32.112327 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"24e9dfd8-9299-4981-b95d-a4200749037c","Type":"ContainerStarted","Data":"6cdf9b65c0efd6cd8b858475c956cee97108174683feb08c6cfb18adda53826a"} Dec 12 00:47:32 crc kubenswrapper[4606]: I1212 00:47:32.143269 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.14325313 podStartE2EDuration="3.14325313s" podCreationTimestamp="2025-12-12 00:47:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:47:32.14028694 +0000 UTC m=+1442.685639806" watchObservedRunningTime="2025-12-12 00:47:32.14325313 +0000 UTC m=+1442.688605996" Dec 12 00:47:32 crc kubenswrapper[4606]: I1212 00:47:32.178597 4606 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-b9fb498f6-62fcc" podUID="e38df57e-1a86-4c45-bf40-6282a6a049ed" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.149:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.149:8443: connect: connection refused" Dec 12 00:47:32 crc kubenswrapper[4606]: I1212 00:47:32.178671 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-b9fb498f6-62fcc" Dec 12 00:47:32 crc kubenswrapper[4606]: I1212 00:47:32.179449 4606 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="horizon" containerStatusID={"Type":"cri-o","ID":"45c639ce0fe5cb7959a1e8ff1646d4eb9c473c0bc5c88c824b78ae25e0ea3b06"} pod="openstack/horizon-b9fb498f6-62fcc" containerMessage="Container horizon failed startup probe, will be restarted" Dec 12 00:47:32 crc kubenswrapper[4606]: I1212 00:47:32.179485 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-b9fb498f6-62fcc" podUID="e38df57e-1a86-4c45-bf40-6282a6a049ed" containerName="horizon" containerID="cri-o://45c639ce0fe5cb7959a1e8ff1646d4eb9c473c0bc5c88c824b78ae25e0ea3b06" gracePeriod=30 Dec 12 00:47:33 crc kubenswrapper[4606]: I1212 00:47:33.556813 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 12 00:47:33 crc kubenswrapper[4606]: I1212 00:47:33.557132 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 12 00:47:33 crc kubenswrapper[4606]: I1212 00:47:33.612877 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 12 00:47:33 crc kubenswrapper[4606]: I1212 00:47:33.639743 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 12 00:47:34 crc kubenswrapper[4606]: I1212 00:47:34.132646 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 12 00:47:34 crc kubenswrapper[4606]: I1212 00:47:34.132689 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 12 00:47:36 crc kubenswrapper[4606]: I1212 00:47:36.739000 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 12 00:47:36 crc kubenswrapper[4606]: I1212 00:47:36.739652 4606 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 12 00:47:36 crc kubenswrapper[4606]: I1212 00:47:36.815670 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 12 00:47:39 crc kubenswrapper[4606]: I1212 00:47:39.463383 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 12 00:47:39 crc kubenswrapper[4606]: I1212 00:47:39.463714 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 12 00:47:39 crc kubenswrapper[4606]: I1212 00:47:39.511814 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 12 00:47:39 crc kubenswrapper[4606]: I1212 00:47:39.516593 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 12 00:47:40 crc kubenswrapper[4606]: I1212 00:47:40.180491 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 12 00:47:40 crc kubenswrapper[4606]: I1212 00:47:40.180535 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 12 00:47:41 crc kubenswrapper[4606]: I1212 00:47:41.568485 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 12 00:47:41 crc kubenswrapper[4606]: I1212 00:47:41.569397 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e8225349-5e4b-4de7-8a94-abb2feb23dc5" containerName="ceilometer-central-agent" containerID="cri-o://4d08906e5794ce9ba0fc81b6982aa61cedbc3bee3d449247e9efe2050bb5acd4" gracePeriod=30 Dec 12 00:47:41 crc kubenswrapper[4606]: I1212 00:47:41.569438 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e8225349-5e4b-4de7-8a94-abb2feb23dc5" containerName="ceilometer-notification-agent" containerID="cri-o://fef1a907e5308de5f4e3bd067eae513f37d73ca5580b04fd8cf294f068e72cc6" gracePeriod=30 Dec 12 00:47:41 crc kubenswrapper[4606]: I1212 00:47:41.569427 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e8225349-5e4b-4de7-8a94-abb2feb23dc5" containerName="sg-core" containerID="cri-o://8bb682f7dd660207006f92d30ed05a706a2f8b83c1f54ce22e8e4b86555d52bf" gracePeriod=30 Dec 12 00:47:41 crc kubenswrapper[4606]: I1212 00:47:41.569515 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e8225349-5e4b-4de7-8a94-abb2feb23dc5" containerName="proxy-httpd" containerID="cri-o://22f615273af1cba8df38973d7592fbdfe3f400f290a85b5be4ed45569443c733" gracePeriod=30 Dec 12 00:47:41 crc kubenswrapper[4606]: I1212 00:47:41.581393 4606 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="e8225349-5e4b-4de7-8a94-abb2feb23dc5" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.179:3000/\": EOF" Dec 12 00:47:42 crc kubenswrapper[4606]: I1212 00:47:42.219472 4606 generic.go:334] "Generic (PLEG): container finished" podID="e8225349-5e4b-4de7-8a94-abb2feb23dc5" containerID="22f615273af1cba8df38973d7592fbdfe3f400f290a85b5be4ed45569443c733" exitCode=0 Dec 12 00:47:42 crc kubenswrapper[4606]: I1212 00:47:42.219738 4606 generic.go:334] "Generic (PLEG): container finished" podID="e8225349-5e4b-4de7-8a94-abb2feb23dc5" containerID="8bb682f7dd660207006f92d30ed05a706a2f8b83c1f54ce22e8e4b86555d52bf" exitCode=2 Dec 12 00:47:42 crc kubenswrapper[4606]: I1212 00:47:42.219749 4606 generic.go:334] "Generic (PLEG): container finished" podID="e8225349-5e4b-4de7-8a94-abb2feb23dc5" containerID="4d08906e5794ce9ba0fc81b6982aa61cedbc3bee3d449247e9efe2050bb5acd4" exitCode=0 Dec 12 00:47:42 crc kubenswrapper[4606]: I1212 00:47:42.219581 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e8225349-5e4b-4de7-8a94-abb2feb23dc5","Type":"ContainerDied","Data":"22f615273af1cba8df38973d7592fbdfe3f400f290a85b5be4ed45569443c733"} Dec 12 00:47:42 crc kubenswrapper[4606]: I1212 00:47:42.219793 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e8225349-5e4b-4de7-8a94-abb2feb23dc5","Type":"ContainerDied","Data":"8bb682f7dd660207006f92d30ed05a706a2f8b83c1f54ce22e8e4b86555d52bf"} Dec 12 00:47:42 crc kubenswrapper[4606]: I1212 00:47:42.219809 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e8225349-5e4b-4de7-8a94-abb2feb23dc5","Type":"ContainerDied","Data":"4d08906e5794ce9ba0fc81b6982aa61cedbc3bee3d449247e9efe2050bb5acd4"} Dec 12 00:47:42 crc kubenswrapper[4606]: I1212 00:47:42.881992 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 12 00:47:42 crc kubenswrapper[4606]: I1212 00:47:42.882130 4606 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 12 00:47:42 crc kubenswrapper[4606]: I1212 00:47:42.896818 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 12 00:47:43 crc kubenswrapper[4606]: I1212 00:47:43.277995 4606 generic.go:334] "Generic (PLEG): container finished" podID="e8225349-5e4b-4de7-8a94-abb2feb23dc5" containerID="fef1a907e5308de5f4e3bd067eae513f37d73ca5580b04fd8cf294f068e72cc6" exitCode=0 Dec 12 00:47:43 crc kubenswrapper[4606]: I1212 00:47:43.289047 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e8225349-5e4b-4de7-8a94-abb2feb23dc5","Type":"ContainerDied","Data":"fef1a907e5308de5f4e3bd067eae513f37d73ca5580b04fd8cf294f068e72cc6"} Dec 12 00:47:43 crc kubenswrapper[4606]: I1212 00:47:43.418765 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 12 00:47:43 crc kubenswrapper[4606]: I1212 00:47:43.513448 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e8225349-5e4b-4de7-8a94-abb2feb23dc5-run-httpd\") pod \"e8225349-5e4b-4de7-8a94-abb2feb23dc5\" (UID: \"e8225349-5e4b-4de7-8a94-abb2feb23dc5\") " Dec 12 00:47:43 crc kubenswrapper[4606]: I1212 00:47:43.513516 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8225349-5e4b-4de7-8a94-abb2feb23dc5-combined-ca-bundle\") pod \"e8225349-5e4b-4de7-8a94-abb2feb23dc5\" (UID: \"e8225349-5e4b-4de7-8a94-abb2feb23dc5\") " Dec 12 00:47:43 crc kubenswrapper[4606]: I1212 00:47:43.513557 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8225349-5e4b-4de7-8a94-abb2feb23dc5-config-data\") pod \"e8225349-5e4b-4de7-8a94-abb2feb23dc5\" (UID: \"e8225349-5e4b-4de7-8a94-abb2feb23dc5\") " Dec 12 00:47:43 crc kubenswrapper[4606]: I1212 00:47:43.513574 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8225349-5e4b-4de7-8a94-abb2feb23dc5-scripts\") pod \"e8225349-5e4b-4de7-8a94-abb2feb23dc5\" (UID: \"e8225349-5e4b-4de7-8a94-abb2feb23dc5\") " Dec 12 00:47:43 crc kubenswrapper[4606]: I1212 00:47:43.513594 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h9bb7\" (UniqueName: \"kubernetes.io/projected/e8225349-5e4b-4de7-8a94-abb2feb23dc5-kube-api-access-h9bb7\") pod \"e8225349-5e4b-4de7-8a94-abb2feb23dc5\" (UID: \"e8225349-5e4b-4de7-8a94-abb2feb23dc5\") " Dec 12 00:47:43 crc kubenswrapper[4606]: I1212 00:47:43.513620 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e8225349-5e4b-4de7-8a94-abb2feb23dc5-sg-core-conf-yaml\") pod \"e8225349-5e4b-4de7-8a94-abb2feb23dc5\" (UID: \"e8225349-5e4b-4de7-8a94-abb2feb23dc5\") " Dec 12 00:47:43 crc kubenswrapper[4606]: I1212 00:47:43.513700 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e8225349-5e4b-4de7-8a94-abb2feb23dc5-log-httpd\") pod \"e8225349-5e4b-4de7-8a94-abb2feb23dc5\" (UID: \"e8225349-5e4b-4de7-8a94-abb2feb23dc5\") " Dec 12 00:47:43 crc kubenswrapper[4606]: I1212 00:47:43.514493 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8225349-5e4b-4de7-8a94-abb2feb23dc5-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "e8225349-5e4b-4de7-8a94-abb2feb23dc5" (UID: "e8225349-5e4b-4de7-8a94-abb2feb23dc5"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 00:47:43 crc kubenswrapper[4606]: I1212 00:47:43.515011 4606 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e8225349-5e4b-4de7-8a94-abb2feb23dc5-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 12 00:47:43 crc kubenswrapper[4606]: I1212 00:47:43.515824 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8225349-5e4b-4de7-8a94-abb2feb23dc5-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "e8225349-5e4b-4de7-8a94-abb2feb23dc5" (UID: "e8225349-5e4b-4de7-8a94-abb2feb23dc5"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 00:47:43 crc kubenswrapper[4606]: I1212 00:47:43.522223 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8225349-5e4b-4de7-8a94-abb2feb23dc5-kube-api-access-h9bb7" (OuterVolumeSpecName: "kube-api-access-h9bb7") pod "e8225349-5e4b-4de7-8a94-abb2feb23dc5" (UID: "e8225349-5e4b-4de7-8a94-abb2feb23dc5"). InnerVolumeSpecName "kube-api-access-h9bb7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:47:43 crc kubenswrapper[4606]: I1212 00:47:43.522417 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8225349-5e4b-4de7-8a94-abb2feb23dc5-scripts" (OuterVolumeSpecName: "scripts") pod "e8225349-5e4b-4de7-8a94-abb2feb23dc5" (UID: "e8225349-5e4b-4de7-8a94-abb2feb23dc5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:47:43 crc kubenswrapper[4606]: I1212 00:47:43.549027 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8225349-5e4b-4de7-8a94-abb2feb23dc5-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "e8225349-5e4b-4de7-8a94-abb2feb23dc5" (UID: "e8225349-5e4b-4de7-8a94-abb2feb23dc5"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:47:43 crc kubenswrapper[4606]: I1212 00:47:43.595216 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8225349-5e4b-4de7-8a94-abb2feb23dc5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e8225349-5e4b-4de7-8a94-abb2feb23dc5" (UID: "e8225349-5e4b-4de7-8a94-abb2feb23dc5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:47:43 crc kubenswrapper[4606]: I1212 00:47:43.621079 4606 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8225349-5e4b-4de7-8a94-abb2feb23dc5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 12 00:47:43 crc kubenswrapper[4606]: I1212 00:47:43.621111 4606 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8225349-5e4b-4de7-8a94-abb2feb23dc5-scripts\") on node \"crc\" DevicePath \"\"" Dec 12 00:47:43 crc kubenswrapper[4606]: I1212 00:47:43.621120 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h9bb7\" (UniqueName: \"kubernetes.io/projected/e8225349-5e4b-4de7-8a94-abb2feb23dc5-kube-api-access-h9bb7\") on node \"crc\" DevicePath \"\"" Dec 12 00:47:43 crc kubenswrapper[4606]: I1212 00:47:43.621131 4606 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e8225349-5e4b-4de7-8a94-abb2feb23dc5-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 12 00:47:43 crc kubenswrapper[4606]: I1212 00:47:43.621139 4606 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e8225349-5e4b-4de7-8a94-abb2feb23dc5-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 12 00:47:43 crc kubenswrapper[4606]: I1212 00:47:43.635093 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8225349-5e4b-4de7-8a94-abb2feb23dc5-config-data" (OuterVolumeSpecName: "config-data") pod "e8225349-5e4b-4de7-8a94-abb2feb23dc5" (UID: "e8225349-5e4b-4de7-8a94-abb2feb23dc5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:47:43 crc kubenswrapper[4606]: I1212 00:47:43.723122 4606 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8225349-5e4b-4de7-8a94-abb2feb23dc5-config-data\") on node \"crc\" DevicePath \"\"" Dec 12 00:47:44 crc kubenswrapper[4606]: I1212 00:47:44.288215 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e8225349-5e4b-4de7-8a94-abb2feb23dc5","Type":"ContainerDied","Data":"19e16a7d61dd0c924474397f204012d9cffd29d3e6c96532d414aab9412c405e"} Dec 12 00:47:44 crc kubenswrapper[4606]: I1212 00:47:44.289365 4606 scope.go:117] "RemoveContainer" containerID="22f615273af1cba8df38973d7592fbdfe3f400f290a85b5be4ed45569443c733" Dec 12 00:47:44 crc kubenswrapper[4606]: I1212 00:47:44.289328 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 12 00:47:44 crc kubenswrapper[4606]: I1212 00:47:44.292893 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-s2ncv" event={"ID":"73f59adf-51e1-45e9-95d8-ac24a6310f1e","Type":"ContainerStarted","Data":"fdd80f0fe03a58d54350f78d7440602e42cfd91a7035fe119613a3b5bac54740"} Dec 12 00:47:44 crc kubenswrapper[4606]: I1212 00:47:44.310952 4606 scope.go:117] "RemoveContainer" containerID="8bb682f7dd660207006f92d30ed05a706a2f8b83c1f54ce22e8e4b86555d52bf" Dec 12 00:47:44 crc kubenswrapper[4606]: I1212 00:47:44.313483 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 12 00:47:44 crc kubenswrapper[4606]: I1212 00:47:44.325578 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 12 00:47:44 crc kubenswrapper[4606]: I1212 00:47:44.340097 4606 scope.go:117] "RemoveContainer" containerID="fef1a907e5308de5f4e3bd067eae513f37d73ca5580b04fd8cf294f068e72cc6" Dec 12 00:47:44 crc kubenswrapper[4606]: I1212 00:47:44.365727 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-s2ncv" podStartSLOduration=2.645630775 podStartE2EDuration="15.365707455s" podCreationTimestamp="2025-12-12 00:47:29 +0000 UTC" firstStartedPulling="2025-12-12 00:47:30.468313075 +0000 UTC m=+1441.013665941" lastFinishedPulling="2025-12-12 00:47:43.188389755 +0000 UTC m=+1453.733742621" observedRunningTime="2025-12-12 00:47:44.337338215 +0000 UTC m=+1454.882691081" watchObservedRunningTime="2025-12-12 00:47:44.365707455 +0000 UTC m=+1454.911060321" Dec 12 00:47:44 crc kubenswrapper[4606]: I1212 00:47:44.370633 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 12 00:47:44 crc kubenswrapper[4606]: E1212 00:47:44.372460 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8225349-5e4b-4de7-8a94-abb2feb23dc5" containerName="ceilometer-central-agent" Dec 12 00:47:44 crc kubenswrapper[4606]: I1212 00:47:44.372914 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8225349-5e4b-4de7-8a94-abb2feb23dc5" containerName="ceilometer-central-agent" Dec 12 00:47:44 crc kubenswrapper[4606]: E1212 00:47:44.372989 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8225349-5e4b-4de7-8a94-abb2feb23dc5" containerName="proxy-httpd" Dec 12 00:47:44 crc kubenswrapper[4606]: I1212 00:47:44.373041 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8225349-5e4b-4de7-8a94-abb2feb23dc5" containerName="proxy-httpd" Dec 12 00:47:44 crc kubenswrapper[4606]: E1212 00:47:44.373106 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8225349-5e4b-4de7-8a94-abb2feb23dc5" containerName="ceilometer-notification-agent" Dec 12 00:47:44 crc kubenswrapper[4606]: I1212 00:47:44.373155 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8225349-5e4b-4de7-8a94-abb2feb23dc5" containerName="ceilometer-notification-agent" Dec 12 00:47:44 crc kubenswrapper[4606]: E1212 00:47:44.373239 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8225349-5e4b-4de7-8a94-abb2feb23dc5" containerName="sg-core" Dec 12 00:47:44 crc kubenswrapper[4606]: I1212 00:47:44.373301 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8225349-5e4b-4de7-8a94-abb2feb23dc5" containerName="sg-core" Dec 12 00:47:44 crc kubenswrapper[4606]: I1212 00:47:44.373525 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8225349-5e4b-4de7-8a94-abb2feb23dc5" containerName="ceilometer-central-agent" Dec 12 00:47:44 crc kubenswrapper[4606]: I1212 00:47:44.373598 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8225349-5e4b-4de7-8a94-abb2feb23dc5" containerName="sg-core" Dec 12 00:47:44 crc kubenswrapper[4606]: I1212 00:47:44.373660 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8225349-5e4b-4de7-8a94-abb2feb23dc5" containerName="ceilometer-notification-agent" Dec 12 00:47:44 crc kubenswrapper[4606]: I1212 00:47:44.373720 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8225349-5e4b-4de7-8a94-abb2feb23dc5" containerName="proxy-httpd" Dec 12 00:47:44 crc kubenswrapper[4606]: I1212 00:47:44.370919 4606 scope.go:117] "RemoveContainer" containerID="4d08906e5794ce9ba0fc81b6982aa61cedbc3bee3d449247e9efe2050bb5acd4" Dec 12 00:47:44 crc kubenswrapper[4606]: I1212 00:47:44.375395 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 12 00:47:44 crc kubenswrapper[4606]: I1212 00:47:44.410343 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 12 00:47:44 crc kubenswrapper[4606]: I1212 00:47:44.410345 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 12 00:47:44 crc kubenswrapper[4606]: I1212 00:47:44.433249 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 12 00:47:44 crc kubenswrapper[4606]: I1212 00:47:44.434401 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a3279fa-90bc-4901-b487-999ec5662aca-scripts\") pod \"ceilometer-0\" (UID: \"5a3279fa-90bc-4901-b487-999ec5662aca\") " pod="openstack/ceilometer-0" Dec 12 00:47:44 crc kubenswrapper[4606]: I1212 00:47:44.434506 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5a3279fa-90bc-4901-b487-999ec5662aca-run-httpd\") pod \"ceilometer-0\" (UID: \"5a3279fa-90bc-4901-b487-999ec5662aca\") " pod="openstack/ceilometer-0" Dec 12 00:47:44 crc kubenswrapper[4606]: I1212 00:47:44.434622 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a3279fa-90bc-4901-b487-999ec5662aca-config-data\") pod \"ceilometer-0\" (UID: \"5a3279fa-90bc-4901-b487-999ec5662aca\") " pod="openstack/ceilometer-0" Dec 12 00:47:44 crc kubenswrapper[4606]: I1212 00:47:44.434728 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a3279fa-90bc-4901-b487-999ec5662aca-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5a3279fa-90bc-4901-b487-999ec5662aca\") " pod="openstack/ceilometer-0" Dec 12 00:47:44 crc kubenswrapper[4606]: I1212 00:47:44.434830 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5a3279fa-90bc-4901-b487-999ec5662aca-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5a3279fa-90bc-4901-b487-999ec5662aca\") " pod="openstack/ceilometer-0" Dec 12 00:47:44 crc kubenswrapper[4606]: I1212 00:47:44.434949 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbf89\" (UniqueName: \"kubernetes.io/projected/5a3279fa-90bc-4901-b487-999ec5662aca-kube-api-access-wbf89\") pod \"ceilometer-0\" (UID: \"5a3279fa-90bc-4901-b487-999ec5662aca\") " pod="openstack/ceilometer-0" Dec 12 00:47:44 crc kubenswrapper[4606]: I1212 00:47:44.435057 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5a3279fa-90bc-4901-b487-999ec5662aca-log-httpd\") pod \"ceilometer-0\" (UID: \"5a3279fa-90bc-4901-b487-999ec5662aca\") " pod="openstack/ceilometer-0" Dec 12 00:47:44 crc kubenswrapper[4606]: I1212 00:47:44.537046 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a3279fa-90bc-4901-b487-999ec5662aca-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5a3279fa-90bc-4901-b487-999ec5662aca\") " pod="openstack/ceilometer-0" Dec 12 00:47:44 crc kubenswrapper[4606]: I1212 00:47:44.537119 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5a3279fa-90bc-4901-b487-999ec5662aca-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5a3279fa-90bc-4901-b487-999ec5662aca\") " pod="openstack/ceilometer-0" Dec 12 00:47:44 crc kubenswrapper[4606]: I1212 00:47:44.537191 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wbf89\" (UniqueName: \"kubernetes.io/projected/5a3279fa-90bc-4901-b487-999ec5662aca-kube-api-access-wbf89\") pod \"ceilometer-0\" (UID: \"5a3279fa-90bc-4901-b487-999ec5662aca\") " pod="openstack/ceilometer-0" Dec 12 00:47:44 crc kubenswrapper[4606]: I1212 00:47:44.537222 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5a3279fa-90bc-4901-b487-999ec5662aca-log-httpd\") pod \"ceilometer-0\" (UID: \"5a3279fa-90bc-4901-b487-999ec5662aca\") " pod="openstack/ceilometer-0" Dec 12 00:47:44 crc kubenswrapper[4606]: I1212 00:47:44.537286 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a3279fa-90bc-4901-b487-999ec5662aca-scripts\") pod \"ceilometer-0\" (UID: \"5a3279fa-90bc-4901-b487-999ec5662aca\") " pod="openstack/ceilometer-0" Dec 12 00:47:44 crc kubenswrapper[4606]: I1212 00:47:44.537308 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5a3279fa-90bc-4901-b487-999ec5662aca-run-httpd\") pod \"ceilometer-0\" (UID: \"5a3279fa-90bc-4901-b487-999ec5662aca\") " pod="openstack/ceilometer-0" Dec 12 00:47:44 crc kubenswrapper[4606]: I1212 00:47:44.537396 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a3279fa-90bc-4901-b487-999ec5662aca-config-data\") pod \"ceilometer-0\" (UID: \"5a3279fa-90bc-4901-b487-999ec5662aca\") " pod="openstack/ceilometer-0" Dec 12 00:47:44 crc kubenswrapper[4606]: I1212 00:47:44.538654 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5a3279fa-90bc-4901-b487-999ec5662aca-log-httpd\") pod \"ceilometer-0\" (UID: \"5a3279fa-90bc-4901-b487-999ec5662aca\") " pod="openstack/ceilometer-0" Dec 12 00:47:44 crc kubenswrapper[4606]: I1212 00:47:44.539239 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5a3279fa-90bc-4901-b487-999ec5662aca-run-httpd\") pod \"ceilometer-0\" (UID: \"5a3279fa-90bc-4901-b487-999ec5662aca\") " pod="openstack/ceilometer-0" Dec 12 00:47:44 crc kubenswrapper[4606]: I1212 00:47:44.559031 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a3279fa-90bc-4901-b487-999ec5662aca-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5a3279fa-90bc-4901-b487-999ec5662aca\") " pod="openstack/ceilometer-0" Dec 12 00:47:44 crc kubenswrapper[4606]: I1212 00:47:44.574054 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a3279fa-90bc-4901-b487-999ec5662aca-config-data\") pod \"ceilometer-0\" (UID: \"5a3279fa-90bc-4901-b487-999ec5662aca\") " pod="openstack/ceilometer-0" Dec 12 00:47:44 crc kubenswrapper[4606]: I1212 00:47:44.578742 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5a3279fa-90bc-4901-b487-999ec5662aca-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5a3279fa-90bc-4901-b487-999ec5662aca\") " pod="openstack/ceilometer-0" Dec 12 00:47:44 crc kubenswrapper[4606]: I1212 00:47:44.587709 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a3279fa-90bc-4901-b487-999ec5662aca-scripts\") pod \"ceilometer-0\" (UID: \"5a3279fa-90bc-4901-b487-999ec5662aca\") " pod="openstack/ceilometer-0" Dec 12 00:47:44 crc kubenswrapper[4606]: I1212 00:47:44.591450 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbf89\" (UniqueName: \"kubernetes.io/projected/5a3279fa-90bc-4901-b487-999ec5662aca-kube-api-access-wbf89\") pod \"ceilometer-0\" (UID: \"5a3279fa-90bc-4901-b487-999ec5662aca\") " pod="openstack/ceilometer-0" Dec 12 00:47:44 crc kubenswrapper[4606]: I1212 00:47:44.731737 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 12 00:47:44 crc kubenswrapper[4606]: I1212 00:47:44.807944 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 12 00:47:45 crc kubenswrapper[4606]: I1212 00:47:45.292139 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-79c99578bb-cdgsn" Dec 12 00:47:45 crc kubenswrapper[4606]: W1212 00:47:45.327333 4606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5a3279fa_90bc_4901_b487_999ec5662aca.slice/crio-8d2bc4f98c9feaae5a430000a5bbfa606fc44e8e4a8ead7903cfde6001b9f159 WatchSource:0}: Error finding container 8d2bc4f98c9feaae5a430000a5bbfa606fc44e8e4a8ead7903cfde6001b9f159: Status 404 returned error can't find the container with id 8d2bc4f98c9feaae5a430000a5bbfa606fc44e8e4a8ead7903cfde6001b9f159 Dec 12 00:47:45 crc kubenswrapper[4606]: I1212 00:47:45.330281 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 12 00:47:45 crc kubenswrapper[4606]: I1212 00:47:45.709899 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8225349-5e4b-4de7-8a94-abb2feb23dc5" path="/var/lib/kubelet/pods/e8225349-5e4b-4de7-8a94-abb2feb23dc5/volumes" Dec 12 00:47:46 crc kubenswrapper[4606]: I1212 00:47:46.317975 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5a3279fa-90bc-4901-b487-999ec5662aca","Type":"ContainerStarted","Data":"d86820d3a5b64d78f6d5135fd59c6c9abfc5de051933f9ccd0259808462da447"} Dec 12 00:47:46 crc kubenswrapper[4606]: I1212 00:47:46.318281 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5a3279fa-90bc-4901-b487-999ec5662aca","Type":"ContainerStarted","Data":"8d2bc4f98c9feaae5a430000a5bbfa606fc44e8e4a8ead7903cfde6001b9f159"} Dec 12 00:47:47 crc kubenswrapper[4606]: I1212 00:47:47.245058 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-79c99578bb-cdgsn" Dec 12 00:47:47 crc kubenswrapper[4606]: I1212 00:47:47.343508 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5a3279fa-90bc-4901-b487-999ec5662aca","Type":"ContainerStarted","Data":"43f33118aea34745e33acdab17e4cd5c31a5e334030bc2ccf1387959c29d7570"} Dec 12 00:47:49 crc kubenswrapper[4606]: I1212 00:47:49.363898 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5a3279fa-90bc-4901-b487-999ec5662aca","Type":"ContainerStarted","Data":"f7386100b624464973e1ad1c1ed60575b11f7f32e9ed54665e31ffde18d4afeb"} Dec 12 00:47:51 crc kubenswrapper[4606]: I1212 00:47:51.384918 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5a3279fa-90bc-4901-b487-999ec5662aca","Type":"ContainerStarted","Data":"d5ca13b80eedeeb7bba7425d60ef4c6ebff323673dd57d2fa693733519214d6f"} Dec 12 00:47:51 crc kubenswrapper[4606]: I1212 00:47:51.385353 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5a3279fa-90bc-4901-b487-999ec5662aca" containerName="ceilometer-central-agent" containerID="cri-o://d86820d3a5b64d78f6d5135fd59c6c9abfc5de051933f9ccd0259808462da447" gracePeriod=30 Dec 12 00:47:51 crc kubenswrapper[4606]: I1212 00:47:51.385388 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5a3279fa-90bc-4901-b487-999ec5662aca" containerName="sg-core" containerID="cri-o://f7386100b624464973e1ad1c1ed60575b11f7f32e9ed54665e31ffde18d4afeb" gracePeriod=30 Dec 12 00:47:51 crc kubenswrapper[4606]: I1212 00:47:51.385384 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5a3279fa-90bc-4901-b487-999ec5662aca" containerName="proxy-httpd" containerID="cri-o://d5ca13b80eedeeb7bba7425d60ef4c6ebff323673dd57d2fa693733519214d6f" gracePeriod=30 Dec 12 00:47:51 crc kubenswrapper[4606]: I1212 00:47:51.385400 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5a3279fa-90bc-4901-b487-999ec5662aca" containerName="ceilometer-notification-agent" containerID="cri-o://43f33118aea34745e33acdab17e4cd5c31a5e334030bc2ccf1387959c29d7570" gracePeriod=30 Dec 12 00:47:51 crc kubenswrapper[4606]: I1212 00:47:51.385754 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 12 00:47:51 crc kubenswrapper[4606]: I1212 00:47:51.410112 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.476575413 podStartE2EDuration="7.410092639s" podCreationTimestamp="2025-12-12 00:47:44 +0000 UTC" firstStartedPulling="2025-12-12 00:47:45.329422801 +0000 UTC m=+1455.874775667" lastFinishedPulling="2025-12-12 00:47:50.262940007 +0000 UTC m=+1460.808292893" observedRunningTime="2025-12-12 00:47:51.401894189 +0000 UTC m=+1461.947247055" watchObservedRunningTime="2025-12-12 00:47:51.410092639 +0000 UTC m=+1461.955445505" Dec 12 00:47:52 crc kubenswrapper[4606]: I1212 00:47:52.395077 4606 generic.go:334] "Generic (PLEG): container finished" podID="5a3279fa-90bc-4901-b487-999ec5662aca" containerID="d5ca13b80eedeeb7bba7425d60ef4c6ebff323673dd57d2fa693733519214d6f" exitCode=0 Dec 12 00:47:52 crc kubenswrapper[4606]: I1212 00:47:52.395432 4606 generic.go:334] "Generic (PLEG): container finished" podID="5a3279fa-90bc-4901-b487-999ec5662aca" containerID="f7386100b624464973e1ad1c1ed60575b11f7f32e9ed54665e31ffde18d4afeb" exitCode=2 Dec 12 00:47:52 crc kubenswrapper[4606]: I1212 00:47:52.395441 4606 generic.go:334] "Generic (PLEG): container finished" podID="5a3279fa-90bc-4901-b487-999ec5662aca" containerID="43f33118aea34745e33acdab17e4cd5c31a5e334030bc2ccf1387959c29d7570" exitCode=0 Dec 12 00:47:52 crc kubenswrapper[4606]: I1212 00:47:52.395243 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5a3279fa-90bc-4901-b487-999ec5662aca","Type":"ContainerDied","Data":"d5ca13b80eedeeb7bba7425d60ef4c6ebff323673dd57d2fa693733519214d6f"} Dec 12 00:47:52 crc kubenswrapper[4606]: I1212 00:47:52.395469 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5a3279fa-90bc-4901-b487-999ec5662aca","Type":"ContainerDied","Data":"f7386100b624464973e1ad1c1ed60575b11f7f32e9ed54665e31ffde18d4afeb"} Dec 12 00:47:52 crc kubenswrapper[4606]: I1212 00:47:52.395481 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5a3279fa-90bc-4901-b487-999ec5662aca","Type":"ContainerDied","Data":"43f33118aea34745e33acdab17e4cd5c31a5e334030bc2ccf1387959c29d7570"} Dec 12 00:47:58 crc kubenswrapper[4606]: I1212 00:47:58.440658 4606 generic.go:334] "Generic (PLEG): container finished" podID="73f59adf-51e1-45e9-95d8-ac24a6310f1e" containerID="fdd80f0fe03a58d54350f78d7440602e42cfd91a7035fe119613a3b5bac54740" exitCode=0 Dec 12 00:47:58 crc kubenswrapper[4606]: I1212 00:47:58.440830 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-s2ncv" event={"ID":"73f59adf-51e1-45e9-95d8-ac24a6310f1e","Type":"ContainerDied","Data":"fdd80f0fe03a58d54350f78d7440602e42cfd91a7035fe119613a3b5bac54740"} Dec 12 00:47:59 crc kubenswrapper[4606]: I1212 00:47:59.802578 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-s2ncv" Dec 12 00:47:59 crc kubenswrapper[4606]: I1212 00:47:59.925207 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73f59adf-51e1-45e9-95d8-ac24a6310f1e-scripts\") pod \"73f59adf-51e1-45e9-95d8-ac24a6310f1e\" (UID: \"73f59adf-51e1-45e9-95d8-ac24a6310f1e\") " Dec 12 00:47:59 crc kubenswrapper[4606]: I1212 00:47:59.925331 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zjx2d\" (UniqueName: \"kubernetes.io/projected/73f59adf-51e1-45e9-95d8-ac24a6310f1e-kube-api-access-zjx2d\") pod \"73f59adf-51e1-45e9-95d8-ac24a6310f1e\" (UID: \"73f59adf-51e1-45e9-95d8-ac24a6310f1e\") " Dec 12 00:47:59 crc kubenswrapper[4606]: I1212 00:47:59.925376 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73f59adf-51e1-45e9-95d8-ac24a6310f1e-config-data\") pod \"73f59adf-51e1-45e9-95d8-ac24a6310f1e\" (UID: \"73f59adf-51e1-45e9-95d8-ac24a6310f1e\") " Dec 12 00:47:59 crc kubenswrapper[4606]: I1212 00:47:59.925529 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73f59adf-51e1-45e9-95d8-ac24a6310f1e-combined-ca-bundle\") pod \"73f59adf-51e1-45e9-95d8-ac24a6310f1e\" (UID: \"73f59adf-51e1-45e9-95d8-ac24a6310f1e\") " Dec 12 00:47:59 crc kubenswrapper[4606]: I1212 00:47:59.932509 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73f59adf-51e1-45e9-95d8-ac24a6310f1e-scripts" (OuterVolumeSpecName: "scripts") pod "73f59adf-51e1-45e9-95d8-ac24a6310f1e" (UID: "73f59adf-51e1-45e9-95d8-ac24a6310f1e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:47:59 crc kubenswrapper[4606]: I1212 00:47:59.933357 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73f59adf-51e1-45e9-95d8-ac24a6310f1e-kube-api-access-zjx2d" (OuterVolumeSpecName: "kube-api-access-zjx2d") pod "73f59adf-51e1-45e9-95d8-ac24a6310f1e" (UID: "73f59adf-51e1-45e9-95d8-ac24a6310f1e"). InnerVolumeSpecName "kube-api-access-zjx2d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:47:59 crc kubenswrapper[4606]: I1212 00:47:59.957132 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73f59adf-51e1-45e9-95d8-ac24a6310f1e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "73f59adf-51e1-45e9-95d8-ac24a6310f1e" (UID: "73f59adf-51e1-45e9-95d8-ac24a6310f1e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:47:59 crc kubenswrapper[4606]: I1212 00:47:59.971008 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73f59adf-51e1-45e9-95d8-ac24a6310f1e-config-data" (OuterVolumeSpecName: "config-data") pod "73f59adf-51e1-45e9-95d8-ac24a6310f1e" (UID: "73f59adf-51e1-45e9-95d8-ac24a6310f1e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:48:00 crc kubenswrapper[4606]: I1212 00:48:00.027404 4606 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73f59adf-51e1-45e9-95d8-ac24a6310f1e-scripts\") on node \"crc\" DevicePath \"\"" Dec 12 00:48:00 crc kubenswrapper[4606]: I1212 00:48:00.027437 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zjx2d\" (UniqueName: \"kubernetes.io/projected/73f59adf-51e1-45e9-95d8-ac24a6310f1e-kube-api-access-zjx2d\") on node \"crc\" DevicePath \"\"" Dec 12 00:48:00 crc kubenswrapper[4606]: I1212 00:48:00.027449 4606 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73f59adf-51e1-45e9-95d8-ac24a6310f1e-config-data\") on node \"crc\" DevicePath \"\"" Dec 12 00:48:00 crc kubenswrapper[4606]: I1212 00:48:00.027458 4606 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73f59adf-51e1-45e9-95d8-ac24a6310f1e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 12 00:48:00 crc kubenswrapper[4606]: I1212 00:48:00.462496 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-s2ncv" event={"ID":"73f59adf-51e1-45e9-95d8-ac24a6310f1e","Type":"ContainerDied","Data":"89bec40075158683afd89f06646309bbd311dc563dddf3e31f8ce517c25a8524"} Dec 12 00:48:00 crc kubenswrapper[4606]: I1212 00:48:00.462562 4606 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="89bec40075158683afd89f06646309bbd311dc563dddf3e31f8ce517c25a8524" Dec 12 00:48:00 crc kubenswrapper[4606]: I1212 00:48:00.462519 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-s2ncv" Dec 12 00:48:00 crc kubenswrapper[4606]: I1212 00:48:00.567025 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 12 00:48:00 crc kubenswrapper[4606]: E1212 00:48:00.567442 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73f59adf-51e1-45e9-95d8-ac24a6310f1e" containerName="nova-cell0-conductor-db-sync" Dec 12 00:48:00 crc kubenswrapper[4606]: I1212 00:48:00.567457 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="73f59adf-51e1-45e9-95d8-ac24a6310f1e" containerName="nova-cell0-conductor-db-sync" Dec 12 00:48:00 crc kubenswrapper[4606]: I1212 00:48:00.567636 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="73f59adf-51e1-45e9-95d8-ac24a6310f1e" containerName="nova-cell0-conductor-db-sync" Dec 12 00:48:00 crc kubenswrapper[4606]: I1212 00:48:00.571664 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 12 00:48:00 crc kubenswrapper[4606]: I1212 00:48:00.574441 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-vcljj" Dec 12 00:48:00 crc kubenswrapper[4606]: I1212 00:48:00.583315 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 12 00:48:00 crc kubenswrapper[4606]: I1212 00:48:00.593380 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 12 00:48:00 crc kubenswrapper[4606]: I1212 00:48:00.641529 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a57ec7a6-8024-4745-8f9a-3c85bb363d82-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"a57ec7a6-8024-4745-8f9a-3c85bb363d82\") " pod="openstack/nova-cell0-conductor-0" Dec 12 00:48:00 crc kubenswrapper[4606]: I1212 00:48:00.641582 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2kjx6\" (UniqueName: \"kubernetes.io/projected/a57ec7a6-8024-4745-8f9a-3c85bb363d82-kube-api-access-2kjx6\") pod \"nova-cell0-conductor-0\" (UID: \"a57ec7a6-8024-4745-8f9a-3c85bb363d82\") " pod="openstack/nova-cell0-conductor-0" Dec 12 00:48:00 crc kubenswrapper[4606]: I1212 00:48:00.641694 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a57ec7a6-8024-4745-8f9a-3c85bb363d82-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"a57ec7a6-8024-4745-8f9a-3c85bb363d82\") " pod="openstack/nova-cell0-conductor-0" Dec 12 00:48:00 crc kubenswrapper[4606]: I1212 00:48:00.743353 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a57ec7a6-8024-4745-8f9a-3c85bb363d82-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"a57ec7a6-8024-4745-8f9a-3c85bb363d82\") " pod="openstack/nova-cell0-conductor-0" Dec 12 00:48:00 crc kubenswrapper[4606]: I1212 00:48:00.743424 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2kjx6\" (UniqueName: \"kubernetes.io/projected/a57ec7a6-8024-4745-8f9a-3c85bb363d82-kube-api-access-2kjx6\") pod \"nova-cell0-conductor-0\" (UID: \"a57ec7a6-8024-4745-8f9a-3c85bb363d82\") " pod="openstack/nova-cell0-conductor-0" Dec 12 00:48:00 crc kubenswrapper[4606]: I1212 00:48:00.744636 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a57ec7a6-8024-4745-8f9a-3c85bb363d82-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"a57ec7a6-8024-4745-8f9a-3c85bb363d82\") " pod="openstack/nova-cell0-conductor-0" Dec 12 00:48:00 crc kubenswrapper[4606]: I1212 00:48:00.750499 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a57ec7a6-8024-4745-8f9a-3c85bb363d82-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"a57ec7a6-8024-4745-8f9a-3c85bb363d82\") " pod="openstack/nova-cell0-conductor-0" Dec 12 00:48:00 crc kubenswrapper[4606]: I1212 00:48:00.759846 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a57ec7a6-8024-4745-8f9a-3c85bb363d82-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"a57ec7a6-8024-4745-8f9a-3c85bb363d82\") " pod="openstack/nova-cell0-conductor-0" Dec 12 00:48:00 crc kubenswrapper[4606]: I1212 00:48:00.776634 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2kjx6\" (UniqueName: \"kubernetes.io/projected/a57ec7a6-8024-4745-8f9a-3c85bb363d82-kube-api-access-2kjx6\") pod \"nova-cell0-conductor-0\" (UID: \"a57ec7a6-8024-4745-8f9a-3c85bb363d82\") " pod="openstack/nova-cell0-conductor-0" Dec 12 00:48:00 crc kubenswrapper[4606]: I1212 00:48:00.900038 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 12 00:48:01 crc kubenswrapper[4606]: I1212 00:48:01.328047 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 12 00:48:01 crc kubenswrapper[4606]: I1212 00:48:01.366214 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5a3279fa-90bc-4901-b487-999ec5662aca-log-httpd\") pod \"5a3279fa-90bc-4901-b487-999ec5662aca\" (UID: \"5a3279fa-90bc-4901-b487-999ec5662aca\") " Dec 12 00:48:01 crc kubenswrapper[4606]: I1212 00:48:01.366265 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a3279fa-90bc-4901-b487-999ec5662aca-config-data\") pod \"5a3279fa-90bc-4901-b487-999ec5662aca\" (UID: \"5a3279fa-90bc-4901-b487-999ec5662aca\") " Dec 12 00:48:01 crc kubenswrapper[4606]: I1212 00:48:01.366296 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5a3279fa-90bc-4901-b487-999ec5662aca-sg-core-conf-yaml\") pod \"5a3279fa-90bc-4901-b487-999ec5662aca\" (UID: \"5a3279fa-90bc-4901-b487-999ec5662aca\") " Dec 12 00:48:01 crc kubenswrapper[4606]: I1212 00:48:01.366382 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wbf89\" (UniqueName: \"kubernetes.io/projected/5a3279fa-90bc-4901-b487-999ec5662aca-kube-api-access-wbf89\") pod \"5a3279fa-90bc-4901-b487-999ec5662aca\" (UID: \"5a3279fa-90bc-4901-b487-999ec5662aca\") " Dec 12 00:48:01 crc kubenswrapper[4606]: I1212 00:48:01.366411 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5a3279fa-90bc-4901-b487-999ec5662aca-run-httpd\") pod \"5a3279fa-90bc-4901-b487-999ec5662aca\" (UID: \"5a3279fa-90bc-4901-b487-999ec5662aca\") " Dec 12 00:48:01 crc kubenswrapper[4606]: I1212 00:48:01.366451 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a3279fa-90bc-4901-b487-999ec5662aca-combined-ca-bundle\") pod \"5a3279fa-90bc-4901-b487-999ec5662aca\" (UID: \"5a3279fa-90bc-4901-b487-999ec5662aca\") " Dec 12 00:48:01 crc kubenswrapper[4606]: I1212 00:48:01.366514 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a3279fa-90bc-4901-b487-999ec5662aca-scripts\") pod \"5a3279fa-90bc-4901-b487-999ec5662aca\" (UID: \"5a3279fa-90bc-4901-b487-999ec5662aca\") " Dec 12 00:48:01 crc kubenswrapper[4606]: I1212 00:48:01.366894 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a3279fa-90bc-4901-b487-999ec5662aca-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "5a3279fa-90bc-4901-b487-999ec5662aca" (UID: "5a3279fa-90bc-4901-b487-999ec5662aca"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 00:48:01 crc kubenswrapper[4606]: I1212 00:48:01.367252 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a3279fa-90bc-4901-b487-999ec5662aca-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "5a3279fa-90bc-4901-b487-999ec5662aca" (UID: "5a3279fa-90bc-4901-b487-999ec5662aca"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 00:48:01 crc kubenswrapper[4606]: I1212 00:48:01.374519 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a3279fa-90bc-4901-b487-999ec5662aca-scripts" (OuterVolumeSpecName: "scripts") pod "5a3279fa-90bc-4901-b487-999ec5662aca" (UID: "5a3279fa-90bc-4901-b487-999ec5662aca"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:48:01 crc kubenswrapper[4606]: I1212 00:48:01.379410 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a3279fa-90bc-4901-b487-999ec5662aca-kube-api-access-wbf89" (OuterVolumeSpecName: "kube-api-access-wbf89") pod "5a3279fa-90bc-4901-b487-999ec5662aca" (UID: "5a3279fa-90bc-4901-b487-999ec5662aca"). InnerVolumeSpecName "kube-api-access-wbf89". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:48:01 crc kubenswrapper[4606]: I1212 00:48:01.441780 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a3279fa-90bc-4901-b487-999ec5662aca-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "5a3279fa-90bc-4901-b487-999ec5662aca" (UID: "5a3279fa-90bc-4901-b487-999ec5662aca"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:48:01 crc kubenswrapper[4606]: I1212 00:48:01.482421 4606 generic.go:334] "Generic (PLEG): container finished" podID="5a3279fa-90bc-4901-b487-999ec5662aca" containerID="d86820d3a5b64d78f6d5135fd59c6c9abfc5de051933f9ccd0259808462da447" exitCode=0 Dec 12 00:48:01 crc kubenswrapper[4606]: I1212 00:48:01.482472 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a3279fa-90bc-4901-b487-999ec5662aca-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5a3279fa-90bc-4901-b487-999ec5662aca" (UID: "5a3279fa-90bc-4901-b487-999ec5662aca"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:48:01 crc kubenswrapper[4606]: I1212 00:48:01.482488 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 12 00:48:01 crc kubenswrapper[4606]: I1212 00:48:01.482510 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5a3279fa-90bc-4901-b487-999ec5662aca","Type":"ContainerDied","Data":"d86820d3a5b64d78f6d5135fd59c6c9abfc5de051933f9ccd0259808462da447"} Dec 12 00:48:01 crc kubenswrapper[4606]: I1212 00:48:01.483679 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5a3279fa-90bc-4901-b487-999ec5662aca","Type":"ContainerDied","Data":"8d2bc4f98c9feaae5a430000a5bbfa606fc44e8e4a8ead7903cfde6001b9f159"} Dec 12 00:48:01 crc kubenswrapper[4606]: I1212 00:48:01.483748 4606 scope.go:117] "RemoveContainer" containerID="d5ca13b80eedeeb7bba7425d60ef4c6ebff323673dd57d2fa693733519214d6f" Dec 12 00:48:01 crc kubenswrapper[4606]: I1212 00:48:01.483839 4606 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5a3279fa-90bc-4901-b487-999ec5662aca-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 12 00:48:01 crc kubenswrapper[4606]: I1212 00:48:01.483867 4606 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5a3279fa-90bc-4901-b487-999ec5662aca-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 12 00:48:01 crc kubenswrapper[4606]: I1212 00:48:01.483898 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wbf89\" (UniqueName: \"kubernetes.io/projected/5a3279fa-90bc-4901-b487-999ec5662aca-kube-api-access-wbf89\") on node \"crc\" DevicePath \"\"" Dec 12 00:48:01 crc kubenswrapper[4606]: I1212 00:48:01.483911 4606 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5a3279fa-90bc-4901-b487-999ec5662aca-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 12 00:48:01 crc kubenswrapper[4606]: I1212 00:48:01.483923 4606 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a3279fa-90bc-4901-b487-999ec5662aca-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 12 00:48:01 crc kubenswrapper[4606]: I1212 00:48:01.483936 4606 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a3279fa-90bc-4901-b487-999ec5662aca-scripts\") on node \"crc\" DevicePath \"\"" Dec 12 00:48:01 crc kubenswrapper[4606]: I1212 00:48:01.518896 4606 scope.go:117] "RemoveContainer" containerID="f7386100b624464973e1ad1c1ed60575b11f7f32e9ed54665e31ffde18d4afeb" Dec 12 00:48:01 crc kubenswrapper[4606]: I1212 00:48:01.538114 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a3279fa-90bc-4901-b487-999ec5662aca-config-data" (OuterVolumeSpecName: "config-data") pod "5a3279fa-90bc-4901-b487-999ec5662aca" (UID: "5a3279fa-90bc-4901-b487-999ec5662aca"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:48:01 crc kubenswrapper[4606]: I1212 00:48:01.557218 4606 scope.go:117] "RemoveContainer" containerID="43f33118aea34745e33acdab17e4cd5c31a5e334030bc2ccf1387959c29d7570" Dec 12 00:48:01 crc kubenswrapper[4606]: I1212 00:48:01.563124 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 12 00:48:01 crc kubenswrapper[4606]: I1212 00:48:01.585501 4606 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a3279fa-90bc-4901-b487-999ec5662aca-config-data\") on node \"crc\" DevicePath \"\"" Dec 12 00:48:01 crc kubenswrapper[4606]: I1212 00:48:01.587660 4606 scope.go:117] "RemoveContainer" containerID="d86820d3a5b64d78f6d5135fd59c6c9abfc5de051933f9ccd0259808462da447" Dec 12 00:48:01 crc kubenswrapper[4606]: I1212 00:48:01.618352 4606 scope.go:117] "RemoveContainer" containerID="d5ca13b80eedeeb7bba7425d60ef4c6ebff323673dd57d2fa693733519214d6f" Dec 12 00:48:01 crc kubenswrapper[4606]: E1212 00:48:01.618965 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5ca13b80eedeeb7bba7425d60ef4c6ebff323673dd57d2fa693733519214d6f\": container with ID starting with d5ca13b80eedeeb7bba7425d60ef4c6ebff323673dd57d2fa693733519214d6f not found: ID does not exist" containerID="d5ca13b80eedeeb7bba7425d60ef4c6ebff323673dd57d2fa693733519214d6f" Dec 12 00:48:01 crc kubenswrapper[4606]: I1212 00:48:01.619056 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5ca13b80eedeeb7bba7425d60ef4c6ebff323673dd57d2fa693733519214d6f"} err="failed to get container status \"d5ca13b80eedeeb7bba7425d60ef4c6ebff323673dd57d2fa693733519214d6f\": rpc error: code = NotFound desc = could not find container \"d5ca13b80eedeeb7bba7425d60ef4c6ebff323673dd57d2fa693733519214d6f\": container with ID starting with d5ca13b80eedeeb7bba7425d60ef4c6ebff323673dd57d2fa693733519214d6f not found: ID does not exist" Dec 12 00:48:01 crc kubenswrapper[4606]: I1212 00:48:01.619297 4606 scope.go:117] "RemoveContainer" containerID="f7386100b624464973e1ad1c1ed60575b11f7f32e9ed54665e31ffde18d4afeb" Dec 12 00:48:01 crc kubenswrapper[4606]: E1212 00:48:01.619699 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7386100b624464973e1ad1c1ed60575b11f7f32e9ed54665e31ffde18d4afeb\": container with ID starting with f7386100b624464973e1ad1c1ed60575b11f7f32e9ed54665e31ffde18d4afeb not found: ID does not exist" containerID="f7386100b624464973e1ad1c1ed60575b11f7f32e9ed54665e31ffde18d4afeb" Dec 12 00:48:01 crc kubenswrapper[4606]: I1212 00:48:01.619782 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7386100b624464973e1ad1c1ed60575b11f7f32e9ed54665e31ffde18d4afeb"} err="failed to get container status \"f7386100b624464973e1ad1c1ed60575b11f7f32e9ed54665e31ffde18d4afeb\": rpc error: code = NotFound desc = could not find container \"f7386100b624464973e1ad1c1ed60575b11f7f32e9ed54665e31ffde18d4afeb\": container with ID starting with f7386100b624464973e1ad1c1ed60575b11f7f32e9ed54665e31ffde18d4afeb not found: ID does not exist" Dec 12 00:48:01 crc kubenswrapper[4606]: I1212 00:48:01.619816 4606 scope.go:117] "RemoveContainer" containerID="43f33118aea34745e33acdab17e4cd5c31a5e334030bc2ccf1387959c29d7570" Dec 12 00:48:01 crc kubenswrapper[4606]: E1212 00:48:01.620143 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43f33118aea34745e33acdab17e4cd5c31a5e334030bc2ccf1387959c29d7570\": container with ID starting with 43f33118aea34745e33acdab17e4cd5c31a5e334030bc2ccf1387959c29d7570 not found: ID does not exist" containerID="43f33118aea34745e33acdab17e4cd5c31a5e334030bc2ccf1387959c29d7570" Dec 12 00:48:01 crc kubenswrapper[4606]: I1212 00:48:01.620258 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43f33118aea34745e33acdab17e4cd5c31a5e334030bc2ccf1387959c29d7570"} err="failed to get container status \"43f33118aea34745e33acdab17e4cd5c31a5e334030bc2ccf1387959c29d7570\": rpc error: code = NotFound desc = could not find container \"43f33118aea34745e33acdab17e4cd5c31a5e334030bc2ccf1387959c29d7570\": container with ID starting with 43f33118aea34745e33acdab17e4cd5c31a5e334030bc2ccf1387959c29d7570 not found: ID does not exist" Dec 12 00:48:01 crc kubenswrapper[4606]: I1212 00:48:01.620327 4606 scope.go:117] "RemoveContainer" containerID="d86820d3a5b64d78f6d5135fd59c6c9abfc5de051933f9ccd0259808462da447" Dec 12 00:48:01 crc kubenswrapper[4606]: E1212 00:48:01.622967 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d86820d3a5b64d78f6d5135fd59c6c9abfc5de051933f9ccd0259808462da447\": container with ID starting with d86820d3a5b64d78f6d5135fd59c6c9abfc5de051933f9ccd0259808462da447 not found: ID does not exist" containerID="d86820d3a5b64d78f6d5135fd59c6c9abfc5de051933f9ccd0259808462da447" Dec 12 00:48:01 crc kubenswrapper[4606]: I1212 00:48:01.623008 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d86820d3a5b64d78f6d5135fd59c6c9abfc5de051933f9ccd0259808462da447"} err="failed to get container status \"d86820d3a5b64d78f6d5135fd59c6c9abfc5de051933f9ccd0259808462da447\": rpc error: code = NotFound desc = could not find container \"d86820d3a5b64d78f6d5135fd59c6c9abfc5de051933f9ccd0259808462da447\": container with ID starting with d86820d3a5b64d78f6d5135fd59c6c9abfc5de051933f9ccd0259808462da447 not found: ID does not exist" Dec 12 00:48:01 crc kubenswrapper[4606]: I1212 00:48:01.834288 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 12 00:48:01 crc kubenswrapper[4606]: I1212 00:48:01.844074 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 12 00:48:01 crc kubenswrapper[4606]: I1212 00:48:01.860422 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 12 00:48:01 crc kubenswrapper[4606]: E1212 00:48:01.860760 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a3279fa-90bc-4901-b487-999ec5662aca" containerName="ceilometer-notification-agent" Dec 12 00:48:01 crc kubenswrapper[4606]: I1212 00:48:01.860777 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a3279fa-90bc-4901-b487-999ec5662aca" containerName="ceilometer-notification-agent" Dec 12 00:48:01 crc kubenswrapper[4606]: E1212 00:48:01.860788 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a3279fa-90bc-4901-b487-999ec5662aca" containerName="sg-core" Dec 12 00:48:01 crc kubenswrapper[4606]: I1212 00:48:01.860794 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a3279fa-90bc-4901-b487-999ec5662aca" containerName="sg-core" Dec 12 00:48:01 crc kubenswrapper[4606]: E1212 00:48:01.860813 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a3279fa-90bc-4901-b487-999ec5662aca" containerName="ceilometer-central-agent" Dec 12 00:48:01 crc kubenswrapper[4606]: I1212 00:48:01.860820 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a3279fa-90bc-4901-b487-999ec5662aca" containerName="ceilometer-central-agent" Dec 12 00:48:01 crc kubenswrapper[4606]: E1212 00:48:01.860828 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a3279fa-90bc-4901-b487-999ec5662aca" containerName="proxy-httpd" Dec 12 00:48:01 crc kubenswrapper[4606]: I1212 00:48:01.860833 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a3279fa-90bc-4901-b487-999ec5662aca" containerName="proxy-httpd" Dec 12 00:48:01 crc kubenswrapper[4606]: I1212 00:48:01.860984 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a3279fa-90bc-4901-b487-999ec5662aca" containerName="proxy-httpd" Dec 12 00:48:01 crc kubenswrapper[4606]: I1212 00:48:01.860996 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a3279fa-90bc-4901-b487-999ec5662aca" containerName="ceilometer-notification-agent" Dec 12 00:48:01 crc kubenswrapper[4606]: I1212 00:48:01.861007 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a3279fa-90bc-4901-b487-999ec5662aca" containerName="ceilometer-central-agent" Dec 12 00:48:01 crc kubenswrapper[4606]: I1212 00:48:01.861023 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a3279fa-90bc-4901-b487-999ec5662aca" containerName="sg-core" Dec 12 00:48:01 crc kubenswrapper[4606]: I1212 00:48:01.863724 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 12 00:48:01 crc kubenswrapper[4606]: I1212 00:48:01.865917 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 12 00:48:01 crc kubenswrapper[4606]: I1212 00:48:01.866408 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 12 00:48:01 crc kubenswrapper[4606]: I1212 00:48:01.884982 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 12 00:48:01 crc kubenswrapper[4606]: I1212 00:48:01.997302 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/841a9b2c-ca62-41f0-8307-7cd58b43aa9e-run-httpd\") pod \"ceilometer-0\" (UID: \"841a9b2c-ca62-41f0-8307-7cd58b43aa9e\") " pod="openstack/ceilometer-0" Dec 12 00:48:01 crc kubenswrapper[4606]: I1212 00:48:01.997375 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/841a9b2c-ca62-41f0-8307-7cd58b43aa9e-log-httpd\") pod \"ceilometer-0\" (UID: \"841a9b2c-ca62-41f0-8307-7cd58b43aa9e\") " pod="openstack/ceilometer-0" Dec 12 00:48:01 crc kubenswrapper[4606]: I1212 00:48:01.997440 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/841a9b2c-ca62-41f0-8307-7cd58b43aa9e-config-data\") pod \"ceilometer-0\" (UID: \"841a9b2c-ca62-41f0-8307-7cd58b43aa9e\") " pod="openstack/ceilometer-0" Dec 12 00:48:01 crc kubenswrapper[4606]: I1212 00:48:01.997515 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/841a9b2c-ca62-41f0-8307-7cd58b43aa9e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"841a9b2c-ca62-41f0-8307-7cd58b43aa9e\") " pod="openstack/ceilometer-0" Dec 12 00:48:01 crc kubenswrapper[4606]: I1212 00:48:01.997573 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/841a9b2c-ca62-41f0-8307-7cd58b43aa9e-scripts\") pod \"ceilometer-0\" (UID: \"841a9b2c-ca62-41f0-8307-7cd58b43aa9e\") " pod="openstack/ceilometer-0" Dec 12 00:48:01 crc kubenswrapper[4606]: I1212 00:48:01.997605 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nchnh\" (UniqueName: \"kubernetes.io/projected/841a9b2c-ca62-41f0-8307-7cd58b43aa9e-kube-api-access-nchnh\") pod \"ceilometer-0\" (UID: \"841a9b2c-ca62-41f0-8307-7cd58b43aa9e\") " pod="openstack/ceilometer-0" Dec 12 00:48:01 crc kubenswrapper[4606]: I1212 00:48:01.997703 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/841a9b2c-ca62-41f0-8307-7cd58b43aa9e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"841a9b2c-ca62-41f0-8307-7cd58b43aa9e\") " pod="openstack/ceilometer-0" Dec 12 00:48:02 crc kubenswrapper[4606]: I1212 00:48:02.009987 4606 patch_prober.go:28] interesting pod/machine-config-daemon-cqmz5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 12 00:48:02 crc kubenswrapper[4606]: I1212 00:48:02.010037 4606 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 12 00:48:02 crc kubenswrapper[4606]: I1212 00:48:02.010077 4606 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" Dec 12 00:48:02 crc kubenswrapper[4606]: I1212 00:48:02.010825 4606 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"77f50006b910ef408eff1d278d14d4c6df84559c43dc90a75f9e7ad4aa5dad37"} pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 12 00:48:02 crc kubenswrapper[4606]: I1212 00:48:02.010874 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" containerName="machine-config-daemon" containerID="cri-o://77f50006b910ef408eff1d278d14d4c6df84559c43dc90a75f9e7ad4aa5dad37" gracePeriod=600 Dec 12 00:48:02 crc kubenswrapper[4606]: I1212 00:48:02.099869 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/841a9b2c-ca62-41f0-8307-7cd58b43aa9e-scripts\") pod \"ceilometer-0\" (UID: \"841a9b2c-ca62-41f0-8307-7cd58b43aa9e\") " pod="openstack/ceilometer-0" Dec 12 00:48:02 crc kubenswrapper[4606]: I1212 00:48:02.099938 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nchnh\" (UniqueName: \"kubernetes.io/projected/841a9b2c-ca62-41f0-8307-7cd58b43aa9e-kube-api-access-nchnh\") pod \"ceilometer-0\" (UID: \"841a9b2c-ca62-41f0-8307-7cd58b43aa9e\") " pod="openstack/ceilometer-0" Dec 12 00:48:02 crc kubenswrapper[4606]: I1212 00:48:02.100040 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/841a9b2c-ca62-41f0-8307-7cd58b43aa9e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"841a9b2c-ca62-41f0-8307-7cd58b43aa9e\") " pod="openstack/ceilometer-0" Dec 12 00:48:02 crc kubenswrapper[4606]: I1212 00:48:02.100133 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/841a9b2c-ca62-41f0-8307-7cd58b43aa9e-run-httpd\") pod \"ceilometer-0\" (UID: \"841a9b2c-ca62-41f0-8307-7cd58b43aa9e\") " pod="openstack/ceilometer-0" Dec 12 00:48:02 crc kubenswrapper[4606]: I1212 00:48:02.100195 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/841a9b2c-ca62-41f0-8307-7cd58b43aa9e-log-httpd\") pod \"ceilometer-0\" (UID: \"841a9b2c-ca62-41f0-8307-7cd58b43aa9e\") " pod="openstack/ceilometer-0" Dec 12 00:48:02 crc kubenswrapper[4606]: I1212 00:48:02.100231 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/841a9b2c-ca62-41f0-8307-7cd58b43aa9e-config-data\") pod \"ceilometer-0\" (UID: \"841a9b2c-ca62-41f0-8307-7cd58b43aa9e\") " pod="openstack/ceilometer-0" Dec 12 00:48:02 crc kubenswrapper[4606]: I1212 00:48:02.100291 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/841a9b2c-ca62-41f0-8307-7cd58b43aa9e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"841a9b2c-ca62-41f0-8307-7cd58b43aa9e\") " pod="openstack/ceilometer-0" Dec 12 00:48:02 crc kubenswrapper[4606]: I1212 00:48:02.101125 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/841a9b2c-ca62-41f0-8307-7cd58b43aa9e-run-httpd\") pod \"ceilometer-0\" (UID: \"841a9b2c-ca62-41f0-8307-7cd58b43aa9e\") " pod="openstack/ceilometer-0" Dec 12 00:48:02 crc kubenswrapper[4606]: I1212 00:48:02.101685 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/841a9b2c-ca62-41f0-8307-7cd58b43aa9e-log-httpd\") pod \"ceilometer-0\" (UID: \"841a9b2c-ca62-41f0-8307-7cd58b43aa9e\") " pod="openstack/ceilometer-0" Dec 12 00:48:02 crc kubenswrapper[4606]: I1212 00:48:02.107987 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/841a9b2c-ca62-41f0-8307-7cd58b43aa9e-scripts\") pod \"ceilometer-0\" (UID: \"841a9b2c-ca62-41f0-8307-7cd58b43aa9e\") " pod="openstack/ceilometer-0" Dec 12 00:48:02 crc kubenswrapper[4606]: I1212 00:48:02.108258 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/841a9b2c-ca62-41f0-8307-7cd58b43aa9e-config-data\") pod \"ceilometer-0\" (UID: \"841a9b2c-ca62-41f0-8307-7cd58b43aa9e\") " pod="openstack/ceilometer-0" Dec 12 00:48:02 crc kubenswrapper[4606]: I1212 00:48:02.108585 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/841a9b2c-ca62-41f0-8307-7cd58b43aa9e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"841a9b2c-ca62-41f0-8307-7cd58b43aa9e\") " pod="openstack/ceilometer-0" Dec 12 00:48:02 crc kubenswrapper[4606]: I1212 00:48:02.124604 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/841a9b2c-ca62-41f0-8307-7cd58b43aa9e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"841a9b2c-ca62-41f0-8307-7cd58b43aa9e\") " pod="openstack/ceilometer-0" Dec 12 00:48:02 crc kubenswrapper[4606]: I1212 00:48:02.131128 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nchnh\" (UniqueName: \"kubernetes.io/projected/841a9b2c-ca62-41f0-8307-7cd58b43aa9e-kube-api-access-nchnh\") pod \"ceilometer-0\" (UID: \"841a9b2c-ca62-41f0-8307-7cd58b43aa9e\") " pod="openstack/ceilometer-0" Dec 12 00:48:02 crc kubenswrapper[4606]: E1212 00:48:02.150456 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 00:48:02 crc kubenswrapper[4606]: I1212 00:48:02.181148 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 12 00:48:02 crc kubenswrapper[4606]: I1212 00:48:02.493196 4606 generic.go:334] "Generic (PLEG): container finished" podID="a543e227-be89-40cb-941d-b4707cc28921" containerID="77f50006b910ef408eff1d278d14d4c6df84559c43dc90a75f9e7ad4aa5dad37" exitCode=0 Dec 12 00:48:02 crc kubenswrapper[4606]: I1212 00:48:02.493606 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" event={"ID":"a543e227-be89-40cb-941d-b4707cc28921","Type":"ContainerDied","Data":"77f50006b910ef408eff1d278d14d4c6df84559c43dc90a75f9e7ad4aa5dad37"} Dec 12 00:48:02 crc kubenswrapper[4606]: I1212 00:48:02.493636 4606 scope.go:117] "RemoveContainer" containerID="e80327b00df207db6b4792ec6ccf6cd67956fd25c801ee29c3af3ba674cbe9cc" Dec 12 00:48:02 crc kubenswrapper[4606]: I1212 00:48:02.494243 4606 scope.go:117] "RemoveContainer" containerID="77f50006b910ef408eff1d278d14d4c6df84559c43dc90a75f9e7ad4aa5dad37" Dec 12 00:48:02 crc kubenswrapper[4606]: E1212 00:48:02.494564 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 00:48:02 crc kubenswrapper[4606]: I1212 00:48:02.496357 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"a57ec7a6-8024-4745-8f9a-3c85bb363d82","Type":"ContainerStarted","Data":"f4cf7f3104a4c5b42f4a4228c787c8935b84852fc8881865fa0d3718937bd4ee"} Dec 12 00:48:02 crc kubenswrapper[4606]: I1212 00:48:02.496385 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"a57ec7a6-8024-4745-8f9a-3c85bb363d82","Type":"ContainerStarted","Data":"57c4ae42cfc3850fa9177a0dad39cafa954edf4383b83fad994378c0710bac17"} Dec 12 00:48:02 crc kubenswrapper[4606]: I1212 00:48:02.496828 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Dec 12 00:48:02 crc kubenswrapper[4606]: I1212 00:48:02.500084 4606 generic.go:334] "Generic (PLEG): container finished" podID="e38df57e-1a86-4c45-bf40-6282a6a049ed" containerID="45c639ce0fe5cb7959a1e8ff1646d4eb9c473c0bc5c88c824b78ae25e0ea3b06" exitCode=137 Dec 12 00:48:02 crc kubenswrapper[4606]: I1212 00:48:02.500133 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-b9fb498f6-62fcc" event={"ID":"e38df57e-1a86-4c45-bf40-6282a6a049ed","Type":"ContainerDied","Data":"45c639ce0fe5cb7959a1e8ff1646d4eb9c473c0bc5c88c824b78ae25e0ea3b06"} Dec 12 00:48:02 crc kubenswrapper[4606]: I1212 00:48:02.535577 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.535556386 podStartE2EDuration="2.535556386s" podCreationTimestamp="2025-12-12 00:48:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:48:02.531010584 +0000 UTC m=+1473.076363450" watchObservedRunningTime="2025-12-12 00:48:02.535556386 +0000 UTC m=+1473.080909252" Dec 12 00:48:02 crc kubenswrapper[4606]: I1212 00:48:02.669243 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 12 00:48:02 crc kubenswrapper[4606]: I1212 00:48:02.709027 4606 scope.go:117] "RemoveContainer" containerID="2dbf7369ad77ec21071d76169ebe84202398f88e4b0d626bed0634bc2c0923cd" Dec 12 00:48:03 crc kubenswrapper[4606]: I1212 00:48:03.536541 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"841a9b2c-ca62-41f0-8307-7cd58b43aa9e","Type":"ContainerStarted","Data":"314bab1308a545aa3977fd76e5f21d2a28edcb4db3d952b9bba2f5c9acaa2236"} Dec 12 00:48:03 crc kubenswrapper[4606]: I1212 00:48:03.573088 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-b9fb498f6-62fcc" event={"ID":"e38df57e-1a86-4c45-bf40-6282a6a049ed","Type":"ContainerStarted","Data":"94c561297eb74c29227d303502172b4bf8795f519284f29f1abc9bfa0dcda988"} Dec 12 00:48:03 crc kubenswrapper[4606]: I1212 00:48:03.726663 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a3279fa-90bc-4901-b487-999ec5662aca" path="/var/lib/kubelet/pods/5a3279fa-90bc-4901-b487-999ec5662aca/volumes" Dec 12 00:48:04 crc kubenswrapper[4606]: I1212 00:48:04.635491 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"841a9b2c-ca62-41f0-8307-7cd58b43aa9e","Type":"ContainerStarted","Data":"e9385f2519eb011ca784d1ede40a391d82acd7e77913ce00e762d707764b1127"} Dec 12 00:48:05 crc kubenswrapper[4606]: I1212 00:48:05.647788 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"841a9b2c-ca62-41f0-8307-7cd58b43aa9e","Type":"ContainerStarted","Data":"4f1e81bc1d9daeb41b40889d1e8d1cec27087f61b7b54f4e5d99c031acedc48f"} Dec 12 00:48:05 crc kubenswrapper[4606]: I1212 00:48:05.648134 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"841a9b2c-ca62-41f0-8307-7cd58b43aa9e","Type":"ContainerStarted","Data":"d135a11dabe2b21a5d5f111d71771b37cd712b5a4675684608703b85d4b84a05"} Dec 12 00:48:07 crc kubenswrapper[4606]: I1212 00:48:07.673293 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"841a9b2c-ca62-41f0-8307-7cd58b43aa9e","Type":"ContainerStarted","Data":"5457ea54ccc768adb098ee155755be6f0bc0286d124c50c835b3ce55b1720028"} Dec 12 00:48:07 crc kubenswrapper[4606]: I1212 00:48:07.674996 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 12 00:48:07 crc kubenswrapper[4606]: I1212 00:48:07.698926 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.292732296 podStartE2EDuration="6.698908512s" podCreationTimestamp="2025-12-12 00:48:01 +0000 UTC" firstStartedPulling="2025-12-12 00:48:02.68683673 +0000 UTC m=+1473.232189596" lastFinishedPulling="2025-12-12 00:48:07.093012936 +0000 UTC m=+1477.638365812" observedRunningTime="2025-12-12 00:48:07.694530085 +0000 UTC m=+1478.239882981" watchObservedRunningTime="2025-12-12 00:48:07.698908512 +0000 UTC m=+1478.244261388" Dec 12 00:48:10 crc kubenswrapper[4606]: I1212 00:48:10.946032 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Dec 12 00:48:11 crc kubenswrapper[4606]: I1212 00:48:11.495115 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-tfrdx"] Dec 12 00:48:11 crc kubenswrapper[4606]: I1212 00:48:11.497058 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-tfrdx" Dec 12 00:48:11 crc kubenswrapper[4606]: I1212 00:48:11.502626 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Dec 12 00:48:11 crc kubenswrapper[4606]: I1212 00:48:11.502703 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Dec 12 00:48:11 crc kubenswrapper[4606]: I1212 00:48:11.525951 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-tfrdx"] Dec 12 00:48:11 crc kubenswrapper[4606]: I1212 00:48:11.597822 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a61d852-d814-4230-9ab7-4d0b5742b00a-config-data\") pod \"nova-cell0-cell-mapping-tfrdx\" (UID: \"8a61d852-d814-4230-9ab7-4d0b5742b00a\") " pod="openstack/nova-cell0-cell-mapping-tfrdx" Dec 12 00:48:11 crc kubenswrapper[4606]: I1212 00:48:11.597905 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a61d852-d814-4230-9ab7-4d0b5742b00a-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-tfrdx\" (UID: \"8a61d852-d814-4230-9ab7-4d0b5742b00a\") " pod="openstack/nova-cell0-cell-mapping-tfrdx" Dec 12 00:48:11 crc kubenswrapper[4606]: I1212 00:48:11.597930 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a61d852-d814-4230-9ab7-4d0b5742b00a-scripts\") pod \"nova-cell0-cell-mapping-tfrdx\" (UID: \"8a61d852-d814-4230-9ab7-4d0b5742b00a\") " pod="openstack/nova-cell0-cell-mapping-tfrdx" Dec 12 00:48:11 crc kubenswrapper[4606]: I1212 00:48:11.597966 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwblg\" (UniqueName: \"kubernetes.io/projected/8a61d852-d814-4230-9ab7-4d0b5742b00a-kube-api-access-rwblg\") pod \"nova-cell0-cell-mapping-tfrdx\" (UID: \"8a61d852-d814-4230-9ab7-4d0b5742b00a\") " pod="openstack/nova-cell0-cell-mapping-tfrdx" Dec 12 00:48:11 crc kubenswrapper[4606]: I1212 00:48:11.702737 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a61d852-d814-4230-9ab7-4d0b5742b00a-config-data\") pod \"nova-cell0-cell-mapping-tfrdx\" (UID: \"8a61d852-d814-4230-9ab7-4d0b5742b00a\") " pod="openstack/nova-cell0-cell-mapping-tfrdx" Dec 12 00:48:11 crc kubenswrapper[4606]: I1212 00:48:11.702824 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a61d852-d814-4230-9ab7-4d0b5742b00a-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-tfrdx\" (UID: \"8a61d852-d814-4230-9ab7-4d0b5742b00a\") " pod="openstack/nova-cell0-cell-mapping-tfrdx" Dec 12 00:48:11 crc kubenswrapper[4606]: I1212 00:48:11.702855 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a61d852-d814-4230-9ab7-4d0b5742b00a-scripts\") pod \"nova-cell0-cell-mapping-tfrdx\" (UID: \"8a61d852-d814-4230-9ab7-4d0b5742b00a\") " pod="openstack/nova-cell0-cell-mapping-tfrdx" Dec 12 00:48:11 crc kubenswrapper[4606]: I1212 00:48:11.702890 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rwblg\" (UniqueName: \"kubernetes.io/projected/8a61d852-d814-4230-9ab7-4d0b5742b00a-kube-api-access-rwblg\") pod \"nova-cell0-cell-mapping-tfrdx\" (UID: \"8a61d852-d814-4230-9ab7-4d0b5742b00a\") " pod="openstack/nova-cell0-cell-mapping-tfrdx" Dec 12 00:48:11 crc kubenswrapper[4606]: I1212 00:48:11.717429 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a61d852-d814-4230-9ab7-4d0b5742b00a-config-data\") pod \"nova-cell0-cell-mapping-tfrdx\" (UID: \"8a61d852-d814-4230-9ab7-4d0b5742b00a\") " pod="openstack/nova-cell0-cell-mapping-tfrdx" Dec 12 00:48:11 crc kubenswrapper[4606]: I1212 00:48:11.736804 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a61d852-d814-4230-9ab7-4d0b5742b00a-scripts\") pod \"nova-cell0-cell-mapping-tfrdx\" (UID: \"8a61d852-d814-4230-9ab7-4d0b5742b00a\") " pod="openstack/nova-cell0-cell-mapping-tfrdx" Dec 12 00:48:11 crc kubenswrapper[4606]: I1212 00:48:11.737424 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a61d852-d814-4230-9ab7-4d0b5742b00a-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-tfrdx\" (UID: \"8a61d852-d814-4230-9ab7-4d0b5742b00a\") " pod="openstack/nova-cell0-cell-mapping-tfrdx" Dec 12 00:48:11 crc kubenswrapper[4606]: I1212 00:48:11.742926 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwblg\" (UniqueName: \"kubernetes.io/projected/8a61d852-d814-4230-9ab7-4d0b5742b00a-kube-api-access-rwblg\") pod \"nova-cell0-cell-mapping-tfrdx\" (UID: \"8a61d852-d814-4230-9ab7-4d0b5742b00a\") " pod="openstack/nova-cell0-cell-mapping-tfrdx" Dec 12 00:48:11 crc kubenswrapper[4606]: I1212 00:48:11.751254 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 12 00:48:11 crc kubenswrapper[4606]: I1212 00:48:11.754640 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 12 00:48:11 crc kubenswrapper[4606]: I1212 00:48:11.757727 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 12 00:48:11 crc kubenswrapper[4606]: I1212 00:48:11.774021 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 12 00:48:11 crc kubenswrapper[4606]: I1212 00:48:11.808307 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfgkm\" (UniqueName: \"kubernetes.io/projected/4ce96712-78cb-43cb-8f08-563664ef0451-kube-api-access-sfgkm\") pod \"nova-api-0\" (UID: \"4ce96712-78cb-43cb-8f08-563664ef0451\") " pod="openstack/nova-api-0" Dec 12 00:48:11 crc kubenswrapper[4606]: I1212 00:48:11.808379 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4ce96712-78cb-43cb-8f08-563664ef0451-logs\") pod \"nova-api-0\" (UID: \"4ce96712-78cb-43cb-8f08-563664ef0451\") " pod="openstack/nova-api-0" Dec 12 00:48:11 crc kubenswrapper[4606]: I1212 00:48:11.808474 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ce96712-78cb-43cb-8f08-563664ef0451-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4ce96712-78cb-43cb-8f08-563664ef0451\") " pod="openstack/nova-api-0" Dec 12 00:48:11 crc kubenswrapper[4606]: I1212 00:48:11.808499 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ce96712-78cb-43cb-8f08-563664ef0451-config-data\") pod \"nova-api-0\" (UID: \"4ce96712-78cb-43cb-8f08-563664ef0451\") " pod="openstack/nova-api-0" Dec 12 00:48:11 crc kubenswrapper[4606]: I1212 00:48:11.818581 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-tfrdx" Dec 12 00:48:11 crc kubenswrapper[4606]: I1212 00:48:11.823318 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 12 00:48:11 crc kubenswrapper[4606]: I1212 00:48:11.824801 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 12 00:48:11 crc kubenswrapper[4606]: I1212 00:48:11.833382 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 12 00:48:11 crc kubenswrapper[4606]: I1212 00:48:11.911593 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ce96712-78cb-43cb-8f08-563664ef0451-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4ce96712-78cb-43cb-8f08-563664ef0451\") " pod="openstack/nova-api-0" Dec 12 00:48:11 crc kubenswrapper[4606]: I1212 00:48:11.911647 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ce96712-78cb-43cb-8f08-563664ef0451-config-data\") pod \"nova-api-0\" (UID: \"4ce96712-78cb-43cb-8f08-563664ef0451\") " pod="openstack/nova-api-0" Dec 12 00:48:11 crc kubenswrapper[4606]: I1212 00:48:11.911776 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sfgkm\" (UniqueName: \"kubernetes.io/projected/4ce96712-78cb-43cb-8f08-563664ef0451-kube-api-access-sfgkm\") pod \"nova-api-0\" (UID: \"4ce96712-78cb-43cb-8f08-563664ef0451\") " pod="openstack/nova-api-0" Dec 12 00:48:11 crc kubenswrapper[4606]: I1212 00:48:11.911842 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4ce96712-78cb-43cb-8f08-563664ef0451-logs\") pod \"nova-api-0\" (UID: \"4ce96712-78cb-43cb-8f08-563664ef0451\") " pod="openstack/nova-api-0" Dec 12 00:48:11 crc kubenswrapper[4606]: I1212 00:48:11.912562 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4ce96712-78cb-43cb-8f08-563664ef0451-logs\") pod \"nova-api-0\" (UID: \"4ce96712-78cb-43cb-8f08-563664ef0451\") " pod="openstack/nova-api-0" Dec 12 00:48:11 crc kubenswrapper[4606]: I1212 00:48:11.956437 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ce96712-78cb-43cb-8f08-563664ef0451-config-data\") pod \"nova-api-0\" (UID: \"4ce96712-78cb-43cb-8f08-563664ef0451\") " pod="openstack/nova-api-0" Dec 12 00:48:11 crc kubenswrapper[4606]: I1212 00:48:11.973639 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ce96712-78cb-43cb-8f08-563664ef0451-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4ce96712-78cb-43cb-8f08-563664ef0451\") " pod="openstack/nova-api-0" Dec 12 00:48:11 crc kubenswrapper[4606]: I1212 00:48:11.978906 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfgkm\" (UniqueName: \"kubernetes.io/projected/4ce96712-78cb-43cb-8f08-563664ef0451-kube-api-access-sfgkm\") pod \"nova-api-0\" (UID: \"4ce96712-78cb-43cb-8f08-563664ef0451\") " pod="openstack/nova-api-0" Dec 12 00:48:11 crc kubenswrapper[4606]: I1212 00:48:11.993913 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 12 00:48:12 crc kubenswrapper[4606]: I1212 00:48:12.000759 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 12 00:48:12 crc kubenswrapper[4606]: I1212 00:48:12.017723 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/477394c1-2eda-4a72-92af-ad59f431fe83-config-data\") pod \"nova-scheduler-0\" (UID: \"477394c1-2eda-4a72-92af-ad59f431fe83\") " pod="openstack/nova-scheduler-0" Dec 12 00:48:12 crc kubenswrapper[4606]: I1212 00:48:12.017814 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/477394c1-2eda-4a72-92af-ad59f431fe83-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"477394c1-2eda-4a72-92af-ad59f431fe83\") " pod="openstack/nova-scheduler-0" Dec 12 00:48:12 crc kubenswrapper[4606]: I1212 00:48:12.018773 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6xcx\" (UniqueName: \"kubernetes.io/projected/477394c1-2eda-4a72-92af-ad59f431fe83-kube-api-access-x6xcx\") pod \"nova-scheduler-0\" (UID: \"477394c1-2eda-4a72-92af-ad59f431fe83\") " pod="openstack/nova-scheduler-0" Dec 12 00:48:12 crc kubenswrapper[4606]: I1212 00:48:12.098404 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 12 00:48:12 crc kubenswrapper[4606]: I1212 00:48:12.099720 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 12 00:48:12 crc kubenswrapper[4606]: I1212 00:48:12.103666 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 12 00:48:12 crc kubenswrapper[4606]: I1212 00:48:12.120101 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/477394c1-2eda-4a72-92af-ad59f431fe83-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"477394c1-2eda-4a72-92af-ad59f431fe83\") " pod="openstack/nova-scheduler-0" Dec 12 00:48:12 crc kubenswrapper[4606]: I1212 00:48:12.120193 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x6xcx\" (UniqueName: \"kubernetes.io/projected/477394c1-2eda-4a72-92af-ad59f431fe83-kube-api-access-x6xcx\") pod \"nova-scheduler-0\" (UID: \"477394c1-2eda-4a72-92af-ad59f431fe83\") " pod="openstack/nova-scheduler-0" Dec 12 00:48:12 crc kubenswrapper[4606]: I1212 00:48:12.120310 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/477394c1-2eda-4a72-92af-ad59f431fe83-config-data\") pod \"nova-scheduler-0\" (UID: \"477394c1-2eda-4a72-92af-ad59f431fe83\") " pod="openstack/nova-scheduler-0" Dec 12 00:48:12 crc kubenswrapper[4606]: I1212 00:48:12.161514 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6xcx\" (UniqueName: \"kubernetes.io/projected/477394c1-2eda-4a72-92af-ad59f431fe83-kube-api-access-x6xcx\") pod \"nova-scheduler-0\" (UID: \"477394c1-2eda-4a72-92af-ad59f431fe83\") " pod="openstack/nova-scheduler-0" Dec 12 00:48:12 crc kubenswrapper[4606]: I1212 00:48:12.166588 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/477394c1-2eda-4a72-92af-ad59f431fe83-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"477394c1-2eda-4a72-92af-ad59f431fe83\") " pod="openstack/nova-scheduler-0" Dec 12 00:48:12 crc kubenswrapper[4606]: I1212 00:48:12.172774 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/477394c1-2eda-4a72-92af-ad59f431fe83-config-data\") pod \"nova-scheduler-0\" (UID: \"477394c1-2eda-4a72-92af-ad59f431fe83\") " pod="openstack/nova-scheduler-0" Dec 12 00:48:12 crc kubenswrapper[4606]: I1212 00:48:12.187571 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-b9fb498f6-62fcc" Dec 12 00:48:12 crc kubenswrapper[4606]: I1212 00:48:12.187601 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-b9fb498f6-62fcc" Dec 12 00:48:12 crc kubenswrapper[4606]: I1212 00:48:12.193213 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 12 00:48:12 crc kubenswrapper[4606]: I1212 00:48:12.198849 4606 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-b9fb498f6-62fcc" podUID="e38df57e-1a86-4c45-bf40-6282a6a049ed" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.149:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.149:8443: connect: connection refused" Dec 12 00:48:12 crc kubenswrapper[4606]: I1212 00:48:12.214613 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 12 00:48:12 crc kubenswrapper[4606]: I1212 00:48:12.216610 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 12 00:48:12 crc kubenswrapper[4606]: I1212 00:48:12.222837 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 12 00:48:12 crc kubenswrapper[4606]: I1212 00:48:12.223420 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2qp7\" (UniqueName: \"kubernetes.io/projected/1e589cf8-9e1a-4124-b7bb-c8fa6fd5b26e-kube-api-access-x2qp7\") pod \"nova-cell1-novncproxy-0\" (UID: \"1e589cf8-9e1a-4124-b7bb-c8fa6fd5b26e\") " pod="openstack/nova-cell1-novncproxy-0" Dec 12 00:48:12 crc kubenswrapper[4606]: I1212 00:48:12.223507 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e589cf8-9e1a-4124-b7bb-c8fa6fd5b26e-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"1e589cf8-9e1a-4124-b7bb-c8fa6fd5b26e\") " pod="openstack/nova-cell1-novncproxy-0" Dec 12 00:48:12 crc kubenswrapper[4606]: I1212 00:48:12.224629 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e589cf8-9e1a-4124-b7bb-c8fa6fd5b26e-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"1e589cf8-9e1a-4124-b7bb-c8fa6fd5b26e\") " pod="openstack/nova-cell1-novncproxy-0" Dec 12 00:48:12 crc kubenswrapper[4606]: I1212 00:48:12.263480 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 12 00:48:12 crc kubenswrapper[4606]: I1212 00:48:12.309467 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-4t5v7"] Dec 12 00:48:12 crc kubenswrapper[4606]: I1212 00:48:12.311663 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bccf8f775-4t5v7" Dec 12 00:48:12 crc kubenswrapper[4606]: I1212 00:48:12.316631 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 12 00:48:12 crc kubenswrapper[4606]: I1212 00:48:12.330322 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a79a593-ee61-48db-a4db-103ea5495c3a-logs\") pod \"nova-metadata-0\" (UID: \"7a79a593-ee61-48db-a4db-103ea5495c3a\") " pod="openstack/nova-metadata-0" Dec 12 00:48:12 crc kubenswrapper[4606]: I1212 00:48:12.330360 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a79a593-ee61-48db-a4db-103ea5495c3a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7a79a593-ee61-48db-a4db-103ea5495c3a\") " pod="openstack/nova-metadata-0" Dec 12 00:48:12 crc kubenswrapper[4606]: I1212 00:48:12.330389 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a930b062-89d9-4d8d-b649-b926aa8b2fe9-config\") pod \"dnsmasq-dns-bccf8f775-4t5v7\" (UID: \"a930b062-89d9-4d8d-b649-b926aa8b2fe9\") " pod="openstack/dnsmasq-dns-bccf8f775-4t5v7" Dec 12 00:48:12 crc kubenswrapper[4606]: I1212 00:48:12.330459 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bjm2\" (UniqueName: \"kubernetes.io/projected/7a79a593-ee61-48db-a4db-103ea5495c3a-kube-api-access-6bjm2\") pod \"nova-metadata-0\" (UID: \"7a79a593-ee61-48db-a4db-103ea5495c3a\") " pod="openstack/nova-metadata-0" Dec 12 00:48:12 crc kubenswrapper[4606]: I1212 00:48:12.330482 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e589cf8-9e1a-4124-b7bb-c8fa6fd5b26e-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"1e589cf8-9e1a-4124-b7bb-c8fa6fd5b26e\") " pod="openstack/nova-cell1-novncproxy-0" Dec 12 00:48:12 crc kubenswrapper[4606]: I1212 00:48:12.330499 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a930b062-89d9-4d8d-b649-b926aa8b2fe9-ovsdbserver-nb\") pod \"dnsmasq-dns-bccf8f775-4t5v7\" (UID: \"a930b062-89d9-4d8d-b649-b926aa8b2fe9\") " pod="openstack/dnsmasq-dns-bccf8f775-4t5v7" Dec 12 00:48:12 crc kubenswrapper[4606]: I1212 00:48:12.330533 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x2qp7\" (UniqueName: \"kubernetes.io/projected/1e589cf8-9e1a-4124-b7bb-c8fa6fd5b26e-kube-api-access-x2qp7\") pod \"nova-cell1-novncproxy-0\" (UID: \"1e589cf8-9e1a-4124-b7bb-c8fa6fd5b26e\") " pod="openstack/nova-cell1-novncproxy-0" Dec 12 00:48:12 crc kubenswrapper[4606]: I1212 00:48:12.330581 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gs5x\" (UniqueName: \"kubernetes.io/projected/a930b062-89d9-4d8d-b649-b926aa8b2fe9-kube-api-access-7gs5x\") pod \"dnsmasq-dns-bccf8f775-4t5v7\" (UID: \"a930b062-89d9-4d8d-b649-b926aa8b2fe9\") " pod="openstack/dnsmasq-dns-bccf8f775-4t5v7" Dec 12 00:48:12 crc kubenswrapper[4606]: I1212 00:48:12.330621 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e589cf8-9e1a-4124-b7bb-c8fa6fd5b26e-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"1e589cf8-9e1a-4124-b7bb-c8fa6fd5b26e\") " pod="openstack/nova-cell1-novncproxy-0" Dec 12 00:48:12 crc kubenswrapper[4606]: I1212 00:48:12.330668 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a930b062-89d9-4d8d-b649-b926aa8b2fe9-ovsdbserver-sb\") pod \"dnsmasq-dns-bccf8f775-4t5v7\" (UID: \"a930b062-89d9-4d8d-b649-b926aa8b2fe9\") " pod="openstack/dnsmasq-dns-bccf8f775-4t5v7" Dec 12 00:48:12 crc kubenswrapper[4606]: I1212 00:48:12.330708 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a79a593-ee61-48db-a4db-103ea5495c3a-config-data\") pod \"nova-metadata-0\" (UID: \"7a79a593-ee61-48db-a4db-103ea5495c3a\") " pod="openstack/nova-metadata-0" Dec 12 00:48:12 crc kubenswrapper[4606]: I1212 00:48:12.330747 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a930b062-89d9-4d8d-b649-b926aa8b2fe9-dns-swift-storage-0\") pod \"dnsmasq-dns-bccf8f775-4t5v7\" (UID: \"a930b062-89d9-4d8d-b649-b926aa8b2fe9\") " pod="openstack/dnsmasq-dns-bccf8f775-4t5v7" Dec 12 00:48:12 crc kubenswrapper[4606]: I1212 00:48:12.330778 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a930b062-89d9-4d8d-b649-b926aa8b2fe9-dns-svc\") pod \"dnsmasq-dns-bccf8f775-4t5v7\" (UID: \"a930b062-89d9-4d8d-b649-b926aa8b2fe9\") " pod="openstack/dnsmasq-dns-bccf8f775-4t5v7" Dec 12 00:48:12 crc kubenswrapper[4606]: I1212 00:48:12.339397 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e589cf8-9e1a-4124-b7bb-c8fa6fd5b26e-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"1e589cf8-9e1a-4124-b7bb-c8fa6fd5b26e\") " pod="openstack/nova-cell1-novncproxy-0" Dec 12 00:48:12 crc kubenswrapper[4606]: I1212 00:48:12.339440 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-4t5v7"] Dec 12 00:48:12 crc kubenswrapper[4606]: I1212 00:48:12.343503 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e589cf8-9e1a-4124-b7bb-c8fa6fd5b26e-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"1e589cf8-9e1a-4124-b7bb-c8fa6fd5b26e\") " pod="openstack/nova-cell1-novncproxy-0" Dec 12 00:48:12 crc kubenswrapper[4606]: I1212 00:48:12.377043 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2qp7\" (UniqueName: \"kubernetes.io/projected/1e589cf8-9e1a-4124-b7bb-c8fa6fd5b26e-kube-api-access-x2qp7\") pod \"nova-cell1-novncproxy-0\" (UID: \"1e589cf8-9e1a-4124-b7bb-c8fa6fd5b26e\") " pod="openstack/nova-cell1-novncproxy-0" Dec 12 00:48:12 crc kubenswrapper[4606]: I1212 00:48:12.431837 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a930b062-89d9-4d8d-b649-b926aa8b2fe9-ovsdbserver-sb\") pod \"dnsmasq-dns-bccf8f775-4t5v7\" (UID: \"a930b062-89d9-4d8d-b649-b926aa8b2fe9\") " pod="openstack/dnsmasq-dns-bccf8f775-4t5v7" Dec 12 00:48:12 crc kubenswrapper[4606]: I1212 00:48:12.431887 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a79a593-ee61-48db-a4db-103ea5495c3a-config-data\") pod \"nova-metadata-0\" (UID: \"7a79a593-ee61-48db-a4db-103ea5495c3a\") " pod="openstack/nova-metadata-0" Dec 12 00:48:12 crc kubenswrapper[4606]: I1212 00:48:12.431914 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a930b062-89d9-4d8d-b649-b926aa8b2fe9-dns-swift-storage-0\") pod \"dnsmasq-dns-bccf8f775-4t5v7\" (UID: \"a930b062-89d9-4d8d-b649-b926aa8b2fe9\") " pod="openstack/dnsmasq-dns-bccf8f775-4t5v7" Dec 12 00:48:12 crc kubenswrapper[4606]: I1212 00:48:12.431936 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a930b062-89d9-4d8d-b649-b926aa8b2fe9-dns-svc\") pod \"dnsmasq-dns-bccf8f775-4t5v7\" (UID: \"a930b062-89d9-4d8d-b649-b926aa8b2fe9\") " pod="openstack/dnsmasq-dns-bccf8f775-4t5v7" Dec 12 00:48:12 crc kubenswrapper[4606]: I1212 00:48:12.431963 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a79a593-ee61-48db-a4db-103ea5495c3a-logs\") pod \"nova-metadata-0\" (UID: \"7a79a593-ee61-48db-a4db-103ea5495c3a\") " pod="openstack/nova-metadata-0" Dec 12 00:48:12 crc kubenswrapper[4606]: I1212 00:48:12.431978 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a79a593-ee61-48db-a4db-103ea5495c3a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7a79a593-ee61-48db-a4db-103ea5495c3a\") " pod="openstack/nova-metadata-0" Dec 12 00:48:12 crc kubenswrapper[4606]: I1212 00:48:12.431995 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a930b062-89d9-4d8d-b649-b926aa8b2fe9-config\") pod \"dnsmasq-dns-bccf8f775-4t5v7\" (UID: \"a930b062-89d9-4d8d-b649-b926aa8b2fe9\") " pod="openstack/dnsmasq-dns-bccf8f775-4t5v7" Dec 12 00:48:12 crc kubenswrapper[4606]: I1212 00:48:12.432038 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6bjm2\" (UniqueName: \"kubernetes.io/projected/7a79a593-ee61-48db-a4db-103ea5495c3a-kube-api-access-6bjm2\") pod \"nova-metadata-0\" (UID: \"7a79a593-ee61-48db-a4db-103ea5495c3a\") " pod="openstack/nova-metadata-0" Dec 12 00:48:12 crc kubenswrapper[4606]: I1212 00:48:12.432055 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a930b062-89d9-4d8d-b649-b926aa8b2fe9-ovsdbserver-nb\") pod \"dnsmasq-dns-bccf8f775-4t5v7\" (UID: \"a930b062-89d9-4d8d-b649-b926aa8b2fe9\") " pod="openstack/dnsmasq-dns-bccf8f775-4t5v7" Dec 12 00:48:12 crc kubenswrapper[4606]: I1212 00:48:12.432089 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gs5x\" (UniqueName: \"kubernetes.io/projected/a930b062-89d9-4d8d-b649-b926aa8b2fe9-kube-api-access-7gs5x\") pod \"dnsmasq-dns-bccf8f775-4t5v7\" (UID: \"a930b062-89d9-4d8d-b649-b926aa8b2fe9\") " pod="openstack/dnsmasq-dns-bccf8f775-4t5v7" Dec 12 00:48:12 crc kubenswrapper[4606]: I1212 00:48:12.433081 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a930b062-89d9-4d8d-b649-b926aa8b2fe9-ovsdbserver-sb\") pod \"dnsmasq-dns-bccf8f775-4t5v7\" (UID: \"a930b062-89d9-4d8d-b649-b926aa8b2fe9\") " pod="openstack/dnsmasq-dns-bccf8f775-4t5v7" Dec 12 00:48:12 crc kubenswrapper[4606]: I1212 00:48:12.436892 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a79a593-ee61-48db-a4db-103ea5495c3a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7a79a593-ee61-48db-a4db-103ea5495c3a\") " pod="openstack/nova-metadata-0" Dec 12 00:48:12 crc kubenswrapper[4606]: I1212 00:48:12.437578 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a930b062-89d9-4d8d-b649-b926aa8b2fe9-config\") pod \"dnsmasq-dns-bccf8f775-4t5v7\" (UID: \"a930b062-89d9-4d8d-b649-b926aa8b2fe9\") " pod="openstack/dnsmasq-dns-bccf8f775-4t5v7" Dec 12 00:48:12 crc kubenswrapper[4606]: I1212 00:48:12.438047 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a930b062-89d9-4d8d-b649-b926aa8b2fe9-ovsdbserver-nb\") pod \"dnsmasq-dns-bccf8f775-4t5v7\" (UID: \"a930b062-89d9-4d8d-b649-b926aa8b2fe9\") " pod="openstack/dnsmasq-dns-bccf8f775-4t5v7" Dec 12 00:48:12 crc kubenswrapper[4606]: I1212 00:48:12.438530 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a930b062-89d9-4d8d-b649-b926aa8b2fe9-dns-svc\") pod \"dnsmasq-dns-bccf8f775-4t5v7\" (UID: \"a930b062-89d9-4d8d-b649-b926aa8b2fe9\") " pod="openstack/dnsmasq-dns-bccf8f775-4t5v7" Dec 12 00:48:12 crc kubenswrapper[4606]: I1212 00:48:12.439001 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a930b062-89d9-4d8d-b649-b926aa8b2fe9-dns-swift-storage-0\") pod \"dnsmasq-dns-bccf8f775-4t5v7\" (UID: \"a930b062-89d9-4d8d-b649-b926aa8b2fe9\") " pod="openstack/dnsmasq-dns-bccf8f775-4t5v7" Dec 12 00:48:12 crc kubenswrapper[4606]: I1212 00:48:12.439326 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a79a593-ee61-48db-a4db-103ea5495c3a-logs\") pod \"nova-metadata-0\" (UID: \"7a79a593-ee61-48db-a4db-103ea5495c3a\") " pod="openstack/nova-metadata-0" Dec 12 00:48:12 crc kubenswrapper[4606]: I1212 00:48:12.450669 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a79a593-ee61-48db-a4db-103ea5495c3a-config-data\") pod \"nova-metadata-0\" (UID: \"7a79a593-ee61-48db-a4db-103ea5495c3a\") " pod="openstack/nova-metadata-0" Dec 12 00:48:12 crc kubenswrapper[4606]: I1212 00:48:12.454542 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bjm2\" (UniqueName: \"kubernetes.io/projected/7a79a593-ee61-48db-a4db-103ea5495c3a-kube-api-access-6bjm2\") pod \"nova-metadata-0\" (UID: \"7a79a593-ee61-48db-a4db-103ea5495c3a\") " pod="openstack/nova-metadata-0" Dec 12 00:48:12 crc kubenswrapper[4606]: I1212 00:48:12.456489 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gs5x\" (UniqueName: \"kubernetes.io/projected/a930b062-89d9-4d8d-b649-b926aa8b2fe9-kube-api-access-7gs5x\") pod \"dnsmasq-dns-bccf8f775-4t5v7\" (UID: \"a930b062-89d9-4d8d-b649-b926aa8b2fe9\") " pod="openstack/dnsmasq-dns-bccf8f775-4t5v7" Dec 12 00:48:12 crc kubenswrapper[4606]: I1212 00:48:12.475590 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 12 00:48:12 crc kubenswrapper[4606]: I1212 00:48:12.546598 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 12 00:48:12 crc kubenswrapper[4606]: I1212 00:48:12.655027 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-tfrdx"] Dec 12 00:48:12 crc kubenswrapper[4606]: I1212 00:48:12.669662 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bccf8f775-4t5v7" Dec 12 00:48:12 crc kubenswrapper[4606]: I1212 00:48:12.740727 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-tfrdx" event={"ID":"8a61d852-d814-4230-9ab7-4d0b5742b00a","Type":"ContainerStarted","Data":"7c3e1913d598bc7b85968e5cefc571c517b0036f783bba4d89845b092551127c"} Dec 12 00:48:12 crc kubenswrapper[4606]: I1212 00:48:12.848225 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 12 00:48:12 crc kubenswrapper[4606]: W1212 00:48:12.851381 4606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4ce96712_78cb_43cb_8f08_563664ef0451.slice/crio-01856a9a29d4ac33b341a40d3afdc2f4f3e79e81347e31b346d99fd7d470af37 WatchSource:0}: Error finding container 01856a9a29d4ac33b341a40d3afdc2f4f3e79e81347e31b346d99fd7d470af37: Status 404 returned error can't find the container with id 01856a9a29d4ac33b341a40d3afdc2f4f3e79e81347e31b346d99fd7d470af37 Dec 12 00:48:13 crc kubenswrapper[4606]: I1212 00:48:13.048786 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 12 00:48:13 crc kubenswrapper[4606]: I1212 00:48:13.241287 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 12 00:48:13 crc kubenswrapper[4606]: W1212 00:48:13.407660 4606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7a79a593_ee61_48db_a4db_103ea5495c3a.slice/crio-bb32fabdffc4212d66266dacc3b9659cc8a283717da8aca6b9e2351e4a3c8f1c WatchSource:0}: Error finding container bb32fabdffc4212d66266dacc3b9659cc8a283717da8aca6b9e2351e4a3c8f1c: Status 404 returned error can't find the container with id bb32fabdffc4212d66266dacc3b9659cc8a283717da8aca6b9e2351e4a3c8f1c Dec 12 00:48:13 crc kubenswrapper[4606]: I1212 00:48:13.412038 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 12 00:48:13 crc kubenswrapper[4606]: I1212 00:48:13.553402 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-4t5v7"] Dec 12 00:48:13 crc kubenswrapper[4606]: I1212 00:48:13.610066 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-xzftq"] Dec 12 00:48:13 crc kubenswrapper[4606]: I1212 00:48:13.611396 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-xzftq" Dec 12 00:48:13 crc kubenswrapper[4606]: I1212 00:48:13.616545 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Dec 12 00:48:13 crc kubenswrapper[4606]: I1212 00:48:13.616692 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 12 00:48:13 crc kubenswrapper[4606]: I1212 00:48:13.648394 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-xzftq"] Dec 12 00:48:13 crc kubenswrapper[4606]: I1212 00:48:13.660318 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r542z\" (UniqueName: \"kubernetes.io/projected/cdd1a39f-c0cc-4f2f-938b-b170f5f7d9a4-kube-api-access-r542z\") pod \"nova-cell1-conductor-db-sync-xzftq\" (UID: \"cdd1a39f-c0cc-4f2f-938b-b170f5f7d9a4\") " pod="openstack/nova-cell1-conductor-db-sync-xzftq" Dec 12 00:48:13 crc kubenswrapper[4606]: I1212 00:48:13.660479 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdd1a39f-c0cc-4f2f-938b-b170f5f7d9a4-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-xzftq\" (UID: \"cdd1a39f-c0cc-4f2f-938b-b170f5f7d9a4\") " pod="openstack/nova-cell1-conductor-db-sync-xzftq" Dec 12 00:48:13 crc kubenswrapper[4606]: I1212 00:48:13.660654 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdd1a39f-c0cc-4f2f-938b-b170f5f7d9a4-config-data\") pod \"nova-cell1-conductor-db-sync-xzftq\" (UID: \"cdd1a39f-c0cc-4f2f-938b-b170f5f7d9a4\") " pod="openstack/nova-cell1-conductor-db-sync-xzftq" Dec 12 00:48:13 crc kubenswrapper[4606]: I1212 00:48:13.660859 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cdd1a39f-c0cc-4f2f-938b-b170f5f7d9a4-scripts\") pod \"nova-cell1-conductor-db-sync-xzftq\" (UID: \"cdd1a39f-c0cc-4f2f-938b-b170f5f7d9a4\") " pod="openstack/nova-cell1-conductor-db-sync-xzftq" Dec 12 00:48:13 crc kubenswrapper[4606]: I1212 00:48:13.754951 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"477394c1-2eda-4a72-92af-ad59f431fe83","Type":"ContainerStarted","Data":"fecbc1b1088cd62428eba7f7fea785301ba2f231ddc9bb63b50d84b663f2d966"} Dec 12 00:48:13 crc kubenswrapper[4606]: I1212 00:48:13.762668 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r542z\" (UniqueName: \"kubernetes.io/projected/cdd1a39f-c0cc-4f2f-938b-b170f5f7d9a4-kube-api-access-r542z\") pod \"nova-cell1-conductor-db-sync-xzftq\" (UID: \"cdd1a39f-c0cc-4f2f-938b-b170f5f7d9a4\") " pod="openstack/nova-cell1-conductor-db-sync-xzftq" Dec 12 00:48:13 crc kubenswrapper[4606]: I1212 00:48:13.763403 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdd1a39f-c0cc-4f2f-938b-b170f5f7d9a4-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-xzftq\" (UID: \"cdd1a39f-c0cc-4f2f-938b-b170f5f7d9a4\") " pod="openstack/nova-cell1-conductor-db-sync-xzftq" Dec 12 00:48:13 crc kubenswrapper[4606]: I1212 00:48:13.764062 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdd1a39f-c0cc-4f2f-938b-b170f5f7d9a4-config-data\") pod \"nova-cell1-conductor-db-sync-xzftq\" (UID: \"cdd1a39f-c0cc-4f2f-938b-b170f5f7d9a4\") " pod="openstack/nova-cell1-conductor-db-sync-xzftq" Dec 12 00:48:13 crc kubenswrapper[4606]: I1212 00:48:13.764246 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cdd1a39f-c0cc-4f2f-938b-b170f5f7d9a4-scripts\") pod \"nova-cell1-conductor-db-sync-xzftq\" (UID: \"cdd1a39f-c0cc-4f2f-938b-b170f5f7d9a4\") " pod="openstack/nova-cell1-conductor-db-sync-xzftq" Dec 12 00:48:13 crc kubenswrapper[4606]: I1212 00:48:13.768796 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdd1a39f-c0cc-4f2f-938b-b170f5f7d9a4-config-data\") pod \"nova-cell1-conductor-db-sync-xzftq\" (UID: \"cdd1a39f-c0cc-4f2f-938b-b170f5f7d9a4\") " pod="openstack/nova-cell1-conductor-db-sync-xzftq" Dec 12 00:48:13 crc kubenswrapper[4606]: I1212 00:48:13.769011 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7a79a593-ee61-48db-a4db-103ea5495c3a","Type":"ContainerStarted","Data":"bb32fabdffc4212d66266dacc3b9659cc8a283717da8aca6b9e2351e4a3c8f1c"} Dec 12 00:48:13 crc kubenswrapper[4606]: I1212 00:48:13.770875 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdd1a39f-c0cc-4f2f-938b-b170f5f7d9a4-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-xzftq\" (UID: \"cdd1a39f-c0cc-4f2f-938b-b170f5f7d9a4\") " pod="openstack/nova-cell1-conductor-db-sync-xzftq" Dec 12 00:48:13 crc kubenswrapper[4606]: I1212 00:48:13.771305 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-4t5v7" event={"ID":"a930b062-89d9-4d8d-b649-b926aa8b2fe9","Type":"ContainerStarted","Data":"d8d7a1efbaab6d22e6d30aee564eb6894063dd038290ebec51ee6c47076801f1"} Dec 12 00:48:13 crc kubenswrapper[4606]: I1212 00:48:13.775253 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cdd1a39f-c0cc-4f2f-938b-b170f5f7d9a4-scripts\") pod \"nova-cell1-conductor-db-sync-xzftq\" (UID: \"cdd1a39f-c0cc-4f2f-938b-b170f5f7d9a4\") " pod="openstack/nova-cell1-conductor-db-sync-xzftq" Dec 12 00:48:13 crc kubenswrapper[4606]: I1212 00:48:13.807468 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-tfrdx" event={"ID":"8a61d852-d814-4230-9ab7-4d0b5742b00a","Type":"ContainerStarted","Data":"e4bfde433c9e93055eba44d7b0159e70ca005ce3945986a0a0d98516213ccef1"} Dec 12 00:48:13 crc kubenswrapper[4606]: I1212 00:48:13.818040 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"1e589cf8-9e1a-4124-b7bb-c8fa6fd5b26e","Type":"ContainerStarted","Data":"056bf1efcf69368f446da4035b3875754ceb7bd5466d6bd0609302a3f7a2bbd7"} Dec 12 00:48:13 crc kubenswrapper[4606]: I1212 00:48:13.823808 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r542z\" (UniqueName: \"kubernetes.io/projected/cdd1a39f-c0cc-4f2f-938b-b170f5f7d9a4-kube-api-access-r542z\") pod \"nova-cell1-conductor-db-sync-xzftq\" (UID: \"cdd1a39f-c0cc-4f2f-938b-b170f5f7d9a4\") " pod="openstack/nova-cell1-conductor-db-sync-xzftq" Dec 12 00:48:13 crc kubenswrapper[4606]: I1212 00:48:13.826248 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4ce96712-78cb-43cb-8f08-563664ef0451","Type":"ContainerStarted","Data":"01856a9a29d4ac33b341a40d3afdc2f4f3e79e81347e31b346d99fd7d470af37"} Dec 12 00:48:13 crc kubenswrapper[4606]: I1212 00:48:13.852060 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-tfrdx" podStartSLOduration=2.852020993 podStartE2EDuration="2.852020993s" podCreationTimestamp="2025-12-12 00:48:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:48:13.829367356 +0000 UTC m=+1484.374720232" watchObservedRunningTime="2025-12-12 00:48:13.852020993 +0000 UTC m=+1484.397373859" Dec 12 00:48:13 crc kubenswrapper[4606]: I1212 00:48:13.943006 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-xzftq" Dec 12 00:48:14 crc kubenswrapper[4606]: I1212 00:48:14.443970 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-xzftq"] Dec 12 00:48:14 crc kubenswrapper[4606]: I1212 00:48:14.838896 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-xzftq" event={"ID":"cdd1a39f-c0cc-4f2f-938b-b170f5f7d9a4","Type":"ContainerStarted","Data":"ba05fb9ddf63c631f27d231a8d8954e06df430388b725101e1130c24e60f9fcc"} Dec 12 00:48:14 crc kubenswrapper[4606]: I1212 00:48:14.838937 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-xzftq" event={"ID":"cdd1a39f-c0cc-4f2f-938b-b170f5f7d9a4","Type":"ContainerStarted","Data":"f97546ee5ba64bdf3a6821924c9ce6a7dbf9b162c2f335803998f3e77dab335e"} Dec 12 00:48:14 crc kubenswrapper[4606]: I1212 00:48:14.848554 4606 generic.go:334] "Generic (PLEG): container finished" podID="a930b062-89d9-4d8d-b649-b926aa8b2fe9" containerID="1fe3f35acb6880fa87ad534db567d0670de94b1081cc7dbaaf1a7e703fe31835" exitCode=0 Dec 12 00:48:14 crc kubenswrapper[4606]: I1212 00:48:14.849305 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-4t5v7" event={"ID":"a930b062-89d9-4d8d-b649-b926aa8b2fe9","Type":"ContainerDied","Data":"1fe3f35acb6880fa87ad534db567d0670de94b1081cc7dbaaf1a7e703fe31835"} Dec 12 00:48:14 crc kubenswrapper[4606]: I1212 00:48:14.862403 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-xzftq" podStartSLOduration=1.862381347 podStartE2EDuration="1.862381347s" podCreationTimestamp="2025-12-12 00:48:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:48:14.855612476 +0000 UTC m=+1485.400965342" watchObservedRunningTime="2025-12-12 00:48:14.862381347 +0000 UTC m=+1485.407734213" Dec 12 00:48:15 crc kubenswrapper[4606]: I1212 00:48:15.704453 4606 scope.go:117] "RemoveContainer" containerID="77f50006b910ef408eff1d278d14d4c6df84559c43dc90a75f9e7ad4aa5dad37" Dec 12 00:48:15 crc kubenswrapper[4606]: E1212 00:48:15.704877 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 00:48:15 crc kubenswrapper[4606]: I1212 00:48:15.840379 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 12 00:48:15 crc kubenswrapper[4606]: I1212 00:48:15.861760 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-4t5v7" event={"ID":"a930b062-89d9-4d8d-b649-b926aa8b2fe9","Type":"ContainerStarted","Data":"373b04978f7f7adca9a70fa32eb7c948937a6429caa347afdeb01a6f17840877"} Dec 12 00:48:15 crc kubenswrapper[4606]: I1212 00:48:15.861817 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-bccf8f775-4t5v7" Dec 12 00:48:15 crc kubenswrapper[4606]: I1212 00:48:15.911235 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 12 00:48:18 crc kubenswrapper[4606]: I1212 00:48:18.899158 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"1e589cf8-9e1a-4124-b7bb-c8fa6fd5b26e","Type":"ContainerStarted","Data":"ca55a0c69b9e5148ed7e570d1e1cff32cf0739821995257ca5662d20ac134add"} Dec 12 00:48:18 crc kubenswrapper[4606]: I1212 00:48:18.899323 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="1e589cf8-9e1a-4124-b7bb-c8fa6fd5b26e" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://ca55a0c69b9e5148ed7e570d1e1cff32cf0739821995257ca5662d20ac134add" gracePeriod=30 Dec 12 00:48:18 crc kubenswrapper[4606]: I1212 00:48:18.901566 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4ce96712-78cb-43cb-8f08-563664ef0451","Type":"ContainerStarted","Data":"624269db7cd2a4c1fa80f0d6faad7581358518b0167ee7b8a274beca76759320"} Dec 12 00:48:18 crc kubenswrapper[4606]: I1212 00:48:18.901587 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4ce96712-78cb-43cb-8f08-563664ef0451","Type":"ContainerStarted","Data":"c7e5d0ce7189dac632f9180bfe3beeed6b24362506942aa82865c7a3ded94093"} Dec 12 00:48:18 crc kubenswrapper[4606]: I1212 00:48:18.904287 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"477394c1-2eda-4a72-92af-ad59f431fe83","Type":"ContainerStarted","Data":"adbd8ada1f7845cda5bfbd50137ee5c91b4d760242fa6fcdc78827f317c79996"} Dec 12 00:48:18 crc kubenswrapper[4606]: I1212 00:48:18.911292 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7a79a593-ee61-48db-a4db-103ea5495c3a","Type":"ContainerStarted","Data":"34813c56595b02722b34c77154376551e202b27de7a2d2f9a7e225a6e1ab25e2"} Dec 12 00:48:18 crc kubenswrapper[4606]: I1212 00:48:18.911323 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7a79a593-ee61-48db-a4db-103ea5495c3a","Type":"ContainerStarted","Data":"aa52ea42bbcdf2a87d165746d208a51465d0ed2df49e64f80e40fbcaa4a350dd"} Dec 12 00:48:18 crc kubenswrapper[4606]: I1212 00:48:18.911475 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="7a79a593-ee61-48db-a4db-103ea5495c3a" containerName="nova-metadata-log" containerID="cri-o://aa52ea42bbcdf2a87d165746d208a51465d0ed2df49e64f80e40fbcaa4a350dd" gracePeriod=30 Dec 12 00:48:18 crc kubenswrapper[4606]: I1212 00:48:18.911533 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="7a79a593-ee61-48db-a4db-103ea5495c3a" containerName="nova-metadata-metadata" containerID="cri-o://34813c56595b02722b34c77154376551e202b27de7a2d2f9a7e225a6e1ab25e2" gracePeriod=30 Dec 12 00:48:18 crc kubenswrapper[4606]: I1212 00:48:18.921592 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-bccf8f775-4t5v7" podStartSLOduration=6.921573417 podStartE2EDuration="6.921573417s" podCreationTimestamp="2025-12-12 00:48:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:48:15.916612539 +0000 UTC m=+1486.461965405" watchObservedRunningTime="2025-12-12 00:48:18.921573417 +0000 UTC m=+1489.466926283" Dec 12 00:48:18 crc kubenswrapper[4606]: I1212 00:48:18.951123 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.200636105 podStartE2EDuration="7.951108138s" podCreationTimestamp="2025-12-12 00:48:11 +0000 UTC" firstStartedPulling="2025-12-12 00:48:12.859511955 +0000 UTC m=+1483.404864821" lastFinishedPulling="2025-12-12 00:48:17.609983988 +0000 UTC m=+1488.155336854" observedRunningTime="2025-12-12 00:48:18.949576307 +0000 UTC m=+1489.494929173" watchObservedRunningTime="2025-12-12 00:48:18.951108138 +0000 UTC m=+1489.496461004" Dec 12 00:48:18 crc kubenswrapper[4606]: I1212 00:48:18.953295 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.60158553 podStartE2EDuration="7.953278426s" podCreationTimestamp="2025-12-12 00:48:11 +0000 UTC" firstStartedPulling="2025-12-12 00:48:13.259123784 +0000 UTC m=+1483.804476650" lastFinishedPulling="2025-12-12 00:48:17.61081668 +0000 UTC m=+1488.156169546" observedRunningTime="2025-12-12 00:48:18.928602655 +0000 UTC m=+1489.473955521" watchObservedRunningTime="2025-12-12 00:48:18.953278426 +0000 UTC m=+1489.498631292" Dec 12 00:48:18 crc kubenswrapper[4606]: I1212 00:48:18.970529 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.454295913 podStartE2EDuration="7.970513818s" podCreationTimestamp="2025-12-12 00:48:11 +0000 UTC" firstStartedPulling="2025-12-12 00:48:13.094740039 +0000 UTC m=+1483.640092905" lastFinishedPulling="2025-12-12 00:48:17.610957944 +0000 UTC m=+1488.156310810" observedRunningTime="2025-12-12 00:48:18.964062175 +0000 UTC m=+1489.509415041" watchObservedRunningTime="2025-12-12 00:48:18.970513818 +0000 UTC m=+1489.515866684" Dec 12 00:48:18 crc kubenswrapper[4606]: I1212 00:48:18.986857 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.779916139 podStartE2EDuration="7.986838275s" podCreationTimestamp="2025-12-12 00:48:11 +0000 UTC" firstStartedPulling="2025-12-12 00:48:13.410815709 +0000 UTC m=+1483.956168575" lastFinishedPulling="2025-12-12 00:48:17.617737845 +0000 UTC m=+1488.163090711" observedRunningTime="2025-12-12 00:48:18.984894783 +0000 UTC m=+1489.530247649" watchObservedRunningTime="2025-12-12 00:48:18.986838275 +0000 UTC m=+1489.532191141" Dec 12 00:48:19 crc kubenswrapper[4606]: I1212 00:48:19.692411 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 12 00:48:19 crc kubenswrapper[4606]: I1212 00:48:19.811125 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a79a593-ee61-48db-a4db-103ea5495c3a-logs\") pod \"7a79a593-ee61-48db-a4db-103ea5495c3a\" (UID: \"7a79a593-ee61-48db-a4db-103ea5495c3a\") " Dec 12 00:48:19 crc kubenswrapper[4606]: I1212 00:48:19.811183 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a79a593-ee61-48db-a4db-103ea5495c3a-config-data\") pod \"7a79a593-ee61-48db-a4db-103ea5495c3a\" (UID: \"7a79a593-ee61-48db-a4db-103ea5495c3a\") " Dec 12 00:48:19 crc kubenswrapper[4606]: I1212 00:48:19.811215 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6bjm2\" (UniqueName: \"kubernetes.io/projected/7a79a593-ee61-48db-a4db-103ea5495c3a-kube-api-access-6bjm2\") pod \"7a79a593-ee61-48db-a4db-103ea5495c3a\" (UID: \"7a79a593-ee61-48db-a4db-103ea5495c3a\") " Dec 12 00:48:19 crc kubenswrapper[4606]: I1212 00:48:19.811305 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a79a593-ee61-48db-a4db-103ea5495c3a-combined-ca-bundle\") pod \"7a79a593-ee61-48db-a4db-103ea5495c3a\" (UID: \"7a79a593-ee61-48db-a4db-103ea5495c3a\") " Dec 12 00:48:19 crc kubenswrapper[4606]: I1212 00:48:19.811518 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a79a593-ee61-48db-a4db-103ea5495c3a-logs" (OuterVolumeSpecName: "logs") pod "7a79a593-ee61-48db-a4db-103ea5495c3a" (UID: "7a79a593-ee61-48db-a4db-103ea5495c3a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 00:48:19 crc kubenswrapper[4606]: I1212 00:48:19.812271 4606 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a79a593-ee61-48db-a4db-103ea5495c3a-logs\") on node \"crc\" DevicePath \"\"" Dec 12 00:48:19 crc kubenswrapper[4606]: I1212 00:48:19.816857 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a79a593-ee61-48db-a4db-103ea5495c3a-kube-api-access-6bjm2" (OuterVolumeSpecName: "kube-api-access-6bjm2") pod "7a79a593-ee61-48db-a4db-103ea5495c3a" (UID: "7a79a593-ee61-48db-a4db-103ea5495c3a"). InnerVolumeSpecName "kube-api-access-6bjm2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:48:19 crc kubenswrapper[4606]: I1212 00:48:19.851314 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a79a593-ee61-48db-a4db-103ea5495c3a-config-data" (OuterVolumeSpecName: "config-data") pod "7a79a593-ee61-48db-a4db-103ea5495c3a" (UID: "7a79a593-ee61-48db-a4db-103ea5495c3a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:48:19 crc kubenswrapper[4606]: I1212 00:48:19.857418 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a79a593-ee61-48db-a4db-103ea5495c3a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7a79a593-ee61-48db-a4db-103ea5495c3a" (UID: "7a79a593-ee61-48db-a4db-103ea5495c3a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:48:19 crc kubenswrapper[4606]: I1212 00:48:19.914413 4606 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a79a593-ee61-48db-a4db-103ea5495c3a-config-data\") on node \"crc\" DevicePath \"\"" Dec 12 00:48:19 crc kubenswrapper[4606]: I1212 00:48:19.914450 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6bjm2\" (UniqueName: \"kubernetes.io/projected/7a79a593-ee61-48db-a4db-103ea5495c3a-kube-api-access-6bjm2\") on node \"crc\" DevicePath \"\"" Dec 12 00:48:19 crc kubenswrapper[4606]: I1212 00:48:19.914464 4606 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a79a593-ee61-48db-a4db-103ea5495c3a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 12 00:48:19 crc kubenswrapper[4606]: I1212 00:48:19.922763 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 12 00:48:19 crc kubenswrapper[4606]: I1212 00:48:19.922798 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7a79a593-ee61-48db-a4db-103ea5495c3a","Type":"ContainerDied","Data":"34813c56595b02722b34c77154376551e202b27de7a2d2f9a7e225a6e1ab25e2"} Dec 12 00:48:19 crc kubenswrapper[4606]: I1212 00:48:19.922856 4606 scope.go:117] "RemoveContainer" containerID="34813c56595b02722b34c77154376551e202b27de7a2d2f9a7e225a6e1ab25e2" Dec 12 00:48:19 crc kubenswrapper[4606]: I1212 00:48:19.922674 4606 generic.go:334] "Generic (PLEG): container finished" podID="7a79a593-ee61-48db-a4db-103ea5495c3a" containerID="34813c56595b02722b34c77154376551e202b27de7a2d2f9a7e225a6e1ab25e2" exitCode=0 Dec 12 00:48:19 crc kubenswrapper[4606]: I1212 00:48:19.923223 4606 generic.go:334] "Generic (PLEG): container finished" podID="7a79a593-ee61-48db-a4db-103ea5495c3a" containerID="aa52ea42bbcdf2a87d165746d208a51465d0ed2df49e64f80e40fbcaa4a350dd" exitCode=143 Dec 12 00:48:19 crc kubenswrapper[4606]: I1212 00:48:19.923303 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7a79a593-ee61-48db-a4db-103ea5495c3a","Type":"ContainerDied","Data":"aa52ea42bbcdf2a87d165746d208a51465d0ed2df49e64f80e40fbcaa4a350dd"} Dec 12 00:48:19 crc kubenswrapper[4606]: I1212 00:48:19.923333 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7a79a593-ee61-48db-a4db-103ea5495c3a","Type":"ContainerDied","Data":"bb32fabdffc4212d66266dacc3b9659cc8a283717da8aca6b9e2351e4a3c8f1c"} Dec 12 00:48:19 crc kubenswrapper[4606]: I1212 00:48:19.973009 4606 scope.go:117] "RemoveContainer" containerID="aa52ea42bbcdf2a87d165746d208a51465d0ed2df49e64f80e40fbcaa4a350dd" Dec 12 00:48:19 crc kubenswrapper[4606]: I1212 00:48:19.978694 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 12 00:48:19 crc kubenswrapper[4606]: I1212 00:48:19.988679 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 12 00:48:20 crc kubenswrapper[4606]: I1212 00:48:20.001032 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 12 00:48:20 crc kubenswrapper[4606]: E1212 00:48:20.008925 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a79a593-ee61-48db-a4db-103ea5495c3a" containerName="nova-metadata-log" Dec 12 00:48:20 crc kubenswrapper[4606]: I1212 00:48:20.009010 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a79a593-ee61-48db-a4db-103ea5495c3a" containerName="nova-metadata-log" Dec 12 00:48:20 crc kubenswrapper[4606]: E1212 00:48:20.009086 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a79a593-ee61-48db-a4db-103ea5495c3a" containerName="nova-metadata-metadata" Dec 12 00:48:20 crc kubenswrapper[4606]: I1212 00:48:20.009144 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a79a593-ee61-48db-a4db-103ea5495c3a" containerName="nova-metadata-metadata" Dec 12 00:48:20 crc kubenswrapper[4606]: I1212 00:48:20.009391 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a79a593-ee61-48db-a4db-103ea5495c3a" containerName="nova-metadata-metadata" Dec 12 00:48:20 crc kubenswrapper[4606]: I1212 00:48:20.009459 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a79a593-ee61-48db-a4db-103ea5495c3a" containerName="nova-metadata-log" Dec 12 00:48:20 crc kubenswrapper[4606]: I1212 00:48:20.016971 4606 scope.go:117] "RemoveContainer" containerID="34813c56595b02722b34c77154376551e202b27de7a2d2f9a7e225a6e1ab25e2" Dec 12 00:48:20 crc kubenswrapper[4606]: I1212 00:48:20.017125 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 12 00:48:20 crc kubenswrapper[4606]: E1212 00:48:20.017486 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34813c56595b02722b34c77154376551e202b27de7a2d2f9a7e225a6e1ab25e2\": container with ID starting with 34813c56595b02722b34c77154376551e202b27de7a2d2f9a7e225a6e1ab25e2 not found: ID does not exist" containerID="34813c56595b02722b34c77154376551e202b27de7a2d2f9a7e225a6e1ab25e2" Dec 12 00:48:20 crc kubenswrapper[4606]: I1212 00:48:20.017639 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34813c56595b02722b34c77154376551e202b27de7a2d2f9a7e225a6e1ab25e2"} err="failed to get container status \"34813c56595b02722b34c77154376551e202b27de7a2d2f9a7e225a6e1ab25e2\": rpc error: code = NotFound desc = could not find container \"34813c56595b02722b34c77154376551e202b27de7a2d2f9a7e225a6e1ab25e2\": container with ID starting with 34813c56595b02722b34c77154376551e202b27de7a2d2f9a7e225a6e1ab25e2 not found: ID does not exist" Dec 12 00:48:20 crc kubenswrapper[4606]: I1212 00:48:20.018527 4606 scope.go:117] "RemoveContainer" containerID="aa52ea42bbcdf2a87d165746d208a51465d0ed2df49e64f80e40fbcaa4a350dd" Dec 12 00:48:20 crc kubenswrapper[4606]: E1212 00:48:20.019043 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa52ea42bbcdf2a87d165746d208a51465d0ed2df49e64f80e40fbcaa4a350dd\": container with ID starting with aa52ea42bbcdf2a87d165746d208a51465d0ed2df49e64f80e40fbcaa4a350dd not found: ID does not exist" containerID="aa52ea42bbcdf2a87d165746d208a51465d0ed2df49e64f80e40fbcaa4a350dd" Dec 12 00:48:20 crc kubenswrapper[4606]: I1212 00:48:20.019088 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa52ea42bbcdf2a87d165746d208a51465d0ed2df49e64f80e40fbcaa4a350dd"} err="failed to get container status \"aa52ea42bbcdf2a87d165746d208a51465d0ed2df49e64f80e40fbcaa4a350dd\": rpc error: code = NotFound desc = could not find container \"aa52ea42bbcdf2a87d165746d208a51465d0ed2df49e64f80e40fbcaa4a350dd\": container with ID starting with aa52ea42bbcdf2a87d165746d208a51465d0ed2df49e64f80e40fbcaa4a350dd not found: ID does not exist" Dec 12 00:48:20 crc kubenswrapper[4606]: I1212 00:48:20.019108 4606 scope.go:117] "RemoveContainer" containerID="34813c56595b02722b34c77154376551e202b27de7a2d2f9a7e225a6e1ab25e2" Dec 12 00:48:20 crc kubenswrapper[4606]: I1212 00:48:20.019361 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34813c56595b02722b34c77154376551e202b27de7a2d2f9a7e225a6e1ab25e2"} err="failed to get container status \"34813c56595b02722b34c77154376551e202b27de7a2d2f9a7e225a6e1ab25e2\": rpc error: code = NotFound desc = could not find container \"34813c56595b02722b34c77154376551e202b27de7a2d2f9a7e225a6e1ab25e2\": container with ID starting with 34813c56595b02722b34c77154376551e202b27de7a2d2f9a7e225a6e1ab25e2 not found: ID does not exist" Dec 12 00:48:20 crc kubenswrapper[4606]: I1212 00:48:20.019377 4606 scope.go:117] "RemoveContainer" containerID="aa52ea42bbcdf2a87d165746d208a51465d0ed2df49e64f80e40fbcaa4a350dd" Dec 12 00:48:20 crc kubenswrapper[4606]: I1212 00:48:20.019592 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa52ea42bbcdf2a87d165746d208a51465d0ed2df49e64f80e40fbcaa4a350dd"} err="failed to get container status \"aa52ea42bbcdf2a87d165746d208a51465d0ed2df49e64f80e40fbcaa4a350dd\": rpc error: code = NotFound desc = could not find container \"aa52ea42bbcdf2a87d165746d208a51465d0ed2df49e64f80e40fbcaa4a350dd\": container with ID starting with aa52ea42bbcdf2a87d165746d208a51465d0ed2df49e64f80e40fbcaa4a350dd not found: ID does not exist" Dec 12 00:48:20 crc kubenswrapper[4606]: I1212 00:48:20.020723 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 12 00:48:20 crc kubenswrapper[4606]: I1212 00:48:20.022121 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 12 00:48:20 crc kubenswrapper[4606]: I1212 00:48:20.043702 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 12 00:48:20 crc kubenswrapper[4606]: I1212 00:48:20.138669 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06336572-8158-4ce2-a6fa-ab2cf40ca435-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"06336572-8158-4ce2-a6fa-ab2cf40ca435\") " pod="openstack/nova-metadata-0" Dec 12 00:48:20 crc kubenswrapper[4606]: I1212 00:48:20.138726 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/06336572-8158-4ce2-a6fa-ab2cf40ca435-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"06336572-8158-4ce2-a6fa-ab2cf40ca435\") " pod="openstack/nova-metadata-0" Dec 12 00:48:20 crc kubenswrapper[4606]: I1212 00:48:20.138761 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/06336572-8158-4ce2-a6fa-ab2cf40ca435-logs\") pod \"nova-metadata-0\" (UID: \"06336572-8158-4ce2-a6fa-ab2cf40ca435\") " pod="openstack/nova-metadata-0" Dec 12 00:48:20 crc kubenswrapper[4606]: I1212 00:48:20.138896 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hrwj\" (UniqueName: \"kubernetes.io/projected/06336572-8158-4ce2-a6fa-ab2cf40ca435-kube-api-access-6hrwj\") pod \"nova-metadata-0\" (UID: \"06336572-8158-4ce2-a6fa-ab2cf40ca435\") " pod="openstack/nova-metadata-0" Dec 12 00:48:20 crc kubenswrapper[4606]: I1212 00:48:20.139275 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06336572-8158-4ce2-a6fa-ab2cf40ca435-config-data\") pod \"nova-metadata-0\" (UID: \"06336572-8158-4ce2-a6fa-ab2cf40ca435\") " pod="openstack/nova-metadata-0" Dec 12 00:48:20 crc kubenswrapper[4606]: I1212 00:48:20.241610 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6hrwj\" (UniqueName: \"kubernetes.io/projected/06336572-8158-4ce2-a6fa-ab2cf40ca435-kube-api-access-6hrwj\") pod \"nova-metadata-0\" (UID: \"06336572-8158-4ce2-a6fa-ab2cf40ca435\") " pod="openstack/nova-metadata-0" Dec 12 00:48:20 crc kubenswrapper[4606]: I1212 00:48:20.241770 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06336572-8158-4ce2-a6fa-ab2cf40ca435-config-data\") pod \"nova-metadata-0\" (UID: \"06336572-8158-4ce2-a6fa-ab2cf40ca435\") " pod="openstack/nova-metadata-0" Dec 12 00:48:20 crc kubenswrapper[4606]: I1212 00:48:20.241825 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06336572-8158-4ce2-a6fa-ab2cf40ca435-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"06336572-8158-4ce2-a6fa-ab2cf40ca435\") " pod="openstack/nova-metadata-0" Dec 12 00:48:20 crc kubenswrapper[4606]: I1212 00:48:20.241855 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/06336572-8158-4ce2-a6fa-ab2cf40ca435-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"06336572-8158-4ce2-a6fa-ab2cf40ca435\") " pod="openstack/nova-metadata-0" Dec 12 00:48:20 crc kubenswrapper[4606]: I1212 00:48:20.241889 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/06336572-8158-4ce2-a6fa-ab2cf40ca435-logs\") pod \"nova-metadata-0\" (UID: \"06336572-8158-4ce2-a6fa-ab2cf40ca435\") " pod="openstack/nova-metadata-0" Dec 12 00:48:20 crc kubenswrapper[4606]: I1212 00:48:20.242569 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/06336572-8158-4ce2-a6fa-ab2cf40ca435-logs\") pod \"nova-metadata-0\" (UID: \"06336572-8158-4ce2-a6fa-ab2cf40ca435\") " pod="openstack/nova-metadata-0" Dec 12 00:48:20 crc kubenswrapper[4606]: I1212 00:48:20.246919 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06336572-8158-4ce2-a6fa-ab2cf40ca435-config-data\") pod \"nova-metadata-0\" (UID: \"06336572-8158-4ce2-a6fa-ab2cf40ca435\") " pod="openstack/nova-metadata-0" Dec 12 00:48:20 crc kubenswrapper[4606]: I1212 00:48:20.247655 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06336572-8158-4ce2-a6fa-ab2cf40ca435-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"06336572-8158-4ce2-a6fa-ab2cf40ca435\") " pod="openstack/nova-metadata-0" Dec 12 00:48:20 crc kubenswrapper[4606]: I1212 00:48:20.256105 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/06336572-8158-4ce2-a6fa-ab2cf40ca435-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"06336572-8158-4ce2-a6fa-ab2cf40ca435\") " pod="openstack/nova-metadata-0" Dec 12 00:48:20 crc kubenswrapper[4606]: I1212 00:48:20.261721 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hrwj\" (UniqueName: \"kubernetes.io/projected/06336572-8158-4ce2-a6fa-ab2cf40ca435-kube-api-access-6hrwj\") pod \"nova-metadata-0\" (UID: \"06336572-8158-4ce2-a6fa-ab2cf40ca435\") " pod="openstack/nova-metadata-0" Dec 12 00:48:20 crc kubenswrapper[4606]: I1212 00:48:20.359338 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 12 00:48:20 crc kubenswrapper[4606]: I1212 00:48:20.921348 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 12 00:48:20 crc kubenswrapper[4606]: I1212 00:48:20.948410 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"06336572-8158-4ce2-a6fa-ab2cf40ca435","Type":"ContainerStarted","Data":"de9fc49121194e33d12b0e23187d36c62e6f8d25179d6cd73c5d7d2e8e363db2"} Dec 12 00:48:21 crc kubenswrapper[4606]: I1212 00:48:21.715688 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a79a593-ee61-48db-a4db-103ea5495c3a" path="/var/lib/kubelet/pods/7a79a593-ee61-48db-a4db-103ea5495c3a/volumes" Dec 12 00:48:21 crc kubenswrapper[4606]: I1212 00:48:21.971979 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"06336572-8158-4ce2-a6fa-ab2cf40ca435","Type":"ContainerStarted","Data":"6bdc44a9a6bdcf5dc5f65fdf264ce857eb0ccde5d1f0a7ea6d33b51781d9303f"} Dec 12 00:48:21 crc kubenswrapper[4606]: I1212 00:48:21.972033 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"06336572-8158-4ce2-a6fa-ab2cf40ca435","Type":"ContainerStarted","Data":"c38dec3a5de24fee067e5e182a1be14c4bc9b50903caf748df8e8a911fd3f253"} Dec 12 00:48:22 crc kubenswrapper[4606]: I1212 00:48:22.001563 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 12 00:48:22 crc kubenswrapper[4606]: I1212 00:48:22.001608 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 12 00:48:22 crc kubenswrapper[4606]: I1212 00:48:22.006981 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.006957743 podStartE2EDuration="3.006957743s" podCreationTimestamp="2025-12-12 00:48:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:48:21.995715372 +0000 UTC m=+1492.541068238" watchObservedRunningTime="2025-12-12 00:48:22.006957743 +0000 UTC m=+1492.552310609" Dec 12 00:48:22 crc kubenswrapper[4606]: I1212 00:48:22.178473 4606 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-b9fb498f6-62fcc" podUID="e38df57e-1a86-4c45-bf40-6282a6a049ed" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.149:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.149:8443: connect: connection refused" Dec 12 00:48:22 crc kubenswrapper[4606]: I1212 00:48:22.318245 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 12 00:48:22 crc kubenswrapper[4606]: I1212 00:48:22.318310 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 12 00:48:22 crc kubenswrapper[4606]: I1212 00:48:22.345880 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 12 00:48:22 crc kubenswrapper[4606]: I1212 00:48:22.476684 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 12 00:48:22 crc kubenswrapper[4606]: I1212 00:48:22.672505 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-bccf8f775-4t5v7" Dec 12 00:48:22 crc kubenswrapper[4606]: I1212 00:48:22.819452 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-p8nxl"] Dec 12 00:48:22 crc kubenswrapper[4606]: I1212 00:48:22.822952 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6578955fd5-p8nxl" podUID="36e6ef5a-367e-412f-af42-3bba95417184" containerName="dnsmasq-dns" containerID="cri-o://cf236970ecdd890458909955d16a3c5ec90b299c8f6eee249228c1fd39f1aacf" gracePeriod=10 Dec 12 00:48:22 crc kubenswrapper[4606]: I1212 00:48:22.995317 4606 generic.go:334] "Generic (PLEG): container finished" podID="36e6ef5a-367e-412f-af42-3bba95417184" containerID="cf236970ecdd890458909955d16a3c5ec90b299c8f6eee249228c1fd39f1aacf" exitCode=0 Dec 12 00:48:22 crc kubenswrapper[4606]: I1212 00:48:22.995397 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-p8nxl" event={"ID":"36e6ef5a-367e-412f-af42-3bba95417184","Type":"ContainerDied","Data":"cf236970ecdd890458909955d16a3c5ec90b299c8f6eee249228c1fd39f1aacf"} Dec 12 00:48:23 crc kubenswrapper[4606]: I1212 00:48:23.003517 4606 generic.go:334] "Generic (PLEG): container finished" podID="8a61d852-d814-4230-9ab7-4d0b5742b00a" containerID="e4bfde433c9e93055eba44d7b0159e70ca005ce3945986a0a0d98516213ccef1" exitCode=0 Dec 12 00:48:23 crc kubenswrapper[4606]: I1212 00:48:23.003644 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-tfrdx" event={"ID":"8a61d852-d814-4230-9ab7-4d0b5742b00a","Type":"ContainerDied","Data":"e4bfde433c9e93055eba44d7b0159e70ca005ce3945986a0a0d98516213ccef1"} Dec 12 00:48:23 crc kubenswrapper[4606]: I1212 00:48:23.042033 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 12 00:48:23 crc kubenswrapper[4606]: I1212 00:48:23.087812 4606 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="4ce96712-78cb-43cb-8f08-563664ef0451" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.187:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 12 00:48:23 crc kubenswrapper[4606]: I1212 00:48:23.087922 4606 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="4ce96712-78cb-43cb-8f08-563664ef0451" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.187:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 12 00:48:23 crc kubenswrapper[4606]: I1212 00:48:23.444840 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-p8nxl" Dec 12 00:48:23 crc kubenswrapper[4606]: I1212 00:48:23.633927 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/36e6ef5a-367e-412f-af42-3bba95417184-dns-svc\") pod \"36e6ef5a-367e-412f-af42-3bba95417184\" (UID: \"36e6ef5a-367e-412f-af42-3bba95417184\") " Dec 12 00:48:23 crc kubenswrapper[4606]: I1212 00:48:23.634378 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36e6ef5a-367e-412f-af42-3bba95417184-config\") pod \"36e6ef5a-367e-412f-af42-3bba95417184\" (UID: \"36e6ef5a-367e-412f-af42-3bba95417184\") " Dec 12 00:48:23 crc kubenswrapper[4606]: I1212 00:48:23.634553 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c4dgq\" (UniqueName: \"kubernetes.io/projected/36e6ef5a-367e-412f-af42-3bba95417184-kube-api-access-c4dgq\") pod \"36e6ef5a-367e-412f-af42-3bba95417184\" (UID: \"36e6ef5a-367e-412f-af42-3bba95417184\") " Dec 12 00:48:23 crc kubenswrapper[4606]: I1212 00:48:23.634834 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/36e6ef5a-367e-412f-af42-3bba95417184-ovsdbserver-sb\") pod \"36e6ef5a-367e-412f-af42-3bba95417184\" (UID: \"36e6ef5a-367e-412f-af42-3bba95417184\") " Dec 12 00:48:23 crc kubenswrapper[4606]: I1212 00:48:23.635100 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/36e6ef5a-367e-412f-af42-3bba95417184-dns-swift-storage-0\") pod \"36e6ef5a-367e-412f-af42-3bba95417184\" (UID: \"36e6ef5a-367e-412f-af42-3bba95417184\") " Dec 12 00:48:23 crc kubenswrapper[4606]: I1212 00:48:23.635307 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/36e6ef5a-367e-412f-af42-3bba95417184-ovsdbserver-nb\") pod \"36e6ef5a-367e-412f-af42-3bba95417184\" (UID: \"36e6ef5a-367e-412f-af42-3bba95417184\") " Dec 12 00:48:23 crc kubenswrapper[4606]: I1212 00:48:23.693062 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36e6ef5a-367e-412f-af42-3bba95417184-kube-api-access-c4dgq" (OuterVolumeSpecName: "kube-api-access-c4dgq") pod "36e6ef5a-367e-412f-af42-3bba95417184" (UID: "36e6ef5a-367e-412f-af42-3bba95417184"). InnerVolumeSpecName "kube-api-access-c4dgq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:48:23 crc kubenswrapper[4606]: I1212 00:48:23.738480 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c4dgq\" (UniqueName: \"kubernetes.io/projected/36e6ef5a-367e-412f-af42-3bba95417184-kube-api-access-c4dgq\") on node \"crc\" DevicePath \"\"" Dec 12 00:48:23 crc kubenswrapper[4606]: I1212 00:48:23.785675 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36e6ef5a-367e-412f-af42-3bba95417184-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "36e6ef5a-367e-412f-af42-3bba95417184" (UID: "36e6ef5a-367e-412f-af42-3bba95417184"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:48:23 crc kubenswrapper[4606]: I1212 00:48:23.803994 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36e6ef5a-367e-412f-af42-3bba95417184-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "36e6ef5a-367e-412f-af42-3bba95417184" (UID: "36e6ef5a-367e-412f-af42-3bba95417184"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:48:23 crc kubenswrapper[4606]: I1212 00:48:23.815121 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36e6ef5a-367e-412f-af42-3bba95417184-config" (OuterVolumeSpecName: "config") pod "36e6ef5a-367e-412f-af42-3bba95417184" (UID: "36e6ef5a-367e-412f-af42-3bba95417184"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:48:23 crc kubenswrapper[4606]: I1212 00:48:23.820881 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36e6ef5a-367e-412f-af42-3bba95417184-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "36e6ef5a-367e-412f-af42-3bba95417184" (UID: "36e6ef5a-367e-412f-af42-3bba95417184"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:48:23 crc kubenswrapper[4606]: I1212 00:48:23.822641 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36e6ef5a-367e-412f-af42-3bba95417184-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "36e6ef5a-367e-412f-af42-3bba95417184" (UID: "36e6ef5a-367e-412f-af42-3bba95417184"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:48:23 crc kubenswrapper[4606]: I1212 00:48:23.839943 4606 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/36e6ef5a-367e-412f-af42-3bba95417184-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 12 00:48:23 crc kubenswrapper[4606]: I1212 00:48:23.839967 4606 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36e6ef5a-367e-412f-af42-3bba95417184-config\") on node \"crc\" DevicePath \"\"" Dec 12 00:48:23 crc kubenswrapper[4606]: I1212 00:48:23.839976 4606 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/36e6ef5a-367e-412f-af42-3bba95417184-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 12 00:48:23 crc kubenswrapper[4606]: I1212 00:48:23.839986 4606 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/36e6ef5a-367e-412f-af42-3bba95417184-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 12 00:48:23 crc kubenswrapper[4606]: I1212 00:48:23.839996 4606 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/36e6ef5a-367e-412f-af42-3bba95417184-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 12 00:48:24 crc kubenswrapper[4606]: I1212 00:48:24.015565 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-p8nxl" event={"ID":"36e6ef5a-367e-412f-af42-3bba95417184","Type":"ContainerDied","Data":"343ca413d6d41b7feaf5639e68653591545070834a2d53185e957e8f56ef024e"} Dec 12 00:48:24 crc kubenswrapper[4606]: I1212 00:48:24.015636 4606 scope.go:117] "RemoveContainer" containerID="cf236970ecdd890458909955d16a3c5ec90b299c8f6eee249228c1fd39f1aacf" Dec 12 00:48:24 crc kubenswrapper[4606]: I1212 00:48:24.016132 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-p8nxl" Dec 12 00:48:24 crc kubenswrapper[4606]: I1212 00:48:24.017989 4606 generic.go:334] "Generic (PLEG): container finished" podID="cdd1a39f-c0cc-4f2f-938b-b170f5f7d9a4" containerID="ba05fb9ddf63c631f27d231a8d8954e06df430388b725101e1130c24e60f9fcc" exitCode=0 Dec 12 00:48:24 crc kubenswrapper[4606]: I1212 00:48:24.018624 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-xzftq" event={"ID":"cdd1a39f-c0cc-4f2f-938b-b170f5f7d9a4","Type":"ContainerDied","Data":"ba05fb9ddf63c631f27d231a8d8954e06df430388b725101e1130c24e60f9fcc"} Dec 12 00:48:24 crc kubenswrapper[4606]: I1212 00:48:24.053633 4606 scope.go:117] "RemoveContainer" containerID="060726d7cba75c94080e422330e8a6b95588f87352842c10483c5166661c537c" Dec 12 00:48:24 crc kubenswrapper[4606]: I1212 00:48:24.079017 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-p8nxl"] Dec 12 00:48:24 crc kubenswrapper[4606]: I1212 00:48:24.088517 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-p8nxl"] Dec 12 00:48:24 crc kubenswrapper[4606]: I1212 00:48:24.407810 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-tfrdx" Dec 12 00:48:24 crc kubenswrapper[4606]: I1212 00:48:24.552957 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a61d852-d814-4230-9ab7-4d0b5742b00a-config-data\") pod \"8a61d852-d814-4230-9ab7-4d0b5742b00a\" (UID: \"8a61d852-d814-4230-9ab7-4d0b5742b00a\") " Dec 12 00:48:24 crc kubenswrapper[4606]: I1212 00:48:24.553025 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rwblg\" (UniqueName: \"kubernetes.io/projected/8a61d852-d814-4230-9ab7-4d0b5742b00a-kube-api-access-rwblg\") pod \"8a61d852-d814-4230-9ab7-4d0b5742b00a\" (UID: \"8a61d852-d814-4230-9ab7-4d0b5742b00a\") " Dec 12 00:48:24 crc kubenswrapper[4606]: I1212 00:48:24.553074 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a61d852-d814-4230-9ab7-4d0b5742b00a-scripts\") pod \"8a61d852-d814-4230-9ab7-4d0b5742b00a\" (UID: \"8a61d852-d814-4230-9ab7-4d0b5742b00a\") " Dec 12 00:48:24 crc kubenswrapper[4606]: I1212 00:48:24.553093 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a61d852-d814-4230-9ab7-4d0b5742b00a-combined-ca-bundle\") pod \"8a61d852-d814-4230-9ab7-4d0b5742b00a\" (UID: \"8a61d852-d814-4230-9ab7-4d0b5742b00a\") " Dec 12 00:48:24 crc kubenswrapper[4606]: I1212 00:48:24.573255 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a61d852-d814-4230-9ab7-4d0b5742b00a-scripts" (OuterVolumeSpecName: "scripts") pod "8a61d852-d814-4230-9ab7-4d0b5742b00a" (UID: "8a61d852-d814-4230-9ab7-4d0b5742b00a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:48:24 crc kubenswrapper[4606]: I1212 00:48:24.579339 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a61d852-d814-4230-9ab7-4d0b5742b00a-kube-api-access-rwblg" (OuterVolumeSpecName: "kube-api-access-rwblg") pod "8a61d852-d814-4230-9ab7-4d0b5742b00a" (UID: "8a61d852-d814-4230-9ab7-4d0b5742b00a"). InnerVolumeSpecName "kube-api-access-rwblg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:48:24 crc kubenswrapper[4606]: I1212 00:48:24.587451 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a61d852-d814-4230-9ab7-4d0b5742b00a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8a61d852-d814-4230-9ab7-4d0b5742b00a" (UID: "8a61d852-d814-4230-9ab7-4d0b5742b00a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:48:24 crc kubenswrapper[4606]: I1212 00:48:24.604933 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a61d852-d814-4230-9ab7-4d0b5742b00a-config-data" (OuterVolumeSpecName: "config-data") pod "8a61d852-d814-4230-9ab7-4d0b5742b00a" (UID: "8a61d852-d814-4230-9ab7-4d0b5742b00a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:48:24 crc kubenswrapper[4606]: I1212 00:48:24.655950 4606 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a61d852-d814-4230-9ab7-4d0b5742b00a-config-data\") on node \"crc\" DevicePath \"\"" Dec 12 00:48:24 crc kubenswrapper[4606]: I1212 00:48:24.655990 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rwblg\" (UniqueName: \"kubernetes.io/projected/8a61d852-d814-4230-9ab7-4d0b5742b00a-kube-api-access-rwblg\") on node \"crc\" DevicePath \"\"" Dec 12 00:48:24 crc kubenswrapper[4606]: I1212 00:48:24.656000 4606 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a61d852-d814-4230-9ab7-4d0b5742b00a-scripts\") on node \"crc\" DevicePath \"\"" Dec 12 00:48:24 crc kubenswrapper[4606]: I1212 00:48:24.656009 4606 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a61d852-d814-4230-9ab7-4d0b5742b00a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 12 00:48:25 crc kubenswrapper[4606]: I1212 00:48:25.027248 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-tfrdx" event={"ID":"8a61d852-d814-4230-9ab7-4d0b5742b00a","Type":"ContainerDied","Data":"7c3e1913d598bc7b85968e5cefc571c517b0036f783bba4d89845b092551127c"} Dec 12 00:48:25 crc kubenswrapper[4606]: I1212 00:48:25.027283 4606 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7c3e1913d598bc7b85968e5cefc571c517b0036f783bba4d89845b092551127c" Dec 12 00:48:25 crc kubenswrapper[4606]: I1212 00:48:25.028462 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-tfrdx" Dec 12 00:48:25 crc kubenswrapper[4606]: I1212 00:48:25.320564 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 12 00:48:25 crc kubenswrapper[4606]: I1212 00:48:25.321491 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="4ce96712-78cb-43cb-8f08-563664ef0451" containerName="nova-api-log" containerID="cri-o://c7e5d0ce7189dac632f9180bfe3beeed6b24362506942aa82865c7a3ded94093" gracePeriod=30 Dec 12 00:48:25 crc kubenswrapper[4606]: I1212 00:48:25.321972 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="4ce96712-78cb-43cb-8f08-563664ef0451" containerName="nova-api-api" containerID="cri-o://624269db7cd2a4c1fa80f0d6faad7581358518b0167ee7b8a274beca76759320" gracePeriod=30 Dec 12 00:48:25 crc kubenswrapper[4606]: I1212 00:48:25.360140 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 12 00:48:25 crc kubenswrapper[4606]: I1212 00:48:25.361105 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 12 00:48:25 crc kubenswrapper[4606]: I1212 00:48:25.361795 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 12 00:48:25 crc kubenswrapper[4606]: I1212 00:48:25.361955 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="477394c1-2eda-4a72-92af-ad59f431fe83" containerName="nova-scheduler-scheduler" containerID="cri-o://adbd8ada1f7845cda5bfbd50137ee5c91b4d760242fa6fcdc78827f317c79996" gracePeriod=30 Dec 12 00:48:25 crc kubenswrapper[4606]: I1212 00:48:25.432246 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 12 00:48:25 crc kubenswrapper[4606]: I1212 00:48:25.518667 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-xzftq" Dec 12 00:48:25 crc kubenswrapper[4606]: I1212 00:48:25.676002 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdd1a39f-c0cc-4f2f-938b-b170f5f7d9a4-combined-ca-bundle\") pod \"cdd1a39f-c0cc-4f2f-938b-b170f5f7d9a4\" (UID: \"cdd1a39f-c0cc-4f2f-938b-b170f5f7d9a4\") " Dec 12 00:48:25 crc kubenswrapper[4606]: I1212 00:48:25.676141 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r542z\" (UniqueName: \"kubernetes.io/projected/cdd1a39f-c0cc-4f2f-938b-b170f5f7d9a4-kube-api-access-r542z\") pod \"cdd1a39f-c0cc-4f2f-938b-b170f5f7d9a4\" (UID: \"cdd1a39f-c0cc-4f2f-938b-b170f5f7d9a4\") " Dec 12 00:48:25 crc kubenswrapper[4606]: I1212 00:48:25.676236 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cdd1a39f-c0cc-4f2f-938b-b170f5f7d9a4-scripts\") pod \"cdd1a39f-c0cc-4f2f-938b-b170f5f7d9a4\" (UID: \"cdd1a39f-c0cc-4f2f-938b-b170f5f7d9a4\") " Dec 12 00:48:25 crc kubenswrapper[4606]: I1212 00:48:25.676283 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdd1a39f-c0cc-4f2f-938b-b170f5f7d9a4-config-data\") pod \"cdd1a39f-c0cc-4f2f-938b-b170f5f7d9a4\" (UID: \"cdd1a39f-c0cc-4f2f-938b-b170f5f7d9a4\") " Dec 12 00:48:25 crc kubenswrapper[4606]: I1212 00:48:25.680738 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cdd1a39f-c0cc-4f2f-938b-b170f5f7d9a4-kube-api-access-r542z" (OuterVolumeSpecName: "kube-api-access-r542z") pod "cdd1a39f-c0cc-4f2f-938b-b170f5f7d9a4" (UID: "cdd1a39f-c0cc-4f2f-938b-b170f5f7d9a4"). InnerVolumeSpecName "kube-api-access-r542z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:48:25 crc kubenswrapper[4606]: I1212 00:48:25.685367 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdd1a39f-c0cc-4f2f-938b-b170f5f7d9a4-scripts" (OuterVolumeSpecName: "scripts") pod "cdd1a39f-c0cc-4f2f-938b-b170f5f7d9a4" (UID: "cdd1a39f-c0cc-4f2f-938b-b170f5f7d9a4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:48:25 crc kubenswrapper[4606]: I1212 00:48:25.731389 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdd1a39f-c0cc-4f2f-938b-b170f5f7d9a4-config-data" (OuterVolumeSpecName: "config-data") pod "cdd1a39f-c0cc-4f2f-938b-b170f5f7d9a4" (UID: "cdd1a39f-c0cc-4f2f-938b-b170f5f7d9a4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:48:25 crc kubenswrapper[4606]: I1212 00:48:25.788392 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r542z\" (UniqueName: \"kubernetes.io/projected/cdd1a39f-c0cc-4f2f-938b-b170f5f7d9a4-kube-api-access-r542z\") on node \"crc\" DevicePath \"\"" Dec 12 00:48:25 crc kubenswrapper[4606]: I1212 00:48:25.788420 4606 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cdd1a39f-c0cc-4f2f-938b-b170f5f7d9a4-scripts\") on node \"crc\" DevicePath \"\"" Dec 12 00:48:25 crc kubenswrapper[4606]: I1212 00:48:25.788429 4606 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdd1a39f-c0cc-4f2f-938b-b170f5f7d9a4-config-data\") on node \"crc\" DevicePath \"\"" Dec 12 00:48:25 crc kubenswrapper[4606]: I1212 00:48:25.797996 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36e6ef5a-367e-412f-af42-3bba95417184" path="/var/lib/kubelet/pods/36e6ef5a-367e-412f-af42-3bba95417184/volumes" Dec 12 00:48:25 crc kubenswrapper[4606]: I1212 00:48:25.798385 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdd1a39f-c0cc-4f2f-938b-b170f5f7d9a4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cdd1a39f-c0cc-4f2f-938b-b170f5f7d9a4" (UID: "cdd1a39f-c0cc-4f2f-938b-b170f5f7d9a4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:48:25 crc kubenswrapper[4606]: I1212 00:48:25.890996 4606 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdd1a39f-c0cc-4f2f-938b-b170f5f7d9a4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 12 00:48:26 crc kubenswrapper[4606]: I1212 00:48:26.039001 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-xzftq" event={"ID":"cdd1a39f-c0cc-4f2f-938b-b170f5f7d9a4","Type":"ContainerDied","Data":"f97546ee5ba64bdf3a6821924c9ce6a7dbf9b162c2f335803998f3e77dab335e"} Dec 12 00:48:26 crc kubenswrapper[4606]: I1212 00:48:26.039034 4606 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f97546ee5ba64bdf3a6821924c9ce6a7dbf9b162c2f335803998f3e77dab335e" Dec 12 00:48:26 crc kubenswrapper[4606]: I1212 00:48:26.039105 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-xzftq" Dec 12 00:48:26 crc kubenswrapper[4606]: I1212 00:48:26.041999 4606 generic.go:334] "Generic (PLEG): container finished" podID="4ce96712-78cb-43cb-8f08-563664ef0451" containerID="c7e5d0ce7189dac632f9180bfe3beeed6b24362506942aa82865c7a3ded94093" exitCode=143 Dec 12 00:48:26 crc kubenswrapper[4606]: I1212 00:48:26.042775 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4ce96712-78cb-43cb-8f08-563664ef0451","Type":"ContainerDied","Data":"c7e5d0ce7189dac632f9180bfe3beeed6b24362506942aa82865c7a3ded94093"} Dec 12 00:48:26 crc kubenswrapper[4606]: I1212 00:48:26.116705 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 12 00:48:26 crc kubenswrapper[4606]: E1212 00:48:26.117050 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a61d852-d814-4230-9ab7-4d0b5742b00a" containerName="nova-manage" Dec 12 00:48:26 crc kubenswrapper[4606]: I1212 00:48:26.117067 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a61d852-d814-4230-9ab7-4d0b5742b00a" containerName="nova-manage" Dec 12 00:48:26 crc kubenswrapper[4606]: E1212 00:48:26.117087 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdd1a39f-c0cc-4f2f-938b-b170f5f7d9a4" containerName="nova-cell1-conductor-db-sync" Dec 12 00:48:26 crc kubenswrapper[4606]: I1212 00:48:26.117094 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdd1a39f-c0cc-4f2f-938b-b170f5f7d9a4" containerName="nova-cell1-conductor-db-sync" Dec 12 00:48:26 crc kubenswrapper[4606]: E1212 00:48:26.117108 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36e6ef5a-367e-412f-af42-3bba95417184" containerName="dnsmasq-dns" Dec 12 00:48:26 crc kubenswrapper[4606]: I1212 00:48:26.117114 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="36e6ef5a-367e-412f-af42-3bba95417184" containerName="dnsmasq-dns" Dec 12 00:48:26 crc kubenswrapper[4606]: E1212 00:48:26.117138 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36e6ef5a-367e-412f-af42-3bba95417184" containerName="init" Dec 12 00:48:26 crc kubenswrapper[4606]: I1212 00:48:26.117143 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="36e6ef5a-367e-412f-af42-3bba95417184" containerName="init" Dec 12 00:48:26 crc kubenswrapper[4606]: I1212 00:48:26.117325 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="36e6ef5a-367e-412f-af42-3bba95417184" containerName="dnsmasq-dns" Dec 12 00:48:26 crc kubenswrapper[4606]: I1212 00:48:26.117337 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdd1a39f-c0cc-4f2f-938b-b170f5f7d9a4" containerName="nova-cell1-conductor-db-sync" Dec 12 00:48:26 crc kubenswrapper[4606]: I1212 00:48:26.117347 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a61d852-d814-4230-9ab7-4d0b5742b00a" containerName="nova-manage" Dec 12 00:48:26 crc kubenswrapper[4606]: I1212 00:48:26.117932 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 12 00:48:26 crc kubenswrapper[4606]: I1212 00:48:26.124947 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 12 00:48:26 crc kubenswrapper[4606]: I1212 00:48:26.133796 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 12 00:48:26 crc kubenswrapper[4606]: I1212 00:48:26.195983 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ww42\" (UniqueName: \"kubernetes.io/projected/11fcc545-3d1b-4a4a-b302-c1b565908edf-kube-api-access-7ww42\") pod \"nova-cell1-conductor-0\" (UID: \"11fcc545-3d1b-4a4a-b302-c1b565908edf\") " pod="openstack/nova-cell1-conductor-0" Dec 12 00:48:26 crc kubenswrapper[4606]: I1212 00:48:26.196380 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11fcc545-3d1b-4a4a-b302-c1b565908edf-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"11fcc545-3d1b-4a4a-b302-c1b565908edf\") " pod="openstack/nova-cell1-conductor-0" Dec 12 00:48:26 crc kubenswrapper[4606]: I1212 00:48:26.196416 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11fcc545-3d1b-4a4a-b302-c1b565908edf-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"11fcc545-3d1b-4a4a-b302-c1b565908edf\") " pod="openstack/nova-cell1-conductor-0" Dec 12 00:48:26 crc kubenswrapper[4606]: I1212 00:48:26.297764 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ww42\" (UniqueName: \"kubernetes.io/projected/11fcc545-3d1b-4a4a-b302-c1b565908edf-kube-api-access-7ww42\") pod \"nova-cell1-conductor-0\" (UID: \"11fcc545-3d1b-4a4a-b302-c1b565908edf\") " pod="openstack/nova-cell1-conductor-0" Dec 12 00:48:26 crc kubenswrapper[4606]: I1212 00:48:26.297821 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11fcc545-3d1b-4a4a-b302-c1b565908edf-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"11fcc545-3d1b-4a4a-b302-c1b565908edf\") " pod="openstack/nova-cell1-conductor-0" Dec 12 00:48:26 crc kubenswrapper[4606]: I1212 00:48:26.297842 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11fcc545-3d1b-4a4a-b302-c1b565908edf-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"11fcc545-3d1b-4a4a-b302-c1b565908edf\") " pod="openstack/nova-cell1-conductor-0" Dec 12 00:48:26 crc kubenswrapper[4606]: I1212 00:48:26.304605 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11fcc545-3d1b-4a4a-b302-c1b565908edf-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"11fcc545-3d1b-4a4a-b302-c1b565908edf\") " pod="openstack/nova-cell1-conductor-0" Dec 12 00:48:26 crc kubenswrapper[4606]: I1212 00:48:26.307035 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11fcc545-3d1b-4a4a-b302-c1b565908edf-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"11fcc545-3d1b-4a4a-b302-c1b565908edf\") " pod="openstack/nova-cell1-conductor-0" Dec 12 00:48:26 crc kubenswrapper[4606]: I1212 00:48:26.313617 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ww42\" (UniqueName: \"kubernetes.io/projected/11fcc545-3d1b-4a4a-b302-c1b565908edf-kube-api-access-7ww42\") pod \"nova-cell1-conductor-0\" (UID: \"11fcc545-3d1b-4a4a-b302-c1b565908edf\") " pod="openstack/nova-cell1-conductor-0" Dec 12 00:48:26 crc kubenswrapper[4606]: I1212 00:48:26.439316 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 12 00:48:26 crc kubenswrapper[4606]: I1212 00:48:26.912201 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 12 00:48:27 crc kubenswrapper[4606]: I1212 00:48:27.050980 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="06336572-8158-4ce2-a6fa-ab2cf40ca435" containerName="nova-metadata-log" containerID="cri-o://c38dec3a5de24fee067e5e182a1be14c4bc9b50903caf748df8e8a911fd3f253" gracePeriod=30 Dec 12 00:48:27 crc kubenswrapper[4606]: I1212 00:48:27.051071 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"11fcc545-3d1b-4a4a-b302-c1b565908edf","Type":"ContainerStarted","Data":"76e30fddf91523c75428147dddd9c77f66f5f7ab365d4e352817ca59616cd2b1"} Dec 12 00:48:27 crc kubenswrapper[4606]: I1212 00:48:27.051429 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="06336572-8158-4ce2-a6fa-ab2cf40ca435" containerName="nova-metadata-metadata" containerID="cri-o://6bdc44a9a6bdcf5dc5f65fdf264ce857eb0ccde5d1f0a7ea6d33b51781d9303f" gracePeriod=30 Dec 12 00:48:27 crc kubenswrapper[4606]: E1212 00:48:27.319312 4606 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="adbd8ada1f7845cda5bfbd50137ee5c91b4d760242fa6fcdc78827f317c79996" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 12 00:48:27 crc kubenswrapper[4606]: E1212 00:48:27.320786 4606 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="adbd8ada1f7845cda5bfbd50137ee5c91b4d760242fa6fcdc78827f317c79996" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 12 00:48:27 crc kubenswrapper[4606]: E1212 00:48:27.321969 4606 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="adbd8ada1f7845cda5bfbd50137ee5c91b4d760242fa6fcdc78827f317c79996" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 12 00:48:27 crc kubenswrapper[4606]: E1212 00:48:27.322001 4606 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="477394c1-2eda-4a72-92af-ad59f431fe83" containerName="nova-scheduler-scheduler" Dec 12 00:48:27 crc kubenswrapper[4606]: I1212 00:48:27.601419 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 12 00:48:27 crc kubenswrapper[4606]: I1212 00:48:27.700014 4606 scope.go:117] "RemoveContainer" containerID="77f50006b910ef408eff1d278d14d4c6df84559c43dc90a75f9e7ad4aa5dad37" Dec 12 00:48:27 crc kubenswrapper[4606]: E1212 00:48:27.700842 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 00:48:27 crc kubenswrapper[4606]: I1212 00:48:27.725941 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/06336572-8158-4ce2-a6fa-ab2cf40ca435-logs\") pod \"06336572-8158-4ce2-a6fa-ab2cf40ca435\" (UID: \"06336572-8158-4ce2-a6fa-ab2cf40ca435\") " Dec 12 00:48:27 crc kubenswrapper[4606]: I1212 00:48:27.726066 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06336572-8158-4ce2-a6fa-ab2cf40ca435-combined-ca-bundle\") pod \"06336572-8158-4ce2-a6fa-ab2cf40ca435\" (UID: \"06336572-8158-4ce2-a6fa-ab2cf40ca435\") " Dec 12 00:48:27 crc kubenswrapper[4606]: I1212 00:48:27.726119 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06336572-8158-4ce2-a6fa-ab2cf40ca435-config-data\") pod \"06336572-8158-4ce2-a6fa-ab2cf40ca435\" (UID: \"06336572-8158-4ce2-a6fa-ab2cf40ca435\") " Dec 12 00:48:27 crc kubenswrapper[4606]: I1212 00:48:27.726255 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/06336572-8158-4ce2-a6fa-ab2cf40ca435-nova-metadata-tls-certs\") pod \"06336572-8158-4ce2-a6fa-ab2cf40ca435\" (UID: \"06336572-8158-4ce2-a6fa-ab2cf40ca435\") " Dec 12 00:48:27 crc kubenswrapper[4606]: I1212 00:48:27.726289 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6hrwj\" (UniqueName: \"kubernetes.io/projected/06336572-8158-4ce2-a6fa-ab2cf40ca435-kube-api-access-6hrwj\") pod \"06336572-8158-4ce2-a6fa-ab2cf40ca435\" (UID: \"06336572-8158-4ce2-a6fa-ab2cf40ca435\") " Dec 12 00:48:27 crc kubenswrapper[4606]: I1212 00:48:27.726534 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/06336572-8158-4ce2-a6fa-ab2cf40ca435-logs" (OuterVolumeSpecName: "logs") pod "06336572-8158-4ce2-a6fa-ab2cf40ca435" (UID: "06336572-8158-4ce2-a6fa-ab2cf40ca435"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 00:48:27 crc kubenswrapper[4606]: I1212 00:48:27.727007 4606 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/06336572-8158-4ce2-a6fa-ab2cf40ca435-logs\") on node \"crc\" DevicePath \"\"" Dec 12 00:48:27 crc kubenswrapper[4606]: I1212 00:48:27.732765 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06336572-8158-4ce2-a6fa-ab2cf40ca435-kube-api-access-6hrwj" (OuterVolumeSpecName: "kube-api-access-6hrwj") pod "06336572-8158-4ce2-a6fa-ab2cf40ca435" (UID: "06336572-8158-4ce2-a6fa-ab2cf40ca435"). InnerVolumeSpecName "kube-api-access-6hrwj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:48:27 crc kubenswrapper[4606]: I1212 00:48:27.756787 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06336572-8158-4ce2-a6fa-ab2cf40ca435-config-data" (OuterVolumeSpecName: "config-data") pod "06336572-8158-4ce2-a6fa-ab2cf40ca435" (UID: "06336572-8158-4ce2-a6fa-ab2cf40ca435"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:48:27 crc kubenswrapper[4606]: I1212 00:48:27.763498 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06336572-8158-4ce2-a6fa-ab2cf40ca435-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "06336572-8158-4ce2-a6fa-ab2cf40ca435" (UID: "06336572-8158-4ce2-a6fa-ab2cf40ca435"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:48:27 crc kubenswrapper[4606]: I1212 00:48:27.788673 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06336572-8158-4ce2-a6fa-ab2cf40ca435-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "06336572-8158-4ce2-a6fa-ab2cf40ca435" (UID: "06336572-8158-4ce2-a6fa-ab2cf40ca435"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:48:27 crc kubenswrapper[4606]: I1212 00:48:27.828972 4606 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06336572-8158-4ce2-a6fa-ab2cf40ca435-config-data\") on node \"crc\" DevicePath \"\"" Dec 12 00:48:27 crc kubenswrapper[4606]: I1212 00:48:27.829017 4606 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/06336572-8158-4ce2-a6fa-ab2cf40ca435-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 12 00:48:27 crc kubenswrapper[4606]: I1212 00:48:27.829039 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6hrwj\" (UniqueName: \"kubernetes.io/projected/06336572-8158-4ce2-a6fa-ab2cf40ca435-kube-api-access-6hrwj\") on node \"crc\" DevicePath \"\"" Dec 12 00:48:27 crc kubenswrapper[4606]: I1212 00:48:27.829059 4606 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06336572-8158-4ce2-a6fa-ab2cf40ca435-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 12 00:48:28 crc kubenswrapper[4606]: I1212 00:48:28.061417 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"11fcc545-3d1b-4a4a-b302-c1b565908edf","Type":"ContainerStarted","Data":"afac9dc462bc2e80b21a1555974d4f4b2848df3b3f187381a27f085d4a9bdc58"} Dec 12 00:48:28 crc kubenswrapper[4606]: I1212 00:48:28.062591 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Dec 12 00:48:28 crc kubenswrapper[4606]: I1212 00:48:28.066373 4606 generic.go:334] "Generic (PLEG): container finished" podID="06336572-8158-4ce2-a6fa-ab2cf40ca435" containerID="6bdc44a9a6bdcf5dc5f65fdf264ce857eb0ccde5d1f0a7ea6d33b51781d9303f" exitCode=0 Dec 12 00:48:28 crc kubenswrapper[4606]: I1212 00:48:28.066407 4606 generic.go:334] "Generic (PLEG): container finished" podID="06336572-8158-4ce2-a6fa-ab2cf40ca435" containerID="c38dec3a5de24fee067e5e182a1be14c4bc9b50903caf748df8e8a911fd3f253" exitCode=143 Dec 12 00:48:28 crc kubenswrapper[4606]: I1212 00:48:28.066429 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"06336572-8158-4ce2-a6fa-ab2cf40ca435","Type":"ContainerDied","Data":"6bdc44a9a6bdcf5dc5f65fdf264ce857eb0ccde5d1f0a7ea6d33b51781d9303f"} Dec 12 00:48:28 crc kubenswrapper[4606]: I1212 00:48:28.066456 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"06336572-8158-4ce2-a6fa-ab2cf40ca435","Type":"ContainerDied","Data":"c38dec3a5de24fee067e5e182a1be14c4bc9b50903caf748df8e8a911fd3f253"} Dec 12 00:48:28 crc kubenswrapper[4606]: I1212 00:48:28.066469 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"06336572-8158-4ce2-a6fa-ab2cf40ca435","Type":"ContainerDied","Data":"de9fc49121194e33d12b0e23187d36c62e6f8d25179d6cd73c5d7d2e8e363db2"} Dec 12 00:48:28 crc kubenswrapper[4606]: I1212 00:48:28.066489 4606 scope.go:117] "RemoveContainer" containerID="6bdc44a9a6bdcf5dc5f65fdf264ce857eb0ccde5d1f0a7ea6d33b51781d9303f" Dec 12 00:48:28 crc kubenswrapper[4606]: I1212 00:48:28.066617 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 12 00:48:28 crc kubenswrapper[4606]: I1212 00:48:28.082156 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.082117356 podStartE2EDuration="2.082117356s" podCreationTimestamp="2025-12-12 00:48:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:48:28.077819691 +0000 UTC m=+1498.623172557" watchObservedRunningTime="2025-12-12 00:48:28.082117356 +0000 UTC m=+1498.627470222" Dec 12 00:48:28 crc kubenswrapper[4606]: I1212 00:48:28.126758 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 12 00:48:28 crc kubenswrapper[4606]: I1212 00:48:28.127194 4606 scope.go:117] "RemoveContainer" containerID="c38dec3a5de24fee067e5e182a1be14c4bc9b50903caf748df8e8a911fd3f253" Dec 12 00:48:28 crc kubenswrapper[4606]: I1212 00:48:28.171010 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 12 00:48:28 crc kubenswrapper[4606]: I1212 00:48:28.182016 4606 scope.go:117] "RemoveContainer" containerID="6bdc44a9a6bdcf5dc5f65fdf264ce857eb0ccde5d1f0a7ea6d33b51781d9303f" Dec 12 00:48:28 crc kubenswrapper[4606]: E1212 00:48:28.182477 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6bdc44a9a6bdcf5dc5f65fdf264ce857eb0ccde5d1f0a7ea6d33b51781d9303f\": container with ID starting with 6bdc44a9a6bdcf5dc5f65fdf264ce857eb0ccde5d1f0a7ea6d33b51781d9303f not found: ID does not exist" containerID="6bdc44a9a6bdcf5dc5f65fdf264ce857eb0ccde5d1f0a7ea6d33b51781d9303f" Dec 12 00:48:28 crc kubenswrapper[4606]: I1212 00:48:28.182513 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6bdc44a9a6bdcf5dc5f65fdf264ce857eb0ccde5d1f0a7ea6d33b51781d9303f"} err="failed to get container status \"6bdc44a9a6bdcf5dc5f65fdf264ce857eb0ccde5d1f0a7ea6d33b51781d9303f\": rpc error: code = NotFound desc = could not find container \"6bdc44a9a6bdcf5dc5f65fdf264ce857eb0ccde5d1f0a7ea6d33b51781d9303f\": container with ID starting with 6bdc44a9a6bdcf5dc5f65fdf264ce857eb0ccde5d1f0a7ea6d33b51781d9303f not found: ID does not exist" Dec 12 00:48:28 crc kubenswrapper[4606]: I1212 00:48:28.182533 4606 scope.go:117] "RemoveContainer" containerID="c38dec3a5de24fee067e5e182a1be14c4bc9b50903caf748df8e8a911fd3f253" Dec 12 00:48:28 crc kubenswrapper[4606]: E1212 00:48:28.182837 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c38dec3a5de24fee067e5e182a1be14c4bc9b50903caf748df8e8a911fd3f253\": container with ID starting with c38dec3a5de24fee067e5e182a1be14c4bc9b50903caf748df8e8a911fd3f253 not found: ID does not exist" containerID="c38dec3a5de24fee067e5e182a1be14c4bc9b50903caf748df8e8a911fd3f253" Dec 12 00:48:28 crc kubenswrapper[4606]: I1212 00:48:28.182858 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c38dec3a5de24fee067e5e182a1be14c4bc9b50903caf748df8e8a911fd3f253"} err="failed to get container status \"c38dec3a5de24fee067e5e182a1be14c4bc9b50903caf748df8e8a911fd3f253\": rpc error: code = NotFound desc = could not find container \"c38dec3a5de24fee067e5e182a1be14c4bc9b50903caf748df8e8a911fd3f253\": container with ID starting with c38dec3a5de24fee067e5e182a1be14c4bc9b50903caf748df8e8a911fd3f253 not found: ID does not exist" Dec 12 00:48:28 crc kubenswrapper[4606]: I1212 00:48:28.182873 4606 scope.go:117] "RemoveContainer" containerID="6bdc44a9a6bdcf5dc5f65fdf264ce857eb0ccde5d1f0a7ea6d33b51781d9303f" Dec 12 00:48:28 crc kubenswrapper[4606]: I1212 00:48:28.183106 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6bdc44a9a6bdcf5dc5f65fdf264ce857eb0ccde5d1f0a7ea6d33b51781d9303f"} err="failed to get container status \"6bdc44a9a6bdcf5dc5f65fdf264ce857eb0ccde5d1f0a7ea6d33b51781d9303f\": rpc error: code = NotFound desc = could not find container \"6bdc44a9a6bdcf5dc5f65fdf264ce857eb0ccde5d1f0a7ea6d33b51781d9303f\": container with ID starting with 6bdc44a9a6bdcf5dc5f65fdf264ce857eb0ccde5d1f0a7ea6d33b51781d9303f not found: ID does not exist" Dec 12 00:48:28 crc kubenswrapper[4606]: I1212 00:48:28.183122 4606 scope.go:117] "RemoveContainer" containerID="c38dec3a5de24fee067e5e182a1be14c4bc9b50903caf748df8e8a911fd3f253" Dec 12 00:48:28 crc kubenswrapper[4606]: I1212 00:48:28.183384 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c38dec3a5de24fee067e5e182a1be14c4bc9b50903caf748df8e8a911fd3f253"} err="failed to get container status \"c38dec3a5de24fee067e5e182a1be14c4bc9b50903caf748df8e8a911fd3f253\": rpc error: code = NotFound desc = could not find container \"c38dec3a5de24fee067e5e182a1be14c4bc9b50903caf748df8e8a911fd3f253\": container with ID starting with c38dec3a5de24fee067e5e182a1be14c4bc9b50903caf748df8e8a911fd3f253 not found: ID does not exist" Dec 12 00:48:28 crc kubenswrapper[4606]: I1212 00:48:28.184603 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 12 00:48:28 crc kubenswrapper[4606]: E1212 00:48:28.185001 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06336572-8158-4ce2-a6fa-ab2cf40ca435" containerName="nova-metadata-log" Dec 12 00:48:28 crc kubenswrapper[4606]: I1212 00:48:28.185015 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="06336572-8158-4ce2-a6fa-ab2cf40ca435" containerName="nova-metadata-log" Dec 12 00:48:28 crc kubenswrapper[4606]: E1212 00:48:28.185052 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06336572-8158-4ce2-a6fa-ab2cf40ca435" containerName="nova-metadata-metadata" Dec 12 00:48:28 crc kubenswrapper[4606]: I1212 00:48:28.185058 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="06336572-8158-4ce2-a6fa-ab2cf40ca435" containerName="nova-metadata-metadata" Dec 12 00:48:28 crc kubenswrapper[4606]: I1212 00:48:28.185266 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="06336572-8158-4ce2-a6fa-ab2cf40ca435" containerName="nova-metadata-log" Dec 12 00:48:28 crc kubenswrapper[4606]: I1212 00:48:28.185288 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="06336572-8158-4ce2-a6fa-ab2cf40ca435" containerName="nova-metadata-metadata" Dec 12 00:48:28 crc kubenswrapper[4606]: I1212 00:48:28.186292 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 12 00:48:28 crc kubenswrapper[4606]: I1212 00:48:28.191490 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 12 00:48:28 crc kubenswrapper[4606]: I1212 00:48:28.191673 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 12 00:48:28 crc kubenswrapper[4606]: I1212 00:48:28.192668 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 12 00:48:28 crc kubenswrapper[4606]: I1212 00:48:28.337789 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4176ccd-46b9-44e4-af1b-f02f91762469-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a4176ccd-46b9-44e4-af1b-f02f91762469\") " pod="openstack/nova-metadata-0" Dec 12 00:48:28 crc kubenswrapper[4606]: I1212 00:48:28.337873 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4176ccd-46b9-44e4-af1b-f02f91762469-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"a4176ccd-46b9-44e4-af1b-f02f91762469\") " pod="openstack/nova-metadata-0" Dec 12 00:48:28 crc kubenswrapper[4606]: I1212 00:48:28.337933 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4176ccd-46b9-44e4-af1b-f02f91762469-logs\") pod \"nova-metadata-0\" (UID: \"a4176ccd-46b9-44e4-af1b-f02f91762469\") " pod="openstack/nova-metadata-0" Dec 12 00:48:28 crc kubenswrapper[4606]: I1212 00:48:28.337970 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6b6dj\" (UniqueName: \"kubernetes.io/projected/a4176ccd-46b9-44e4-af1b-f02f91762469-kube-api-access-6b6dj\") pod \"nova-metadata-0\" (UID: \"a4176ccd-46b9-44e4-af1b-f02f91762469\") " pod="openstack/nova-metadata-0" Dec 12 00:48:28 crc kubenswrapper[4606]: I1212 00:48:28.338054 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4176ccd-46b9-44e4-af1b-f02f91762469-config-data\") pod \"nova-metadata-0\" (UID: \"a4176ccd-46b9-44e4-af1b-f02f91762469\") " pod="openstack/nova-metadata-0" Dec 12 00:48:28 crc kubenswrapper[4606]: I1212 00:48:28.440065 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4176ccd-46b9-44e4-af1b-f02f91762469-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a4176ccd-46b9-44e4-af1b-f02f91762469\") " pod="openstack/nova-metadata-0" Dec 12 00:48:28 crc kubenswrapper[4606]: I1212 00:48:28.440138 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4176ccd-46b9-44e4-af1b-f02f91762469-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"a4176ccd-46b9-44e4-af1b-f02f91762469\") " pod="openstack/nova-metadata-0" Dec 12 00:48:28 crc kubenswrapper[4606]: I1212 00:48:28.440190 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4176ccd-46b9-44e4-af1b-f02f91762469-logs\") pod \"nova-metadata-0\" (UID: \"a4176ccd-46b9-44e4-af1b-f02f91762469\") " pod="openstack/nova-metadata-0" Dec 12 00:48:28 crc kubenswrapper[4606]: I1212 00:48:28.440218 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6b6dj\" (UniqueName: \"kubernetes.io/projected/a4176ccd-46b9-44e4-af1b-f02f91762469-kube-api-access-6b6dj\") pod \"nova-metadata-0\" (UID: \"a4176ccd-46b9-44e4-af1b-f02f91762469\") " pod="openstack/nova-metadata-0" Dec 12 00:48:28 crc kubenswrapper[4606]: I1212 00:48:28.440267 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4176ccd-46b9-44e4-af1b-f02f91762469-config-data\") pod \"nova-metadata-0\" (UID: \"a4176ccd-46b9-44e4-af1b-f02f91762469\") " pod="openstack/nova-metadata-0" Dec 12 00:48:28 crc kubenswrapper[4606]: I1212 00:48:28.440958 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4176ccd-46b9-44e4-af1b-f02f91762469-logs\") pod \"nova-metadata-0\" (UID: \"a4176ccd-46b9-44e4-af1b-f02f91762469\") " pod="openstack/nova-metadata-0" Dec 12 00:48:28 crc kubenswrapper[4606]: I1212 00:48:28.445312 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4176ccd-46b9-44e4-af1b-f02f91762469-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"a4176ccd-46b9-44e4-af1b-f02f91762469\") " pod="openstack/nova-metadata-0" Dec 12 00:48:28 crc kubenswrapper[4606]: I1212 00:48:28.446306 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4176ccd-46b9-44e4-af1b-f02f91762469-config-data\") pod \"nova-metadata-0\" (UID: \"a4176ccd-46b9-44e4-af1b-f02f91762469\") " pod="openstack/nova-metadata-0" Dec 12 00:48:28 crc kubenswrapper[4606]: I1212 00:48:28.448022 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4176ccd-46b9-44e4-af1b-f02f91762469-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a4176ccd-46b9-44e4-af1b-f02f91762469\") " pod="openstack/nova-metadata-0" Dec 12 00:48:28 crc kubenswrapper[4606]: I1212 00:48:28.461359 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6b6dj\" (UniqueName: \"kubernetes.io/projected/a4176ccd-46b9-44e4-af1b-f02f91762469-kube-api-access-6b6dj\") pod \"nova-metadata-0\" (UID: \"a4176ccd-46b9-44e4-af1b-f02f91762469\") " pod="openstack/nova-metadata-0" Dec 12 00:48:28 crc kubenswrapper[4606]: I1212 00:48:28.507664 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 12 00:48:29 crc kubenswrapper[4606]: I1212 00:48:29.009634 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 12 00:48:29 crc kubenswrapper[4606]: I1212 00:48:29.077554 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a4176ccd-46b9-44e4-af1b-f02f91762469","Type":"ContainerStarted","Data":"5766b600fb26557a1b097f61b630512f202821ffd8d7f3c71a66c0881b354abd"} Dec 12 00:48:29 crc kubenswrapper[4606]: I1212 00:48:29.718711 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06336572-8158-4ce2-a6fa-ab2cf40ca435" path="/var/lib/kubelet/pods/06336572-8158-4ce2-a6fa-ab2cf40ca435/volumes" Dec 12 00:48:30 crc kubenswrapper[4606]: I1212 00:48:30.090087 4606 generic.go:334] "Generic (PLEG): container finished" podID="4ce96712-78cb-43cb-8f08-563664ef0451" containerID="624269db7cd2a4c1fa80f0d6faad7581358518b0167ee7b8a274beca76759320" exitCode=0 Dec 12 00:48:30 crc kubenswrapper[4606]: I1212 00:48:30.090130 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4ce96712-78cb-43cb-8f08-563664ef0451","Type":"ContainerDied","Data":"624269db7cd2a4c1fa80f0d6faad7581358518b0167ee7b8a274beca76759320"} Dec 12 00:48:30 crc kubenswrapper[4606]: I1212 00:48:30.092615 4606 generic.go:334] "Generic (PLEG): container finished" podID="477394c1-2eda-4a72-92af-ad59f431fe83" containerID="adbd8ada1f7845cda5bfbd50137ee5c91b4d760242fa6fcdc78827f317c79996" exitCode=0 Dec 12 00:48:30 crc kubenswrapper[4606]: I1212 00:48:30.092885 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"477394c1-2eda-4a72-92af-ad59f431fe83","Type":"ContainerDied","Data":"adbd8ada1f7845cda5bfbd50137ee5c91b4d760242fa6fcdc78827f317c79996"} Dec 12 00:48:30 crc kubenswrapper[4606]: I1212 00:48:30.096092 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a4176ccd-46b9-44e4-af1b-f02f91762469","Type":"ContainerStarted","Data":"7e10fb794188aaccf0db771b264ebc20e470eaec80fc45b8db4ef8f5f1e2b19b"} Dec 12 00:48:30 crc kubenswrapper[4606]: I1212 00:48:30.096145 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a4176ccd-46b9-44e4-af1b-f02f91762469","Type":"ContainerStarted","Data":"bdf98d1f27cbe685dd1272f2502b449010d9c6a4277dc98d71146a6620cb0715"} Dec 12 00:48:30 crc kubenswrapper[4606]: I1212 00:48:30.129355 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.12933328 podStartE2EDuration="2.12933328s" podCreationTimestamp="2025-12-12 00:48:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:48:30.120039541 +0000 UTC m=+1500.665392417" watchObservedRunningTime="2025-12-12 00:48:30.12933328 +0000 UTC m=+1500.674686146" Dec 12 00:48:30 crc kubenswrapper[4606]: I1212 00:48:30.237307 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 12 00:48:30 crc kubenswrapper[4606]: I1212 00:48:30.245558 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 12 00:48:30 crc kubenswrapper[4606]: I1212 00:48:30.406989 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ce96712-78cb-43cb-8f08-563664ef0451-config-data\") pod \"4ce96712-78cb-43cb-8f08-563664ef0451\" (UID: \"4ce96712-78cb-43cb-8f08-563664ef0451\") " Dec 12 00:48:30 crc kubenswrapper[4606]: I1212 00:48:30.407377 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sfgkm\" (UniqueName: \"kubernetes.io/projected/4ce96712-78cb-43cb-8f08-563664ef0451-kube-api-access-sfgkm\") pod \"4ce96712-78cb-43cb-8f08-563664ef0451\" (UID: \"4ce96712-78cb-43cb-8f08-563664ef0451\") " Dec 12 00:48:30 crc kubenswrapper[4606]: I1212 00:48:30.407465 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ce96712-78cb-43cb-8f08-563664ef0451-combined-ca-bundle\") pod \"4ce96712-78cb-43cb-8f08-563664ef0451\" (UID: \"4ce96712-78cb-43cb-8f08-563664ef0451\") " Dec 12 00:48:30 crc kubenswrapper[4606]: I1212 00:48:30.407492 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4ce96712-78cb-43cb-8f08-563664ef0451-logs\") pod \"4ce96712-78cb-43cb-8f08-563664ef0451\" (UID: \"4ce96712-78cb-43cb-8f08-563664ef0451\") " Dec 12 00:48:30 crc kubenswrapper[4606]: I1212 00:48:30.407532 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x6xcx\" (UniqueName: \"kubernetes.io/projected/477394c1-2eda-4a72-92af-ad59f431fe83-kube-api-access-x6xcx\") pod \"477394c1-2eda-4a72-92af-ad59f431fe83\" (UID: \"477394c1-2eda-4a72-92af-ad59f431fe83\") " Dec 12 00:48:30 crc kubenswrapper[4606]: I1212 00:48:30.407560 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/477394c1-2eda-4a72-92af-ad59f431fe83-config-data\") pod \"477394c1-2eda-4a72-92af-ad59f431fe83\" (UID: \"477394c1-2eda-4a72-92af-ad59f431fe83\") " Dec 12 00:48:30 crc kubenswrapper[4606]: I1212 00:48:30.407594 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/477394c1-2eda-4a72-92af-ad59f431fe83-combined-ca-bundle\") pod \"477394c1-2eda-4a72-92af-ad59f431fe83\" (UID: \"477394c1-2eda-4a72-92af-ad59f431fe83\") " Dec 12 00:48:30 crc kubenswrapper[4606]: I1212 00:48:30.413969 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ce96712-78cb-43cb-8f08-563664ef0451-logs" (OuterVolumeSpecName: "logs") pod "4ce96712-78cb-43cb-8f08-563664ef0451" (UID: "4ce96712-78cb-43cb-8f08-563664ef0451"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 00:48:30 crc kubenswrapper[4606]: I1212 00:48:30.417328 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ce96712-78cb-43cb-8f08-563664ef0451-kube-api-access-sfgkm" (OuterVolumeSpecName: "kube-api-access-sfgkm") pod "4ce96712-78cb-43cb-8f08-563664ef0451" (UID: "4ce96712-78cb-43cb-8f08-563664ef0451"). InnerVolumeSpecName "kube-api-access-sfgkm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:48:30 crc kubenswrapper[4606]: I1212 00:48:30.429482 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/477394c1-2eda-4a72-92af-ad59f431fe83-kube-api-access-x6xcx" (OuterVolumeSpecName: "kube-api-access-x6xcx") pod "477394c1-2eda-4a72-92af-ad59f431fe83" (UID: "477394c1-2eda-4a72-92af-ad59f431fe83"). InnerVolumeSpecName "kube-api-access-x6xcx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:48:30 crc kubenswrapper[4606]: I1212 00:48:30.438936 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/477394c1-2eda-4a72-92af-ad59f431fe83-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "477394c1-2eda-4a72-92af-ad59f431fe83" (UID: "477394c1-2eda-4a72-92af-ad59f431fe83"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:48:30 crc kubenswrapper[4606]: I1212 00:48:30.448247 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ce96712-78cb-43cb-8f08-563664ef0451-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4ce96712-78cb-43cb-8f08-563664ef0451" (UID: "4ce96712-78cb-43cb-8f08-563664ef0451"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:48:30 crc kubenswrapper[4606]: I1212 00:48:30.449649 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/477394c1-2eda-4a72-92af-ad59f431fe83-config-data" (OuterVolumeSpecName: "config-data") pod "477394c1-2eda-4a72-92af-ad59f431fe83" (UID: "477394c1-2eda-4a72-92af-ad59f431fe83"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:48:30 crc kubenswrapper[4606]: I1212 00:48:30.463378 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ce96712-78cb-43cb-8f08-563664ef0451-config-data" (OuterVolumeSpecName: "config-data") pod "4ce96712-78cb-43cb-8f08-563664ef0451" (UID: "4ce96712-78cb-43cb-8f08-563664ef0451"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:48:30 crc kubenswrapper[4606]: I1212 00:48:30.509910 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sfgkm\" (UniqueName: \"kubernetes.io/projected/4ce96712-78cb-43cb-8f08-563664ef0451-kube-api-access-sfgkm\") on node \"crc\" DevicePath \"\"" Dec 12 00:48:30 crc kubenswrapper[4606]: I1212 00:48:30.509952 4606 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ce96712-78cb-43cb-8f08-563664ef0451-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 12 00:48:30 crc kubenswrapper[4606]: I1212 00:48:30.509964 4606 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4ce96712-78cb-43cb-8f08-563664ef0451-logs\") on node \"crc\" DevicePath \"\"" Dec 12 00:48:30 crc kubenswrapper[4606]: I1212 00:48:30.509976 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x6xcx\" (UniqueName: \"kubernetes.io/projected/477394c1-2eda-4a72-92af-ad59f431fe83-kube-api-access-x6xcx\") on node \"crc\" DevicePath \"\"" Dec 12 00:48:30 crc kubenswrapper[4606]: I1212 00:48:30.509988 4606 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/477394c1-2eda-4a72-92af-ad59f431fe83-config-data\") on node \"crc\" DevicePath \"\"" Dec 12 00:48:30 crc kubenswrapper[4606]: I1212 00:48:30.510002 4606 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/477394c1-2eda-4a72-92af-ad59f431fe83-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 12 00:48:30 crc kubenswrapper[4606]: I1212 00:48:30.510013 4606 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ce96712-78cb-43cb-8f08-563664ef0451-config-data\") on node \"crc\" DevicePath \"\"" Dec 12 00:48:31 crc kubenswrapper[4606]: I1212 00:48:31.103853 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4ce96712-78cb-43cb-8f08-563664ef0451","Type":"ContainerDied","Data":"01856a9a29d4ac33b341a40d3afdc2f4f3e79e81347e31b346d99fd7d470af37"} Dec 12 00:48:31 crc kubenswrapper[4606]: I1212 00:48:31.103870 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 12 00:48:31 crc kubenswrapper[4606]: I1212 00:48:31.103927 4606 scope.go:117] "RemoveContainer" containerID="624269db7cd2a4c1fa80f0d6faad7581358518b0167ee7b8a274beca76759320" Dec 12 00:48:31 crc kubenswrapper[4606]: I1212 00:48:31.107120 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 12 00:48:31 crc kubenswrapper[4606]: I1212 00:48:31.110808 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"477394c1-2eda-4a72-92af-ad59f431fe83","Type":"ContainerDied","Data":"fecbc1b1088cd62428eba7f7fea785301ba2f231ddc9bb63b50d84b663f2d966"} Dec 12 00:48:31 crc kubenswrapper[4606]: I1212 00:48:31.130671 4606 scope.go:117] "RemoveContainer" containerID="c7e5d0ce7189dac632f9180bfe3beeed6b24362506942aa82865c7a3ded94093" Dec 12 00:48:31 crc kubenswrapper[4606]: I1212 00:48:31.151095 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 12 00:48:31 crc kubenswrapper[4606]: I1212 00:48:31.177684 4606 scope.go:117] "RemoveContainer" containerID="adbd8ada1f7845cda5bfbd50137ee5c91b4d760242fa6fcdc78827f317c79996" Dec 12 00:48:31 crc kubenswrapper[4606]: I1212 00:48:31.243523 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 12 00:48:31 crc kubenswrapper[4606]: I1212 00:48:31.292515 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 12 00:48:31 crc kubenswrapper[4606]: I1212 00:48:31.304106 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 12 00:48:31 crc kubenswrapper[4606]: I1212 00:48:31.316114 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 12 00:48:31 crc kubenswrapper[4606]: E1212 00:48:31.316719 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ce96712-78cb-43cb-8f08-563664ef0451" containerName="nova-api-api" Dec 12 00:48:31 crc kubenswrapper[4606]: I1212 00:48:31.316739 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ce96712-78cb-43cb-8f08-563664ef0451" containerName="nova-api-api" Dec 12 00:48:31 crc kubenswrapper[4606]: E1212 00:48:31.316771 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="477394c1-2eda-4a72-92af-ad59f431fe83" containerName="nova-scheduler-scheduler" Dec 12 00:48:31 crc kubenswrapper[4606]: I1212 00:48:31.316779 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="477394c1-2eda-4a72-92af-ad59f431fe83" containerName="nova-scheduler-scheduler" Dec 12 00:48:31 crc kubenswrapper[4606]: E1212 00:48:31.316797 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ce96712-78cb-43cb-8f08-563664ef0451" containerName="nova-api-log" Dec 12 00:48:31 crc kubenswrapper[4606]: I1212 00:48:31.316803 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ce96712-78cb-43cb-8f08-563664ef0451" containerName="nova-api-log" Dec 12 00:48:31 crc kubenswrapper[4606]: I1212 00:48:31.316993 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ce96712-78cb-43cb-8f08-563664ef0451" containerName="nova-api-api" Dec 12 00:48:31 crc kubenswrapper[4606]: I1212 00:48:31.317018 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="477394c1-2eda-4a72-92af-ad59f431fe83" containerName="nova-scheduler-scheduler" Dec 12 00:48:31 crc kubenswrapper[4606]: I1212 00:48:31.317030 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ce96712-78cb-43cb-8f08-563664ef0451" containerName="nova-api-log" Dec 12 00:48:31 crc kubenswrapper[4606]: I1212 00:48:31.318071 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 12 00:48:31 crc kubenswrapper[4606]: I1212 00:48:31.320365 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 12 00:48:31 crc kubenswrapper[4606]: I1212 00:48:31.334165 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 12 00:48:31 crc kubenswrapper[4606]: I1212 00:48:31.338685 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 12 00:48:31 crc kubenswrapper[4606]: I1212 00:48:31.341031 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 12 00:48:31 crc kubenswrapper[4606]: I1212 00:48:31.345321 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 12 00:48:31 crc kubenswrapper[4606]: I1212 00:48:31.370521 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 12 00:48:31 crc kubenswrapper[4606]: I1212 00:48:31.442523 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eac07a42-5397-42fd-be30-3677507d5a65-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"eac07a42-5397-42fd-be30-3677507d5a65\") " pod="openstack/nova-scheduler-0" Dec 12 00:48:31 crc kubenswrapper[4606]: I1212 00:48:31.442867 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/29f7a7c3-4ec4-4e0e-8309-6a9dddcd9b71-logs\") pod \"nova-api-0\" (UID: \"29f7a7c3-4ec4-4e0e-8309-6a9dddcd9b71\") " pod="openstack/nova-api-0" Dec 12 00:48:31 crc kubenswrapper[4606]: I1212 00:48:31.443020 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29f7a7c3-4ec4-4e0e-8309-6a9dddcd9b71-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"29f7a7c3-4ec4-4e0e-8309-6a9dddcd9b71\") " pod="openstack/nova-api-0" Dec 12 00:48:31 crc kubenswrapper[4606]: I1212 00:48:31.443125 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29f7a7c3-4ec4-4e0e-8309-6a9dddcd9b71-config-data\") pod \"nova-api-0\" (UID: \"29f7a7c3-4ec4-4e0e-8309-6a9dddcd9b71\") " pod="openstack/nova-api-0" Dec 12 00:48:31 crc kubenswrapper[4606]: I1212 00:48:31.443232 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sr492\" (UniqueName: \"kubernetes.io/projected/29f7a7c3-4ec4-4e0e-8309-6a9dddcd9b71-kube-api-access-sr492\") pod \"nova-api-0\" (UID: \"29f7a7c3-4ec4-4e0e-8309-6a9dddcd9b71\") " pod="openstack/nova-api-0" Dec 12 00:48:31 crc kubenswrapper[4606]: I1212 00:48:31.443347 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wh4bc\" (UniqueName: \"kubernetes.io/projected/eac07a42-5397-42fd-be30-3677507d5a65-kube-api-access-wh4bc\") pod \"nova-scheduler-0\" (UID: \"eac07a42-5397-42fd-be30-3677507d5a65\") " pod="openstack/nova-scheduler-0" Dec 12 00:48:31 crc kubenswrapper[4606]: I1212 00:48:31.443462 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eac07a42-5397-42fd-be30-3677507d5a65-config-data\") pod \"nova-scheduler-0\" (UID: \"eac07a42-5397-42fd-be30-3677507d5a65\") " pod="openstack/nova-scheduler-0" Dec 12 00:48:31 crc kubenswrapper[4606]: I1212 00:48:31.545227 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wh4bc\" (UniqueName: \"kubernetes.io/projected/eac07a42-5397-42fd-be30-3677507d5a65-kube-api-access-wh4bc\") pod \"nova-scheduler-0\" (UID: \"eac07a42-5397-42fd-be30-3677507d5a65\") " pod="openstack/nova-scheduler-0" Dec 12 00:48:31 crc kubenswrapper[4606]: I1212 00:48:31.545304 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eac07a42-5397-42fd-be30-3677507d5a65-config-data\") pod \"nova-scheduler-0\" (UID: \"eac07a42-5397-42fd-be30-3677507d5a65\") " pod="openstack/nova-scheduler-0" Dec 12 00:48:31 crc kubenswrapper[4606]: I1212 00:48:31.545399 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eac07a42-5397-42fd-be30-3677507d5a65-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"eac07a42-5397-42fd-be30-3677507d5a65\") " pod="openstack/nova-scheduler-0" Dec 12 00:48:31 crc kubenswrapper[4606]: I1212 00:48:31.545496 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/29f7a7c3-4ec4-4e0e-8309-6a9dddcd9b71-logs\") pod \"nova-api-0\" (UID: \"29f7a7c3-4ec4-4e0e-8309-6a9dddcd9b71\") " pod="openstack/nova-api-0" Dec 12 00:48:31 crc kubenswrapper[4606]: I1212 00:48:31.545546 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29f7a7c3-4ec4-4e0e-8309-6a9dddcd9b71-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"29f7a7c3-4ec4-4e0e-8309-6a9dddcd9b71\") " pod="openstack/nova-api-0" Dec 12 00:48:31 crc kubenswrapper[4606]: I1212 00:48:31.545617 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29f7a7c3-4ec4-4e0e-8309-6a9dddcd9b71-config-data\") pod \"nova-api-0\" (UID: \"29f7a7c3-4ec4-4e0e-8309-6a9dddcd9b71\") " pod="openstack/nova-api-0" Dec 12 00:48:31 crc kubenswrapper[4606]: I1212 00:48:31.545981 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/29f7a7c3-4ec4-4e0e-8309-6a9dddcd9b71-logs\") pod \"nova-api-0\" (UID: \"29f7a7c3-4ec4-4e0e-8309-6a9dddcd9b71\") " pod="openstack/nova-api-0" Dec 12 00:48:31 crc kubenswrapper[4606]: I1212 00:48:31.546279 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sr492\" (UniqueName: \"kubernetes.io/projected/29f7a7c3-4ec4-4e0e-8309-6a9dddcd9b71-kube-api-access-sr492\") pod \"nova-api-0\" (UID: \"29f7a7c3-4ec4-4e0e-8309-6a9dddcd9b71\") " pod="openstack/nova-api-0" Dec 12 00:48:31 crc kubenswrapper[4606]: I1212 00:48:31.552298 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eac07a42-5397-42fd-be30-3677507d5a65-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"eac07a42-5397-42fd-be30-3677507d5a65\") " pod="openstack/nova-scheduler-0" Dec 12 00:48:31 crc kubenswrapper[4606]: I1212 00:48:31.552847 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29f7a7c3-4ec4-4e0e-8309-6a9dddcd9b71-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"29f7a7c3-4ec4-4e0e-8309-6a9dddcd9b71\") " pod="openstack/nova-api-0" Dec 12 00:48:31 crc kubenswrapper[4606]: I1212 00:48:31.554778 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eac07a42-5397-42fd-be30-3677507d5a65-config-data\") pod \"nova-scheduler-0\" (UID: \"eac07a42-5397-42fd-be30-3677507d5a65\") " pod="openstack/nova-scheduler-0" Dec 12 00:48:31 crc kubenswrapper[4606]: I1212 00:48:31.567385 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29f7a7c3-4ec4-4e0e-8309-6a9dddcd9b71-config-data\") pod \"nova-api-0\" (UID: \"29f7a7c3-4ec4-4e0e-8309-6a9dddcd9b71\") " pod="openstack/nova-api-0" Dec 12 00:48:31 crc kubenswrapper[4606]: I1212 00:48:31.572066 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wh4bc\" (UniqueName: \"kubernetes.io/projected/eac07a42-5397-42fd-be30-3677507d5a65-kube-api-access-wh4bc\") pod \"nova-scheduler-0\" (UID: \"eac07a42-5397-42fd-be30-3677507d5a65\") " pod="openstack/nova-scheduler-0" Dec 12 00:48:31 crc kubenswrapper[4606]: I1212 00:48:31.577246 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sr492\" (UniqueName: \"kubernetes.io/projected/29f7a7c3-4ec4-4e0e-8309-6a9dddcd9b71-kube-api-access-sr492\") pod \"nova-api-0\" (UID: \"29f7a7c3-4ec4-4e0e-8309-6a9dddcd9b71\") " pod="openstack/nova-api-0" Dec 12 00:48:31 crc kubenswrapper[4606]: I1212 00:48:31.645377 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 12 00:48:31 crc kubenswrapper[4606]: I1212 00:48:31.665902 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 12 00:48:31 crc kubenswrapper[4606]: I1212 00:48:31.714571 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="477394c1-2eda-4a72-92af-ad59f431fe83" path="/var/lib/kubelet/pods/477394c1-2eda-4a72-92af-ad59f431fe83/volumes" Dec 12 00:48:31 crc kubenswrapper[4606]: I1212 00:48:31.715407 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ce96712-78cb-43cb-8f08-563664ef0451" path="/var/lib/kubelet/pods/4ce96712-78cb-43cb-8f08-563664ef0451/volumes" Dec 12 00:48:32 crc kubenswrapper[4606]: I1212 00:48:32.130244 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 12 00:48:32 crc kubenswrapper[4606]: W1212 00:48:32.136523 4606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod29f7a7c3_4ec4_4e0e_8309_6a9dddcd9b71.slice/crio-2f58c5386ff01ffd000085a77b3e76943ab3749aab770c67f8b13b5cdefe29a9 WatchSource:0}: Error finding container 2f58c5386ff01ffd000085a77b3e76943ab3749aab770c67f8b13b5cdefe29a9: Status 404 returned error can't find the container with id 2f58c5386ff01ffd000085a77b3e76943ab3749aab770c67f8b13b5cdefe29a9 Dec 12 00:48:32 crc kubenswrapper[4606]: I1212 00:48:32.189556 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 12 00:48:32 crc kubenswrapper[4606]: I1212 00:48:32.219690 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 12 00:48:32 crc kubenswrapper[4606]: W1212 00:48:32.223488 4606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeac07a42_5397_42fd_be30_3677507d5a65.slice/crio-6ca15f1e5bf3638c6d30145dbe36cfd605a19a9c538ee8748dfccbf5b8a13983 WatchSource:0}: Error finding container 6ca15f1e5bf3638c6d30145dbe36cfd605a19a9c538ee8748dfccbf5b8a13983: Status 404 returned error can't find the container with id 6ca15f1e5bf3638c6d30145dbe36cfd605a19a9c538ee8748dfccbf5b8a13983 Dec 12 00:48:33 crc kubenswrapper[4606]: I1212 00:48:33.141999 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"29f7a7c3-4ec4-4e0e-8309-6a9dddcd9b71","Type":"ContainerStarted","Data":"ca87d7ac40c931cabbbafb535d048eb8408a18d2b49b368ae79cdd5222d438bb"} Dec 12 00:48:33 crc kubenswrapper[4606]: I1212 00:48:33.143854 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"29f7a7c3-4ec4-4e0e-8309-6a9dddcd9b71","Type":"ContainerStarted","Data":"a0008e6405cd0549dca2ad6c59ed2ff839f325226c110499b357f4629df23e89"} Dec 12 00:48:33 crc kubenswrapper[4606]: I1212 00:48:33.143954 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"29f7a7c3-4ec4-4e0e-8309-6a9dddcd9b71","Type":"ContainerStarted","Data":"2f58c5386ff01ffd000085a77b3e76943ab3749aab770c67f8b13b5cdefe29a9"} Dec 12 00:48:33 crc kubenswrapper[4606]: I1212 00:48:33.149328 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"eac07a42-5397-42fd-be30-3677507d5a65","Type":"ContainerStarted","Data":"76493356d682c0e7accbc9e0c87215456eb472aa3fd0b81543206e3f0dc86c5d"} Dec 12 00:48:33 crc kubenswrapper[4606]: I1212 00:48:33.149385 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"eac07a42-5397-42fd-be30-3677507d5a65","Type":"ContainerStarted","Data":"6ca15f1e5bf3638c6d30145dbe36cfd605a19a9c538ee8748dfccbf5b8a13983"} Dec 12 00:48:33 crc kubenswrapper[4606]: I1212 00:48:33.168867 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.168848257 podStartE2EDuration="2.168848257s" podCreationTimestamp="2025-12-12 00:48:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:48:33.163198436 +0000 UTC m=+1503.708551322" watchObservedRunningTime="2025-12-12 00:48:33.168848257 +0000 UTC m=+1503.714201123" Dec 12 00:48:33 crc kubenswrapper[4606]: I1212 00:48:33.194613 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.194592927 podStartE2EDuration="2.194592927s" podCreationTimestamp="2025-12-12 00:48:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:48:33.187484256 +0000 UTC m=+1503.732837122" watchObservedRunningTime="2025-12-12 00:48:33.194592927 +0000 UTC m=+1503.739945793" Dec 12 00:48:33 crc kubenswrapper[4606]: I1212 00:48:33.509654 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 12 00:48:33 crc kubenswrapper[4606]: I1212 00:48:33.510011 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 12 00:48:34 crc kubenswrapper[4606]: I1212 00:48:34.572213 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-b9fb498f6-62fcc" Dec 12 00:48:36 crc kubenswrapper[4606]: I1212 00:48:36.024697 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 12 00:48:36 crc kubenswrapper[4606]: I1212 00:48:36.174704 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="fe774fb2-c953-4fc2-8f6b-ec94268d6e7d" containerName="kube-state-metrics" containerID="cri-o://19d90ed114f2005fa9d7fa01c73eda1cca64f83fdbd5875c73bf85360b9c124c" gracePeriod=30 Dec 12 00:48:36 crc kubenswrapper[4606]: I1212 00:48:36.475146 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Dec 12 00:48:36 crc kubenswrapper[4606]: I1212 00:48:36.652236 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-b9fb498f6-62fcc" Dec 12 00:48:36 crc kubenswrapper[4606]: I1212 00:48:36.666028 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 12 00:48:36 crc kubenswrapper[4606]: I1212 00:48:36.671478 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 12 00:48:36 crc kubenswrapper[4606]: I1212 00:48:36.805554 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-79c99578bb-cdgsn"] Dec 12 00:48:36 crc kubenswrapper[4606]: I1212 00:48:36.805973 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-79c99578bb-cdgsn" podUID="9ede4720-3fd7-4524-adfc-c1c395f12170" containerName="horizon-log" containerID="cri-o://3da605673d28778834960969826f21261bb757319e5b0200c284265da3ac2e79" gracePeriod=30 Dec 12 00:48:36 crc kubenswrapper[4606]: I1212 00:48:36.806072 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-79c99578bb-cdgsn" podUID="9ede4720-3fd7-4524-adfc-c1c395f12170" containerName="horizon" containerID="cri-o://118461cd66eb187af751f8b1bf5a9a842145b3d677c56d382d11224f393d77e9" gracePeriod=30 Dec 12 00:48:36 crc kubenswrapper[4606]: I1212 00:48:36.857365 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xmd5n\" (UniqueName: \"kubernetes.io/projected/fe774fb2-c953-4fc2-8f6b-ec94268d6e7d-kube-api-access-xmd5n\") pod \"fe774fb2-c953-4fc2-8f6b-ec94268d6e7d\" (UID: \"fe774fb2-c953-4fc2-8f6b-ec94268d6e7d\") " Dec 12 00:48:36 crc kubenswrapper[4606]: I1212 00:48:36.877468 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe774fb2-c953-4fc2-8f6b-ec94268d6e7d-kube-api-access-xmd5n" (OuterVolumeSpecName: "kube-api-access-xmd5n") pod "fe774fb2-c953-4fc2-8f6b-ec94268d6e7d" (UID: "fe774fb2-c953-4fc2-8f6b-ec94268d6e7d"). InnerVolumeSpecName "kube-api-access-xmd5n". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:48:36 crc kubenswrapper[4606]: I1212 00:48:36.960118 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xmd5n\" (UniqueName: \"kubernetes.io/projected/fe774fb2-c953-4fc2-8f6b-ec94268d6e7d-kube-api-access-xmd5n\") on node \"crc\" DevicePath \"\"" Dec 12 00:48:37 crc kubenswrapper[4606]: I1212 00:48:37.185463 4606 generic.go:334] "Generic (PLEG): container finished" podID="fe774fb2-c953-4fc2-8f6b-ec94268d6e7d" containerID="19d90ed114f2005fa9d7fa01c73eda1cca64f83fdbd5875c73bf85360b9c124c" exitCode=2 Dec 12 00:48:37 crc kubenswrapper[4606]: I1212 00:48:37.185524 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 12 00:48:37 crc kubenswrapper[4606]: I1212 00:48:37.185526 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"fe774fb2-c953-4fc2-8f6b-ec94268d6e7d","Type":"ContainerDied","Data":"19d90ed114f2005fa9d7fa01c73eda1cca64f83fdbd5875c73bf85360b9c124c"} Dec 12 00:48:37 crc kubenswrapper[4606]: I1212 00:48:37.185630 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"fe774fb2-c953-4fc2-8f6b-ec94268d6e7d","Type":"ContainerDied","Data":"aa91c344846cd9f4512562ebc0594a7267f85faa5b548877caef0ee995c65cd7"} Dec 12 00:48:37 crc kubenswrapper[4606]: I1212 00:48:37.185647 4606 scope.go:117] "RemoveContainer" containerID="19d90ed114f2005fa9d7fa01c73eda1cca64f83fdbd5875c73bf85360b9c124c" Dec 12 00:48:37 crc kubenswrapper[4606]: I1212 00:48:37.208899 4606 scope.go:117] "RemoveContainer" containerID="19d90ed114f2005fa9d7fa01c73eda1cca64f83fdbd5875c73bf85360b9c124c" Dec 12 00:48:37 crc kubenswrapper[4606]: E1212 00:48:37.210077 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19d90ed114f2005fa9d7fa01c73eda1cca64f83fdbd5875c73bf85360b9c124c\": container with ID starting with 19d90ed114f2005fa9d7fa01c73eda1cca64f83fdbd5875c73bf85360b9c124c not found: ID does not exist" containerID="19d90ed114f2005fa9d7fa01c73eda1cca64f83fdbd5875c73bf85360b9c124c" Dec 12 00:48:37 crc kubenswrapper[4606]: I1212 00:48:37.210121 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19d90ed114f2005fa9d7fa01c73eda1cca64f83fdbd5875c73bf85360b9c124c"} err="failed to get container status \"19d90ed114f2005fa9d7fa01c73eda1cca64f83fdbd5875c73bf85360b9c124c\": rpc error: code = NotFound desc = could not find container \"19d90ed114f2005fa9d7fa01c73eda1cca64f83fdbd5875c73bf85360b9c124c\": container with ID starting with 19d90ed114f2005fa9d7fa01c73eda1cca64f83fdbd5875c73bf85360b9c124c not found: ID does not exist" Dec 12 00:48:37 crc kubenswrapper[4606]: I1212 00:48:37.229962 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 12 00:48:37 crc kubenswrapper[4606]: I1212 00:48:37.239714 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 12 00:48:37 crc kubenswrapper[4606]: I1212 00:48:37.259632 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 12 00:48:37 crc kubenswrapper[4606]: E1212 00:48:37.260065 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe774fb2-c953-4fc2-8f6b-ec94268d6e7d" containerName="kube-state-metrics" Dec 12 00:48:37 crc kubenswrapper[4606]: I1212 00:48:37.260082 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe774fb2-c953-4fc2-8f6b-ec94268d6e7d" containerName="kube-state-metrics" Dec 12 00:48:37 crc kubenswrapper[4606]: I1212 00:48:37.260256 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe774fb2-c953-4fc2-8f6b-ec94268d6e7d" containerName="kube-state-metrics" Dec 12 00:48:37 crc kubenswrapper[4606]: I1212 00:48:37.260919 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 12 00:48:37 crc kubenswrapper[4606]: I1212 00:48:37.267603 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 12 00:48:37 crc kubenswrapper[4606]: I1212 00:48:37.271554 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Dec 12 00:48:37 crc kubenswrapper[4606]: I1212 00:48:37.271591 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Dec 12 00:48:37 crc kubenswrapper[4606]: I1212 00:48:37.368847 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/14b69a07-590a-4574-b5b9-de1bfe8c8fcf-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"14b69a07-590a-4574-b5b9-de1bfe8c8fcf\") " pod="openstack/kube-state-metrics-0" Dec 12 00:48:37 crc kubenswrapper[4606]: I1212 00:48:37.369039 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/14b69a07-590a-4574-b5b9-de1bfe8c8fcf-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"14b69a07-590a-4574-b5b9-de1bfe8c8fcf\") " pod="openstack/kube-state-metrics-0" Dec 12 00:48:37 crc kubenswrapper[4606]: I1212 00:48:37.369181 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14b69a07-590a-4574-b5b9-de1bfe8c8fcf-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"14b69a07-590a-4574-b5b9-de1bfe8c8fcf\") " pod="openstack/kube-state-metrics-0" Dec 12 00:48:37 crc kubenswrapper[4606]: I1212 00:48:37.369228 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzl4c\" (UniqueName: \"kubernetes.io/projected/14b69a07-590a-4574-b5b9-de1bfe8c8fcf-kube-api-access-kzl4c\") pod \"kube-state-metrics-0\" (UID: \"14b69a07-590a-4574-b5b9-de1bfe8c8fcf\") " pod="openstack/kube-state-metrics-0" Dec 12 00:48:37 crc kubenswrapper[4606]: I1212 00:48:37.471295 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/14b69a07-590a-4574-b5b9-de1bfe8c8fcf-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"14b69a07-590a-4574-b5b9-de1bfe8c8fcf\") " pod="openstack/kube-state-metrics-0" Dec 12 00:48:37 crc kubenswrapper[4606]: I1212 00:48:37.471386 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/14b69a07-590a-4574-b5b9-de1bfe8c8fcf-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"14b69a07-590a-4574-b5b9-de1bfe8c8fcf\") " pod="openstack/kube-state-metrics-0" Dec 12 00:48:37 crc kubenswrapper[4606]: I1212 00:48:37.471436 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14b69a07-590a-4574-b5b9-de1bfe8c8fcf-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"14b69a07-590a-4574-b5b9-de1bfe8c8fcf\") " pod="openstack/kube-state-metrics-0" Dec 12 00:48:37 crc kubenswrapper[4606]: I1212 00:48:37.471459 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kzl4c\" (UniqueName: \"kubernetes.io/projected/14b69a07-590a-4574-b5b9-de1bfe8c8fcf-kube-api-access-kzl4c\") pod \"kube-state-metrics-0\" (UID: \"14b69a07-590a-4574-b5b9-de1bfe8c8fcf\") " pod="openstack/kube-state-metrics-0" Dec 12 00:48:37 crc kubenswrapper[4606]: I1212 00:48:37.477042 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14b69a07-590a-4574-b5b9-de1bfe8c8fcf-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"14b69a07-590a-4574-b5b9-de1bfe8c8fcf\") " pod="openstack/kube-state-metrics-0" Dec 12 00:48:37 crc kubenswrapper[4606]: I1212 00:48:37.477749 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/14b69a07-590a-4574-b5b9-de1bfe8c8fcf-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"14b69a07-590a-4574-b5b9-de1bfe8c8fcf\") " pod="openstack/kube-state-metrics-0" Dec 12 00:48:37 crc kubenswrapper[4606]: I1212 00:48:37.482832 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/14b69a07-590a-4574-b5b9-de1bfe8c8fcf-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"14b69a07-590a-4574-b5b9-de1bfe8c8fcf\") " pod="openstack/kube-state-metrics-0" Dec 12 00:48:37 crc kubenswrapper[4606]: I1212 00:48:37.489951 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzl4c\" (UniqueName: \"kubernetes.io/projected/14b69a07-590a-4574-b5b9-de1bfe8c8fcf-kube-api-access-kzl4c\") pod \"kube-state-metrics-0\" (UID: \"14b69a07-590a-4574-b5b9-de1bfe8c8fcf\") " pod="openstack/kube-state-metrics-0" Dec 12 00:48:37 crc kubenswrapper[4606]: I1212 00:48:37.590788 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 12 00:48:37 crc kubenswrapper[4606]: I1212 00:48:37.722453 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe774fb2-c953-4fc2-8f6b-ec94268d6e7d" path="/var/lib/kubelet/pods/fe774fb2-c953-4fc2-8f6b-ec94268d6e7d/volumes" Dec 12 00:48:38 crc kubenswrapper[4606]: I1212 00:48:38.051698 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 12 00:48:38 crc kubenswrapper[4606]: W1212 00:48:38.063922 4606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod14b69a07_590a_4574_b5b9_de1bfe8c8fcf.slice/crio-4e987b75e1717d9d81959eb8496bc1b9dd5475e29a1b88448e6320ff051adb99 WatchSource:0}: Error finding container 4e987b75e1717d9d81959eb8496bc1b9dd5475e29a1b88448e6320ff051adb99: Status 404 returned error can't find the container with id 4e987b75e1717d9d81959eb8496bc1b9dd5475e29a1b88448e6320ff051adb99 Dec 12 00:48:38 crc kubenswrapper[4606]: I1212 00:48:38.196356 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"14b69a07-590a-4574-b5b9-de1bfe8c8fcf","Type":"ContainerStarted","Data":"4e987b75e1717d9d81959eb8496bc1b9dd5475e29a1b88448e6320ff051adb99"} Dec 12 00:48:38 crc kubenswrapper[4606]: I1212 00:48:38.270375 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 12 00:48:38 crc kubenswrapper[4606]: I1212 00:48:38.270648 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="841a9b2c-ca62-41f0-8307-7cd58b43aa9e" containerName="ceilometer-central-agent" containerID="cri-o://e9385f2519eb011ca784d1ede40a391d82acd7e77913ce00e762d707764b1127" gracePeriod=30 Dec 12 00:48:38 crc kubenswrapper[4606]: I1212 00:48:38.270700 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="841a9b2c-ca62-41f0-8307-7cd58b43aa9e" containerName="sg-core" containerID="cri-o://4f1e81bc1d9daeb41b40889d1e8d1cec27087f61b7b54f4e5d99c031acedc48f" gracePeriod=30 Dec 12 00:48:38 crc kubenswrapper[4606]: I1212 00:48:38.270750 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="841a9b2c-ca62-41f0-8307-7cd58b43aa9e" containerName="ceilometer-notification-agent" containerID="cri-o://d135a11dabe2b21a5d5f111d71771b37cd712b5a4675684608703b85d4b84a05" gracePeriod=30 Dec 12 00:48:38 crc kubenswrapper[4606]: I1212 00:48:38.270990 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="841a9b2c-ca62-41f0-8307-7cd58b43aa9e" containerName="proxy-httpd" containerID="cri-o://5457ea54ccc768adb098ee155755be6f0bc0286d124c50c835b3ce55b1720028" gracePeriod=30 Dec 12 00:48:38 crc kubenswrapper[4606]: I1212 00:48:38.509418 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 12 00:48:38 crc kubenswrapper[4606]: I1212 00:48:38.510822 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 12 00:48:38 crc kubenswrapper[4606]: I1212 00:48:38.699631 4606 scope.go:117] "RemoveContainer" containerID="77f50006b910ef408eff1d278d14d4c6df84559c43dc90a75f9e7ad4aa5dad37" Dec 12 00:48:38 crc kubenswrapper[4606]: E1212 00:48:38.700018 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 00:48:39 crc kubenswrapper[4606]: I1212 00:48:39.210696 4606 generic.go:334] "Generic (PLEG): container finished" podID="841a9b2c-ca62-41f0-8307-7cd58b43aa9e" containerID="5457ea54ccc768adb098ee155755be6f0bc0286d124c50c835b3ce55b1720028" exitCode=0 Dec 12 00:48:39 crc kubenswrapper[4606]: I1212 00:48:39.211003 4606 generic.go:334] "Generic (PLEG): container finished" podID="841a9b2c-ca62-41f0-8307-7cd58b43aa9e" containerID="4f1e81bc1d9daeb41b40889d1e8d1cec27087f61b7b54f4e5d99c031acedc48f" exitCode=2 Dec 12 00:48:39 crc kubenswrapper[4606]: I1212 00:48:39.211015 4606 generic.go:334] "Generic (PLEG): container finished" podID="841a9b2c-ca62-41f0-8307-7cd58b43aa9e" containerID="e9385f2519eb011ca784d1ede40a391d82acd7e77913ce00e762d707764b1127" exitCode=0 Dec 12 00:48:39 crc kubenswrapper[4606]: I1212 00:48:39.210781 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"841a9b2c-ca62-41f0-8307-7cd58b43aa9e","Type":"ContainerDied","Data":"5457ea54ccc768adb098ee155755be6f0bc0286d124c50c835b3ce55b1720028"} Dec 12 00:48:39 crc kubenswrapper[4606]: I1212 00:48:39.211079 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"841a9b2c-ca62-41f0-8307-7cd58b43aa9e","Type":"ContainerDied","Data":"4f1e81bc1d9daeb41b40889d1e8d1cec27087f61b7b54f4e5d99c031acedc48f"} Dec 12 00:48:39 crc kubenswrapper[4606]: I1212 00:48:39.211093 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"841a9b2c-ca62-41f0-8307-7cd58b43aa9e","Type":"ContainerDied","Data":"e9385f2519eb011ca784d1ede40a391d82acd7e77913ce00e762d707764b1127"} Dec 12 00:48:39 crc kubenswrapper[4606]: I1212 00:48:39.213237 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"14b69a07-590a-4574-b5b9-de1bfe8c8fcf","Type":"ContainerStarted","Data":"d7080ee7de2a48167ea4164302fffd2fdf269d521f587693ee0abae3805f12bf"} Dec 12 00:48:39 crc kubenswrapper[4606]: I1212 00:48:39.214269 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 12 00:48:39 crc kubenswrapper[4606]: I1212 00:48:39.234096 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=1.833839371 podStartE2EDuration="2.234081273s" podCreationTimestamp="2025-12-12 00:48:37 +0000 UTC" firstStartedPulling="2025-12-12 00:48:38.068253531 +0000 UTC m=+1508.613606397" lastFinishedPulling="2025-12-12 00:48:38.468495423 +0000 UTC m=+1509.013848299" observedRunningTime="2025-12-12 00:48:39.234007291 +0000 UTC m=+1509.779360157" watchObservedRunningTime="2025-12-12 00:48:39.234081273 +0000 UTC m=+1509.779434129" Dec 12 00:48:39 crc kubenswrapper[4606]: I1212 00:48:39.528402 4606 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="a4176ccd-46b9-44e4-af1b-f02f91762469" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.195:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 12 00:48:39 crc kubenswrapper[4606]: I1212 00:48:39.528438 4606 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="a4176ccd-46b9-44e4-af1b-f02f91762469" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.195:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 12 00:48:39 crc kubenswrapper[4606]: I1212 00:48:39.711423 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-6slrb"] Dec 12 00:48:39 crc kubenswrapper[4606]: I1212 00:48:39.713271 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6slrb" Dec 12 00:48:39 crc kubenswrapper[4606]: I1212 00:48:39.726461 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6slrb"] Dec 12 00:48:39 crc kubenswrapper[4606]: I1212 00:48:39.819057 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfxzs\" (UniqueName: \"kubernetes.io/projected/66096abb-d271-4f16-a936-ec59f78d40c0-kube-api-access-rfxzs\") pod \"redhat-marketplace-6slrb\" (UID: \"66096abb-d271-4f16-a936-ec59f78d40c0\") " pod="openshift-marketplace/redhat-marketplace-6slrb" Dec 12 00:48:39 crc kubenswrapper[4606]: I1212 00:48:39.819108 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66096abb-d271-4f16-a936-ec59f78d40c0-catalog-content\") pod \"redhat-marketplace-6slrb\" (UID: \"66096abb-d271-4f16-a936-ec59f78d40c0\") " pod="openshift-marketplace/redhat-marketplace-6slrb" Dec 12 00:48:39 crc kubenswrapper[4606]: I1212 00:48:39.819213 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66096abb-d271-4f16-a936-ec59f78d40c0-utilities\") pod \"redhat-marketplace-6slrb\" (UID: \"66096abb-d271-4f16-a936-ec59f78d40c0\") " pod="openshift-marketplace/redhat-marketplace-6slrb" Dec 12 00:48:39 crc kubenswrapper[4606]: I1212 00:48:39.924588 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66096abb-d271-4f16-a936-ec59f78d40c0-utilities\") pod \"redhat-marketplace-6slrb\" (UID: \"66096abb-d271-4f16-a936-ec59f78d40c0\") " pod="openshift-marketplace/redhat-marketplace-6slrb" Dec 12 00:48:39 crc kubenswrapper[4606]: I1212 00:48:39.924968 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rfxzs\" (UniqueName: \"kubernetes.io/projected/66096abb-d271-4f16-a936-ec59f78d40c0-kube-api-access-rfxzs\") pod \"redhat-marketplace-6slrb\" (UID: \"66096abb-d271-4f16-a936-ec59f78d40c0\") " pod="openshift-marketplace/redhat-marketplace-6slrb" Dec 12 00:48:39 crc kubenswrapper[4606]: I1212 00:48:39.925094 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66096abb-d271-4f16-a936-ec59f78d40c0-catalog-content\") pod \"redhat-marketplace-6slrb\" (UID: \"66096abb-d271-4f16-a936-ec59f78d40c0\") " pod="openshift-marketplace/redhat-marketplace-6slrb" Dec 12 00:48:39 crc kubenswrapper[4606]: I1212 00:48:39.925662 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66096abb-d271-4f16-a936-ec59f78d40c0-catalog-content\") pod \"redhat-marketplace-6slrb\" (UID: \"66096abb-d271-4f16-a936-ec59f78d40c0\") " pod="openshift-marketplace/redhat-marketplace-6slrb" Dec 12 00:48:39 crc kubenswrapper[4606]: I1212 00:48:39.925985 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66096abb-d271-4f16-a936-ec59f78d40c0-utilities\") pod \"redhat-marketplace-6slrb\" (UID: \"66096abb-d271-4f16-a936-ec59f78d40c0\") " pod="openshift-marketplace/redhat-marketplace-6slrb" Dec 12 00:48:39 crc kubenswrapper[4606]: I1212 00:48:39.952110 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rfxzs\" (UniqueName: \"kubernetes.io/projected/66096abb-d271-4f16-a936-ec59f78d40c0-kube-api-access-rfxzs\") pod \"redhat-marketplace-6slrb\" (UID: \"66096abb-d271-4f16-a936-ec59f78d40c0\") " pod="openshift-marketplace/redhat-marketplace-6slrb" Dec 12 00:48:40 crc kubenswrapper[4606]: I1212 00:48:40.071486 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6slrb" Dec 12 00:48:40 crc kubenswrapper[4606]: I1212 00:48:40.225854 4606 generic.go:334] "Generic (PLEG): container finished" podID="9ede4720-3fd7-4524-adfc-c1c395f12170" containerID="118461cd66eb187af751f8b1bf5a9a842145b3d677c56d382d11224f393d77e9" exitCode=0 Dec 12 00:48:40 crc kubenswrapper[4606]: I1212 00:48:40.225917 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-79c99578bb-cdgsn" event={"ID":"9ede4720-3fd7-4524-adfc-c1c395f12170","Type":"ContainerDied","Data":"118461cd66eb187af751f8b1bf5a9a842145b3d677c56d382d11224f393d77e9"} Dec 12 00:48:40 crc kubenswrapper[4606]: I1212 00:48:40.225983 4606 scope.go:117] "RemoveContainer" containerID="a50c810b61ab80031ddc96b3bb79c28f1af6ee88b45271a71187d7164a11dd04" Dec 12 00:48:40 crc kubenswrapper[4606]: I1212 00:48:40.753368 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6slrb"] Dec 12 00:48:41 crc kubenswrapper[4606]: I1212 00:48:41.236400 4606 generic.go:334] "Generic (PLEG): container finished" podID="66096abb-d271-4f16-a936-ec59f78d40c0" containerID="d49f032c238570d73a2a818f5e972e9c8532b3ab5287860dcc77bce2d3e00681" exitCode=0 Dec 12 00:48:41 crc kubenswrapper[4606]: I1212 00:48:41.237662 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6slrb" event={"ID":"66096abb-d271-4f16-a936-ec59f78d40c0","Type":"ContainerDied","Data":"d49f032c238570d73a2a818f5e972e9c8532b3ab5287860dcc77bce2d3e00681"} Dec 12 00:48:41 crc kubenswrapper[4606]: I1212 00:48:41.237738 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6slrb" event={"ID":"66096abb-d271-4f16-a936-ec59f78d40c0","Type":"ContainerStarted","Data":"c1ccf42191e51bc7d9e303c69542374720703be7de82104dc2eff60e273a5580"} Dec 12 00:48:41 crc kubenswrapper[4606]: I1212 00:48:41.646542 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 12 00:48:41 crc kubenswrapper[4606]: I1212 00:48:41.646597 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 12 00:48:41 crc kubenswrapper[4606]: I1212 00:48:41.668380 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 12 00:48:41 crc kubenswrapper[4606]: I1212 00:48:41.710384 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 12 00:48:41 crc kubenswrapper[4606]: I1212 00:48:41.906130 4606 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-79c99578bb-cdgsn" podUID="9ede4720-3fd7-4524-adfc-c1c395f12170" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.148:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.148:8443: connect: connection refused" Dec 12 00:48:42 crc kubenswrapper[4606]: I1212 00:48:42.299056 4606 generic.go:334] "Generic (PLEG): container finished" podID="841a9b2c-ca62-41f0-8307-7cd58b43aa9e" containerID="d135a11dabe2b21a5d5f111d71771b37cd712b5a4675684608703b85d4b84a05" exitCode=0 Dec 12 00:48:42 crc kubenswrapper[4606]: I1212 00:48:42.299479 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"841a9b2c-ca62-41f0-8307-7cd58b43aa9e","Type":"ContainerDied","Data":"d135a11dabe2b21a5d5f111d71771b37cd712b5a4675684608703b85d4b84a05"} Dec 12 00:48:42 crc kubenswrapper[4606]: I1212 00:48:42.309038 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6slrb" event={"ID":"66096abb-d271-4f16-a936-ec59f78d40c0","Type":"ContainerStarted","Data":"7ab2cd1128dbcfc400edfc345c36516799f13b24abb9a2ccac2432a074126cc2"} Dec 12 00:48:42 crc kubenswrapper[4606]: I1212 00:48:42.351591 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 12 00:48:42 crc kubenswrapper[4606]: I1212 00:48:42.468884 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 12 00:48:42 crc kubenswrapper[4606]: I1212 00:48:42.584310 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/841a9b2c-ca62-41f0-8307-7cd58b43aa9e-config-data\") pod \"841a9b2c-ca62-41f0-8307-7cd58b43aa9e\" (UID: \"841a9b2c-ca62-41f0-8307-7cd58b43aa9e\") " Dec 12 00:48:42 crc kubenswrapper[4606]: I1212 00:48:42.584391 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/841a9b2c-ca62-41f0-8307-7cd58b43aa9e-combined-ca-bundle\") pod \"841a9b2c-ca62-41f0-8307-7cd58b43aa9e\" (UID: \"841a9b2c-ca62-41f0-8307-7cd58b43aa9e\") " Dec 12 00:48:42 crc kubenswrapper[4606]: I1212 00:48:42.584450 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/841a9b2c-ca62-41f0-8307-7cd58b43aa9e-run-httpd\") pod \"841a9b2c-ca62-41f0-8307-7cd58b43aa9e\" (UID: \"841a9b2c-ca62-41f0-8307-7cd58b43aa9e\") " Dec 12 00:48:42 crc kubenswrapper[4606]: I1212 00:48:42.584500 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/841a9b2c-ca62-41f0-8307-7cd58b43aa9e-sg-core-conf-yaml\") pod \"841a9b2c-ca62-41f0-8307-7cd58b43aa9e\" (UID: \"841a9b2c-ca62-41f0-8307-7cd58b43aa9e\") " Dec 12 00:48:42 crc kubenswrapper[4606]: I1212 00:48:42.584678 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/841a9b2c-ca62-41f0-8307-7cd58b43aa9e-scripts\") pod \"841a9b2c-ca62-41f0-8307-7cd58b43aa9e\" (UID: \"841a9b2c-ca62-41f0-8307-7cd58b43aa9e\") " Dec 12 00:48:42 crc kubenswrapper[4606]: I1212 00:48:42.584720 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nchnh\" (UniqueName: \"kubernetes.io/projected/841a9b2c-ca62-41f0-8307-7cd58b43aa9e-kube-api-access-nchnh\") pod \"841a9b2c-ca62-41f0-8307-7cd58b43aa9e\" (UID: \"841a9b2c-ca62-41f0-8307-7cd58b43aa9e\") " Dec 12 00:48:42 crc kubenswrapper[4606]: I1212 00:48:42.584769 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/841a9b2c-ca62-41f0-8307-7cd58b43aa9e-log-httpd\") pod \"841a9b2c-ca62-41f0-8307-7cd58b43aa9e\" (UID: \"841a9b2c-ca62-41f0-8307-7cd58b43aa9e\") " Dec 12 00:48:42 crc kubenswrapper[4606]: I1212 00:48:42.584848 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/841a9b2c-ca62-41f0-8307-7cd58b43aa9e-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "841a9b2c-ca62-41f0-8307-7cd58b43aa9e" (UID: "841a9b2c-ca62-41f0-8307-7cd58b43aa9e"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 00:48:42 crc kubenswrapper[4606]: I1212 00:48:42.585421 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/841a9b2c-ca62-41f0-8307-7cd58b43aa9e-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "841a9b2c-ca62-41f0-8307-7cd58b43aa9e" (UID: "841a9b2c-ca62-41f0-8307-7cd58b43aa9e"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 00:48:42 crc kubenswrapper[4606]: I1212 00:48:42.586015 4606 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/841a9b2c-ca62-41f0-8307-7cd58b43aa9e-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 12 00:48:42 crc kubenswrapper[4606]: I1212 00:48:42.586036 4606 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/841a9b2c-ca62-41f0-8307-7cd58b43aa9e-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 12 00:48:42 crc kubenswrapper[4606]: I1212 00:48:42.636374 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/841a9b2c-ca62-41f0-8307-7cd58b43aa9e-scripts" (OuterVolumeSpecName: "scripts") pod "841a9b2c-ca62-41f0-8307-7cd58b43aa9e" (UID: "841a9b2c-ca62-41f0-8307-7cd58b43aa9e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:48:42 crc kubenswrapper[4606]: I1212 00:48:42.639979 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/841a9b2c-ca62-41f0-8307-7cd58b43aa9e-kube-api-access-nchnh" (OuterVolumeSpecName: "kube-api-access-nchnh") pod "841a9b2c-ca62-41f0-8307-7cd58b43aa9e" (UID: "841a9b2c-ca62-41f0-8307-7cd58b43aa9e"). InnerVolumeSpecName "kube-api-access-nchnh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:48:42 crc kubenswrapper[4606]: I1212 00:48:42.667479 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/841a9b2c-ca62-41f0-8307-7cd58b43aa9e-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "841a9b2c-ca62-41f0-8307-7cd58b43aa9e" (UID: "841a9b2c-ca62-41f0-8307-7cd58b43aa9e"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:48:42 crc kubenswrapper[4606]: I1212 00:48:42.690681 4606 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/841a9b2c-ca62-41f0-8307-7cd58b43aa9e-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 12 00:48:42 crc kubenswrapper[4606]: I1212 00:48:42.690720 4606 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/841a9b2c-ca62-41f0-8307-7cd58b43aa9e-scripts\") on node \"crc\" DevicePath \"\"" Dec 12 00:48:42 crc kubenswrapper[4606]: I1212 00:48:42.690733 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nchnh\" (UniqueName: \"kubernetes.io/projected/841a9b2c-ca62-41f0-8307-7cd58b43aa9e-kube-api-access-nchnh\") on node \"crc\" DevicePath \"\"" Dec 12 00:48:42 crc kubenswrapper[4606]: I1212 00:48:42.730253 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/841a9b2c-ca62-41f0-8307-7cd58b43aa9e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "841a9b2c-ca62-41f0-8307-7cd58b43aa9e" (UID: "841a9b2c-ca62-41f0-8307-7cd58b43aa9e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:48:42 crc kubenswrapper[4606]: I1212 00:48:42.731328 4606 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="29f7a7c3-4ec4-4e0e-8309-6a9dddcd9b71" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.196:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 12 00:48:42 crc kubenswrapper[4606]: I1212 00:48:42.731575 4606 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="29f7a7c3-4ec4-4e0e-8309-6a9dddcd9b71" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.196:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 12 00:48:42 crc kubenswrapper[4606]: I1212 00:48:42.777502 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/841a9b2c-ca62-41f0-8307-7cd58b43aa9e-config-data" (OuterVolumeSpecName: "config-data") pod "841a9b2c-ca62-41f0-8307-7cd58b43aa9e" (UID: "841a9b2c-ca62-41f0-8307-7cd58b43aa9e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:48:42 crc kubenswrapper[4606]: I1212 00:48:42.794501 4606 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/841a9b2c-ca62-41f0-8307-7cd58b43aa9e-config-data\") on node \"crc\" DevicePath \"\"" Dec 12 00:48:42 crc kubenswrapper[4606]: I1212 00:48:42.794622 4606 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/841a9b2c-ca62-41f0-8307-7cd58b43aa9e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 12 00:48:43 crc kubenswrapper[4606]: I1212 00:48:43.318711 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"841a9b2c-ca62-41f0-8307-7cd58b43aa9e","Type":"ContainerDied","Data":"314bab1308a545aa3977fd76e5f21d2a28edcb4db3d952b9bba2f5c9acaa2236"} Dec 12 00:48:43 crc kubenswrapper[4606]: I1212 00:48:43.318767 4606 scope.go:117] "RemoveContainer" containerID="5457ea54ccc768adb098ee155755be6f0bc0286d124c50c835b3ce55b1720028" Dec 12 00:48:43 crc kubenswrapper[4606]: I1212 00:48:43.318908 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 12 00:48:43 crc kubenswrapper[4606]: I1212 00:48:43.322001 4606 generic.go:334] "Generic (PLEG): container finished" podID="66096abb-d271-4f16-a936-ec59f78d40c0" containerID="7ab2cd1128dbcfc400edfc345c36516799f13b24abb9a2ccac2432a074126cc2" exitCode=0 Dec 12 00:48:43 crc kubenswrapper[4606]: I1212 00:48:43.323940 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6slrb" event={"ID":"66096abb-d271-4f16-a936-ec59f78d40c0","Type":"ContainerDied","Data":"7ab2cd1128dbcfc400edfc345c36516799f13b24abb9a2ccac2432a074126cc2"} Dec 12 00:48:43 crc kubenswrapper[4606]: I1212 00:48:43.346198 4606 scope.go:117] "RemoveContainer" containerID="4f1e81bc1d9daeb41b40889d1e8d1cec27087f61b7b54f4e5d99c031acedc48f" Dec 12 00:48:43 crc kubenswrapper[4606]: I1212 00:48:43.370458 4606 scope.go:117] "RemoveContainer" containerID="d135a11dabe2b21a5d5f111d71771b37cd712b5a4675684608703b85d4b84a05" Dec 12 00:48:43 crc kubenswrapper[4606]: I1212 00:48:43.396905 4606 scope.go:117] "RemoveContainer" containerID="e9385f2519eb011ca784d1ede40a391d82acd7e77913ce00e762d707764b1127" Dec 12 00:48:43 crc kubenswrapper[4606]: I1212 00:48:43.404265 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 12 00:48:43 crc kubenswrapper[4606]: I1212 00:48:43.414165 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 12 00:48:43 crc kubenswrapper[4606]: I1212 00:48:43.424501 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 12 00:48:43 crc kubenswrapper[4606]: E1212 00:48:43.424885 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="841a9b2c-ca62-41f0-8307-7cd58b43aa9e" containerName="ceilometer-notification-agent" Dec 12 00:48:43 crc kubenswrapper[4606]: I1212 00:48:43.424900 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="841a9b2c-ca62-41f0-8307-7cd58b43aa9e" containerName="ceilometer-notification-agent" Dec 12 00:48:43 crc kubenswrapper[4606]: E1212 00:48:43.424920 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="841a9b2c-ca62-41f0-8307-7cd58b43aa9e" containerName="sg-core" Dec 12 00:48:43 crc kubenswrapper[4606]: I1212 00:48:43.424926 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="841a9b2c-ca62-41f0-8307-7cd58b43aa9e" containerName="sg-core" Dec 12 00:48:43 crc kubenswrapper[4606]: E1212 00:48:43.424940 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="841a9b2c-ca62-41f0-8307-7cd58b43aa9e" containerName="ceilometer-central-agent" Dec 12 00:48:43 crc kubenswrapper[4606]: I1212 00:48:43.424946 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="841a9b2c-ca62-41f0-8307-7cd58b43aa9e" containerName="ceilometer-central-agent" Dec 12 00:48:43 crc kubenswrapper[4606]: E1212 00:48:43.424980 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="841a9b2c-ca62-41f0-8307-7cd58b43aa9e" containerName="proxy-httpd" Dec 12 00:48:43 crc kubenswrapper[4606]: I1212 00:48:43.424985 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="841a9b2c-ca62-41f0-8307-7cd58b43aa9e" containerName="proxy-httpd" Dec 12 00:48:43 crc kubenswrapper[4606]: I1212 00:48:43.425138 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="841a9b2c-ca62-41f0-8307-7cd58b43aa9e" containerName="proxy-httpd" Dec 12 00:48:43 crc kubenswrapper[4606]: I1212 00:48:43.425150 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="841a9b2c-ca62-41f0-8307-7cd58b43aa9e" containerName="ceilometer-notification-agent" Dec 12 00:48:43 crc kubenswrapper[4606]: I1212 00:48:43.425162 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="841a9b2c-ca62-41f0-8307-7cd58b43aa9e" containerName="sg-core" Dec 12 00:48:43 crc kubenswrapper[4606]: I1212 00:48:43.425198 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="841a9b2c-ca62-41f0-8307-7cd58b43aa9e" containerName="ceilometer-central-agent" Dec 12 00:48:43 crc kubenswrapper[4606]: I1212 00:48:43.431851 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 12 00:48:43 crc kubenswrapper[4606]: I1212 00:48:43.434629 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 12 00:48:43 crc kubenswrapper[4606]: I1212 00:48:43.434635 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 12 00:48:43 crc kubenswrapper[4606]: I1212 00:48:43.435013 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 12 00:48:43 crc kubenswrapper[4606]: I1212 00:48:43.441624 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 12 00:48:43 crc kubenswrapper[4606]: I1212 00:48:43.508408 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87g94\" (UniqueName: \"kubernetes.io/projected/e5b439be-0062-43d1-931c-0c2cb2a94e7f-kube-api-access-87g94\") pod \"ceilometer-0\" (UID: \"e5b439be-0062-43d1-931c-0c2cb2a94e7f\") " pod="openstack/ceilometer-0" Dec 12 00:48:43 crc kubenswrapper[4606]: I1212 00:48:43.508448 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e5b439be-0062-43d1-931c-0c2cb2a94e7f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e5b439be-0062-43d1-931c-0c2cb2a94e7f\") " pod="openstack/ceilometer-0" Dec 12 00:48:43 crc kubenswrapper[4606]: I1212 00:48:43.508479 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e5b439be-0062-43d1-931c-0c2cb2a94e7f-log-httpd\") pod \"ceilometer-0\" (UID: \"e5b439be-0062-43d1-931c-0c2cb2a94e7f\") " pod="openstack/ceilometer-0" Dec 12 00:48:43 crc kubenswrapper[4606]: I1212 00:48:43.508502 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5b439be-0062-43d1-931c-0c2cb2a94e7f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e5b439be-0062-43d1-931c-0c2cb2a94e7f\") " pod="openstack/ceilometer-0" Dec 12 00:48:43 crc kubenswrapper[4606]: I1212 00:48:43.508534 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5b439be-0062-43d1-931c-0c2cb2a94e7f-scripts\") pod \"ceilometer-0\" (UID: \"e5b439be-0062-43d1-931c-0c2cb2a94e7f\") " pod="openstack/ceilometer-0" Dec 12 00:48:43 crc kubenswrapper[4606]: I1212 00:48:43.508552 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5b439be-0062-43d1-931c-0c2cb2a94e7f-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e5b439be-0062-43d1-931c-0c2cb2a94e7f\") " pod="openstack/ceilometer-0" Dec 12 00:48:43 crc kubenswrapper[4606]: I1212 00:48:43.508575 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e5b439be-0062-43d1-931c-0c2cb2a94e7f-run-httpd\") pod \"ceilometer-0\" (UID: \"e5b439be-0062-43d1-931c-0c2cb2a94e7f\") " pod="openstack/ceilometer-0" Dec 12 00:48:43 crc kubenswrapper[4606]: I1212 00:48:43.508590 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5b439be-0062-43d1-931c-0c2cb2a94e7f-config-data\") pod \"ceilometer-0\" (UID: \"e5b439be-0062-43d1-931c-0c2cb2a94e7f\") " pod="openstack/ceilometer-0" Dec 12 00:48:43 crc kubenswrapper[4606]: I1212 00:48:43.609725 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87g94\" (UniqueName: \"kubernetes.io/projected/e5b439be-0062-43d1-931c-0c2cb2a94e7f-kube-api-access-87g94\") pod \"ceilometer-0\" (UID: \"e5b439be-0062-43d1-931c-0c2cb2a94e7f\") " pod="openstack/ceilometer-0" Dec 12 00:48:43 crc kubenswrapper[4606]: I1212 00:48:43.610006 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e5b439be-0062-43d1-931c-0c2cb2a94e7f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e5b439be-0062-43d1-931c-0c2cb2a94e7f\") " pod="openstack/ceilometer-0" Dec 12 00:48:43 crc kubenswrapper[4606]: I1212 00:48:43.610127 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e5b439be-0062-43d1-931c-0c2cb2a94e7f-log-httpd\") pod \"ceilometer-0\" (UID: \"e5b439be-0062-43d1-931c-0c2cb2a94e7f\") " pod="openstack/ceilometer-0" Dec 12 00:48:43 crc kubenswrapper[4606]: I1212 00:48:43.610252 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5b439be-0062-43d1-931c-0c2cb2a94e7f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e5b439be-0062-43d1-931c-0c2cb2a94e7f\") " pod="openstack/ceilometer-0" Dec 12 00:48:43 crc kubenswrapper[4606]: I1212 00:48:43.610363 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5b439be-0062-43d1-931c-0c2cb2a94e7f-scripts\") pod \"ceilometer-0\" (UID: \"e5b439be-0062-43d1-931c-0c2cb2a94e7f\") " pod="openstack/ceilometer-0" Dec 12 00:48:43 crc kubenswrapper[4606]: I1212 00:48:43.610445 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5b439be-0062-43d1-931c-0c2cb2a94e7f-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e5b439be-0062-43d1-931c-0c2cb2a94e7f\") " pod="openstack/ceilometer-0" Dec 12 00:48:43 crc kubenswrapper[4606]: I1212 00:48:43.610559 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e5b439be-0062-43d1-931c-0c2cb2a94e7f-run-httpd\") pod \"ceilometer-0\" (UID: \"e5b439be-0062-43d1-931c-0c2cb2a94e7f\") " pod="openstack/ceilometer-0" Dec 12 00:48:43 crc kubenswrapper[4606]: I1212 00:48:43.611016 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5b439be-0062-43d1-931c-0c2cb2a94e7f-config-data\") pod \"ceilometer-0\" (UID: \"e5b439be-0062-43d1-931c-0c2cb2a94e7f\") " pod="openstack/ceilometer-0" Dec 12 00:48:43 crc kubenswrapper[4606]: I1212 00:48:43.610970 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e5b439be-0062-43d1-931c-0c2cb2a94e7f-run-httpd\") pod \"ceilometer-0\" (UID: \"e5b439be-0062-43d1-931c-0c2cb2a94e7f\") " pod="openstack/ceilometer-0" Dec 12 00:48:43 crc kubenswrapper[4606]: I1212 00:48:43.610715 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e5b439be-0062-43d1-931c-0c2cb2a94e7f-log-httpd\") pod \"ceilometer-0\" (UID: \"e5b439be-0062-43d1-931c-0c2cb2a94e7f\") " pod="openstack/ceilometer-0" Dec 12 00:48:43 crc kubenswrapper[4606]: I1212 00:48:43.615700 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5b439be-0062-43d1-931c-0c2cb2a94e7f-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e5b439be-0062-43d1-931c-0c2cb2a94e7f\") " pod="openstack/ceilometer-0" Dec 12 00:48:43 crc kubenswrapper[4606]: I1212 00:48:43.618103 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5b439be-0062-43d1-931c-0c2cb2a94e7f-scripts\") pod \"ceilometer-0\" (UID: \"e5b439be-0062-43d1-931c-0c2cb2a94e7f\") " pod="openstack/ceilometer-0" Dec 12 00:48:43 crc kubenswrapper[4606]: I1212 00:48:43.637082 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5b439be-0062-43d1-931c-0c2cb2a94e7f-config-data\") pod \"ceilometer-0\" (UID: \"e5b439be-0062-43d1-931c-0c2cb2a94e7f\") " pod="openstack/ceilometer-0" Dec 12 00:48:43 crc kubenswrapper[4606]: I1212 00:48:43.637677 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5b439be-0062-43d1-931c-0c2cb2a94e7f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e5b439be-0062-43d1-931c-0c2cb2a94e7f\") " pod="openstack/ceilometer-0" Dec 12 00:48:43 crc kubenswrapper[4606]: I1212 00:48:43.638737 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87g94\" (UniqueName: \"kubernetes.io/projected/e5b439be-0062-43d1-931c-0c2cb2a94e7f-kube-api-access-87g94\") pod \"ceilometer-0\" (UID: \"e5b439be-0062-43d1-931c-0c2cb2a94e7f\") " pod="openstack/ceilometer-0" Dec 12 00:48:43 crc kubenswrapper[4606]: I1212 00:48:43.639697 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e5b439be-0062-43d1-931c-0c2cb2a94e7f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e5b439be-0062-43d1-931c-0c2cb2a94e7f\") " pod="openstack/ceilometer-0" Dec 12 00:48:43 crc kubenswrapper[4606]: I1212 00:48:43.719344 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="841a9b2c-ca62-41f0-8307-7cd58b43aa9e" path="/var/lib/kubelet/pods/841a9b2c-ca62-41f0-8307-7cd58b43aa9e/volumes" Dec 12 00:48:43 crc kubenswrapper[4606]: I1212 00:48:43.752994 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 12 00:48:44 crc kubenswrapper[4606]: I1212 00:48:44.352038 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 12 00:48:45 crc kubenswrapper[4606]: I1212 00:48:45.365258 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e5b439be-0062-43d1-931c-0c2cb2a94e7f","Type":"ContainerStarted","Data":"86a4d1893867a6d26ce4dc36c713b48bd9b0c1ea6454a44918a9a72cb29a5282"} Dec 12 00:48:45 crc kubenswrapper[4606]: I1212 00:48:45.365933 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e5b439be-0062-43d1-931c-0c2cb2a94e7f","Type":"ContainerStarted","Data":"1a3e2d58089576075d8b645f562537cf614f472fe49116b594a08adb455c2db2"} Dec 12 00:48:45 crc kubenswrapper[4606]: I1212 00:48:45.370380 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6slrb" event={"ID":"66096abb-d271-4f16-a936-ec59f78d40c0","Type":"ContainerStarted","Data":"f71d952b38db1acf13c62ba6c9b0030399863495353fc7181de4f5613807c62e"} Dec 12 00:48:45 crc kubenswrapper[4606]: I1212 00:48:45.390702 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-6slrb" podStartSLOduration=3.379854118 podStartE2EDuration="6.390672377s" podCreationTimestamp="2025-12-12 00:48:39 +0000 UTC" firstStartedPulling="2025-12-12 00:48:41.239916399 +0000 UTC m=+1511.785269265" lastFinishedPulling="2025-12-12 00:48:44.250734658 +0000 UTC m=+1514.796087524" observedRunningTime="2025-12-12 00:48:45.387898232 +0000 UTC m=+1515.933251168" watchObservedRunningTime="2025-12-12 00:48:45.390672377 +0000 UTC m=+1515.936025283" Dec 12 00:48:46 crc kubenswrapper[4606]: I1212 00:48:46.379650 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e5b439be-0062-43d1-931c-0c2cb2a94e7f","Type":"ContainerStarted","Data":"54ddf391554cecd54f714f13fac2eb26dc6050ba5beb3441de205298d6d8f284"} Dec 12 00:48:47 crc kubenswrapper[4606]: I1212 00:48:47.389469 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e5b439be-0062-43d1-931c-0c2cb2a94e7f","Type":"ContainerStarted","Data":"23741727f4506a2f3bd377fd00f0540d2072a00cf6e8864f626bd9f977381f4c"} Dec 12 00:48:47 crc kubenswrapper[4606]: I1212 00:48:47.603478 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 12 00:48:48 crc kubenswrapper[4606]: I1212 00:48:48.401156 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e5b439be-0062-43d1-931c-0c2cb2a94e7f","Type":"ContainerStarted","Data":"40f018ad1c87a265e10bb7fc511038d32e836b545fc9ae443dbd1830c25d4216"} Dec 12 00:48:48 crc kubenswrapper[4606]: I1212 00:48:48.401831 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 12 00:48:48 crc kubenswrapper[4606]: I1212 00:48:48.427317 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.9541805939999999 podStartE2EDuration="5.427297927s" podCreationTimestamp="2025-12-12 00:48:43 +0000 UTC" firstStartedPulling="2025-12-12 00:48:44.363831308 +0000 UTC m=+1514.909184174" lastFinishedPulling="2025-12-12 00:48:47.836948601 +0000 UTC m=+1518.382301507" observedRunningTime="2025-12-12 00:48:48.420779702 +0000 UTC m=+1518.966132568" watchObservedRunningTime="2025-12-12 00:48:48.427297927 +0000 UTC m=+1518.972650793" Dec 12 00:48:48 crc kubenswrapper[4606]: I1212 00:48:48.516659 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 12 00:48:48 crc kubenswrapper[4606]: I1212 00:48:48.517885 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 12 00:48:48 crc kubenswrapper[4606]: I1212 00:48:48.523675 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 12 00:48:49 crc kubenswrapper[4606]: I1212 00:48:49.411620 4606 generic.go:334] "Generic (PLEG): container finished" podID="1e589cf8-9e1a-4124-b7bb-c8fa6fd5b26e" containerID="ca55a0c69b9e5148ed7e570d1e1cff32cf0739821995257ca5662d20ac134add" exitCode=137 Dec 12 00:48:49 crc kubenswrapper[4606]: I1212 00:48:49.411689 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"1e589cf8-9e1a-4124-b7bb-c8fa6fd5b26e","Type":"ContainerDied","Data":"ca55a0c69b9e5148ed7e570d1e1cff32cf0739821995257ca5662d20ac134add"} Dec 12 00:48:49 crc kubenswrapper[4606]: I1212 00:48:49.413238 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"1e589cf8-9e1a-4124-b7bb-c8fa6fd5b26e","Type":"ContainerDied","Data":"056bf1efcf69368f446da4035b3875754ceb7bd5466d6bd0609302a3f7a2bbd7"} Dec 12 00:48:49 crc kubenswrapper[4606]: I1212 00:48:49.413257 4606 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="056bf1efcf69368f446da4035b3875754ceb7bd5466d6bd0609302a3f7a2bbd7" Dec 12 00:48:49 crc kubenswrapper[4606]: I1212 00:48:49.423561 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 12 00:48:49 crc kubenswrapper[4606]: I1212 00:48:49.443714 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 12 00:48:49 crc kubenswrapper[4606]: I1212 00:48:49.531936 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e589cf8-9e1a-4124-b7bb-c8fa6fd5b26e-combined-ca-bundle\") pod \"1e589cf8-9e1a-4124-b7bb-c8fa6fd5b26e\" (UID: \"1e589cf8-9e1a-4124-b7bb-c8fa6fd5b26e\") " Dec 12 00:48:49 crc kubenswrapper[4606]: I1212 00:48:49.532070 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2qp7\" (UniqueName: \"kubernetes.io/projected/1e589cf8-9e1a-4124-b7bb-c8fa6fd5b26e-kube-api-access-x2qp7\") pod \"1e589cf8-9e1a-4124-b7bb-c8fa6fd5b26e\" (UID: \"1e589cf8-9e1a-4124-b7bb-c8fa6fd5b26e\") " Dec 12 00:48:49 crc kubenswrapper[4606]: I1212 00:48:49.532123 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e589cf8-9e1a-4124-b7bb-c8fa6fd5b26e-config-data\") pod \"1e589cf8-9e1a-4124-b7bb-c8fa6fd5b26e\" (UID: \"1e589cf8-9e1a-4124-b7bb-c8fa6fd5b26e\") " Dec 12 00:48:49 crc kubenswrapper[4606]: I1212 00:48:49.570918 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e589cf8-9e1a-4124-b7bb-c8fa6fd5b26e-kube-api-access-x2qp7" (OuterVolumeSpecName: "kube-api-access-x2qp7") pod "1e589cf8-9e1a-4124-b7bb-c8fa6fd5b26e" (UID: "1e589cf8-9e1a-4124-b7bb-c8fa6fd5b26e"). InnerVolumeSpecName "kube-api-access-x2qp7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:48:49 crc kubenswrapper[4606]: I1212 00:48:49.574445 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e589cf8-9e1a-4124-b7bb-c8fa6fd5b26e-config-data" (OuterVolumeSpecName: "config-data") pod "1e589cf8-9e1a-4124-b7bb-c8fa6fd5b26e" (UID: "1e589cf8-9e1a-4124-b7bb-c8fa6fd5b26e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:48:49 crc kubenswrapper[4606]: I1212 00:48:49.663486 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2qp7\" (UniqueName: \"kubernetes.io/projected/1e589cf8-9e1a-4124-b7bb-c8fa6fd5b26e-kube-api-access-x2qp7\") on node \"crc\" DevicePath \"\"" Dec 12 00:48:49 crc kubenswrapper[4606]: I1212 00:48:49.663523 4606 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e589cf8-9e1a-4124-b7bb-c8fa6fd5b26e-config-data\") on node \"crc\" DevicePath \"\"" Dec 12 00:48:49 crc kubenswrapper[4606]: I1212 00:48:49.669562 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e589cf8-9e1a-4124-b7bb-c8fa6fd5b26e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1e589cf8-9e1a-4124-b7bb-c8fa6fd5b26e" (UID: "1e589cf8-9e1a-4124-b7bb-c8fa6fd5b26e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:48:49 crc kubenswrapper[4606]: I1212 00:48:49.765485 4606 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e589cf8-9e1a-4124-b7bb-c8fa6fd5b26e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 12 00:48:50 crc kubenswrapper[4606]: I1212 00:48:50.072409 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-6slrb" Dec 12 00:48:50 crc kubenswrapper[4606]: I1212 00:48:50.072718 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-6slrb" Dec 12 00:48:50 crc kubenswrapper[4606]: I1212 00:48:50.123571 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-6slrb" Dec 12 00:48:50 crc kubenswrapper[4606]: I1212 00:48:50.420405 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 12 00:48:50 crc kubenswrapper[4606]: I1212 00:48:50.459218 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 12 00:48:50 crc kubenswrapper[4606]: I1212 00:48:50.474486 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 12 00:48:50 crc kubenswrapper[4606]: I1212 00:48:50.483899 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-6slrb" Dec 12 00:48:50 crc kubenswrapper[4606]: I1212 00:48:50.506759 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 12 00:48:50 crc kubenswrapper[4606]: E1212 00:48:50.507210 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e589cf8-9e1a-4124-b7bb-c8fa6fd5b26e" containerName="nova-cell1-novncproxy-novncproxy" Dec 12 00:48:50 crc kubenswrapper[4606]: I1212 00:48:50.507226 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e589cf8-9e1a-4124-b7bb-c8fa6fd5b26e" containerName="nova-cell1-novncproxy-novncproxy" Dec 12 00:48:50 crc kubenswrapper[4606]: I1212 00:48:50.507425 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e589cf8-9e1a-4124-b7bb-c8fa6fd5b26e" containerName="nova-cell1-novncproxy-novncproxy" Dec 12 00:48:50 crc kubenswrapper[4606]: I1212 00:48:50.508051 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 12 00:48:50 crc kubenswrapper[4606]: I1212 00:48:50.512989 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Dec 12 00:48:50 crc kubenswrapper[4606]: I1212 00:48:50.513241 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 12 00:48:50 crc kubenswrapper[4606]: I1212 00:48:50.513370 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Dec 12 00:48:50 crc kubenswrapper[4606]: I1212 00:48:50.517696 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 12 00:48:50 crc kubenswrapper[4606]: I1212 00:48:50.575084 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6slrb"] Dec 12 00:48:50 crc kubenswrapper[4606]: I1212 00:48:50.686606 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/4674f3cc-27d8-4e55-9fe1-f13378aefbc8-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"4674f3cc-27d8-4e55-9fe1-f13378aefbc8\") " pod="openstack/nova-cell1-novncproxy-0" Dec 12 00:48:50 crc kubenswrapper[4606]: I1212 00:48:50.686673 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/4674f3cc-27d8-4e55-9fe1-f13378aefbc8-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"4674f3cc-27d8-4e55-9fe1-f13378aefbc8\") " pod="openstack/nova-cell1-novncproxy-0" Dec 12 00:48:50 crc kubenswrapper[4606]: I1212 00:48:50.686764 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxbcb\" (UniqueName: \"kubernetes.io/projected/4674f3cc-27d8-4e55-9fe1-f13378aefbc8-kube-api-access-mxbcb\") pod \"nova-cell1-novncproxy-0\" (UID: \"4674f3cc-27d8-4e55-9fe1-f13378aefbc8\") " pod="openstack/nova-cell1-novncproxy-0" Dec 12 00:48:50 crc kubenswrapper[4606]: I1212 00:48:50.686788 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4674f3cc-27d8-4e55-9fe1-f13378aefbc8-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"4674f3cc-27d8-4e55-9fe1-f13378aefbc8\") " pod="openstack/nova-cell1-novncproxy-0" Dec 12 00:48:50 crc kubenswrapper[4606]: I1212 00:48:50.686839 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4674f3cc-27d8-4e55-9fe1-f13378aefbc8-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"4674f3cc-27d8-4e55-9fe1-f13378aefbc8\") " pod="openstack/nova-cell1-novncproxy-0" Dec 12 00:48:50 crc kubenswrapper[4606]: I1212 00:48:50.788450 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/4674f3cc-27d8-4e55-9fe1-f13378aefbc8-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"4674f3cc-27d8-4e55-9fe1-f13378aefbc8\") " pod="openstack/nova-cell1-novncproxy-0" Dec 12 00:48:50 crc kubenswrapper[4606]: I1212 00:48:50.788509 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/4674f3cc-27d8-4e55-9fe1-f13378aefbc8-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"4674f3cc-27d8-4e55-9fe1-f13378aefbc8\") " pod="openstack/nova-cell1-novncproxy-0" Dec 12 00:48:50 crc kubenswrapper[4606]: I1212 00:48:50.788603 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxbcb\" (UniqueName: \"kubernetes.io/projected/4674f3cc-27d8-4e55-9fe1-f13378aefbc8-kube-api-access-mxbcb\") pod \"nova-cell1-novncproxy-0\" (UID: \"4674f3cc-27d8-4e55-9fe1-f13378aefbc8\") " pod="openstack/nova-cell1-novncproxy-0" Dec 12 00:48:50 crc kubenswrapper[4606]: I1212 00:48:50.788631 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4674f3cc-27d8-4e55-9fe1-f13378aefbc8-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"4674f3cc-27d8-4e55-9fe1-f13378aefbc8\") " pod="openstack/nova-cell1-novncproxy-0" Dec 12 00:48:50 crc kubenswrapper[4606]: I1212 00:48:50.788674 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4674f3cc-27d8-4e55-9fe1-f13378aefbc8-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"4674f3cc-27d8-4e55-9fe1-f13378aefbc8\") " pod="openstack/nova-cell1-novncproxy-0" Dec 12 00:48:50 crc kubenswrapper[4606]: I1212 00:48:50.798716 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/4674f3cc-27d8-4e55-9fe1-f13378aefbc8-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"4674f3cc-27d8-4e55-9fe1-f13378aefbc8\") " pod="openstack/nova-cell1-novncproxy-0" Dec 12 00:48:50 crc kubenswrapper[4606]: I1212 00:48:50.799363 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/4674f3cc-27d8-4e55-9fe1-f13378aefbc8-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"4674f3cc-27d8-4e55-9fe1-f13378aefbc8\") " pod="openstack/nova-cell1-novncproxy-0" Dec 12 00:48:50 crc kubenswrapper[4606]: I1212 00:48:50.800121 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4674f3cc-27d8-4e55-9fe1-f13378aefbc8-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"4674f3cc-27d8-4e55-9fe1-f13378aefbc8\") " pod="openstack/nova-cell1-novncproxy-0" Dec 12 00:48:50 crc kubenswrapper[4606]: I1212 00:48:50.800610 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4674f3cc-27d8-4e55-9fe1-f13378aefbc8-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"4674f3cc-27d8-4e55-9fe1-f13378aefbc8\") " pod="openstack/nova-cell1-novncproxy-0" Dec 12 00:48:50 crc kubenswrapper[4606]: I1212 00:48:50.807684 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxbcb\" (UniqueName: \"kubernetes.io/projected/4674f3cc-27d8-4e55-9fe1-f13378aefbc8-kube-api-access-mxbcb\") pod \"nova-cell1-novncproxy-0\" (UID: \"4674f3cc-27d8-4e55-9fe1-f13378aefbc8\") " pod="openstack/nova-cell1-novncproxy-0" Dec 12 00:48:50 crc kubenswrapper[4606]: I1212 00:48:50.824920 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 12 00:48:51 crc kubenswrapper[4606]: I1212 00:48:51.295138 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 12 00:48:51 crc kubenswrapper[4606]: W1212 00:48:51.303639 4606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4674f3cc_27d8_4e55_9fe1_f13378aefbc8.slice/crio-59142c615fbdf8d28445d61891ef90a6aeff1546bfcd8f5899483bef7a803d2a WatchSource:0}: Error finding container 59142c615fbdf8d28445d61891ef90a6aeff1546bfcd8f5899483bef7a803d2a: Status 404 returned error can't find the container with id 59142c615fbdf8d28445d61891ef90a6aeff1546bfcd8f5899483bef7a803d2a Dec 12 00:48:51 crc kubenswrapper[4606]: I1212 00:48:51.443441 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"4674f3cc-27d8-4e55-9fe1-f13378aefbc8","Type":"ContainerStarted","Data":"59142c615fbdf8d28445d61891ef90a6aeff1546bfcd8f5899483bef7a803d2a"} Dec 12 00:48:51 crc kubenswrapper[4606]: I1212 00:48:51.649556 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 12 00:48:51 crc kubenswrapper[4606]: I1212 00:48:51.650313 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 12 00:48:51 crc kubenswrapper[4606]: I1212 00:48:51.655671 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 12 00:48:51 crc kubenswrapper[4606]: I1212 00:48:51.655981 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 12 00:48:51 crc kubenswrapper[4606]: I1212 00:48:51.709140 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e589cf8-9e1a-4124-b7bb-c8fa6fd5b26e" path="/var/lib/kubelet/pods/1e589cf8-9e1a-4124-b7bb-c8fa6fd5b26e/volumes" Dec 12 00:48:51 crc kubenswrapper[4606]: I1212 00:48:51.905631 4606 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-79c99578bb-cdgsn" podUID="9ede4720-3fd7-4524-adfc-c1c395f12170" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.148:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.148:8443: connect: connection refused" Dec 12 00:48:52 crc kubenswrapper[4606]: I1212 00:48:52.463814 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"4674f3cc-27d8-4e55-9fe1-f13378aefbc8","Type":"ContainerStarted","Data":"f91e818359edac930fc279c0bf39f9a5651a9080062d774b972c4a88311d7adb"} Dec 12 00:48:52 crc kubenswrapper[4606]: I1212 00:48:52.464187 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-6slrb" podUID="66096abb-d271-4f16-a936-ec59f78d40c0" containerName="registry-server" containerID="cri-o://f71d952b38db1acf13c62ba6c9b0030399863495353fc7181de4f5613807c62e" gracePeriod=2 Dec 12 00:48:52 crc kubenswrapper[4606]: I1212 00:48:52.464751 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 12 00:48:52 crc kubenswrapper[4606]: I1212 00:48:52.481862 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 12 00:48:52 crc kubenswrapper[4606]: I1212 00:48:52.497813 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.497789404 podStartE2EDuration="2.497789404s" podCreationTimestamp="2025-12-12 00:48:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:48:52.488856025 +0000 UTC m=+1523.034208891" watchObservedRunningTime="2025-12-12 00:48:52.497789404 +0000 UTC m=+1523.043142280" Dec 12 00:48:52 crc kubenswrapper[4606]: I1212 00:48:52.700411 4606 scope.go:117] "RemoveContainer" containerID="77f50006b910ef408eff1d278d14d4c6df84559c43dc90a75f9e7ad4aa5dad37" Dec 12 00:48:52 crc kubenswrapper[4606]: E1212 00:48:52.700844 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 00:48:52 crc kubenswrapper[4606]: I1212 00:48:52.738245 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-78bt2"] Dec 12 00:48:52 crc kubenswrapper[4606]: I1212 00:48:52.740144 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cd5cbd7b9-78bt2" Dec 12 00:48:52 crc kubenswrapper[4606]: I1212 00:48:52.763824 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-78bt2"] Dec 12 00:48:52 crc kubenswrapper[4606]: I1212 00:48:52.833434 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-frh6k"] Dec 12 00:48:52 crc kubenswrapper[4606]: I1212 00:48:52.835505 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-frh6k" Dec 12 00:48:52 crc kubenswrapper[4606]: I1212 00:48:52.857244 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8632450-2d2a-4683-a1f8-fa91a510e5bd-config\") pod \"dnsmasq-dns-cd5cbd7b9-78bt2\" (UID: \"a8632450-2d2a-4683-a1f8-fa91a510e5bd\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-78bt2" Dec 12 00:48:52 crc kubenswrapper[4606]: I1212 00:48:52.857548 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a8632450-2d2a-4683-a1f8-fa91a510e5bd-dns-swift-storage-0\") pod \"dnsmasq-dns-cd5cbd7b9-78bt2\" (UID: \"a8632450-2d2a-4683-a1f8-fa91a510e5bd\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-78bt2" Dec 12 00:48:52 crc kubenswrapper[4606]: I1212 00:48:52.857674 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a8632450-2d2a-4683-a1f8-fa91a510e5bd-ovsdbserver-nb\") pod \"dnsmasq-dns-cd5cbd7b9-78bt2\" (UID: \"a8632450-2d2a-4683-a1f8-fa91a510e5bd\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-78bt2" Dec 12 00:48:52 crc kubenswrapper[4606]: I1212 00:48:52.857775 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a8632450-2d2a-4683-a1f8-fa91a510e5bd-dns-svc\") pod \"dnsmasq-dns-cd5cbd7b9-78bt2\" (UID: \"a8632450-2d2a-4683-a1f8-fa91a510e5bd\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-78bt2" Dec 12 00:48:52 crc kubenswrapper[4606]: I1212 00:48:52.857820 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwmcv\" (UniqueName: \"kubernetes.io/projected/a8632450-2d2a-4683-a1f8-fa91a510e5bd-kube-api-access-lwmcv\") pod \"dnsmasq-dns-cd5cbd7b9-78bt2\" (UID: \"a8632450-2d2a-4683-a1f8-fa91a510e5bd\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-78bt2" Dec 12 00:48:52 crc kubenswrapper[4606]: I1212 00:48:52.857895 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a8632450-2d2a-4683-a1f8-fa91a510e5bd-ovsdbserver-sb\") pod \"dnsmasq-dns-cd5cbd7b9-78bt2\" (UID: \"a8632450-2d2a-4683-a1f8-fa91a510e5bd\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-78bt2" Dec 12 00:48:52 crc kubenswrapper[4606]: I1212 00:48:52.870813 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-frh6k"] Dec 12 00:48:52 crc kubenswrapper[4606]: I1212 00:48:52.959766 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/459d1fea-af9d-46b7-ad3a-057dc9b980f2-utilities\") pod \"redhat-operators-frh6k\" (UID: \"459d1fea-af9d-46b7-ad3a-057dc9b980f2\") " pod="openshift-marketplace/redhat-operators-frh6k" Dec 12 00:48:52 crc kubenswrapper[4606]: I1212 00:48:52.959838 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a8632450-2d2a-4683-a1f8-fa91a510e5bd-dns-swift-storage-0\") pod \"dnsmasq-dns-cd5cbd7b9-78bt2\" (UID: \"a8632450-2d2a-4683-a1f8-fa91a510e5bd\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-78bt2" Dec 12 00:48:52 crc kubenswrapper[4606]: I1212 00:48:52.959859 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/459d1fea-af9d-46b7-ad3a-057dc9b980f2-catalog-content\") pod \"redhat-operators-frh6k\" (UID: \"459d1fea-af9d-46b7-ad3a-057dc9b980f2\") " pod="openshift-marketplace/redhat-operators-frh6k" Dec 12 00:48:52 crc kubenswrapper[4606]: I1212 00:48:52.959896 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a8632450-2d2a-4683-a1f8-fa91a510e5bd-ovsdbserver-nb\") pod \"dnsmasq-dns-cd5cbd7b9-78bt2\" (UID: \"a8632450-2d2a-4683-a1f8-fa91a510e5bd\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-78bt2" Dec 12 00:48:52 crc kubenswrapper[4606]: I1212 00:48:52.959948 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a8632450-2d2a-4683-a1f8-fa91a510e5bd-dns-svc\") pod \"dnsmasq-dns-cd5cbd7b9-78bt2\" (UID: \"a8632450-2d2a-4683-a1f8-fa91a510e5bd\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-78bt2" Dec 12 00:48:52 crc kubenswrapper[4606]: I1212 00:48:52.959971 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwmcv\" (UniqueName: \"kubernetes.io/projected/a8632450-2d2a-4683-a1f8-fa91a510e5bd-kube-api-access-lwmcv\") pod \"dnsmasq-dns-cd5cbd7b9-78bt2\" (UID: \"a8632450-2d2a-4683-a1f8-fa91a510e5bd\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-78bt2" Dec 12 00:48:52 crc kubenswrapper[4606]: I1212 00:48:52.959987 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62wc8\" (UniqueName: \"kubernetes.io/projected/459d1fea-af9d-46b7-ad3a-057dc9b980f2-kube-api-access-62wc8\") pod \"redhat-operators-frh6k\" (UID: \"459d1fea-af9d-46b7-ad3a-057dc9b980f2\") " pod="openshift-marketplace/redhat-operators-frh6k" Dec 12 00:48:52 crc kubenswrapper[4606]: I1212 00:48:52.960021 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a8632450-2d2a-4683-a1f8-fa91a510e5bd-ovsdbserver-sb\") pod \"dnsmasq-dns-cd5cbd7b9-78bt2\" (UID: \"a8632450-2d2a-4683-a1f8-fa91a510e5bd\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-78bt2" Dec 12 00:48:52 crc kubenswrapper[4606]: I1212 00:48:52.960055 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8632450-2d2a-4683-a1f8-fa91a510e5bd-config\") pod \"dnsmasq-dns-cd5cbd7b9-78bt2\" (UID: \"a8632450-2d2a-4683-a1f8-fa91a510e5bd\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-78bt2" Dec 12 00:48:52 crc kubenswrapper[4606]: I1212 00:48:52.960986 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8632450-2d2a-4683-a1f8-fa91a510e5bd-config\") pod \"dnsmasq-dns-cd5cbd7b9-78bt2\" (UID: \"a8632450-2d2a-4683-a1f8-fa91a510e5bd\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-78bt2" Dec 12 00:48:52 crc kubenswrapper[4606]: I1212 00:48:52.961216 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a8632450-2d2a-4683-a1f8-fa91a510e5bd-ovsdbserver-nb\") pod \"dnsmasq-dns-cd5cbd7b9-78bt2\" (UID: \"a8632450-2d2a-4683-a1f8-fa91a510e5bd\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-78bt2" Dec 12 00:48:52 crc kubenswrapper[4606]: I1212 00:48:52.961454 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a8632450-2d2a-4683-a1f8-fa91a510e5bd-dns-svc\") pod \"dnsmasq-dns-cd5cbd7b9-78bt2\" (UID: \"a8632450-2d2a-4683-a1f8-fa91a510e5bd\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-78bt2" Dec 12 00:48:52 crc kubenswrapper[4606]: I1212 00:48:52.961853 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a8632450-2d2a-4683-a1f8-fa91a510e5bd-ovsdbserver-sb\") pod \"dnsmasq-dns-cd5cbd7b9-78bt2\" (UID: \"a8632450-2d2a-4683-a1f8-fa91a510e5bd\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-78bt2" Dec 12 00:48:52 crc kubenswrapper[4606]: I1212 00:48:52.962056 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a8632450-2d2a-4683-a1f8-fa91a510e5bd-dns-swift-storage-0\") pod \"dnsmasq-dns-cd5cbd7b9-78bt2\" (UID: \"a8632450-2d2a-4683-a1f8-fa91a510e5bd\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-78bt2" Dec 12 00:48:52 crc kubenswrapper[4606]: I1212 00:48:52.981215 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwmcv\" (UniqueName: \"kubernetes.io/projected/a8632450-2d2a-4683-a1f8-fa91a510e5bd-kube-api-access-lwmcv\") pod \"dnsmasq-dns-cd5cbd7b9-78bt2\" (UID: \"a8632450-2d2a-4683-a1f8-fa91a510e5bd\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-78bt2" Dec 12 00:48:53 crc kubenswrapper[4606]: I1212 00:48:53.062813 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/459d1fea-af9d-46b7-ad3a-057dc9b980f2-utilities\") pod \"redhat-operators-frh6k\" (UID: \"459d1fea-af9d-46b7-ad3a-057dc9b980f2\") " pod="openshift-marketplace/redhat-operators-frh6k" Dec 12 00:48:53 crc kubenswrapper[4606]: I1212 00:48:53.063149 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/459d1fea-af9d-46b7-ad3a-057dc9b980f2-catalog-content\") pod \"redhat-operators-frh6k\" (UID: \"459d1fea-af9d-46b7-ad3a-057dc9b980f2\") " pod="openshift-marketplace/redhat-operators-frh6k" Dec 12 00:48:53 crc kubenswrapper[4606]: I1212 00:48:53.063232 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-62wc8\" (UniqueName: \"kubernetes.io/projected/459d1fea-af9d-46b7-ad3a-057dc9b980f2-kube-api-access-62wc8\") pod \"redhat-operators-frh6k\" (UID: \"459d1fea-af9d-46b7-ad3a-057dc9b980f2\") " pod="openshift-marketplace/redhat-operators-frh6k" Dec 12 00:48:53 crc kubenswrapper[4606]: I1212 00:48:53.063671 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/459d1fea-af9d-46b7-ad3a-057dc9b980f2-utilities\") pod \"redhat-operators-frh6k\" (UID: \"459d1fea-af9d-46b7-ad3a-057dc9b980f2\") " pod="openshift-marketplace/redhat-operators-frh6k" Dec 12 00:48:53 crc kubenswrapper[4606]: I1212 00:48:53.063921 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/459d1fea-af9d-46b7-ad3a-057dc9b980f2-catalog-content\") pod \"redhat-operators-frh6k\" (UID: \"459d1fea-af9d-46b7-ad3a-057dc9b980f2\") " pod="openshift-marketplace/redhat-operators-frh6k" Dec 12 00:48:53 crc kubenswrapper[4606]: I1212 00:48:53.069556 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cd5cbd7b9-78bt2" Dec 12 00:48:53 crc kubenswrapper[4606]: I1212 00:48:53.102291 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-62wc8\" (UniqueName: \"kubernetes.io/projected/459d1fea-af9d-46b7-ad3a-057dc9b980f2-kube-api-access-62wc8\") pod \"redhat-operators-frh6k\" (UID: \"459d1fea-af9d-46b7-ad3a-057dc9b980f2\") " pod="openshift-marketplace/redhat-operators-frh6k" Dec 12 00:48:53 crc kubenswrapper[4606]: I1212 00:48:53.165080 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-frh6k" Dec 12 00:48:53 crc kubenswrapper[4606]: I1212 00:48:53.271518 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6slrb" Dec 12 00:48:53 crc kubenswrapper[4606]: I1212 00:48:53.386913 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66096abb-d271-4f16-a936-ec59f78d40c0-utilities\") pod \"66096abb-d271-4f16-a936-ec59f78d40c0\" (UID: \"66096abb-d271-4f16-a936-ec59f78d40c0\") " Dec 12 00:48:53 crc kubenswrapper[4606]: I1212 00:48:53.387139 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66096abb-d271-4f16-a936-ec59f78d40c0-catalog-content\") pod \"66096abb-d271-4f16-a936-ec59f78d40c0\" (UID: \"66096abb-d271-4f16-a936-ec59f78d40c0\") " Dec 12 00:48:53 crc kubenswrapper[4606]: I1212 00:48:53.387231 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rfxzs\" (UniqueName: \"kubernetes.io/projected/66096abb-d271-4f16-a936-ec59f78d40c0-kube-api-access-rfxzs\") pod \"66096abb-d271-4f16-a936-ec59f78d40c0\" (UID: \"66096abb-d271-4f16-a936-ec59f78d40c0\") " Dec 12 00:48:53 crc kubenswrapper[4606]: I1212 00:48:53.389096 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66096abb-d271-4f16-a936-ec59f78d40c0-utilities" (OuterVolumeSpecName: "utilities") pod "66096abb-d271-4f16-a936-ec59f78d40c0" (UID: "66096abb-d271-4f16-a936-ec59f78d40c0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 00:48:53 crc kubenswrapper[4606]: I1212 00:48:53.389489 4606 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66096abb-d271-4f16-a936-ec59f78d40c0-utilities\") on node \"crc\" DevicePath \"\"" Dec 12 00:48:53 crc kubenswrapper[4606]: I1212 00:48:53.395610 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66096abb-d271-4f16-a936-ec59f78d40c0-kube-api-access-rfxzs" (OuterVolumeSpecName: "kube-api-access-rfxzs") pod "66096abb-d271-4f16-a936-ec59f78d40c0" (UID: "66096abb-d271-4f16-a936-ec59f78d40c0"). InnerVolumeSpecName "kube-api-access-rfxzs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:48:53 crc kubenswrapper[4606]: I1212 00:48:53.450581 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66096abb-d271-4f16-a936-ec59f78d40c0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "66096abb-d271-4f16-a936-ec59f78d40c0" (UID: "66096abb-d271-4f16-a936-ec59f78d40c0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 00:48:53 crc kubenswrapper[4606]: I1212 00:48:53.481919 4606 generic.go:334] "Generic (PLEG): container finished" podID="66096abb-d271-4f16-a936-ec59f78d40c0" containerID="f71d952b38db1acf13c62ba6c9b0030399863495353fc7181de4f5613807c62e" exitCode=0 Dec 12 00:48:53 crc kubenswrapper[4606]: I1212 00:48:53.482831 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6slrb" Dec 12 00:48:53 crc kubenswrapper[4606]: I1212 00:48:53.483335 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6slrb" event={"ID":"66096abb-d271-4f16-a936-ec59f78d40c0","Type":"ContainerDied","Data":"f71d952b38db1acf13c62ba6c9b0030399863495353fc7181de4f5613807c62e"} Dec 12 00:48:53 crc kubenswrapper[4606]: I1212 00:48:53.483389 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6slrb" event={"ID":"66096abb-d271-4f16-a936-ec59f78d40c0","Type":"ContainerDied","Data":"c1ccf42191e51bc7d9e303c69542374720703be7de82104dc2eff60e273a5580"} Dec 12 00:48:53 crc kubenswrapper[4606]: I1212 00:48:53.483411 4606 scope.go:117] "RemoveContainer" containerID="f71d952b38db1acf13c62ba6c9b0030399863495353fc7181de4f5613807c62e" Dec 12 00:48:53 crc kubenswrapper[4606]: I1212 00:48:53.491091 4606 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66096abb-d271-4f16-a936-ec59f78d40c0-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 12 00:48:53 crc kubenswrapper[4606]: I1212 00:48:53.491117 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rfxzs\" (UniqueName: \"kubernetes.io/projected/66096abb-d271-4f16-a936-ec59f78d40c0-kube-api-access-rfxzs\") on node \"crc\" DevicePath \"\"" Dec 12 00:48:53 crc kubenswrapper[4606]: I1212 00:48:53.542371 4606 scope.go:117] "RemoveContainer" containerID="7ab2cd1128dbcfc400edfc345c36516799f13b24abb9a2ccac2432a074126cc2" Dec 12 00:48:53 crc kubenswrapper[4606]: I1212 00:48:53.551776 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6slrb"] Dec 12 00:48:53 crc kubenswrapper[4606]: I1212 00:48:53.572476 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-6slrb"] Dec 12 00:48:53 crc kubenswrapper[4606]: I1212 00:48:53.576599 4606 scope.go:117] "RemoveContainer" containerID="d49f032c238570d73a2a818f5e972e9c8532b3ab5287860dcc77bce2d3e00681" Dec 12 00:48:53 crc kubenswrapper[4606]: I1212 00:48:53.627723 4606 scope.go:117] "RemoveContainer" containerID="f71d952b38db1acf13c62ba6c9b0030399863495353fc7181de4f5613807c62e" Dec 12 00:48:53 crc kubenswrapper[4606]: E1212 00:48:53.642454 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f71d952b38db1acf13c62ba6c9b0030399863495353fc7181de4f5613807c62e\": container with ID starting with f71d952b38db1acf13c62ba6c9b0030399863495353fc7181de4f5613807c62e not found: ID does not exist" containerID="f71d952b38db1acf13c62ba6c9b0030399863495353fc7181de4f5613807c62e" Dec 12 00:48:53 crc kubenswrapper[4606]: I1212 00:48:53.642511 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f71d952b38db1acf13c62ba6c9b0030399863495353fc7181de4f5613807c62e"} err="failed to get container status \"f71d952b38db1acf13c62ba6c9b0030399863495353fc7181de4f5613807c62e\": rpc error: code = NotFound desc = could not find container \"f71d952b38db1acf13c62ba6c9b0030399863495353fc7181de4f5613807c62e\": container with ID starting with f71d952b38db1acf13c62ba6c9b0030399863495353fc7181de4f5613807c62e not found: ID does not exist" Dec 12 00:48:53 crc kubenswrapper[4606]: I1212 00:48:53.642541 4606 scope.go:117] "RemoveContainer" containerID="7ab2cd1128dbcfc400edfc345c36516799f13b24abb9a2ccac2432a074126cc2" Dec 12 00:48:53 crc kubenswrapper[4606]: E1212 00:48:53.647469 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ab2cd1128dbcfc400edfc345c36516799f13b24abb9a2ccac2432a074126cc2\": container with ID starting with 7ab2cd1128dbcfc400edfc345c36516799f13b24abb9a2ccac2432a074126cc2 not found: ID does not exist" containerID="7ab2cd1128dbcfc400edfc345c36516799f13b24abb9a2ccac2432a074126cc2" Dec 12 00:48:53 crc kubenswrapper[4606]: I1212 00:48:53.647503 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ab2cd1128dbcfc400edfc345c36516799f13b24abb9a2ccac2432a074126cc2"} err="failed to get container status \"7ab2cd1128dbcfc400edfc345c36516799f13b24abb9a2ccac2432a074126cc2\": rpc error: code = NotFound desc = could not find container \"7ab2cd1128dbcfc400edfc345c36516799f13b24abb9a2ccac2432a074126cc2\": container with ID starting with 7ab2cd1128dbcfc400edfc345c36516799f13b24abb9a2ccac2432a074126cc2 not found: ID does not exist" Dec 12 00:48:53 crc kubenswrapper[4606]: I1212 00:48:53.647526 4606 scope.go:117] "RemoveContainer" containerID="d49f032c238570d73a2a818f5e972e9c8532b3ab5287860dcc77bce2d3e00681" Dec 12 00:48:53 crc kubenswrapper[4606]: E1212 00:48:53.648514 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d49f032c238570d73a2a818f5e972e9c8532b3ab5287860dcc77bce2d3e00681\": container with ID starting with d49f032c238570d73a2a818f5e972e9c8532b3ab5287860dcc77bce2d3e00681 not found: ID does not exist" containerID="d49f032c238570d73a2a818f5e972e9c8532b3ab5287860dcc77bce2d3e00681" Dec 12 00:48:53 crc kubenswrapper[4606]: I1212 00:48:53.648548 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d49f032c238570d73a2a818f5e972e9c8532b3ab5287860dcc77bce2d3e00681"} err="failed to get container status \"d49f032c238570d73a2a818f5e972e9c8532b3ab5287860dcc77bce2d3e00681\": rpc error: code = NotFound desc = could not find container \"d49f032c238570d73a2a818f5e972e9c8532b3ab5287860dcc77bce2d3e00681\": container with ID starting with d49f032c238570d73a2a818f5e972e9c8532b3ab5287860dcc77bce2d3e00681 not found: ID does not exist" Dec 12 00:48:53 crc kubenswrapper[4606]: I1212 00:48:53.734150 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66096abb-d271-4f16-a936-ec59f78d40c0" path="/var/lib/kubelet/pods/66096abb-d271-4f16-a936-ec59f78d40c0/volumes" Dec 12 00:48:53 crc kubenswrapper[4606]: I1212 00:48:53.764300 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-78bt2"] Dec 12 00:48:53 crc kubenswrapper[4606]: I1212 00:48:53.963118 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-frh6k"] Dec 12 00:48:53 crc kubenswrapper[4606]: W1212 00:48:53.969592 4606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod459d1fea_af9d_46b7_ad3a_057dc9b980f2.slice/crio-dbe15dcf9524a28856eb82721eb2ec26aa6767253c8102118aea53b5c6094126 WatchSource:0}: Error finding container dbe15dcf9524a28856eb82721eb2ec26aa6767253c8102118aea53b5c6094126: Status 404 returned error can't find the container with id dbe15dcf9524a28856eb82721eb2ec26aa6767253c8102118aea53b5c6094126 Dec 12 00:48:54 crc kubenswrapper[4606]: I1212 00:48:54.493286 4606 generic.go:334] "Generic (PLEG): container finished" podID="a8632450-2d2a-4683-a1f8-fa91a510e5bd" containerID="d6910078722b4ed4898959861a8b0d6edd4f2ed9a4261f6686ac2f1f45232d86" exitCode=0 Dec 12 00:48:54 crc kubenswrapper[4606]: I1212 00:48:54.493352 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-78bt2" event={"ID":"a8632450-2d2a-4683-a1f8-fa91a510e5bd","Type":"ContainerDied","Data":"d6910078722b4ed4898959861a8b0d6edd4f2ed9a4261f6686ac2f1f45232d86"} Dec 12 00:48:54 crc kubenswrapper[4606]: I1212 00:48:54.493373 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-78bt2" event={"ID":"a8632450-2d2a-4683-a1f8-fa91a510e5bd","Type":"ContainerStarted","Data":"3c741bcede5b7732ebe5d748c41109cbef64f871b4e47df3971002b64dc5ddec"} Dec 12 00:48:54 crc kubenswrapper[4606]: I1212 00:48:54.495650 4606 generic.go:334] "Generic (PLEG): container finished" podID="459d1fea-af9d-46b7-ad3a-057dc9b980f2" containerID="1fd3d72af352fc3e4c858caf8b200b88f45aea4e3801c2f73ab1e63867e6945f" exitCode=0 Dec 12 00:48:54 crc kubenswrapper[4606]: I1212 00:48:54.495729 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-frh6k" event={"ID":"459d1fea-af9d-46b7-ad3a-057dc9b980f2","Type":"ContainerDied","Data":"1fd3d72af352fc3e4c858caf8b200b88f45aea4e3801c2f73ab1e63867e6945f"} Dec 12 00:48:54 crc kubenswrapper[4606]: I1212 00:48:54.495777 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-frh6k" event={"ID":"459d1fea-af9d-46b7-ad3a-057dc9b980f2","Type":"ContainerStarted","Data":"dbe15dcf9524a28856eb82721eb2ec26aa6767253c8102118aea53b5c6094126"} Dec 12 00:48:55 crc kubenswrapper[4606]: I1212 00:48:55.538802 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-78bt2" event={"ID":"a8632450-2d2a-4683-a1f8-fa91a510e5bd","Type":"ContainerStarted","Data":"970367b0c541a354fc32e548e05e79678a24a044c272d623660a060bea052259"} Dec 12 00:48:55 crc kubenswrapper[4606]: I1212 00:48:55.539939 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-cd5cbd7b9-78bt2" Dec 12 00:48:55 crc kubenswrapper[4606]: I1212 00:48:55.558228 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-cd5cbd7b9-78bt2" podStartSLOduration=3.558212432 podStartE2EDuration="3.558212432s" podCreationTimestamp="2025-12-12 00:48:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:48:55.557818591 +0000 UTC m=+1526.103171457" watchObservedRunningTime="2025-12-12 00:48:55.558212432 +0000 UTC m=+1526.103565298" Dec 12 00:48:55 crc kubenswrapper[4606]: I1212 00:48:55.826193 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 12 00:48:55 crc kubenswrapper[4606]: I1212 00:48:55.995735 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 12 00:48:55 crc kubenswrapper[4606]: I1212 00:48:55.995937 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="29f7a7c3-4ec4-4e0e-8309-6a9dddcd9b71" containerName="nova-api-log" containerID="cri-o://a0008e6405cd0549dca2ad6c59ed2ff839f325226c110499b357f4629df23e89" gracePeriod=30 Dec 12 00:48:55 crc kubenswrapper[4606]: I1212 00:48:55.996017 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="29f7a7c3-4ec4-4e0e-8309-6a9dddcd9b71" containerName="nova-api-api" containerID="cri-o://ca87d7ac40c931cabbbafb535d048eb8408a18d2b49b368ae79cdd5222d438bb" gracePeriod=30 Dec 12 00:48:56 crc kubenswrapper[4606]: I1212 00:48:56.550312 4606 generic.go:334] "Generic (PLEG): container finished" podID="29f7a7c3-4ec4-4e0e-8309-6a9dddcd9b71" containerID="a0008e6405cd0549dca2ad6c59ed2ff839f325226c110499b357f4629df23e89" exitCode=143 Dec 12 00:48:56 crc kubenswrapper[4606]: I1212 00:48:56.550391 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"29f7a7c3-4ec4-4e0e-8309-6a9dddcd9b71","Type":"ContainerDied","Data":"a0008e6405cd0549dca2ad6c59ed2ff839f325226c110499b357f4629df23e89"} Dec 12 00:48:56 crc kubenswrapper[4606]: I1212 00:48:56.555292 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-frh6k" event={"ID":"459d1fea-af9d-46b7-ad3a-057dc9b980f2","Type":"ContainerStarted","Data":"1ff9f15affa16338ee7c1a31ca8734bc9c5ee043118de138ae21b0fa6272fa53"} Dec 12 00:48:56 crc kubenswrapper[4606]: I1212 00:48:56.742434 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 12 00:48:56 crc kubenswrapper[4606]: I1212 00:48:56.742954 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e5b439be-0062-43d1-931c-0c2cb2a94e7f" containerName="ceilometer-central-agent" containerID="cri-o://86a4d1893867a6d26ce4dc36c713b48bd9b0c1ea6454a44918a9a72cb29a5282" gracePeriod=30 Dec 12 00:48:56 crc kubenswrapper[4606]: I1212 00:48:56.743081 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e5b439be-0062-43d1-931c-0c2cb2a94e7f" containerName="ceilometer-notification-agent" containerID="cri-o://54ddf391554cecd54f714f13fac2eb26dc6050ba5beb3441de205298d6d8f284" gracePeriod=30 Dec 12 00:48:56 crc kubenswrapper[4606]: I1212 00:48:56.743129 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e5b439be-0062-43d1-931c-0c2cb2a94e7f" containerName="sg-core" containerID="cri-o://23741727f4506a2f3bd377fd00f0540d2072a00cf6e8864f626bd9f977381f4c" gracePeriod=30 Dec 12 00:48:56 crc kubenswrapper[4606]: I1212 00:48:56.743186 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e5b439be-0062-43d1-931c-0c2cb2a94e7f" containerName="proxy-httpd" containerID="cri-o://40f018ad1c87a265e10bb7fc511038d32e836b545fc9ae443dbd1830c25d4216" gracePeriod=30 Dec 12 00:48:57 crc kubenswrapper[4606]: I1212 00:48:57.580194 4606 generic.go:334] "Generic (PLEG): container finished" podID="e5b439be-0062-43d1-931c-0c2cb2a94e7f" containerID="40f018ad1c87a265e10bb7fc511038d32e836b545fc9ae443dbd1830c25d4216" exitCode=0 Dec 12 00:48:57 crc kubenswrapper[4606]: I1212 00:48:57.580226 4606 generic.go:334] "Generic (PLEG): container finished" podID="e5b439be-0062-43d1-931c-0c2cb2a94e7f" containerID="23741727f4506a2f3bd377fd00f0540d2072a00cf6e8864f626bd9f977381f4c" exitCode=2 Dec 12 00:48:57 crc kubenswrapper[4606]: I1212 00:48:57.580235 4606 generic.go:334] "Generic (PLEG): container finished" podID="e5b439be-0062-43d1-931c-0c2cb2a94e7f" containerID="86a4d1893867a6d26ce4dc36c713b48bd9b0c1ea6454a44918a9a72cb29a5282" exitCode=0 Dec 12 00:48:57 crc kubenswrapper[4606]: I1212 00:48:57.581051 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e5b439be-0062-43d1-931c-0c2cb2a94e7f","Type":"ContainerDied","Data":"40f018ad1c87a265e10bb7fc511038d32e836b545fc9ae443dbd1830c25d4216"} Dec 12 00:48:57 crc kubenswrapper[4606]: I1212 00:48:57.581085 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e5b439be-0062-43d1-931c-0c2cb2a94e7f","Type":"ContainerDied","Data":"23741727f4506a2f3bd377fd00f0540d2072a00cf6e8864f626bd9f977381f4c"} Dec 12 00:48:57 crc kubenswrapper[4606]: I1212 00:48:57.581095 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e5b439be-0062-43d1-931c-0c2cb2a94e7f","Type":"ContainerDied","Data":"86a4d1893867a6d26ce4dc36c713b48bd9b0c1ea6454a44918a9a72cb29a5282"} Dec 12 00:48:58 crc kubenswrapper[4606]: I1212 00:48:58.591600 4606 generic.go:334] "Generic (PLEG): container finished" podID="e5b439be-0062-43d1-931c-0c2cb2a94e7f" containerID="54ddf391554cecd54f714f13fac2eb26dc6050ba5beb3441de205298d6d8f284" exitCode=0 Dec 12 00:48:58 crc kubenswrapper[4606]: I1212 00:48:58.591665 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e5b439be-0062-43d1-931c-0c2cb2a94e7f","Type":"ContainerDied","Data":"54ddf391554cecd54f714f13fac2eb26dc6050ba5beb3441de205298d6d8f284"} Dec 12 00:48:58 crc kubenswrapper[4606]: I1212 00:48:58.591936 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e5b439be-0062-43d1-931c-0c2cb2a94e7f","Type":"ContainerDied","Data":"1a3e2d58089576075d8b645f562537cf614f472fe49116b594a08adb455c2db2"} Dec 12 00:48:58 crc kubenswrapper[4606]: I1212 00:48:58.591954 4606 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1a3e2d58089576075d8b645f562537cf614f472fe49116b594a08adb455c2db2" Dec 12 00:48:58 crc kubenswrapper[4606]: I1212 00:48:58.613409 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 12 00:48:58 crc kubenswrapper[4606]: I1212 00:48:58.691767 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e5b439be-0062-43d1-931c-0c2cb2a94e7f-sg-core-conf-yaml\") pod \"e5b439be-0062-43d1-931c-0c2cb2a94e7f\" (UID: \"e5b439be-0062-43d1-931c-0c2cb2a94e7f\") " Dec 12 00:48:58 crc kubenswrapper[4606]: I1212 00:48:58.691809 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5b439be-0062-43d1-931c-0c2cb2a94e7f-scripts\") pod \"e5b439be-0062-43d1-931c-0c2cb2a94e7f\" (UID: \"e5b439be-0062-43d1-931c-0c2cb2a94e7f\") " Dec 12 00:48:58 crc kubenswrapper[4606]: I1212 00:48:58.691864 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-87g94\" (UniqueName: \"kubernetes.io/projected/e5b439be-0062-43d1-931c-0c2cb2a94e7f-kube-api-access-87g94\") pod \"e5b439be-0062-43d1-931c-0c2cb2a94e7f\" (UID: \"e5b439be-0062-43d1-931c-0c2cb2a94e7f\") " Dec 12 00:48:58 crc kubenswrapper[4606]: I1212 00:48:58.691886 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5b439be-0062-43d1-931c-0c2cb2a94e7f-ceilometer-tls-certs\") pod \"e5b439be-0062-43d1-931c-0c2cb2a94e7f\" (UID: \"e5b439be-0062-43d1-931c-0c2cb2a94e7f\") " Dec 12 00:48:58 crc kubenswrapper[4606]: I1212 00:48:58.691960 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5b439be-0062-43d1-931c-0c2cb2a94e7f-config-data\") pod \"e5b439be-0062-43d1-931c-0c2cb2a94e7f\" (UID: \"e5b439be-0062-43d1-931c-0c2cb2a94e7f\") " Dec 12 00:48:58 crc kubenswrapper[4606]: I1212 00:48:58.692031 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e5b439be-0062-43d1-931c-0c2cb2a94e7f-log-httpd\") pod \"e5b439be-0062-43d1-931c-0c2cb2a94e7f\" (UID: \"e5b439be-0062-43d1-931c-0c2cb2a94e7f\") " Dec 12 00:48:58 crc kubenswrapper[4606]: I1212 00:48:58.692082 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5b439be-0062-43d1-931c-0c2cb2a94e7f-combined-ca-bundle\") pod \"e5b439be-0062-43d1-931c-0c2cb2a94e7f\" (UID: \"e5b439be-0062-43d1-931c-0c2cb2a94e7f\") " Dec 12 00:48:58 crc kubenswrapper[4606]: I1212 00:48:58.692104 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e5b439be-0062-43d1-931c-0c2cb2a94e7f-run-httpd\") pod \"e5b439be-0062-43d1-931c-0c2cb2a94e7f\" (UID: \"e5b439be-0062-43d1-931c-0c2cb2a94e7f\") " Dec 12 00:48:58 crc kubenswrapper[4606]: I1212 00:48:58.692759 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5b439be-0062-43d1-931c-0c2cb2a94e7f-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "e5b439be-0062-43d1-931c-0c2cb2a94e7f" (UID: "e5b439be-0062-43d1-931c-0c2cb2a94e7f"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 00:48:58 crc kubenswrapper[4606]: I1212 00:48:58.694103 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5b439be-0062-43d1-931c-0c2cb2a94e7f-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "e5b439be-0062-43d1-931c-0c2cb2a94e7f" (UID: "e5b439be-0062-43d1-931c-0c2cb2a94e7f"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 00:48:58 crc kubenswrapper[4606]: I1212 00:48:58.718360 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5b439be-0062-43d1-931c-0c2cb2a94e7f-kube-api-access-87g94" (OuterVolumeSpecName: "kube-api-access-87g94") pod "e5b439be-0062-43d1-931c-0c2cb2a94e7f" (UID: "e5b439be-0062-43d1-931c-0c2cb2a94e7f"). InnerVolumeSpecName "kube-api-access-87g94". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:48:58 crc kubenswrapper[4606]: I1212 00:48:58.720425 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5b439be-0062-43d1-931c-0c2cb2a94e7f-scripts" (OuterVolumeSpecName: "scripts") pod "e5b439be-0062-43d1-931c-0c2cb2a94e7f" (UID: "e5b439be-0062-43d1-931c-0c2cb2a94e7f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:48:58 crc kubenswrapper[4606]: I1212 00:48:58.721417 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5b439be-0062-43d1-931c-0c2cb2a94e7f-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "e5b439be-0062-43d1-931c-0c2cb2a94e7f" (UID: "e5b439be-0062-43d1-931c-0c2cb2a94e7f"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:48:58 crc kubenswrapper[4606]: I1212 00:48:58.753871 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5b439be-0062-43d1-931c-0c2cb2a94e7f-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "e5b439be-0062-43d1-931c-0c2cb2a94e7f" (UID: "e5b439be-0062-43d1-931c-0c2cb2a94e7f"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:48:58 crc kubenswrapper[4606]: I1212 00:48:58.798583 4606 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e5b439be-0062-43d1-931c-0c2cb2a94e7f-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 12 00:48:58 crc kubenswrapper[4606]: I1212 00:48:58.798632 4606 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e5b439be-0062-43d1-931c-0c2cb2a94e7f-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 12 00:48:58 crc kubenswrapper[4606]: I1212 00:48:58.798649 4606 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5b439be-0062-43d1-931c-0c2cb2a94e7f-scripts\") on node \"crc\" DevicePath \"\"" Dec 12 00:48:58 crc kubenswrapper[4606]: I1212 00:48:58.798661 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-87g94\" (UniqueName: \"kubernetes.io/projected/e5b439be-0062-43d1-931c-0c2cb2a94e7f-kube-api-access-87g94\") on node \"crc\" DevicePath \"\"" Dec 12 00:48:58 crc kubenswrapper[4606]: I1212 00:48:58.798677 4606 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5b439be-0062-43d1-931c-0c2cb2a94e7f-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 12 00:48:58 crc kubenswrapper[4606]: I1212 00:48:58.798688 4606 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e5b439be-0062-43d1-931c-0c2cb2a94e7f-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 12 00:48:58 crc kubenswrapper[4606]: I1212 00:48:58.839446 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5b439be-0062-43d1-931c-0c2cb2a94e7f-config-data" (OuterVolumeSpecName: "config-data") pod "e5b439be-0062-43d1-931c-0c2cb2a94e7f" (UID: "e5b439be-0062-43d1-931c-0c2cb2a94e7f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:48:58 crc kubenswrapper[4606]: I1212 00:48:58.844241 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5b439be-0062-43d1-931c-0c2cb2a94e7f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e5b439be-0062-43d1-931c-0c2cb2a94e7f" (UID: "e5b439be-0062-43d1-931c-0c2cb2a94e7f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:48:58 crc kubenswrapper[4606]: I1212 00:48:58.901042 4606 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5b439be-0062-43d1-931c-0c2cb2a94e7f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 12 00:48:58 crc kubenswrapper[4606]: I1212 00:48:58.901079 4606 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5b439be-0062-43d1-931c-0c2cb2a94e7f-config-data\") on node \"crc\" DevicePath \"\"" Dec 12 00:48:59 crc kubenswrapper[4606]: I1212 00:48:59.605069 4606 generic.go:334] "Generic (PLEG): container finished" podID="29f7a7c3-4ec4-4e0e-8309-6a9dddcd9b71" containerID="ca87d7ac40c931cabbbafb535d048eb8408a18d2b49b368ae79cdd5222d438bb" exitCode=0 Dec 12 00:48:59 crc kubenswrapper[4606]: I1212 00:48:59.605481 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"29f7a7c3-4ec4-4e0e-8309-6a9dddcd9b71","Type":"ContainerDied","Data":"ca87d7ac40c931cabbbafb535d048eb8408a18d2b49b368ae79cdd5222d438bb"} Dec 12 00:48:59 crc kubenswrapper[4606]: I1212 00:48:59.610947 4606 generic.go:334] "Generic (PLEG): container finished" podID="459d1fea-af9d-46b7-ad3a-057dc9b980f2" containerID="1ff9f15affa16338ee7c1a31ca8734bc9c5ee043118de138ae21b0fa6272fa53" exitCode=0 Dec 12 00:48:59 crc kubenswrapper[4606]: I1212 00:48:59.611034 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 12 00:48:59 crc kubenswrapper[4606]: I1212 00:48:59.611502 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-frh6k" event={"ID":"459d1fea-af9d-46b7-ad3a-057dc9b980f2","Type":"ContainerDied","Data":"1ff9f15affa16338ee7c1a31ca8734bc9c5ee043118de138ae21b0fa6272fa53"} Dec 12 00:48:59 crc kubenswrapper[4606]: I1212 00:48:59.663512 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 12 00:48:59 crc kubenswrapper[4606]: I1212 00:48:59.682243 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 12 00:48:59 crc kubenswrapper[4606]: I1212 00:48:59.701109 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 12 00:48:59 crc kubenswrapper[4606]: E1212 00:48:59.701507 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66096abb-d271-4f16-a936-ec59f78d40c0" containerName="extract-utilities" Dec 12 00:48:59 crc kubenswrapper[4606]: I1212 00:48:59.701525 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="66096abb-d271-4f16-a936-ec59f78d40c0" containerName="extract-utilities" Dec 12 00:48:59 crc kubenswrapper[4606]: E1212 00:48:59.701537 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66096abb-d271-4f16-a936-ec59f78d40c0" containerName="registry-server" Dec 12 00:48:59 crc kubenswrapper[4606]: I1212 00:48:59.701544 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="66096abb-d271-4f16-a936-ec59f78d40c0" containerName="registry-server" Dec 12 00:48:59 crc kubenswrapper[4606]: E1212 00:48:59.701571 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66096abb-d271-4f16-a936-ec59f78d40c0" containerName="extract-content" Dec 12 00:48:59 crc kubenswrapper[4606]: I1212 00:48:59.701577 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="66096abb-d271-4f16-a936-ec59f78d40c0" containerName="extract-content" Dec 12 00:48:59 crc kubenswrapper[4606]: E1212 00:48:59.701589 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5b439be-0062-43d1-931c-0c2cb2a94e7f" containerName="proxy-httpd" Dec 12 00:48:59 crc kubenswrapper[4606]: I1212 00:48:59.701594 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5b439be-0062-43d1-931c-0c2cb2a94e7f" containerName="proxy-httpd" Dec 12 00:48:59 crc kubenswrapper[4606]: E1212 00:48:59.701617 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5b439be-0062-43d1-931c-0c2cb2a94e7f" containerName="sg-core" Dec 12 00:48:59 crc kubenswrapper[4606]: I1212 00:48:59.701622 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5b439be-0062-43d1-931c-0c2cb2a94e7f" containerName="sg-core" Dec 12 00:48:59 crc kubenswrapper[4606]: E1212 00:48:59.701635 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5b439be-0062-43d1-931c-0c2cb2a94e7f" containerName="ceilometer-central-agent" Dec 12 00:48:59 crc kubenswrapper[4606]: I1212 00:48:59.701642 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5b439be-0062-43d1-931c-0c2cb2a94e7f" containerName="ceilometer-central-agent" Dec 12 00:48:59 crc kubenswrapper[4606]: E1212 00:48:59.701652 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5b439be-0062-43d1-931c-0c2cb2a94e7f" containerName="ceilometer-notification-agent" Dec 12 00:48:59 crc kubenswrapper[4606]: I1212 00:48:59.701659 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5b439be-0062-43d1-931c-0c2cb2a94e7f" containerName="ceilometer-notification-agent" Dec 12 00:48:59 crc kubenswrapper[4606]: I1212 00:48:59.701823 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5b439be-0062-43d1-931c-0c2cb2a94e7f" containerName="sg-core" Dec 12 00:48:59 crc kubenswrapper[4606]: I1212 00:48:59.701842 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5b439be-0062-43d1-931c-0c2cb2a94e7f" containerName="ceilometer-notification-agent" Dec 12 00:48:59 crc kubenswrapper[4606]: I1212 00:48:59.701854 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5b439be-0062-43d1-931c-0c2cb2a94e7f" containerName="ceilometer-central-agent" Dec 12 00:48:59 crc kubenswrapper[4606]: I1212 00:48:59.701868 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="66096abb-d271-4f16-a936-ec59f78d40c0" containerName="registry-server" Dec 12 00:48:59 crc kubenswrapper[4606]: I1212 00:48:59.701877 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5b439be-0062-43d1-931c-0c2cb2a94e7f" containerName="proxy-httpd" Dec 12 00:48:59 crc kubenswrapper[4606]: I1212 00:48:59.707380 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 12 00:48:59 crc kubenswrapper[4606]: I1212 00:48:59.715612 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 12 00:48:59 crc kubenswrapper[4606]: I1212 00:48:59.716708 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 12 00:48:59 crc kubenswrapper[4606]: I1212 00:48:59.716823 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 12 00:48:59 crc kubenswrapper[4606]: I1212 00:48:59.732406 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5b439be-0062-43d1-931c-0c2cb2a94e7f" path="/var/lib/kubelet/pods/e5b439be-0062-43d1-931c-0c2cb2a94e7f/volumes" Dec 12 00:48:59 crc kubenswrapper[4606]: I1212 00:48:59.733107 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 12 00:48:59 crc kubenswrapper[4606]: I1212 00:48:59.770936 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 12 00:48:59 crc kubenswrapper[4606]: I1212 00:48:59.817827 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sr492\" (UniqueName: \"kubernetes.io/projected/29f7a7c3-4ec4-4e0e-8309-6a9dddcd9b71-kube-api-access-sr492\") pod \"29f7a7c3-4ec4-4e0e-8309-6a9dddcd9b71\" (UID: \"29f7a7c3-4ec4-4e0e-8309-6a9dddcd9b71\") " Dec 12 00:48:59 crc kubenswrapper[4606]: I1212 00:48:59.817871 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29f7a7c3-4ec4-4e0e-8309-6a9dddcd9b71-combined-ca-bundle\") pod \"29f7a7c3-4ec4-4e0e-8309-6a9dddcd9b71\" (UID: \"29f7a7c3-4ec4-4e0e-8309-6a9dddcd9b71\") " Dec 12 00:48:59 crc kubenswrapper[4606]: I1212 00:48:59.818050 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29f7a7c3-4ec4-4e0e-8309-6a9dddcd9b71-config-data\") pod \"29f7a7c3-4ec4-4e0e-8309-6a9dddcd9b71\" (UID: \"29f7a7c3-4ec4-4e0e-8309-6a9dddcd9b71\") " Dec 12 00:48:59 crc kubenswrapper[4606]: I1212 00:48:59.818075 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/29f7a7c3-4ec4-4e0e-8309-6a9dddcd9b71-logs\") pod \"29f7a7c3-4ec4-4e0e-8309-6a9dddcd9b71\" (UID: \"29f7a7c3-4ec4-4e0e-8309-6a9dddcd9b71\") " Dec 12 00:48:59 crc kubenswrapper[4606]: I1212 00:48:59.818365 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41c91ed6-c9ea-48d5-ac06-87b859771b39-config-data\") pod \"ceilometer-0\" (UID: \"41c91ed6-c9ea-48d5-ac06-87b859771b39\") " pod="openstack/ceilometer-0" Dec 12 00:48:59 crc kubenswrapper[4606]: I1212 00:48:59.818437 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/41c91ed6-c9ea-48d5-ac06-87b859771b39-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"41c91ed6-c9ea-48d5-ac06-87b859771b39\") " pod="openstack/ceilometer-0" Dec 12 00:48:59 crc kubenswrapper[4606]: I1212 00:48:59.818513 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41c91ed6-c9ea-48d5-ac06-87b859771b39-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"41c91ed6-c9ea-48d5-ac06-87b859771b39\") " pod="openstack/ceilometer-0" Dec 12 00:48:59 crc kubenswrapper[4606]: I1212 00:48:59.818581 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/41c91ed6-c9ea-48d5-ac06-87b859771b39-run-httpd\") pod \"ceilometer-0\" (UID: \"41c91ed6-c9ea-48d5-ac06-87b859771b39\") " pod="openstack/ceilometer-0" Dec 12 00:48:59 crc kubenswrapper[4606]: I1212 00:48:59.818650 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41c91ed6-c9ea-48d5-ac06-87b859771b39-scripts\") pod \"ceilometer-0\" (UID: \"41c91ed6-c9ea-48d5-ac06-87b859771b39\") " pod="openstack/ceilometer-0" Dec 12 00:48:59 crc kubenswrapper[4606]: I1212 00:48:59.818686 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/41c91ed6-c9ea-48d5-ac06-87b859771b39-log-httpd\") pod \"ceilometer-0\" (UID: \"41c91ed6-c9ea-48d5-ac06-87b859771b39\") " pod="openstack/ceilometer-0" Dec 12 00:48:59 crc kubenswrapper[4606]: I1212 00:48:59.818753 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/41c91ed6-c9ea-48d5-ac06-87b859771b39-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"41c91ed6-c9ea-48d5-ac06-87b859771b39\") " pod="openstack/ceilometer-0" Dec 12 00:48:59 crc kubenswrapper[4606]: I1212 00:48:59.819301 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l82rk\" (UniqueName: \"kubernetes.io/projected/41c91ed6-c9ea-48d5-ac06-87b859771b39-kube-api-access-l82rk\") pod \"ceilometer-0\" (UID: \"41c91ed6-c9ea-48d5-ac06-87b859771b39\") " pod="openstack/ceilometer-0" Dec 12 00:48:59 crc kubenswrapper[4606]: I1212 00:48:59.819207 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29f7a7c3-4ec4-4e0e-8309-6a9dddcd9b71-logs" (OuterVolumeSpecName: "logs") pod "29f7a7c3-4ec4-4e0e-8309-6a9dddcd9b71" (UID: "29f7a7c3-4ec4-4e0e-8309-6a9dddcd9b71"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 00:48:59 crc kubenswrapper[4606]: I1212 00:48:59.851859 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29f7a7c3-4ec4-4e0e-8309-6a9dddcd9b71-kube-api-access-sr492" (OuterVolumeSpecName: "kube-api-access-sr492") pod "29f7a7c3-4ec4-4e0e-8309-6a9dddcd9b71" (UID: "29f7a7c3-4ec4-4e0e-8309-6a9dddcd9b71"). InnerVolumeSpecName "kube-api-access-sr492". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:48:59 crc kubenswrapper[4606]: I1212 00:48:59.865552 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29f7a7c3-4ec4-4e0e-8309-6a9dddcd9b71-config-data" (OuterVolumeSpecName: "config-data") pod "29f7a7c3-4ec4-4e0e-8309-6a9dddcd9b71" (UID: "29f7a7c3-4ec4-4e0e-8309-6a9dddcd9b71"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:48:59 crc kubenswrapper[4606]: I1212 00:48:59.891092 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29f7a7c3-4ec4-4e0e-8309-6a9dddcd9b71-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "29f7a7c3-4ec4-4e0e-8309-6a9dddcd9b71" (UID: "29f7a7c3-4ec4-4e0e-8309-6a9dddcd9b71"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:48:59 crc kubenswrapper[4606]: I1212 00:48:59.920842 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/41c91ed6-c9ea-48d5-ac06-87b859771b39-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"41c91ed6-c9ea-48d5-ac06-87b859771b39\") " pod="openstack/ceilometer-0" Dec 12 00:48:59 crc kubenswrapper[4606]: I1212 00:48:59.920915 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41c91ed6-c9ea-48d5-ac06-87b859771b39-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"41c91ed6-c9ea-48d5-ac06-87b859771b39\") " pod="openstack/ceilometer-0" Dec 12 00:48:59 crc kubenswrapper[4606]: I1212 00:48:59.920942 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/41c91ed6-c9ea-48d5-ac06-87b859771b39-run-httpd\") pod \"ceilometer-0\" (UID: \"41c91ed6-c9ea-48d5-ac06-87b859771b39\") " pod="openstack/ceilometer-0" Dec 12 00:48:59 crc kubenswrapper[4606]: I1212 00:48:59.920986 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41c91ed6-c9ea-48d5-ac06-87b859771b39-scripts\") pod \"ceilometer-0\" (UID: \"41c91ed6-c9ea-48d5-ac06-87b859771b39\") " pod="openstack/ceilometer-0" Dec 12 00:48:59 crc kubenswrapper[4606]: I1212 00:48:59.921010 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/41c91ed6-c9ea-48d5-ac06-87b859771b39-log-httpd\") pod \"ceilometer-0\" (UID: \"41c91ed6-c9ea-48d5-ac06-87b859771b39\") " pod="openstack/ceilometer-0" Dec 12 00:48:59 crc kubenswrapper[4606]: I1212 00:48:59.921044 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/41c91ed6-c9ea-48d5-ac06-87b859771b39-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"41c91ed6-c9ea-48d5-ac06-87b859771b39\") " pod="openstack/ceilometer-0" Dec 12 00:48:59 crc kubenswrapper[4606]: I1212 00:48:59.921065 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l82rk\" (UniqueName: \"kubernetes.io/projected/41c91ed6-c9ea-48d5-ac06-87b859771b39-kube-api-access-l82rk\") pod \"ceilometer-0\" (UID: \"41c91ed6-c9ea-48d5-ac06-87b859771b39\") " pod="openstack/ceilometer-0" Dec 12 00:48:59 crc kubenswrapper[4606]: I1212 00:48:59.921091 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41c91ed6-c9ea-48d5-ac06-87b859771b39-config-data\") pod \"ceilometer-0\" (UID: \"41c91ed6-c9ea-48d5-ac06-87b859771b39\") " pod="openstack/ceilometer-0" Dec 12 00:48:59 crc kubenswrapper[4606]: I1212 00:48:59.921141 4606 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29f7a7c3-4ec4-4e0e-8309-6a9dddcd9b71-config-data\") on node \"crc\" DevicePath \"\"" Dec 12 00:48:59 crc kubenswrapper[4606]: I1212 00:48:59.921152 4606 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/29f7a7c3-4ec4-4e0e-8309-6a9dddcd9b71-logs\") on node \"crc\" DevicePath \"\"" Dec 12 00:48:59 crc kubenswrapper[4606]: I1212 00:48:59.921162 4606 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29f7a7c3-4ec4-4e0e-8309-6a9dddcd9b71-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 12 00:48:59 crc kubenswrapper[4606]: I1212 00:48:59.921187 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sr492\" (UniqueName: \"kubernetes.io/projected/29f7a7c3-4ec4-4e0e-8309-6a9dddcd9b71-kube-api-access-sr492\") on node \"crc\" DevicePath \"\"" Dec 12 00:48:59 crc kubenswrapper[4606]: I1212 00:48:59.922824 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/41c91ed6-c9ea-48d5-ac06-87b859771b39-run-httpd\") pod \"ceilometer-0\" (UID: \"41c91ed6-c9ea-48d5-ac06-87b859771b39\") " pod="openstack/ceilometer-0" Dec 12 00:48:59 crc kubenswrapper[4606]: I1212 00:48:59.923122 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/41c91ed6-c9ea-48d5-ac06-87b859771b39-log-httpd\") pod \"ceilometer-0\" (UID: \"41c91ed6-c9ea-48d5-ac06-87b859771b39\") " pod="openstack/ceilometer-0" Dec 12 00:48:59 crc kubenswrapper[4606]: I1212 00:48:59.924559 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41c91ed6-c9ea-48d5-ac06-87b859771b39-scripts\") pod \"ceilometer-0\" (UID: \"41c91ed6-c9ea-48d5-ac06-87b859771b39\") " pod="openstack/ceilometer-0" Dec 12 00:48:59 crc kubenswrapper[4606]: I1212 00:48:59.924796 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41c91ed6-c9ea-48d5-ac06-87b859771b39-config-data\") pod \"ceilometer-0\" (UID: \"41c91ed6-c9ea-48d5-ac06-87b859771b39\") " pod="openstack/ceilometer-0" Dec 12 00:48:59 crc kubenswrapper[4606]: I1212 00:48:59.926419 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/41c91ed6-c9ea-48d5-ac06-87b859771b39-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"41c91ed6-c9ea-48d5-ac06-87b859771b39\") " pod="openstack/ceilometer-0" Dec 12 00:48:59 crc kubenswrapper[4606]: I1212 00:48:59.928642 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/41c91ed6-c9ea-48d5-ac06-87b859771b39-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"41c91ed6-c9ea-48d5-ac06-87b859771b39\") " pod="openstack/ceilometer-0" Dec 12 00:48:59 crc kubenswrapper[4606]: I1212 00:48:59.935380 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41c91ed6-c9ea-48d5-ac06-87b859771b39-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"41c91ed6-c9ea-48d5-ac06-87b859771b39\") " pod="openstack/ceilometer-0" Dec 12 00:48:59 crc kubenswrapper[4606]: I1212 00:48:59.939954 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l82rk\" (UniqueName: \"kubernetes.io/projected/41c91ed6-c9ea-48d5-ac06-87b859771b39-kube-api-access-l82rk\") pod \"ceilometer-0\" (UID: \"41c91ed6-c9ea-48d5-ac06-87b859771b39\") " pod="openstack/ceilometer-0" Dec 12 00:49:00 crc kubenswrapper[4606]: I1212 00:49:00.087278 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 12 00:49:00 crc kubenswrapper[4606]: I1212 00:49:00.588654 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 12 00:49:00 crc kubenswrapper[4606]: I1212 00:49:00.623021 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"41c91ed6-c9ea-48d5-ac06-87b859771b39","Type":"ContainerStarted","Data":"15435ed10edf41b633e220693af64b9a9a10680630e9f562e5c2232263b211c6"} Dec 12 00:49:00 crc kubenswrapper[4606]: I1212 00:49:00.626503 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 12 00:49:00 crc kubenswrapper[4606]: I1212 00:49:00.626504 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"29f7a7c3-4ec4-4e0e-8309-6a9dddcd9b71","Type":"ContainerDied","Data":"2f58c5386ff01ffd000085a77b3e76943ab3749aab770c67f8b13b5cdefe29a9"} Dec 12 00:49:00 crc kubenswrapper[4606]: I1212 00:49:00.626583 4606 scope.go:117] "RemoveContainer" containerID="ca87d7ac40c931cabbbafb535d048eb8408a18d2b49b368ae79cdd5222d438bb" Dec 12 00:49:00 crc kubenswrapper[4606]: I1212 00:49:00.640690 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-frh6k" event={"ID":"459d1fea-af9d-46b7-ad3a-057dc9b980f2","Type":"ContainerStarted","Data":"68b421032f2d46c01319e6120e1f78bc5dacededb5ba2d3fa3dfacae051b6759"} Dec 12 00:49:00 crc kubenswrapper[4606]: I1212 00:49:00.675574 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-frh6k" podStartSLOduration=3.04268398 podStartE2EDuration="8.675547954s" podCreationTimestamp="2025-12-12 00:48:52 +0000 UTC" firstStartedPulling="2025-12-12 00:48:54.500440724 +0000 UTC m=+1525.045793590" lastFinishedPulling="2025-12-12 00:49:00.133304698 +0000 UTC m=+1530.678657564" observedRunningTime="2025-12-12 00:49:00.657893371 +0000 UTC m=+1531.203246237" watchObservedRunningTime="2025-12-12 00:49:00.675547954 +0000 UTC m=+1531.220900820" Dec 12 00:49:00 crc kubenswrapper[4606]: I1212 00:49:00.682209 4606 scope.go:117] "RemoveContainer" containerID="a0008e6405cd0549dca2ad6c59ed2ff839f325226c110499b357f4629df23e89" Dec 12 00:49:00 crc kubenswrapper[4606]: I1212 00:49:00.695233 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 12 00:49:00 crc kubenswrapper[4606]: I1212 00:49:00.714789 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 12 00:49:00 crc kubenswrapper[4606]: I1212 00:49:00.726543 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 12 00:49:00 crc kubenswrapper[4606]: E1212 00:49:00.727012 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29f7a7c3-4ec4-4e0e-8309-6a9dddcd9b71" containerName="nova-api-api" Dec 12 00:49:00 crc kubenswrapper[4606]: I1212 00:49:00.727029 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="29f7a7c3-4ec4-4e0e-8309-6a9dddcd9b71" containerName="nova-api-api" Dec 12 00:49:00 crc kubenswrapper[4606]: E1212 00:49:00.727051 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29f7a7c3-4ec4-4e0e-8309-6a9dddcd9b71" containerName="nova-api-log" Dec 12 00:49:00 crc kubenswrapper[4606]: I1212 00:49:00.727057 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="29f7a7c3-4ec4-4e0e-8309-6a9dddcd9b71" containerName="nova-api-log" Dec 12 00:49:00 crc kubenswrapper[4606]: I1212 00:49:00.727259 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="29f7a7c3-4ec4-4e0e-8309-6a9dddcd9b71" containerName="nova-api-log" Dec 12 00:49:00 crc kubenswrapper[4606]: I1212 00:49:00.727282 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="29f7a7c3-4ec4-4e0e-8309-6a9dddcd9b71" containerName="nova-api-api" Dec 12 00:49:00 crc kubenswrapper[4606]: I1212 00:49:00.728283 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 12 00:49:00 crc kubenswrapper[4606]: I1212 00:49:00.736696 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 12 00:49:00 crc kubenswrapper[4606]: I1212 00:49:00.751029 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 12 00:49:00 crc kubenswrapper[4606]: I1212 00:49:00.751208 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 12 00:49:00 crc kubenswrapper[4606]: I1212 00:49:00.753023 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 12 00:49:00 crc kubenswrapper[4606]: I1212 00:49:00.826368 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Dec 12 00:49:00 crc kubenswrapper[4606]: I1212 00:49:00.839095 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/788076ff-f08d-494e-b7ce-23897879660c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"788076ff-f08d-494e-b7ce-23897879660c\") " pod="openstack/nova-api-0" Dec 12 00:49:00 crc kubenswrapper[4606]: I1212 00:49:00.839135 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/788076ff-f08d-494e-b7ce-23897879660c-internal-tls-certs\") pod \"nova-api-0\" (UID: \"788076ff-f08d-494e-b7ce-23897879660c\") " pod="openstack/nova-api-0" Dec 12 00:49:00 crc kubenswrapper[4606]: I1212 00:49:00.839192 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/788076ff-f08d-494e-b7ce-23897879660c-config-data\") pod \"nova-api-0\" (UID: \"788076ff-f08d-494e-b7ce-23897879660c\") " pod="openstack/nova-api-0" Dec 12 00:49:00 crc kubenswrapper[4606]: I1212 00:49:00.839373 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/788076ff-f08d-494e-b7ce-23897879660c-public-tls-certs\") pod \"nova-api-0\" (UID: \"788076ff-f08d-494e-b7ce-23897879660c\") " pod="openstack/nova-api-0" Dec 12 00:49:00 crc kubenswrapper[4606]: I1212 00:49:00.839436 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/788076ff-f08d-494e-b7ce-23897879660c-logs\") pod \"nova-api-0\" (UID: \"788076ff-f08d-494e-b7ce-23897879660c\") " pod="openstack/nova-api-0" Dec 12 00:49:00 crc kubenswrapper[4606]: I1212 00:49:00.839470 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9r54\" (UniqueName: \"kubernetes.io/projected/788076ff-f08d-494e-b7ce-23897879660c-kube-api-access-t9r54\") pod \"nova-api-0\" (UID: \"788076ff-f08d-494e-b7ce-23897879660c\") " pod="openstack/nova-api-0" Dec 12 00:49:00 crc kubenswrapper[4606]: I1212 00:49:00.847479 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Dec 12 00:49:00 crc kubenswrapper[4606]: I1212 00:49:00.941472 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/788076ff-f08d-494e-b7ce-23897879660c-public-tls-certs\") pod \"nova-api-0\" (UID: \"788076ff-f08d-494e-b7ce-23897879660c\") " pod="openstack/nova-api-0" Dec 12 00:49:00 crc kubenswrapper[4606]: I1212 00:49:00.941538 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/788076ff-f08d-494e-b7ce-23897879660c-logs\") pod \"nova-api-0\" (UID: \"788076ff-f08d-494e-b7ce-23897879660c\") " pod="openstack/nova-api-0" Dec 12 00:49:00 crc kubenswrapper[4606]: I1212 00:49:00.941567 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t9r54\" (UniqueName: \"kubernetes.io/projected/788076ff-f08d-494e-b7ce-23897879660c-kube-api-access-t9r54\") pod \"nova-api-0\" (UID: \"788076ff-f08d-494e-b7ce-23897879660c\") " pod="openstack/nova-api-0" Dec 12 00:49:00 crc kubenswrapper[4606]: I1212 00:49:00.941694 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/788076ff-f08d-494e-b7ce-23897879660c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"788076ff-f08d-494e-b7ce-23897879660c\") " pod="openstack/nova-api-0" Dec 12 00:49:00 crc kubenswrapper[4606]: I1212 00:49:00.941717 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/788076ff-f08d-494e-b7ce-23897879660c-internal-tls-certs\") pod \"nova-api-0\" (UID: \"788076ff-f08d-494e-b7ce-23897879660c\") " pod="openstack/nova-api-0" Dec 12 00:49:00 crc kubenswrapper[4606]: I1212 00:49:00.941763 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/788076ff-f08d-494e-b7ce-23897879660c-config-data\") pod \"nova-api-0\" (UID: \"788076ff-f08d-494e-b7ce-23897879660c\") " pod="openstack/nova-api-0" Dec 12 00:49:00 crc kubenswrapper[4606]: I1212 00:49:00.942783 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/788076ff-f08d-494e-b7ce-23897879660c-logs\") pod \"nova-api-0\" (UID: \"788076ff-f08d-494e-b7ce-23897879660c\") " pod="openstack/nova-api-0" Dec 12 00:49:00 crc kubenswrapper[4606]: I1212 00:49:00.950183 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/788076ff-f08d-494e-b7ce-23897879660c-internal-tls-certs\") pod \"nova-api-0\" (UID: \"788076ff-f08d-494e-b7ce-23897879660c\") " pod="openstack/nova-api-0" Dec 12 00:49:00 crc kubenswrapper[4606]: I1212 00:49:00.950425 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/788076ff-f08d-494e-b7ce-23897879660c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"788076ff-f08d-494e-b7ce-23897879660c\") " pod="openstack/nova-api-0" Dec 12 00:49:00 crc kubenswrapper[4606]: I1212 00:49:00.950804 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/788076ff-f08d-494e-b7ce-23897879660c-public-tls-certs\") pod \"nova-api-0\" (UID: \"788076ff-f08d-494e-b7ce-23897879660c\") " pod="openstack/nova-api-0" Dec 12 00:49:00 crc kubenswrapper[4606]: I1212 00:49:00.956883 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9r54\" (UniqueName: \"kubernetes.io/projected/788076ff-f08d-494e-b7ce-23897879660c-kube-api-access-t9r54\") pod \"nova-api-0\" (UID: \"788076ff-f08d-494e-b7ce-23897879660c\") " pod="openstack/nova-api-0" Dec 12 00:49:00 crc kubenswrapper[4606]: I1212 00:49:00.958728 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/788076ff-f08d-494e-b7ce-23897879660c-config-data\") pod \"nova-api-0\" (UID: \"788076ff-f08d-494e-b7ce-23897879660c\") " pod="openstack/nova-api-0" Dec 12 00:49:01 crc kubenswrapper[4606]: I1212 00:49:01.072801 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 12 00:49:01 crc kubenswrapper[4606]: I1212 00:49:01.593806 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 12 00:49:01 crc kubenswrapper[4606]: I1212 00:49:01.655940 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"41c91ed6-c9ea-48d5-ac06-87b859771b39","Type":"ContainerStarted","Data":"343c29a9c75dca64857d284830fd650e12b327438b072f97cc3921043790df53"} Dec 12 00:49:01 crc kubenswrapper[4606]: I1212 00:49:01.665399 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"788076ff-f08d-494e-b7ce-23897879660c","Type":"ContainerStarted","Data":"6c2d1ae960b6beb782608cd72240004bd98e8b667cbd85dccd11da921ffc5592"} Dec 12 00:49:01 crc kubenswrapper[4606]: I1212 00:49:01.687433 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Dec 12 00:49:01 crc kubenswrapper[4606]: I1212 00:49:01.724872 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29f7a7c3-4ec4-4e0e-8309-6a9dddcd9b71" path="/var/lib/kubelet/pods/29f7a7c3-4ec4-4e0e-8309-6a9dddcd9b71/volumes" Dec 12 00:49:01 crc kubenswrapper[4606]: I1212 00:49:01.846973 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 12 00:49:01 crc kubenswrapper[4606]: I1212 00:49:01.906441 4606 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-79c99578bb-cdgsn" podUID="9ede4720-3fd7-4524-adfc-c1c395f12170" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.148:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.148:8443: connect: connection refused" Dec 12 00:49:01 crc kubenswrapper[4606]: I1212 00:49:01.906545 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-79c99578bb-cdgsn" Dec 12 00:49:01 crc kubenswrapper[4606]: I1212 00:49:01.982216 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-pz8fh"] Dec 12 00:49:01 crc kubenswrapper[4606]: I1212 00:49:01.983452 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-pz8fh" Dec 12 00:49:01 crc kubenswrapper[4606]: I1212 00:49:01.985684 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Dec 12 00:49:01 crc kubenswrapper[4606]: I1212 00:49:01.985871 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Dec 12 00:49:01 crc kubenswrapper[4606]: I1212 00:49:01.988617 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-pz8fh"] Dec 12 00:49:02 crc kubenswrapper[4606]: I1212 00:49:02.072747 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/491bea2c-f0d9-45f2-bcf2-a49b4312e1f0-config-data\") pod \"nova-cell1-cell-mapping-pz8fh\" (UID: \"491bea2c-f0d9-45f2-bcf2-a49b4312e1f0\") " pod="openstack/nova-cell1-cell-mapping-pz8fh" Dec 12 00:49:02 crc kubenswrapper[4606]: I1212 00:49:02.072794 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/491bea2c-f0d9-45f2-bcf2-a49b4312e1f0-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-pz8fh\" (UID: \"491bea2c-f0d9-45f2-bcf2-a49b4312e1f0\") " pod="openstack/nova-cell1-cell-mapping-pz8fh" Dec 12 00:49:02 crc kubenswrapper[4606]: I1212 00:49:02.073334 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtt8l\" (UniqueName: \"kubernetes.io/projected/491bea2c-f0d9-45f2-bcf2-a49b4312e1f0-kube-api-access-vtt8l\") pod \"nova-cell1-cell-mapping-pz8fh\" (UID: \"491bea2c-f0d9-45f2-bcf2-a49b4312e1f0\") " pod="openstack/nova-cell1-cell-mapping-pz8fh" Dec 12 00:49:02 crc kubenswrapper[4606]: I1212 00:49:02.073492 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/491bea2c-f0d9-45f2-bcf2-a49b4312e1f0-scripts\") pod \"nova-cell1-cell-mapping-pz8fh\" (UID: \"491bea2c-f0d9-45f2-bcf2-a49b4312e1f0\") " pod="openstack/nova-cell1-cell-mapping-pz8fh" Dec 12 00:49:02 crc kubenswrapper[4606]: I1212 00:49:02.175463 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vtt8l\" (UniqueName: \"kubernetes.io/projected/491bea2c-f0d9-45f2-bcf2-a49b4312e1f0-kube-api-access-vtt8l\") pod \"nova-cell1-cell-mapping-pz8fh\" (UID: \"491bea2c-f0d9-45f2-bcf2-a49b4312e1f0\") " pod="openstack/nova-cell1-cell-mapping-pz8fh" Dec 12 00:49:02 crc kubenswrapper[4606]: I1212 00:49:02.175947 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/491bea2c-f0d9-45f2-bcf2-a49b4312e1f0-scripts\") pod \"nova-cell1-cell-mapping-pz8fh\" (UID: \"491bea2c-f0d9-45f2-bcf2-a49b4312e1f0\") " pod="openstack/nova-cell1-cell-mapping-pz8fh" Dec 12 00:49:02 crc kubenswrapper[4606]: I1212 00:49:02.176070 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/491bea2c-f0d9-45f2-bcf2-a49b4312e1f0-config-data\") pod \"nova-cell1-cell-mapping-pz8fh\" (UID: \"491bea2c-f0d9-45f2-bcf2-a49b4312e1f0\") " pod="openstack/nova-cell1-cell-mapping-pz8fh" Dec 12 00:49:02 crc kubenswrapper[4606]: I1212 00:49:02.176105 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/491bea2c-f0d9-45f2-bcf2-a49b4312e1f0-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-pz8fh\" (UID: \"491bea2c-f0d9-45f2-bcf2-a49b4312e1f0\") " pod="openstack/nova-cell1-cell-mapping-pz8fh" Dec 12 00:49:02 crc kubenswrapper[4606]: I1212 00:49:02.192042 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/491bea2c-f0d9-45f2-bcf2-a49b4312e1f0-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-pz8fh\" (UID: \"491bea2c-f0d9-45f2-bcf2-a49b4312e1f0\") " pod="openstack/nova-cell1-cell-mapping-pz8fh" Dec 12 00:49:02 crc kubenswrapper[4606]: I1212 00:49:02.197013 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/491bea2c-f0d9-45f2-bcf2-a49b4312e1f0-config-data\") pod \"nova-cell1-cell-mapping-pz8fh\" (UID: \"491bea2c-f0d9-45f2-bcf2-a49b4312e1f0\") " pod="openstack/nova-cell1-cell-mapping-pz8fh" Dec 12 00:49:02 crc kubenswrapper[4606]: I1212 00:49:02.198310 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/491bea2c-f0d9-45f2-bcf2-a49b4312e1f0-scripts\") pod \"nova-cell1-cell-mapping-pz8fh\" (UID: \"491bea2c-f0d9-45f2-bcf2-a49b4312e1f0\") " pod="openstack/nova-cell1-cell-mapping-pz8fh" Dec 12 00:49:02 crc kubenswrapper[4606]: I1212 00:49:02.200716 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtt8l\" (UniqueName: \"kubernetes.io/projected/491bea2c-f0d9-45f2-bcf2-a49b4312e1f0-kube-api-access-vtt8l\") pod \"nova-cell1-cell-mapping-pz8fh\" (UID: \"491bea2c-f0d9-45f2-bcf2-a49b4312e1f0\") " pod="openstack/nova-cell1-cell-mapping-pz8fh" Dec 12 00:49:02 crc kubenswrapper[4606]: I1212 00:49:02.344524 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-pz8fh" Dec 12 00:49:02 crc kubenswrapper[4606]: I1212 00:49:02.682831 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"788076ff-f08d-494e-b7ce-23897879660c","Type":"ContainerStarted","Data":"9db76ff4e6e38f42ea7581306a0e2c09815892feaf64e25eb23dc323434de321"} Dec 12 00:49:02 crc kubenswrapper[4606]: I1212 00:49:02.683092 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"788076ff-f08d-494e-b7ce-23897879660c","Type":"ContainerStarted","Data":"27adee9d67e7ade1d4e757bdd742833c9cc9e8ed08f91f347cccbd879eeb3353"} Dec 12 00:49:02 crc kubenswrapper[4606]: I1212 00:49:02.688721 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"41c91ed6-c9ea-48d5-ac06-87b859771b39","Type":"ContainerStarted","Data":"b92ce8c0a3d9f04aa9dd161d8bf68b5958baf28244a15197dbfa4a4e513b88ce"} Dec 12 00:49:02 crc kubenswrapper[4606]: I1212 00:49:02.732808 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.732788607 podStartE2EDuration="2.732788607s" podCreationTimestamp="2025-12-12 00:49:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:49:02.712563995 +0000 UTC m=+1533.257916861" watchObservedRunningTime="2025-12-12 00:49:02.732788607 +0000 UTC m=+1533.278141473" Dec 12 00:49:02 crc kubenswrapper[4606]: I1212 00:49:02.903589 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-pz8fh"] Dec 12 00:49:03 crc kubenswrapper[4606]: I1212 00:49:03.072415 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-cd5cbd7b9-78bt2" Dec 12 00:49:03 crc kubenswrapper[4606]: I1212 00:49:03.164276 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-4t5v7"] Dec 12 00:49:03 crc kubenswrapper[4606]: I1212 00:49:03.164502 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-bccf8f775-4t5v7" podUID="a930b062-89d9-4d8d-b649-b926aa8b2fe9" containerName="dnsmasq-dns" containerID="cri-o://373b04978f7f7adca9a70fa32eb7c948937a6429caa347afdeb01a6f17840877" gracePeriod=10 Dec 12 00:49:03 crc kubenswrapper[4606]: I1212 00:49:03.172949 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-frh6k" Dec 12 00:49:03 crc kubenswrapper[4606]: I1212 00:49:03.172981 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-frh6k" Dec 12 00:49:03 crc kubenswrapper[4606]: I1212 00:49:03.739555 4606 generic.go:334] "Generic (PLEG): container finished" podID="a930b062-89d9-4d8d-b649-b926aa8b2fe9" containerID="373b04978f7f7adca9a70fa32eb7c948937a6429caa347afdeb01a6f17840877" exitCode=0 Dec 12 00:49:03 crc kubenswrapper[4606]: I1212 00:49:03.739810 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-4t5v7" event={"ID":"a930b062-89d9-4d8d-b649-b926aa8b2fe9","Type":"ContainerDied","Data":"373b04978f7f7adca9a70fa32eb7c948937a6429caa347afdeb01a6f17840877"} Dec 12 00:49:03 crc kubenswrapper[4606]: I1212 00:49:03.748301 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"41c91ed6-c9ea-48d5-ac06-87b859771b39","Type":"ContainerStarted","Data":"4b9ee027cc0a76f91479a7dc3008eca16c36b270b205b708c61be61cad0b1a3c"} Dec 12 00:49:03 crc kubenswrapper[4606]: I1212 00:49:03.779248 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-pz8fh" event={"ID":"491bea2c-f0d9-45f2-bcf2-a49b4312e1f0","Type":"ContainerStarted","Data":"9d565397aa178c1bed8048e5ced41a078fbb2034318963028a3445099f8688f5"} Dec 12 00:49:03 crc kubenswrapper[4606]: I1212 00:49:03.779306 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-pz8fh" event={"ID":"491bea2c-f0d9-45f2-bcf2-a49b4312e1f0","Type":"ContainerStarted","Data":"5b6dffa55c5baa1c049c770ea2f8c16dbf6ed9477f0f38ec1e62a53d68c75660"} Dec 12 00:49:03 crc kubenswrapper[4606]: I1212 00:49:03.796892 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-pz8fh" podStartSLOduration=2.796874254 podStartE2EDuration="2.796874254s" podCreationTimestamp="2025-12-12 00:49:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:49:03.794679375 +0000 UTC m=+1534.340032241" watchObservedRunningTime="2025-12-12 00:49:03.796874254 +0000 UTC m=+1534.342227110" Dec 12 00:49:03 crc kubenswrapper[4606]: I1212 00:49:03.873670 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bccf8f775-4t5v7" Dec 12 00:49:03 crc kubenswrapper[4606]: I1212 00:49:03.931561 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a930b062-89d9-4d8d-b649-b926aa8b2fe9-ovsdbserver-nb\") pod \"a930b062-89d9-4d8d-b649-b926aa8b2fe9\" (UID: \"a930b062-89d9-4d8d-b649-b926aa8b2fe9\") " Dec 12 00:49:03 crc kubenswrapper[4606]: I1212 00:49:03.931739 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7gs5x\" (UniqueName: \"kubernetes.io/projected/a930b062-89d9-4d8d-b649-b926aa8b2fe9-kube-api-access-7gs5x\") pod \"a930b062-89d9-4d8d-b649-b926aa8b2fe9\" (UID: \"a930b062-89d9-4d8d-b649-b926aa8b2fe9\") " Dec 12 00:49:03 crc kubenswrapper[4606]: I1212 00:49:03.931769 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a930b062-89d9-4d8d-b649-b926aa8b2fe9-config\") pod \"a930b062-89d9-4d8d-b649-b926aa8b2fe9\" (UID: \"a930b062-89d9-4d8d-b649-b926aa8b2fe9\") " Dec 12 00:49:03 crc kubenswrapper[4606]: I1212 00:49:03.931804 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a930b062-89d9-4d8d-b649-b926aa8b2fe9-dns-svc\") pod \"a930b062-89d9-4d8d-b649-b926aa8b2fe9\" (UID: \"a930b062-89d9-4d8d-b649-b926aa8b2fe9\") " Dec 12 00:49:03 crc kubenswrapper[4606]: I1212 00:49:03.931852 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a930b062-89d9-4d8d-b649-b926aa8b2fe9-ovsdbserver-sb\") pod \"a930b062-89d9-4d8d-b649-b926aa8b2fe9\" (UID: \"a930b062-89d9-4d8d-b649-b926aa8b2fe9\") " Dec 12 00:49:03 crc kubenswrapper[4606]: I1212 00:49:03.931906 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a930b062-89d9-4d8d-b649-b926aa8b2fe9-dns-swift-storage-0\") pod \"a930b062-89d9-4d8d-b649-b926aa8b2fe9\" (UID: \"a930b062-89d9-4d8d-b649-b926aa8b2fe9\") " Dec 12 00:49:04 crc kubenswrapper[4606]: I1212 00:49:04.013458 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a930b062-89d9-4d8d-b649-b926aa8b2fe9-kube-api-access-7gs5x" (OuterVolumeSpecName: "kube-api-access-7gs5x") pod "a930b062-89d9-4d8d-b649-b926aa8b2fe9" (UID: "a930b062-89d9-4d8d-b649-b926aa8b2fe9"). InnerVolumeSpecName "kube-api-access-7gs5x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:49:04 crc kubenswrapper[4606]: I1212 00:49:04.048581 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7gs5x\" (UniqueName: \"kubernetes.io/projected/a930b062-89d9-4d8d-b649-b926aa8b2fe9-kube-api-access-7gs5x\") on node \"crc\" DevicePath \"\"" Dec 12 00:49:04 crc kubenswrapper[4606]: I1212 00:49:04.060484 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a930b062-89d9-4d8d-b649-b926aa8b2fe9-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a930b062-89d9-4d8d-b649-b926aa8b2fe9" (UID: "a930b062-89d9-4d8d-b649-b926aa8b2fe9"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:49:04 crc kubenswrapper[4606]: I1212 00:49:04.067822 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a930b062-89d9-4d8d-b649-b926aa8b2fe9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a930b062-89d9-4d8d-b649-b926aa8b2fe9" (UID: "a930b062-89d9-4d8d-b649-b926aa8b2fe9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:49:04 crc kubenswrapper[4606]: I1212 00:49:04.112925 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a930b062-89d9-4d8d-b649-b926aa8b2fe9-config" (OuterVolumeSpecName: "config") pod "a930b062-89d9-4d8d-b649-b926aa8b2fe9" (UID: "a930b062-89d9-4d8d-b649-b926aa8b2fe9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:49:04 crc kubenswrapper[4606]: I1212 00:49:04.119932 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a930b062-89d9-4d8d-b649-b926aa8b2fe9-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a930b062-89d9-4d8d-b649-b926aa8b2fe9" (UID: "a930b062-89d9-4d8d-b649-b926aa8b2fe9"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:49:04 crc kubenswrapper[4606]: I1212 00:49:04.125783 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a930b062-89d9-4d8d-b649-b926aa8b2fe9-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "a930b062-89d9-4d8d-b649-b926aa8b2fe9" (UID: "a930b062-89d9-4d8d-b649-b926aa8b2fe9"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:49:04 crc kubenswrapper[4606]: I1212 00:49:04.149965 4606 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a930b062-89d9-4d8d-b649-b926aa8b2fe9-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 12 00:49:04 crc kubenswrapper[4606]: I1212 00:49:04.149997 4606 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a930b062-89d9-4d8d-b649-b926aa8b2fe9-config\") on node \"crc\" DevicePath \"\"" Dec 12 00:49:04 crc kubenswrapper[4606]: I1212 00:49:04.150007 4606 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a930b062-89d9-4d8d-b649-b926aa8b2fe9-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 12 00:49:04 crc kubenswrapper[4606]: I1212 00:49:04.150015 4606 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a930b062-89d9-4d8d-b649-b926aa8b2fe9-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 12 00:49:04 crc kubenswrapper[4606]: I1212 00:49:04.150023 4606 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a930b062-89d9-4d8d-b649-b926aa8b2fe9-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 12 00:49:04 crc kubenswrapper[4606]: I1212 00:49:04.278907 4606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-frh6k" podUID="459d1fea-af9d-46b7-ad3a-057dc9b980f2" containerName="registry-server" probeResult="failure" output=< Dec 12 00:49:04 crc kubenswrapper[4606]: timeout: failed to connect service ":50051" within 1s Dec 12 00:49:04 crc kubenswrapper[4606]: > Dec 12 00:49:04 crc kubenswrapper[4606]: I1212 00:49:04.790657 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bccf8f775-4t5v7" Dec 12 00:49:04 crc kubenswrapper[4606]: I1212 00:49:04.790835 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-4t5v7" event={"ID":"a930b062-89d9-4d8d-b649-b926aa8b2fe9","Type":"ContainerDied","Data":"d8d7a1efbaab6d22e6d30aee564eb6894063dd038290ebec51ee6c47076801f1"} Dec 12 00:49:04 crc kubenswrapper[4606]: I1212 00:49:04.791805 4606 scope.go:117] "RemoveContainer" containerID="373b04978f7f7adca9a70fa32eb7c948937a6429caa347afdeb01a6f17840877" Dec 12 00:49:04 crc kubenswrapper[4606]: I1212 00:49:04.823631 4606 scope.go:117] "RemoveContainer" containerID="1fe3f35acb6880fa87ad534db567d0670de94b1081cc7dbaaf1a7e703fe31835" Dec 12 00:49:04 crc kubenswrapper[4606]: I1212 00:49:04.847410 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-4t5v7"] Dec 12 00:49:04 crc kubenswrapper[4606]: I1212 00:49:04.861279 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-4t5v7"] Dec 12 00:49:05 crc kubenswrapper[4606]: I1212 00:49:05.710045 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a930b062-89d9-4d8d-b649-b926aa8b2fe9" path="/var/lib/kubelet/pods/a930b062-89d9-4d8d-b649-b926aa8b2fe9/volumes" Dec 12 00:49:05 crc kubenswrapper[4606]: I1212 00:49:05.801953 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"41c91ed6-c9ea-48d5-ac06-87b859771b39","Type":"ContainerStarted","Data":"a3cec7a4310523da610375d7fded7d0213efd919e1f32a536adb8212e4761288"} Dec 12 00:49:05 crc kubenswrapper[4606]: I1212 00:49:05.803071 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="41c91ed6-c9ea-48d5-ac06-87b859771b39" containerName="ceilometer-central-agent" containerID="cri-o://343c29a9c75dca64857d284830fd650e12b327438b072f97cc3921043790df53" gracePeriod=30 Dec 12 00:49:05 crc kubenswrapper[4606]: I1212 00:49:05.803210 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="41c91ed6-c9ea-48d5-ac06-87b859771b39" containerName="proxy-httpd" containerID="cri-o://a3cec7a4310523da610375d7fded7d0213efd919e1f32a536adb8212e4761288" gracePeriod=30 Dec 12 00:49:05 crc kubenswrapper[4606]: I1212 00:49:05.803211 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="41c91ed6-c9ea-48d5-ac06-87b859771b39" containerName="sg-core" containerID="cri-o://4b9ee027cc0a76f91479a7dc3008eca16c36b270b205b708c61be61cad0b1a3c" gracePeriod=30 Dec 12 00:49:05 crc kubenswrapper[4606]: I1212 00:49:05.803180 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="41c91ed6-c9ea-48d5-ac06-87b859771b39" containerName="ceilometer-notification-agent" containerID="cri-o://b92ce8c0a3d9f04aa9dd161d8bf68b5958baf28244a15197dbfa4a4e513b88ce" gracePeriod=30 Dec 12 00:49:05 crc kubenswrapper[4606]: I1212 00:49:05.803115 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 12 00:49:05 crc kubenswrapper[4606]: I1212 00:49:05.829580 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.831341488 podStartE2EDuration="6.829564738s" podCreationTimestamp="2025-12-12 00:48:59 +0000 UTC" firstStartedPulling="2025-12-12 00:49:00.61788954 +0000 UTC m=+1531.163242406" lastFinishedPulling="2025-12-12 00:49:04.61611279 +0000 UTC m=+1535.161465656" observedRunningTime="2025-12-12 00:49:05.828917521 +0000 UTC m=+1536.374270397" watchObservedRunningTime="2025-12-12 00:49:05.829564738 +0000 UTC m=+1536.374917604" Dec 12 00:49:06 crc kubenswrapper[4606]: I1212 00:49:06.813926 4606 generic.go:334] "Generic (PLEG): container finished" podID="41c91ed6-c9ea-48d5-ac06-87b859771b39" containerID="a3cec7a4310523da610375d7fded7d0213efd919e1f32a536adb8212e4761288" exitCode=0 Dec 12 00:49:06 crc kubenswrapper[4606]: I1212 00:49:06.814241 4606 generic.go:334] "Generic (PLEG): container finished" podID="41c91ed6-c9ea-48d5-ac06-87b859771b39" containerID="4b9ee027cc0a76f91479a7dc3008eca16c36b270b205b708c61be61cad0b1a3c" exitCode=2 Dec 12 00:49:06 crc kubenswrapper[4606]: I1212 00:49:06.814252 4606 generic.go:334] "Generic (PLEG): container finished" podID="41c91ed6-c9ea-48d5-ac06-87b859771b39" containerID="b92ce8c0a3d9f04aa9dd161d8bf68b5958baf28244a15197dbfa4a4e513b88ce" exitCode=0 Dec 12 00:49:06 crc kubenswrapper[4606]: I1212 00:49:06.814131 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"41c91ed6-c9ea-48d5-ac06-87b859771b39","Type":"ContainerDied","Data":"a3cec7a4310523da610375d7fded7d0213efd919e1f32a536adb8212e4761288"} Dec 12 00:49:06 crc kubenswrapper[4606]: I1212 00:49:06.814298 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"41c91ed6-c9ea-48d5-ac06-87b859771b39","Type":"ContainerDied","Data":"4b9ee027cc0a76f91479a7dc3008eca16c36b270b205b708c61be61cad0b1a3c"} Dec 12 00:49:06 crc kubenswrapper[4606]: I1212 00:49:06.814312 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"41c91ed6-c9ea-48d5-ac06-87b859771b39","Type":"ContainerDied","Data":"b92ce8c0a3d9f04aa9dd161d8bf68b5958baf28244a15197dbfa4a4e513b88ce"} Dec 12 00:49:07 crc kubenswrapper[4606]: I1212 00:49:07.222644 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-79c99578bb-cdgsn" Dec 12 00:49:07 crc kubenswrapper[4606]: I1212 00:49:07.332703 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ede4720-3fd7-4524-adfc-c1c395f12170-horizon-tls-certs\") pod \"9ede4720-3fd7-4524-adfc-c1c395f12170\" (UID: \"9ede4720-3fd7-4524-adfc-c1c395f12170\") " Dec 12 00:49:07 crc kubenswrapper[4606]: I1212 00:49:07.332787 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9ede4720-3fd7-4524-adfc-c1c395f12170-logs\") pod \"9ede4720-3fd7-4524-adfc-c1c395f12170\" (UID: \"9ede4720-3fd7-4524-adfc-c1c395f12170\") " Dec 12 00:49:07 crc kubenswrapper[4606]: I1212 00:49:07.332854 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9ede4720-3fd7-4524-adfc-c1c395f12170-scripts\") pod \"9ede4720-3fd7-4524-adfc-c1c395f12170\" (UID: \"9ede4720-3fd7-4524-adfc-c1c395f12170\") " Dec 12 00:49:07 crc kubenswrapper[4606]: I1212 00:49:07.332922 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ede4720-3fd7-4524-adfc-c1c395f12170-combined-ca-bundle\") pod \"9ede4720-3fd7-4524-adfc-c1c395f12170\" (UID: \"9ede4720-3fd7-4524-adfc-c1c395f12170\") " Dec 12 00:49:07 crc kubenswrapper[4606]: I1212 00:49:07.332966 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pkmtn\" (UniqueName: \"kubernetes.io/projected/9ede4720-3fd7-4524-adfc-c1c395f12170-kube-api-access-pkmtn\") pod \"9ede4720-3fd7-4524-adfc-c1c395f12170\" (UID: \"9ede4720-3fd7-4524-adfc-c1c395f12170\") " Dec 12 00:49:07 crc kubenswrapper[4606]: I1212 00:49:07.333006 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9ede4720-3fd7-4524-adfc-c1c395f12170-config-data\") pod \"9ede4720-3fd7-4524-adfc-c1c395f12170\" (UID: \"9ede4720-3fd7-4524-adfc-c1c395f12170\") " Dec 12 00:49:07 crc kubenswrapper[4606]: I1212 00:49:07.333064 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9ede4720-3fd7-4524-adfc-c1c395f12170-horizon-secret-key\") pod \"9ede4720-3fd7-4524-adfc-c1c395f12170\" (UID: \"9ede4720-3fd7-4524-adfc-c1c395f12170\") " Dec 12 00:49:07 crc kubenswrapper[4606]: I1212 00:49:07.333682 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ede4720-3fd7-4524-adfc-c1c395f12170-logs" (OuterVolumeSpecName: "logs") pod "9ede4720-3fd7-4524-adfc-c1c395f12170" (UID: "9ede4720-3fd7-4524-adfc-c1c395f12170"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 00:49:07 crc kubenswrapper[4606]: I1212 00:49:07.356711 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ede4720-3fd7-4524-adfc-c1c395f12170-kube-api-access-pkmtn" (OuterVolumeSpecName: "kube-api-access-pkmtn") pod "9ede4720-3fd7-4524-adfc-c1c395f12170" (UID: "9ede4720-3fd7-4524-adfc-c1c395f12170"). InnerVolumeSpecName "kube-api-access-pkmtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:49:07 crc kubenswrapper[4606]: I1212 00:49:07.366338 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ede4720-3fd7-4524-adfc-c1c395f12170-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "9ede4720-3fd7-4524-adfc-c1c395f12170" (UID: "9ede4720-3fd7-4524-adfc-c1c395f12170"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:49:07 crc kubenswrapper[4606]: I1212 00:49:07.394616 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ede4720-3fd7-4524-adfc-c1c395f12170-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9ede4720-3fd7-4524-adfc-c1c395f12170" (UID: "9ede4720-3fd7-4524-adfc-c1c395f12170"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:49:07 crc kubenswrapper[4606]: I1212 00:49:07.409449 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ede4720-3fd7-4524-adfc-c1c395f12170-scripts" (OuterVolumeSpecName: "scripts") pod "9ede4720-3fd7-4524-adfc-c1c395f12170" (UID: "9ede4720-3fd7-4524-adfc-c1c395f12170"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:49:07 crc kubenswrapper[4606]: I1212 00:49:07.438799 4606 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9ede4720-3fd7-4524-adfc-c1c395f12170-logs\") on node \"crc\" DevicePath \"\"" Dec 12 00:49:07 crc kubenswrapper[4606]: I1212 00:49:07.438824 4606 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9ede4720-3fd7-4524-adfc-c1c395f12170-scripts\") on node \"crc\" DevicePath \"\"" Dec 12 00:49:07 crc kubenswrapper[4606]: I1212 00:49:07.438917 4606 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ede4720-3fd7-4524-adfc-c1c395f12170-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 12 00:49:07 crc kubenswrapper[4606]: I1212 00:49:07.438939 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pkmtn\" (UniqueName: \"kubernetes.io/projected/9ede4720-3fd7-4524-adfc-c1c395f12170-kube-api-access-pkmtn\") on node \"crc\" DevicePath \"\"" Dec 12 00:49:07 crc kubenswrapper[4606]: I1212 00:49:07.438949 4606 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9ede4720-3fd7-4524-adfc-c1c395f12170-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 12 00:49:07 crc kubenswrapper[4606]: I1212 00:49:07.502534 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ede4720-3fd7-4524-adfc-c1c395f12170-config-data" (OuterVolumeSpecName: "config-data") pod "9ede4720-3fd7-4524-adfc-c1c395f12170" (UID: "9ede4720-3fd7-4524-adfc-c1c395f12170"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:49:07 crc kubenswrapper[4606]: I1212 00:49:07.540789 4606 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9ede4720-3fd7-4524-adfc-c1c395f12170-config-data\") on node \"crc\" DevicePath \"\"" Dec 12 00:49:07 crc kubenswrapper[4606]: I1212 00:49:07.543027 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ede4720-3fd7-4524-adfc-c1c395f12170-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "9ede4720-3fd7-4524-adfc-c1c395f12170" (UID: "9ede4720-3fd7-4524-adfc-c1c395f12170"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:49:07 crc kubenswrapper[4606]: I1212 00:49:07.652463 4606 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ede4720-3fd7-4524-adfc-c1c395f12170-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 12 00:49:07 crc kubenswrapper[4606]: I1212 00:49:07.704339 4606 scope.go:117] "RemoveContainer" containerID="77f50006b910ef408eff1d278d14d4c6df84559c43dc90a75f9e7ad4aa5dad37" Dec 12 00:49:07 crc kubenswrapper[4606]: E1212 00:49:07.704591 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 00:49:07 crc kubenswrapper[4606]: I1212 00:49:07.823287 4606 generic.go:334] "Generic (PLEG): container finished" podID="9ede4720-3fd7-4524-adfc-c1c395f12170" containerID="3da605673d28778834960969826f21261bb757319e5b0200c284265da3ac2e79" exitCode=137 Dec 12 00:49:07 crc kubenswrapper[4606]: I1212 00:49:07.823354 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-79c99578bb-cdgsn" Dec 12 00:49:07 crc kubenswrapper[4606]: I1212 00:49:07.823382 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-79c99578bb-cdgsn" event={"ID":"9ede4720-3fd7-4524-adfc-c1c395f12170","Type":"ContainerDied","Data":"3da605673d28778834960969826f21261bb757319e5b0200c284265da3ac2e79"} Dec 12 00:49:07 crc kubenswrapper[4606]: I1212 00:49:07.823708 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-79c99578bb-cdgsn" event={"ID":"9ede4720-3fd7-4524-adfc-c1c395f12170","Type":"ContainerDied","Data":"24356c8576770364961e0198ef8f3ba94c43809cc5158bcadf3aad4ce0c21a66"} Dec 12 00:49:07 crc kubenswrapper[4606]: I1212 00:49:07.823732 4606 scope.go:117] "RemoveContainer" containerID="118461cd66eb187af751f8b1bf5a9a842145b3d677c56d382d11224f393d77e9" Dec 12 00:49:07 crc kubenswrapper[4606]: I1212 00:49:07.880820 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-79c99578bb-cdgsn"] Dec 12 00:49:07 crc kubenswrapper[4606]: I1212 00:49:07.890819 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-79c99578bb-cdgsn"] Dec 12 00:49:07 crc kubenswrapper[4606]: I1212 00:49:07.988975 4606 scope.go:117] "RemoveContainer" containerID="3da605673d28778834960969826f21261bb757319e5b0200c284265da3ac2e79" Dec 12 00:49:08 crc kubenswrapper[4606]: I1212 00:49:08.007959 4606 scope.go:117] "RemoveContainer" containerID="118461cd66eb187af751f8b1bf5a9a842145b3d677c56d382d11224f393d77e9" Dec 12 00:49:08 crc kubenswrapper[4606]: E1212 00:49:08.008447 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"118461cd66eb187af751f8b1bf5a9a842145b3d677c56d382d11224f393d77e9\": container with ID starting with 118461cd66eb187af751f8b1bf5a9a842145b3d677c56d382d11224f393d77e9 not found: ID does not exist" containerID="118461cd66eb187af751f8b1bf5a9a842145b3d677c56d382d11224f393d77e9" Dec 12 00:49:08 crc kubenswrapper[4606]: I1212 00:49:08.008491 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"118461cd66eb187af751f8b1bf5a9a842145b3d677c56d382d11224f393d77e9"} err="failed to get container status \"118461cd66eb187af751f8b1bf5a9a842145b3d677c56d382d11224f393d77e9\": rpc error: code = NotFound desc = could not find container \"118461cd66eb187af751f8b1bf5a9a842145b3d677c56d382d11224f393d77e9\": container with ID starting with 118461cd66eb187af751f8b1bf5a9a842145b3d677c56d382d11224f393d77e9 not found: ID does not exist" Dec 12 00:49:08 crc kubenswrapper[4606]: I1212 00:49:08.008517 4606 scope.go:117] "RemoveContainer" containerID="3da605673d28778834960969826f21261bb757319e5b0200c284265da3ac2e79" Dec 12 00:49:08 crc kubenswrapper[4606]: E1212 00:49:08.008867 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3da605673d28778834960969826f21261bb757319e5b0200c284265da3ac2e79\": container with ID starting with 3da605673d28778834960969826f21261bb757319e5b0200c284265da3ac2e79 not found: ID does not exist" containerID="3da605673d28778834960969826f21261bb757319e5b0200c284265da3ac2e79" Dec 12 00:49:08 crc kubenswrapper[4606]: I1212 00:49:08.008900 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3da605673d28778834960969826f21261bb757319e5b0200c284265da3ac2e79"} err="failed to get container status \"3da605673d28778834960969826f21261bb757319e5b0200c284265da3ac2e79\": rpc error: code = NotFound desc = could not find container \"3da605673d28778834960969826f21261bb757319e5b0200c284265da3ac2e79\": container with ID starting with 3da605673d28778834960969826f21261bb757319e5b0200c284265da3ac2e79 not found: ID does not exist" Dec 12 00:49:09 crc kubenswrapper[4606]: I1212 00:49:09.627057 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 12 00:49:09 crc kubenswrapper[4606]: I1212 00:49:09.685451 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/41c91ed6-c9ea-48d5-ac06-87b859771b39-ceilometer-tls-certs\") pod \"41c91ed6-c9ea-48d5-ac06-87b859771b39\" (UID: \"41c91ed6-c9ea-48d5-ac06-87b859771b39\") " Dec 12 00:49:09 crc kubenswrapper[4606]: I1212 00:49:09.685534 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41c91ed6-c9ea-48d5-ac06-87b859771b39-config-data\") pod \"41c91ed6-c9ea-48d5-ac06-87b859771b39\" (UID: \"41c91ed6-c9ea-48d5-ac06-87b859771b39\") " Dec 12 00:49:09 crc kubenswrapper[4606]: I1212 00:49:09.685589 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/41c91ed6-c9ea-48d5-ac06-87b859771b39-log-httpd\") pod \"41c91ed6-c9ea-48d5-ac06-87b859771b39\" (UID: \"41c91ed6-c9ea-48d5-ac06-87b859771b39\") " Dec 12 00:49:09 crc kubenswrapper[4606]: I1212 00:49:09.685626 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/41c91ed6-c9ea-48d5-ac06-87b859771b39-sg-core-conf-yaml\") pod \"41c91ed6-c9ea-48d5-ac06-87b859771b39\" (UID: \"41c91ed6-c9ea-48d5-ac06-87b859771b39\") " Dec 12 00:49:09 crc kubenswrapper[4606]: I1212 00:49:09.685692 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l82rk\" (UniqueName: \"kubernetes.io/projected/41c91ed6-c9ea-48d5-ac06-87b859771b39-kube-api-access-l82rk\") pod \"41c91ed6-c9ea-48d5-ac06-87b859771b39\" (UID: \"41c91ed6-c9ea-48d5-ac06-87b859771b39\") " Dec 12 00:49:09 crc kubenswrapper[4606]: I1212 00:49:09.685707 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41c91ed6-c9ea-48d5-ac06-87b859771b39-combined-ca-bundle\") pod \"41c91ed6-c9ea-48d5-ac06-87b859771b39\" (UID: \"41c91ed6-c9ea-48d5-ac06-87b859771b39\") " Dec 12 00:49:09 crc kubenswrapper[4606]: I1212 00:49:09.685738 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/41c91ed6-c9ea-48d5-ac06-87b859771b39-run-httpd\") pod \"41c91ed6-c9ea-48d5-ac06-87b859771b39\" (UID: \"41c91ed6-c9ea-48d5-ac06-87b859771b39\") " Dec 12 00:49:09 crc kubenswrapper[4606]: I1212 00:49:09.685773 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41c91ed6-c9ea-48d5-ac06-87b859771b39-scripts\") pod \"41c91ed6-c9ea-48d5-ac06-87b859771b39\" (UID: \"41c91ed6-c9ea-48d5-ac06-87b859771b39\") " Dec 12 00:49:09 crc kubenswrapper[4606]: I1212 00:49:09.687536 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41c91ed6-c9ea-48d5-ac06-87b859771b39-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "41c91ed6-c9ea-48d5-ac06-87b859771b39" (UID: "41c91ed6-c9ea-48d5-ac06-87b859771b39"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 00:49:09 crc kubenswrapper[4606]: I1212 00:49:09.687838 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41c91ed6-c9ea-48d5-ac06-87b859771b39-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "41c91ed6-c9ea-48d5-ac06-87b859771b39" (UID: "41c91ed6-c9ea-48d5-ac06-87b859771b39"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 00:49:09 crc kubenswrapper[4606]: I1212 00:49:09.710096 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41c91ed6-c9ea-48d5-ac06-87b859771b39-kube-api-access-l82rk" (OuterVolumeSpecName: "kube-api-access-l82rk") pod "41c91ed6-c9ea-48d5-ac06-87b859771b39" (UID: "41c91ed6-c9ea-48d5-ac06-87b859771b39"). InnerVolumeSpecName "kube-api-access-l82rk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:49:09 crc kubenswrapper[4606]: I1212 00:49:09.718018 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41c91ed6-c9ea-48d5-ac06-87b859771b39-scripts" (OuterVolumeSpecName: "scripts") pod "41c91ed6-c9ea-48d5-ac06-87b859771b39" (UID: "41c91ed6-c9ea-48d5-ac06-87b859771b39"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:49:09 crc kubenswrapper[4606]: I1212 00:49:09.719240 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41c91ed6-c9ea-48d5-ac06-87b859771b39-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "41c91ed6-c9ea-48d5-ac06-87b859771b39" (UID: "41c91ed6-c9ea-48d5-ac06-87b859771b39"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:49:09 crc kubenswrapper[4606]: I1212 00:49:09.725630 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ede4720-3fd7-4524-adfc-c1c395f12170" path="/var/lib/kubelet/pods/9ede4720-3fd7-4524-adfc-c1c395f12170/volumes" Dec 12 00:49:09 crc kubenswrapper[4606]: I1212 00:49:09.760476 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41c91ed6-c9ea-48d5-ac06-87b859771b39-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "41c91ed6-c9ea-48d5-ac06-87b859771b39" (UID: "41c91ed6-c9ea-48d5-ac06-87b859771b39"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:49:09 crc kubenswrapper[4606]: I1212 00:49:09.787483 4606 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41c91ed6-c9ea-48d5-ac06-87b859771b39-scripts\") on node \"crc\" DevicePath \"\"" Dec 12 00:49:09 crc kubenswrapper[4606]: I1212 00:49:09.787515 4606 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/41c91ed6-c9ea-48d5-ac06-87b859771b39-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 12 00:49:09 crc kubenswrapper[4606]: I1212 00:49:09.787526 4606 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/41c91ed6-c9ea-48d5-ac06-87b859771b39-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 12 00:49:09 crc kubenswrapper[4606]: I1212 00:49:09.787534 4606 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/41c91ed6-c9ea-48d5-ac06-87b859771b39-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 12 00:49:09 crc kubenswrapper[4606]: I1212 00:49:09.787542 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l82rk\" (UniqueName: \"kubernetes.io/projected/41c91ed6-c9ea-48d5-ac06-87b859771b39-kube-api-access-l82rk\") on node \"crc\" DevicePath \"\"" Dec 12 00:49:09 crc kubenswrapper[4606]: I1212 00:49:09.787550 4606 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/41c91ed6-c9ea-48d5-ac06-87b859771b39-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 12 00:49:09 crc kubenswrapper[4606]: I1212 00:49:09.788929 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41c91ed6-c9ea-48d5-ac06-87b859771b39-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "41c91ed6-c9ea-48d5-ac06-87b859771b39" (UID: "41c91ed6-c9ea-48d5-ac06-87b859771b39"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:49:09 crc kubenswrapper[4606]: I1212 00:49:09.804531 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41c91ed6-c9ea-48d5-ac06-87b859771b39-config-data" (OuterVolumeSpecName: "config-data") pod "41c91ed6-c9ea-48d5-ac06-87b859771b39" (UID: "41c91ed6-c9ea-48d5-ac06-87b859771b39"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:49:09 crc kubenswrapper[4606]: I1212 00:49:09.845103 4606 generic.go:334] "Generic (PLEG): container finished" podID="491bea2c-f0d9-45f2-bcf2-a49b4312e1f0" containerID="9d565397aa178c1bed8048e5ced41a078fbb2034318963028a3445099f8688f5" exitCode=0 Dec 12 00:49:09 crc kubenswrapper[4606]: I1212 00:49:09.845294 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-pz8fh" event={"ID":"491bea2c-f0d9-45f2-bcf2-a49b4312e1f0","Type":"ContainerDied","Data":"9d565397aa178c1bed8048e5ced41a078fbb2034318963028a3445099f8688f5"} Dec 12 00:49:09 crc kubenswrapper[4606]: I1212 00:49:09.852374 4606 generic.go:334] "Generic (PLEG): container finished" podID="41c91ed6-c9ea-48d5-ac06-87b859771b39" containerID="343c29a9c75dca64857d284830fd650e12b327438b072f97cc3921043790df53" exitCode=0 Dec 12 00:49:09 crc kubenswrapper[4606]: I1212 00:49:09.852552 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"41c91ed6-c9ea-48d5-ac06-87b859771b39","Type":"ContainerDied","Data":"343c29a9c75dca64857d284830fd650e12b327438b072f97cc3921043790df53"} Dec 12 00:49:09 crc kubenswrapper[4606]: I1212 00:49:09.852647 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"41c91ed6-c9ea-48d5-ac06-87b859771b39","Type":"ContainerDied","Data":"15435ed10edf41b633e220693af64b9a9a10680630e9f562e5c2232263b211c6"} Dec 12 00:49:09 crc kubenswrapper[4606]: I1212 00:49:09.852761 4606 scope.go:117] "RemoveContainer" containerID="a3cec7a4310523da610375d7fded7d0213efd919e1f32a536adb8212e4761288" Dec 12 00:49:09 crc kubenswrapper[4606]: I1212 00:49:09.852984 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 12 00:49:09 crc kubenswrapper[4606]: I1212 00:49:09.885734 4606 scope.go:117] "RemoveContainer" containerID="4b9ee027cc0a76f91479a7dc3008eca16c36b270b205b708c61be61cad0b1a3c" Dec 12 00:49:09 crc kubenswrapper[4606]: I1212 00:49:09.891634 4606 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41c91ed6-c9ea-48d5-ac06-87b859771b39-config-data\") on node \"crc\" DevicePath \"\"" Dec 12 00:49:09 crc kubenswrapper[4606]: I1212 00:49:09.891963 4606 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41c91ed6-c9ea-48d5-ac06-87b859771b39-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 12 00:49:09 crc kubenswrapper[4606]: I1212 00:49:09.913064 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 12 00:49:09 crc kubenswrapper[4606]: I1212 00:49:09.917132 4606 scope.go:117] "RemoveContainer" containerID="b92ce8c0a3d9f04aa9dd161d8bf68b5958baf28244a15197dbfa4a4e513b88ce" Dec 12 00:49:09 crc kubenswrapper[4606]: I1212 00:49:09.935100 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 12 00:49:09 crc kubenswrapper[4606]: I1212 00:49:09.949375 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 12 00:49:09 crc kubenswrapper[4606]: E1212 00:49:09.949844 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ede4720-3fd7-4524-adfc-c1c395f12170" containerName="horizon" Dec 12 00:49:09 crc kubenswrapper[4606]: I1212 00:49:09.949867 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ede4720-3fd7-4524-adfc-c1c395f12170" containerName="horizon" Dec 12 00:49:09 crc kubenswrapper[4606]: E1212 00:49:09.949880 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a930b062-89d9-4d8d-b649-b926aa8b2fe9" containerName="init" Dec 12 00:49:09 crc kubenswrapper[4606]: I1212 00:49:09.949886 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="a930b062-89d9-4d8d-b649-b926aa8b2fe9" containerName="init" Dec 12 00:49:09 crc kubenswrapper[4606]: E1212 00:49:09.949899 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41c91ed6-c9ea-48d5-ac06-87b859771b39" containerName="ceilometer-central-agent" Dec 12 00:49:09 crc kubenswrapper[4606]: I1212 00:49:09.949905 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="41c91ed6-c9ea-48d5-ac06-87b859771b39" containerName="ceilometer-central-agent" Dec 12 00:49:09 crc kubenswrapper[4606]: E1212 00:49:09.949919 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41c91ed6-c9ea-48d5-ac06-87b859771b39" containerName="proxy-httpd" Dec 12 00:49:09 crc kubenswrapper[4606]: I1212 00:49:09.949925 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="41c91ed6-c9ea-48d5-ac06-87b859771b39" containerName="proxy-httpd" Dec 12 00:49:09 crc kubenswrapper[4606]: E1212 00:49:09.949939 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41c91ed6-c9ea-48d5-ac06-87b859771b39" containerName="sg-core" Dec 12 00:49:09 crc kubenswrapper[4606]: I1212 00:49:09.949944 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="41c91ed6-c9ea-48d5-ac06-87b859771b39" containerName="sg-core" Dec 12 00:49:09 crc kubenswrapper[4606]: E1212 00:49:09.949972 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41c91ed6-c9ea-48d5-ac06-87b859771b39" containerName="ceilometer-notification-agent" Dec 12 00:49:09 crc kubenswrapper[4606]: I1212 00:49:09.949978 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="41c91ed6-c9ea-48d5-ac06-87b859771b39" containerName="ceilometer-notification-agent" Dec 12 00:49:09 crc kubenswrapper[4606]: E1212 00:49:09.949991 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ede4720-3fd7-4524-adfc-c1c395f12170" containerName="horizon-log" Dec 12 00:49:09 crc kubenswrapper[4606]: I1212 00:49:09.949997 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ede4720-3fd7-4524-adfc-c1c395f12170" containerName="horizon-log" Dec 12 00:49:09 crc kubenswrapper[4606]: E1212 00:49:09.950009 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a930b062-89d9-4d8d-b649-b926aa8b2fe9" containerName="dnsmasq-dns" Dec 12 00:49:09 crc kubenswrapper[4606]: I1212 00:49:09.950015 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="a930b062-89d9-4d8d-b649-b926aa8b2fe9" containerName="dnsmasq-dns" Dec 12 00:49:09 crc kubenswrapper[4606]: I1212 00:49:09.950269 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ede4720-3fd7-4524-adfc-c1c395f12170" containerName="horizon" Dec 12 00:49:09 crc kubenswrapper[4606]: I1212 00:49:09.950288 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ede4720-3fd7-4524-adfc-c1c395f12170" containerName="horizon-log" Dec 12 00:49:09 crc kubenswrapper[4606]: I1212 00:49:09.950302 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ede4720-3fd7-4524-adfc-c1c395f12170" containerName="horizon" Dec 12 00:49:09 crc kubenswrapper[4606]: I1212 00:49:09.950312 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="a930b062-89d9-4d8d-b649-b926aa8b2fe9" containerName="dnsmasq-dns" Dec 12 00:49:09 crc kubenswrapper[4606]: I1212 00:49:09.950320 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="41c91ed6-c9ea-48d5-ac06-87b859771b39" containerName="ceilometer-central-agent" Dec 12 00:49:09 crc kubenswrapper[4606]: I1212 00:49:09.950333 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="41c91ed6-c9ea-48d5-ac06-87b859771b39" containerName="sg-core" Dec 12 00:49:09 crc kubenswrapper[4606]: I1212 00:49:09.950344 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="41c91ed6-c9ea-48d5-ac06-87b859771b39" containerName="proxy-httpd" Dec 12 00:49:09 crc kubenswrapper[4606]: I1212 00:49:09.950357 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="41c91ed6-c9ea-48d5-ac06-87b859771b39" containerName="ceilometer-notification-agent" Dec 12 00:49:09 crc kubenswrapper[4606]: E1212 00:49:09.950542 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ede4720-3fd7-4524-adfc-c1c395f12170" containerName="horizon" Dec 12 00:49:09 crc kubenswrapper[4606]: I1212 00:49:09.950554 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ede4720-3fd7-4524-adfc-c1c395f12170" containerName="horizon" Dec 12 00:49:09 crc kubenswrapper[4606]: I1212 00:49:09.952360 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 12 00:49:09 crc kubenswrapper[4606]: I1212 00:49:09.957398 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 12 00:49:09 crc kubenswrapper[4606]: I1212 00:49:09.957673 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 12 00:49:09 crc kubenswrapper[4606]: I1212 00:49:09.961584 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 12 00:49:09 crc kubenswrapper[4606]: I1212 00:49:09.973808 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 12 00:49:10 crc kubenswrapper[4606]: I1212 00:49:10.001785 4606 scope.go:117] "RemoveContainer" containerID="343c29a9c75dca64857d284830fd650e12b327438b072f97cc3921043790df53" Dec 12 00:49:10 crc kubenswrapper[4606]: I1212 00:49:10.031258 4606 scope.go:117] "RemoveContainer" containerID="a3cec7a4310523da610375d7fded7d0213efd919e1f32a536adb8212e4761288" Dec 12 00:49:10 crc kubenswrapper[4606]: E1212 00:49:10.031822 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3cec7a4310523da610375d7fded7d0213efd919e1f32a536adb8212e4761288\": container with ID starting with a3cec7a4310523da610375d7fded7d0213efd919e1f32a536adb8212e4761288 not found: ID does not exist" containerID="a3cec7a4310523da610375d7fded7d0213efd919e1f32a536adb8212e4761288" Dec 12 00:49:10 crc kubenswrapper[4606]: I1212 00:49:10.031912 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3cec7a4310523da610375d7fded7d0213efd919e1f32a536adb8212e4761288"} err="failed to get container status \"a3cec7a4310523da610375d7fded7d0213efd919e1f32a536adb8212e4761288\": rpc error: code = NotFound desc = could not find container \"a3cec7a4310523da610375d7fded7d0213efd919e1f32a536adb8212e4761288\": container with ID starting with a3cec7a4310523da610375d7fded7d0213efd919e1f32a536adb8212e4761288 not found: ID does not exist" Dec 12 00:49:10 crc kubenswrapper[4606]: I1212 00:49:10.031982 4606 scope.go:117] "RemoveContainer" containerID="4b9ee027cc0a76f91479a7dc3008eca16c36b270b205b708c61be61cad0b1a3c" Dec 12 00:49:10 crc kubenswrapper[4606]: E1212 00:49:10.032586 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b9ee027cc0a76f91479a7dc3008eca16c36b270b205b708c61be61cad0b1a3c\": container with ID starting with 4b9ee027cc0a76f91479a7dc3008eca16c36b270b205b708c61be61cad0b1a3c not found: ID does not exist" containerID="4b9ee027cc0a76f91479a7dc3008eca16c36b270b205b708c61be61cad0b1a3c" Dec 12 00:49:10 crc kubenswrapper[4606]: I1212 00:49:10.032629 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b9ee027cc0a76f91479a7dc3008eca16c36b270b205b708c61be61cad0b1a3c"} err="failed to get container status \"4b9ee027cc0a76f91479a7dc3008eca16c36b270b205b708c61be61cad0b1a3c\": rpc error: code = NotFound desc = could not find container \"4b9ee027cc0a76f91479a7dc3008eca16c36b270b205b708c61be61cad0b1a3c\": container with ID starting with 4b9ee027cc0a76f91479a7dc3008eca16c36b270b205b708c61be61cad0b1a3c not found: ID does not exist" Dec 12 00:49:10 crc kubenswrapper[4606]: I1212 00:49:10.032661 4606 scope.go:117] "RemoveContainer" containerID="b92ce8c0a3d9f04aa9dd161d8bf68b5958baf28244a15197dbfa4a4e513b88ce" Dec 12 00:49:10 crc kubenswrapper[4606]: E1212 00:49:10.033161 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b92ce8c0a3d9f04aa9dd161d8bf68b5958baf28244a15197dbfa4a4e513b88ce\": container with ID starting with b92ce8c0a3d9f04aa9dd161d8bf68b5958baf28244a15197dbfa4a4e513b88ce not found: ID does not exist" containerID="b92ce8c0a3d9f04aa9dd161d8bf68b5958baf28244a15197dbfa4a4e513b88ce" Dec 12 00:49:10 crc kubenswrapper[4606]: I1212 00:49:10.033266 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b92ce8c0a3d9f04aa9dd161d8bf68b5958baf28244a15197dbfa4a4e513b88ce"} err="failed to get container status \"b92ce8c0a3d9f04aa9dd161d8bf68b5958baf28244a15197dbfa4a4e513b88ce\": rpc error: code = NotFound desc = could not find container \"b92ce8c0a3d9f04aa9dd161d8bf68b5958baf28244a15197dbfa4a4e513b88ce\": container with ID starting with b92ce8c0a3d9f04aa9dd161d8bf68b5958baf28244a15197dbfa4a4e513b88ce not found: ID does not exist" Dec 12 00:49:10 crc kubenswrapper[4606]: I1212 00:49:10.033283 4606 scope.go:117] "RemoveContainer" containerID="343c29a9c75dca64857d284830fd650e12b327438b072f97cc3921043790df53" Dec 12 00:49:10 crc kubenswrapper[4606]: E1212 00:49:10.033673 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"343c29a9c75dca64857d284830fd650e12b327438b072f97cc3921043790df53\": container with ID starting with 343c29a9c75dca64857d284830fd650e12b327438b072f97cc3921043790df53 not found: ID does not exist" containerID="343c29a9c75dca64857d284830fd650e12b327438b072f97cc3921043790df53" Dec 12 00:49:10 crc kubenswrapper[4606]: I1212 00:49:10.033698 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"343c29a9c75dca64857d284830fd650e12b327438b072f97cc3921043790df53"} err="failed to get container status \"343c29a9c75dca64857d284830fd650e12b327438b072f97cc3921043790df53\": rpc error: code = NotFound desc = could not find container \"343c29a9c75dca64857d284830fd650e12b327438b072f97cc3921043790df53\": container with ID starting with 343c29a9c75dca64857d284830fd650e12b327438b072f97cc3921043790df53 not found: ID does not exist" Dec 12 00:49:10 crc kubenswrapper[4606]: I1212 00:49:10.097611 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dbd84191-5cbb-48c5-af82-bfad9996ee60-scripts\") pod \"ceilometer-0\" (UID: \"dbd84191-5cbb-48c5-af82-bfad9996ee60\") " pod="openstack/ceilometer-0" Dec 12 00:49:10 crc kubenswrapper[4606]: I1212 00:49:10.097728 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dbd84191-5cbb-48c5-af82-bfad9996ee60-log-httpd\") pod \"ceilometer-0\" (UID: \"dbd84191-5cbb-48c5-af82-bfad9996ee60\") " pod="openstack/ceilometer-0" Dec 12 00:49:10 crc kubenswrapper[4606]: I1212 00:49:10.097874 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbd84191-5cbb-48c5-af82-bfad9996ee60-config-data\") pod \"ceilometer-0\" (UID: \"dbd84191-5cbb-48c5-af82-bfad9996ee60\") " pod="openstack/ceilometer-0" Dec 12 00:49:10 crc kubenswrapper[4606]: I1212 00:49:10.098012 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbd84191-5cbb-48c5-af82-bfad9996ee60-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dbd84191-5cbb-48c5-af82-bfad9996ee60\") " pod="openstack/ceilometer-0" Dec 12 00:49:10 crc kubenswrapper[4606]: I1212 00:49:10.098058 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbd84191-5cbb-48c5-af82-bfad9996ee60-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"dbd84191-5cbb-48c5-af82-bfad9996ee60\") " pod="openstack/ceilometer-0" Dec 12 00:49:10 crc kubenswrapper[4606]: I1212 00:49:10.098118 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dbd84191-5cbb-48c5-af82-bfad9996ee60-run-httpd\") pod \"ceilometer-0\" (UID: \"dbd84191-5cbb-48c5-af82-bfad9996ee60\") " pod="openstack/ceilometer-0" Dec 12 00:49:10 crc kubenswrapper[4606]: I1212 00:49:10.098145 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvbf7\" (UniqueName: \"kubernetes.io/projected/dbd84191-5cbb-48c5-af82-bfad9996ee60-kube-api-access-rvbf7\") pod \"ceilometer-0\" (UID: \"dbd84191-5cbb-48c5-af82-bfad9996ee60\") " pod="openstack/ceilometer-0" Dec 12 00:49:10 crc kubenswrapper[4606]: I1212 00:49:10.098326 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dbd84191-5cbb-48c5-af82-bfad9996ee60-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dbd84191-5cbb-48c5-af82-bfad9996ee60\") " pod="openstack/ceilometer-0" Dec 12 00:49:10 crc kubenswrapper[4606]: I1212 00:49:10.200089 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dbd84191-5cbb-48c5-af82-bfad9996ee60-log-httpd\") pod \"ceilometer-0\" (UID: \"dbd84191-5cbb-48c5-af82-bfad9996ee60\") " pod="openstack/ceilometer-0" Dec 12 00:49:10 crc kubenswrapper[4606]: I1212 00:49:10.200142 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbd84191-5cbb-48c5-af82-bfad9996ee60-config-data\") pod \"ceilometer-0\" (UID: \"dbd84191-5cbb-48c5-af82-bfad9996ee60\") " pod="openstack/ceilometer-0" Dec 12 00:49:10 crc kubenswrapper[4606]: I1212 00:49:10.200202 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbd84191-5cbb-48c5-af82-bfad9996ee60-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dbd84191-5cbb-48c5-af82-bfad9996ee60\") " pod="openstack/ceilometer-0" Dec 12 00:49:10 crc kubenswrapper[4606]: I1212 00:49:10.200225 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbd84191-5cbb-48c5-af82-bfad9996ee60-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"dbd84191-5cbb-48c5-af82-bfad9996ee60\") " pod="openstack/ceilometer-0" Dec 12 00:49:10 crc kubenswrapper[4606]: I1212 00:49:10.200262 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dbd84191-5cbb-48c5-af82-bfad9996ee60-run-httpd\") pod \"ceilometer-0\" (UID: \"dbd84191-5cbb-48c5-af82-bfad9996ee60\") " pod="openstack/ceilometer-0" Dec 12 00:49:10 crc kubenswrapper[4606]: I1212 00:49:10.200284 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvbf7\" (UniqueName: \"kubernetes.io/projected/dbd84191-5cbb-48c5-af82-bfad9996ee60-kube-api-access-rvbf7\") pod \"ceilometer-0\" (UID: \"dbd84191-5cbb-48c5-af82-bfad9996ee60\") " pod="openstack/ceilometer-0" Dec 12 00:49:10 crc kubenswrapper[4606]: I1212 00:49:10.200322 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dbd84191-5cbb-48c5-af82-bfad9996ee60-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dbd84191-5cbb-48c5-af82-bfad9996ee60\") " pod="openstack/ceilometer-0" Dec 12 00:49:10 crc kubenswrapper[4606]: I1212 00:49:10.200340 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dbd84191-5cbb-48c5-af82-bfad9996ee60-scripts\") pod \"ceilometer-0\" (UID: \"dbd84191-5cbb-48c5-af82-bfad9996ee60\") " pod="openstack/ceilometer-0" Dec 12 00:49:10 crc kubenswrapper[4606]: I1212 00:49:10.200583 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dbd84191-5cbb-48c5-af82-bfad9996ee60-log-httpd\") pod \"ceilometer-0\" (UID: \"dbd84191-5cbb-48c5-af82-bfad9996ee60\") " pod="openstack/ceilometer-0" Dec 12 00:49:10 crc kubenswrapper[4606]: I1212 00:49:10.200830 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dbd84191-5cbb-48c5-af82-bfad9996ee60-run-httpd\") pod \"ceilometer-0\" (UID: \"dbd84191-5cbb-48c5-af82-bfad9996ee60\") " pod="openstack/ceilometer-0" Dec 12 00:49:10 crc kubenswrapper[4606]: I1212 00:49:10.203862 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dbd84191-5cbb-48c5-af82-bfad9996ee60-scripts\") pod \"ceilometer-0\" (UID: \"dbd84191-5cbb-48c5-af82-bfad9996ee60\") " pod="openstack/ceilometer-0" Dec 12 00:49:10 crc kubenswrapper[4606]: I1212 00:49:10.204658 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbd84191-5cbb-48c5-af82-bfad9996ee60-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dbd84191-5cbb-48c5-af82-bfad9996ee60\") " pod="openstack/ceilometer-0" Dec 12 00:49:10 crc kubenswrapper[4606]: I1212 00:49:10.205458 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbd84191-5cbb-48c5-af82-bfad9996ee60-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"dbd84191-5cbb-48c5-af82-bfad9996ee60\") " pod="openstack/ceilometer-0" Dec 12 00:49:10 crc kubenswrapper[4606]: I1212 00:49:10.206110 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbd84191-5cbb-48c5-af82-bfad9996ee60-config-data\") pod \"ceilometer-0\" (UID: \"dbd84191-5cbb-48c5-af82-bfad9996ee60\") " pod="openstack/ceilometer-0" Dec 12 00:49:10 crc kubenswrapper[4606]: I1212 00:49:10.208091 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dbd84191-5cbb-48c5-af82-bfad9996ee60-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dbd84191-5cbb-48c5-af82-bfad9996ee60\") " pod="openstack/ceilometer-0" Dec 12 00:49:10 crc kubenswrapper[4606]: I1212 00:49:10.223383 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvbf7\" (UniqueName: \"kubernetes.io/projected/dbd84191-5cbb-48c5-af82-bfad9996ee60-kube-api-access-rvbf7\") pod \"ceilometer-0\" (UID: \"dbd84191-5cbb-48c5-af82-bfad9996ee60\") " pod="openstack/ceilometer-0" Dec 12 00:49:10 crc kubenswrapper[4606]: I1212 00:49:10.268508 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 12 00:49:10 crc kubenswrapper[4606]: I1212 00:49:10.776059 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 12 00:49:10 crc kubenswrapper[4606]: W1212 00:49:10.780190 4606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddbd84191_5cbb_48c5_af82_bfad9996ee60.slice/crio-0f9d19e5a09018f5d4655568177ef9e69e5f3533aa3468e32f0d46aab7236617 WatchSource:0}: Error finding container 0f9d19e5a09018f5d4655568177ef9e69e5f3533aa3468e32f0d46aab7236617: Status 404 returned error can't find the container with id 0f9d19e5a09018f5d4655568177ef9e69e5f3533aa3468e32f0d46aab7236617 Dec 12 00:49:10 crc kubenswrapper[4606]: I1212 00:49:10.877128 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dbd84191-5cbb-48c5-af82-bfad9996ee60","Type":"ContainerStarted","Data":"0f9d19e5a09018f5d4655568177ef9e69e5f3533aa3468e32f0d46aab7236617"} Dec 12 00:49:11 crc kubenswrapper[4606]: I1212 00:49:11.073674 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 12 00:49:11 crc kubenswrapper[4606]: I1212 00:49:11.073724 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 12 00:49:11 crc kubenswrapper[4606]: I1212 00:49:11.257682 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-pz8fh" Dec 12 00:49:11 crc kubenswrapper[4606]: I1212 00:49:11.327391 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/491bea2c-f0d9-45f2-bcf2-a49b4312e1f0-config-data\") pod \"491bea2c-f0d9-45f2-bcf2-a49b4312e1f0\" (UID: \"491bea2c-f0d9-45f2-bcf2-a49b4312e1f0\") " Dec 12 00:49:11 crc kubenswrapper[4606]: I1212 00:49:11.327766 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/491bea2c-f0d9-45f2-bcf2-a49b4312e1f0-scripts\") pod \"491bea2c-f0d9-45f2-bcf2-a49b4312e1f0\" (UID: \"491bea2c-f0d9-45f2-bcf2-a49b4312e1f0\") " Dec 12 00:49:11 crc kubenswrapper[4606]: I1212 00:49:11.327850 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vtt8l\" (UniqueName: \"kubernetes.io/projected/491bea2c-f0d9-45f2-bcf2-a49b4312e1f0-kube-api-access-vtt8l\") pod \"491bea2c-f0d9-45f2-bcf2-a49b4312e1f0\" (UID: \"491bea2c-f0d9-45f2-bcf2-a49b4312e1f0\") " Dec 12 00:49:11 crc kubenswrapper[4606]: I1212 00:49:11.327933 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/491bea2c-f0d9-45f2-bcf2-a49b4312e1f0-combined-ca-bundle\") pod \"491bea2c-f0d9-45f2-bcf2-a49b4312e1f0\" (UID: \"491bea2c-f0d9-45f2-bcf2-a49b4312e1f0\") " Dec 12 00:49:11 crc kubenswrapper[4606]: I1212 00:49:11.342370 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/491bea2c-f0d9-45f2-bcf2-a49b4312e1f0-kube-api-access-vtt8l" (OuterVolumeSpecName: "kube-api-access-vtt8l") pod "491bea2c-f0d9-45f2-bcf2-a49b4312e1f0" (UID: "491bea2c-f0d9-45f2-bcf2-a49b4312e1f0"). InnerVolumeSpecName "kube-api-access-vtt8l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:49:11 crc kubenswrapper[4606]: I1212 00:49:11.356094 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/491bea2c-f0d9-45f2-bcf2-a49b4312e1f0-scripts" (OuterVolumeSpecName: "scripts") pod "491bea2c-f0d9-45f2-bcf2-a49b4312e1f0" (UID: "491bea2c-f0d9-45f2-bcf2-a49b4312e1f0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:49:11 crc kubenswrapper[4606]: I1212 00:49:11.361001 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/491bea2c-f0d9-45f2-bcf2-a49b4312e1f0-config-data" (OuterVolumeSpecName: "config-data") pod "491bea2c-f0d9-45f2-bcf2-a49b4312e1f0" (UID: "491bea2c-f0d9-45f2-bcf2-a49b4312e1f0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:49:11 crc kubenswrapper[4606]: I1212 00:49:11.376087 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/491bea2c-f0d9-45f2-bcf2-a49b4312e1f0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "491bea2c-f0d9-45f2-bcf2-a49b4312e1f0" (UID: "491bea2c-f0d9-45f2-bcf2-a49b4312e1f0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:49:11 crc kubenswrapper[4606]: I1212 00:49:11.430440 4606 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/491bea2c-f0d9-45f2-bcf2-a49b4312e1f0-config-data\") on node \"crc\" DevicePath \"\"" Dec 12 00:49:11 crc kubenswrapper[4606]: I1212 00:49:11.430478 4606 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/491bea2c-f0d9-45f2-bcf2-a49b4312e1f0-scripts\") on node \"crc\" DevicePath \"\"" Dec 12 00:49:11 crc kubenswrapper[4606]: I1212 00:49:11.430494 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vtt8l\" (UniqueName: \"kubernetes.io/projected/491bea2c-f0d9-45f2-bcf2-a49b4312e1f0-kube-api-access-vtt8l\") on node \"crc\" DevicePath \"\"" Dec 12 00:49:11 crc kubenswrapper[4606]: I1212 00:49:11.430510 4606 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/491bea2c-f0d9-45f2-bcf2-a49b4312e1f0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 12 00:49:11 crc kubenswrapper[4606]: I1212 00:49:11.711587 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41c91ed6-c9ea-48d5-ac06-87b859771b39" path="/var/lib/kubelet/pods/41c91ed6-c9ea-48d5-ac06-87b859771b39/volumes" Dec 12 00:49:11 crc kubenswrapper[4606]: I1212 00:49:11.893821 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dbd84191-5cbb-48c5-af82-bfad9996ee60","Type":"ContainerStarted","Data":"24cd107db188e3cae38c374e358a8627139bd6ae2225fcf897211e5a70a16158"} Dec 12 00:49:11 crc kubenswrapper[4606]: I1212 00:49:11.895820 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-pz8fh" event={"ID":"491bea2c-f0d9-45f2-bcf2-a49b4312e1f0","Type":"ContainerDied","Data":"5b6dffa55c5baa1c049c770ea2f8c16dbf6ed9477f0f38ec1e62a53d68c75660"} Dec 12 00:49:11 crc kubenswrapper[4606]: I1212 00:49:11.895847 4606 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5b6dffa55c5baa1c049c770ea2f8c16dbf6ed9477f0f38ec1e62a53d68c75660" Dec 12 00:49:11 crc kubenswrapper[4606]: I1212 00:49:11.895901 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-pz8fh" Dec 12 00:49:12 crc kubenswrapper[4606]: I1212 00:49:12.101226 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 12 00:49:12 crc kubenswrapper[4606]: I1212 00:49:12.101431 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="788076ff-f08d-494e-b7ce-23897879660c" containerName="nova-api-log" containerID="cri-o://27adee9d67e7ade1d4e757bdd742833c9cc9e8ed08f91f347cccbd879eeb3353" gracePeriod=30 Dec 12 00:49:12 crc kubenswrapper[4606]: I1212 00:49:12.101794 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="788076ff-f08d-494e-b7ce-23897879660c" containerName="nova-api-api" containerID="cri-o://9db76ff4e6e38f42ea7581306a0e2c09815892feaf64e25eb23dc323434de321" gracePeriod=30 Dec 12 00:49:12 crc kubenswrapper[4606]: I1212 00:49:12.103659 4606 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="788076ff-f08d-494e-b7ce-23897879660c" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.205:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 12 00:49:12 crc kubenswrapper[4606]: I1212 00:49:12.103738 4606 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="788076ff-f08d-494e-b7ce-23897879660c" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.205:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 12 00:49:12 crc kubenswrapper[4606]: I1212 00:49:12.121557 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 12 00:49:12 crc kubenswrapper[4606]: I1212 00:49:12.121939 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="a4176ccd-46b9-44e4-af1b-f02f91762469" containerName="nova-metadata-log" containerID="cri-o://bdf98d1f27cbe685dd1272f2502b449010d9c6a4277dc98d71146a6620cb0715" gracePeriod=30 Dec 12 00:49:12 crc kubenswrapper[4606]: I1212 00:49:12.122838 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="a4176ccd-46b9-44e4-af1b-f02f91762469" containerName="nova-metadata-metadata" containerID="cri-o://7e10fb794188aaccf0db771b264ebc20e470eaec80fc45b8db4ef8f5f1e2b19b" gracePeriod=30 Dec 12 00:49:12 crc kubenswrapper[4606]: I1212 00:49:12.183700 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 12 00:49:12 crc kubenswrapper[4606]: I1212 00:49:12.183917 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="eac07a42-5397-42fd-be30-3677507d5a65" containerName="nova-scheduler-scheduler" containerID="cri-o://76493356d682c0e7accbc9e0c87215456eb472aa3fd0b81543206e3f0dc86c5d" gracePeriod=30 Dec 12 00:49:12 crc kubenswrapper[4606]: I1212 00:49:12.910162 4606 generic.go:334] "Generic (PLEG): container finished" podID="788076ff-f08d-494e-b7ce-23897879660c" containerID="27adee9d67e7ade1d4e757bdd742833c9cc9e8ed08f91f347cccbd879eeb3353" exitCode=143 Dec 12 00:49:12 crc kubenswrapper[4606]: I1212 00:49:12.910261 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"788076ff-f08d-494e-b7ce-23897879660c","Type":"ContainerDied","Data":"27adee9d67e7ade1d4e757bdd742833c9cc9e8ed08f91f347cccbd879eeb3353"} Dec 12 00:49:12 crc kubenswrapper[4606]: I1212 00:49:12.914383 4606 generic.go:334] "Generic (PLEG): container finished" podID="a4176ccd-46b9-44e4-af1b-f02f91762469" containerID="bdf98d1f27cbe685dd1272f2502b449010d9c6a4277dc98d71146a6620cb0715" exitCode=143 Dec 12 00:49:12 crc kubenswrapper[4606]: I1212 00:49:12.914481 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a4176ccd-46b9-44e4-af1b-f02f91762469","Type":"ContainerDied","Data":"bdf98d1f27cbe685dd1272f2502b449010d9c6a4277dc98d71146a6620cb0715"} Dec 12 00:49:12 crc kubenswrapper[4606]: I1212 00:49:12.926033 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dbd84191-5cbb-48c5-af82-bfad9996ee60","Type":"ContainerStarted","Data":"9ea0ecece773e8728cf0ee169ef41de124203f26b0693bacfffaf15da3852bc4"} Dec 12 00:49:13 crc kubenswrapper[4606]: I1212 00:49:13.937247 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dbd84191-5cbb-48c5-af82-bfad9996ee60","Type":"ContainerStarted","Data":"99be8cf76d7c624460f5f9bc06510e2fbf12b190e30826e8b9dc1e1bf1a4f7da"} Dec 12 00:49:14 crc kubenswrapper[4606]: I1212 00:49:14.211844 4606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-frh6k" podUID="459d1fea-af9d-46b7-ad3a-057dc9b980f2" containerName="registry-server" probeResult="failure" output=< Dec 12 00:49:14 crc kubenswrapper[4606]: timeout: failed to connect service ":50051" within 1s Dec 12 00:49:14 crc kubenswrapper[4606]: > Dec 12 00:49:14 crc kubenswrapper[4606]: I1212 00:49:14.969727 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dbd84191-5cbb-48c5-af82-bfad9996ee60","Type":"ContainerStarted","Data":"e83a7e07ea40787b7acf506c48f4274cdf46a4bd7b3d4d9539e6f8cbb191efbd"} Dec 12 00:49:14 crc kubenswrapper[4606]: I1212 00:49:14.970524 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 12 00:49:14 crc kubenswrapper[4606]: I1212 00:49:14.994795 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.553674945 podStartE2EDuration="5.9947804s" podCreationTimestamp="2025-12-12 00:49:09 +0000 UTC" firstStartedPulling="2025-12-12 00:49:10.783568994 +0000 UTC m=+1541.328921860" lastFinishedPulling="2025-12-12 00:49:14.224674449 +0000 UTC m=+1544.770027315" observedRunningTime="2025-12-12 00:49:14.992062357 +0000 UTC m=+1545.537415223" watchObservedRunningTime="2025-12-12 00:49:14.9947804 +0000 UTC m=+1545.540133266" Dec 12 00:49:15 crc kubenswrapper[4606]: I1212 00:49:15.535023 4606 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="a4176ccd-46b9-44e4-af1b-f02f91762469" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.195:8775/\": read tcp 10.217.0.2:56702->10.217.0.195:8775: read: connection reset by peer" Dec 12 00:49:15 crc kubenswrapper[4606]: I1212 00:49:15.535906 4606 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="a4176ccd-46b9-44e4-af1b-f02f91762469" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.195:8775/\": read tcp 10.217.0.2:56700->10.217.0.195:8775: read: connection reset by peer" Dec 12 00:49:15 crc kubenswrapper[4606]: I1212 00:49:15.999598 4606 generic.go:334] "Generic (PLEG): container finished" podID="a4176ccd-46b9-44e4-af1b-f02f91762469" containerID="7e10fb794188aaccf0db771b264ebc20e470eaec80fc45b8db4ef8f5f1e2b19b" exitCode=0 Dec 12 00:49:16 crc kubenswrapper[4606]: I1212 00:49:15.999885 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a4176ccd-46b9-44e4-af1b-f02f91762469","Type":"ContainerDied","Data":"7e10fb794188aaccf0db771b264ebc20e470eaec80fc45b8db4ef8f5f1e2b19b"} Dec 12 00:49:16 crc kubenswrapper[4606]: I1212 00:49:16.001020 4606 generic.go:334] "Generic (PLEG): container finished" podID="eac07a42-5397-42fd-be30-3677507d5a65" containerID="76493356d682c0e7accbc9e0c87215456eb472aa3fd0b81543206e3f0dc86c5d" exitCode=0 Dec 12 00:49:16 crc kubenswrapper[4606]: I1212 00:49:16.001137 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"eac07a42-5397-42fd-be30-3677507d5a65","Type":"ContainerDied","Data":"76493356d682c0e7accbc9e0c87215456eb472aa3fd0b81543206e3f0dc86c5d"} Dec 12 00:49:16 crc kubenswrapper[4606]: I1212 00:49:16.246262 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 12 00:49:16 crc kubenswrapper[4606]: I1212 00:49:16.250767 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 12 00:49:16 crc kubenswrapper[4606]: I1212 00:49:16.329348 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4176ccd-46b9-44e4-af1b-f02f91762469-config-data\") pod \"a4176ccd-46b9-44e4-af1b-f02f91762469\" (UID: \"a4176ccd-46b9-44e4-af1b-f02f91762469\") " Dec 12 00:49:16 crc kubenswrapper[4606]: I1212 00:49:16.329399 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eac07a42-5397-42fd-be30-3677507d5a65-config-data\") pod \"eac07a42-5397-42fd-be30-3677507d5a65\" (UID: \"eac07a42-5397-42fd-be30-3677507d5a65\") " Dec 12 00:49:16 crc kubenswrapper[4606]: I1212 00:49:16.329451 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eac07a42-5397-42fd-be30-3677507d5a65-combined-ca-bundle\") pod \"eac07a42-5397-42fd-be30-3677507d5a65\" (UID: \"eac07a42-5397-42fd-be30-3677507d5a65\") " Dec 12 00:49:16 crc kubenswrapper[4606]: I1212 00:49:16.329512 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4176ccd-46b9-44e4-af1b-f02f91762469-logs\") pod \"a4176ccd-46b9-44e4-af1b-f02f91762469\" (UID: \"a4176ccd-46b9-44e4-af1b-f02f91762469\") " Dec 12 00:49:16 crc kubenswrapper[4606]: I1212 00:49:16.329557 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4176ccd-46b9-44e4-af1b-f02f91762469-nova-metadata-tls-certs\") pod \"a4176ccd-46b9-44e4-af1b-f02f91762469\" (UID: \"a4176ccd-46b9-44e4-af1b-f02f91762469\") " Dec 12 00:49:16 crc kubenswrapper[4606]: I1212 00:49:16.329577 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wh4bc\" (UniqueName: \"kubernetes.io/projected/eac07a42-5397-42fd-be30-3677507d5a65-kube-api-access-wh4bc\") pod \"eac07a42-5397-42fd-be30-3677507d5a65\" (UID: \"eac07a42-5397-42fd-be30-3677507d5a65\") " Dec 12 00:49:16 crc kubenswrapper[4606]: I1212 00:49:16.329614 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6b6dj\" (UniqueName: \"kubernetes.io/projected/a4176ccd-46b9-44e4-af1b-f02f91762469-kube-api-access-6b6dj\") pod \"a4176ccd-46b9-44e4-af1b-f02f91762469\" (UID: \"a4176ccd-46b9-44e4-af1b-f02f91762469\") " Dec 12 00:49:16 crc kubenswrapper[4606]: I1212 00:49:16.329670 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4176ccd-46b9-44e4-af1b-f02f91762469-combined-ca-bundle\") pod \"a4176ccd-46b9-44e4-af1b-f02f91762469\" (UID: \"a4176ccd-46b9-44e4-af1b-f02f91762469\") " Dec 12 00:49:16 crc kubenswrapper[4606]: I1212 00:49:16.329941 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4176ccd-46b9-44e4-af1b-f02f91762469-logs" (OuterVolumeSpecName: "logs") pod "a4176ccd-46b9-44e4-af1b-f02f91762469" (UID: "a4176ccd-46b9-44e4-af1b-f02f91762469"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 00:49:16 crc kubenswrapper[4606]: I1212 00:49:16.330290 4606 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4176ccd-46b9-44e4-af1b-f02f91762469-logs\") on node \"crc\" DevicePath \"\"" Dec 12 00:49:16 crc kubenswrapper[4606]: I1212 00:49:16.338454 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eac07a42-5397-42fd-be30-3677507d5a65-kube-api-access-wh4bc" (OuterVolumeSpecName: "kube-api-access-wh4bc") pod "eac07a42-5397-42fd-be30-3677507d5a65" (UID: "eac07a42-5397-42fd-be30-3677507d5a65"). InnerVolumeSpecName "kube-api-access-wh4bc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:49:16 crc kubenswrapper[4606]: I1212 00:49:16.352414 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4176ccd-46b9-44e4-af1b-f02f91762469-kube-api-access-6b6dj" (OuterVolumeSpecName: "kube-api-access-6b6dj") pod "a4176ccd-46b9-44e4-af1b-f02f91762469" (UID: "a4176ccd-46b9-44e4-af1b-f02f91762469"). InnerVolumeSpecName "kube-api-access-6b6dj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:49:16 crc kubenswrapper[4606]: I1212 00:49:16.387786 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4176ccd-46b9-44e4-af1b-f02f91762469-config-data" (OuterVolumeSpecName: "config-data") pod "a4176ccd-46b9-44e4-af1b-f02f91762469" (UID: "a4176ccd-46b9-44e4-af1b-f02f91762469"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:49:16 crc kubenswrapper[4606]: I1212 00:49:16.401759 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eac07a42-5397-42fd-be30-3677507d5a65-config-data" (OuterVolumeSpecName: "config-data") pod "eac07a42-5397-42fd-be30-3677507d5a65" (UID: "eac07a42-5397-42fd-be30-3677507d5a65"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:49:16 crc kubenswrapper[4606]: I1212 00:49:16.410679 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4176ccd-46b9-44e4-af1b-f02f91762469-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a4176ccd-46b9-44e4-af1b-f02f91762469" (UID: "a4176ccd-46b9-44e4-af1b-f02f91762469"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:49:16 crc kubenswrapper[4606]: I1212 00:49:16.420852 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eac07a42-5397-42fd-be30-3677507d5a65-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eac07a42-5397-42fd-be30-3677507d5a65" (UID: "eac07a42-5397-42fd-be30-3677507d5a65"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:49:16 crc kubenswrapper[4606]: I1212 00:49:16.427776 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4176ccd-46b9-44e4-af1b-f02f91762469-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "a4176ccd-46b9-44e4-af1b-f02f91762469" (UID: "a4176ccd-46b9-44e4-af1b-f02f91762469"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:49:16 crc kubenswrapper[4606]: I1212 00:49:16.434432 4606 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4176ccd-46b9-44e4-af1b-f02f91762469-config-data\") on node \"crc\" DevicePath \"\"" Dec 12 00:49:16 crc kubenswrapper[4606]: I1212 00:49:16.434737 4606 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eac07a42-5397-42fd-be30-3677507d5a65-config-data\") on node \"crc\" DevicePath \"\"" Dec 12 00:49:16 crc kubenswrapper[4606]: I1212 00:49:16.434797 4606 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eac07a42-5397-42fd-be30-3677507d5a65-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 12 00:49:16 crc kubenswrapper[4606]: I1212 00:49:16.434888 4606 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4176ccd-46b9-44e4-af1b-f02f91762469-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 12 00:49:16 crc kubenswrapper[4606]: I1212 00:49:16.434975 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wh4bc\" (UniqueName: \"kubernetes.io/projected/eac07a42-5397-42fd-be30-3677507d5a65-kube-api-access-wh4bc\") on node \"crc\" DevicePath \"\"" Dec 12 00:49:16 crc kubenswrapper[4606]: I1212 00:49:16.435079 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6b6dj\" (UniqueName: \"kubernetes.io/projected/a4176ccd-46b9-44e4-af1b-f02f91762469-kube-api-access-6b6dj\") on node \"crc\" DevicePath \"\"" Dec 12 00:49:16 crc kubenswrapper[4606]: I1212 00:49:16.435163 4606 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4176ccd-46b9-44e4-af1b-f02f91762469-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 12 00:49:17 crc kubenswrapper[4606]: I1212 00:49:17.010161 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"eac07a42-5397-42fd-be30-3677507d5a65","Type":"ContainerDied","Data":"6ca15f1e5bf3638c6d30145dbe36cfd605a19a9c538ee8748dfccbf5b8a13983"} Dec 12 00:49:17 crc kubenswrapper[4606]: I1212 00:49:17.011030 4606 scope.go:117] "RemoveContainer" containerID="76493356d682c0e7accbc9e0c87215456eb472aa3fd0b81543206e3f0dc86c5d" Dec 12 00:49:17 crc kubenswrapper[4606]: I1212 00:49:17.010463 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 12 00:49:17 crc kubenswrapper[4606]: I1212 00:49:17.012200 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a4176ccd-46b9-44e4-af1b-f02f91762469","Type":"ContainerDied","Data":"5766b600fb26557a1b097f61b630512f202821ffd8d7f3c71a66c0881b354abd"} Dec 12 00:49:17 crc kubenswrapper[4606]: I1212 00:49:17.012251 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 12 00:49:17 crc kubenswrapper[4606]: I1212 00:49:17.079612 4606 scope.go:117] "RemoveContainer" containerID="7e10fb794188aaccf0db771b264ebc20e470eaec80fc45b8db4ef8f5f1e2b19b" Dec 12 00:49:17 crc kubenswrapper[4606]: I1212 00:49:17.122852 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 12 00:49:17 crc kubenswrapper[4606]: I1212 00:49:17.137756 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 12 00:49:17 crc kubenswrapper[4606]: I1212 00:49:17.150235 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 12 00:49:17 crc kubenswrapper[4606]: I1212 00:49:17.163232 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 12 00:49:17 crc kubenswrapper[4606]: E1212 00:49:17.163678 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4176ccd-46b9-44e4-af1b-f02f91762469" containerName="nova-metadata-log" Dec 12 00:49:17 crc kubenswrapper[4606]: I1212 00:49:17.163698 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4176ccd-46b9-44e4-af1b-f02f91762469" containerName="nova-metadata-log" Dec 12 00:49:17 crc kubenswrapper[4606]: E1212 00:49:17.163717 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4176ccd-46b9-44e4-af1b-f02f91762469" containerName="nova-metadata-metadata" Dec 12 00:49:17 crc kubenswrapper[4606]: I1212 00:49:17.163731 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4176ccd-46b9-44e4-af1b-f02f91762469" containerName="nova-metadata-metadata" Dec 12 00:49:17 crc kubenswrapper[4606]: E1212 00:49:17.163748 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eac07a42-5397-42fd-be30-3677507d5a65" containerName="nova-scheduler-scheduler" Dec 12 00:49:17 crc kubenswrapper[4606]: I1212 00:49:17.163754 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="eac07a42-5397-42fd-be30-3677507d5a65" containerName="nova-scheduler-scheduler" Dec 12 00:49:17 crc kubenswrapper[4606]: E1212 00:49:17.163776 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="491bea2c-f0d9-45f2-bcf2-a49b4312e1f0" containerName="nova-manage" Dec 12 00:49:17 crc kubenswrapper[4606]: I1212 00:49:17.163819 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="491bea2c-f0d9-45f2-bcf2-a49b4312e1f0" containerName="nova-manage" Dec 12 00:49:17 crc kubenswrapper[4606]: I1212 00:49:17.164010 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4176ccd-46b9-44e4-af1b-f02f91762469" containerName="nova-metadata-log" Dec 12 00:49:17 crc kubenswrapper[4606]: I1212 00:49:17.164028 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4176ccd-46b9-44e4-af1b-f02f91762469" containerName="nova-metadata-metadata" Dec 12 00:49:17 crc kubenswrapper[4606]: I1212 00:49:17.164038 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="491bea2c-f0d9-45f2-bcf2-a49b4312e1f0" containerName="nova-manage" Dec 12 00:49:17 crc kubenswrapper[4606]: I1212 00:49:17.164059 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="eac07a42-5397-42fd-be30-3677507d5a65" containerName="nova-scheduler-scheduler" Dec 12 00:49:17 crc kubenswrapper[4606]: I1212 00:49:17.164695 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 12 00:49:17 crc kubenswrapper[4606]: I1212 00:49:17.170315 4606 scope.go:117] "RemoveContainer" containerID="bdf98d1f27cbe685dd1272f2502b449010d9c6a4277dc98d71146a6620cb0715" Dec 12 00:49:17 crc kubenswrapper[4606]: I1212 00:49:17.170787 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 12 00:49:17 crc kubenswrapper[4606]: I1212 00:49:17.194761 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 12 00:49:17 crc kubenswrapper[4606]: I1212 00:49:17.198357 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 12 00:49:17 crc kubenswrapper[4606]: I1212 00:49:17.200941 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 12 00:49:17 crc kubenswrapper[4606]: I1212 00:49:17.212043 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 12 00:49:17 crc kubenswrapper[4606]: I1212 00:49:17.212240 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 12 00:49:17 crc kubenswrapper[4606]: I1212 00:49:17.229852 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 12 00:49:17 crc kubenswrapper[4606]: I1212 00:49:17.256134 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpkv5\" (UniqueName: \"kubernetes.io/projected/c47ad26c-c149-41c4-8527-5e604b61e0f0-kube-api-access-vpkv5\") pod \"nova-scheduler-0\" (UID: \"c47ad26c-c149-41c4-8527-5e604b61e0f0\") " pod="openstack/nova-scheduler-0" Dec 12 00:49:17 crc kubenswrapper[4606]: I1212 00:49:17.256221 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3157a9f-4b11-4116-8be9-f4cb87e19b9f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f3157a9f-4b11-4116-8be9-f4cb87e19b9f\") " pod="openstack/nova-metadata-0" Dec 12 00:49:17 crc kubenswrapper[4606]: I1212 00:49:17.256249 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3157a9f-4b11-4116-8be9-f4cb87e19b9f-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"f3157a9f-4b11-4116-8be9-f4cb87e19b9f\") " pod="openstack/nova-metadata-0" Dec 12 00:49:17 crc kubenswrapper[4606]: I1212 00:49:17.256329 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-np255\" (UniqueName: \"kubernetes.io/projected/f3157a9f-4b11-4116-8be9-f4cb87e19b9f-kube-api-access-np255\") pod \"nova-metadata-0\" (UID: \"f3157a9f-4b11-4116-8be9-f4cb87e19b9f\") " pod="openstack/nova-metadata-0" Dec 12 00:49:17 crc kubenswrapper[4606]: I1212 00:49:17.256348 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3157a9f-4b11-4116-8be9-f4cb87e19b9f-logs\") pod \"nova-metadata-0\" (UID: \"f3157a9f-4b11-4116-8be9-f4cb87e19b9f\") " pod="openstack/nova-metadata-0" Dec 12 00:49:17 crc kubenswrapper[4606]: I1212 00:49:17.256376 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c47ad26c-c149-41c4-8527-5e604b61e0f0-config-data\") pod \"nova-scheduler-0\" (UID: \"c47ad26c-c149-41c4-8527-5e604b61e0f0\") " pod="openstack/nova-scheduler-0" Dec 12 00:49:17 crc kubenswrapper[4606]: I1212 00:49:17.256406 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c47ad26c-c149-41c4-8527-5e604b61e0f0-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c47ad26c-c149-41c4-8527-5e604b61e0f0\") " pod="openstack/nova-scheduler-0" Dec 12 00:49:17 crc kubenswrapper[4606]: I1212 00:49:17.256425 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3157a9f-4b11-4116-8be9-f4cb87e19b9f-config-data\") pod \"nova-metadata-0\" (UID: \"f3157a9f-4b11-4116-8be9-f4cb87e19b9f\") " pod="openstack/nova-metadata-0" Dec 12 00:49:17 crc kubenswrapper[4606]: I1212 00:49:17.256531 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 12 00:49:17 crc kubenswrapper[4606]: I1212 00:49:17.365117 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3157a9f-4b11-4116-8be9-f4cb87e19b9f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f3157a9f-4b11-4116-8be9-f4cb87e19b9f\") " pod="openstack/nova-metadata-0" Dec 12 00:49:17 crc kubenswrapper[4606]: I1212 00:49:17.365184 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3157a9f-4b11-4116-8be9-f4cb87e19b9f-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"f3157a9f-4b11-4116-8be9-f4cb87e19b9f\") " pod="openstack/nova-metadata-0" Dec 12 00:49:17 crc kubenswrapper[4606]: I1212 00:49:17.365312 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-np255\" (UniqueName: \"kubernetes.io/projected/f3157a9f-4b11-4116-8be9-f4cb87e19b9f-kube-api-access-np255\") pod \"nova-metadata-0\" (UID: \"f3157a9f-4b11-4116-8be9-f4cb87e19b9f\") " pod="openstack/nova-metadata-0" Dec 12 00:49:17 crc kubenswrapper[4606]: I1212 00:49:17.365337 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3157a9f-4b11-4116-8be9-f4cb87e19b9f-logs\") pod \"nova-metadata-0\" (UID: \"f3157a9f-4b11-4116-8be9-f4cb87e19b9f\") " pod="openstack/nova-metadata-0" Dec 12 00:49:17 crc kubenswrapper[4606]: I1212 00:49:17.365381 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c47ad26c-c149-41c4-8527-5e604b61e0f0-config-data\") pod \"nova-scheduler-0\" (UID: \"c47ad26c-c149-41c4-8527-5e604b61e0f0\") " pod="openstack/nova-scheduler-0" Dec 12 00:49:17 crc kubenswrapper[4606]: I1212 00:49:17.365428 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c47ad26c-c149-41c4-8527-5e604b61e0f0-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c47ad26c-c149-41c4-8527-5e604b61e0f0\") " pod="openstack/nova-scheduler-0" Dec 12 00:49:17 crc kubenswrapper[4606]: I1212 00:49:17.365454 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3157a9f-4b11-4116-8be9-f4cb87e19b9f-config-data\") pod \"nova-metadata-0\" (UID: \"f3157a9f-4b11-4116-8be9-f4cb87e19b9f\") " pod="openstack/nova-metadata-0" Dec 12 00:49:17 crc kubenswrapper[4606]: I1212 00:49:17.365507 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vpkv5\" (UniqueName: \"kubernetes.io/projected/c47ad26c-c149-41c4-8527-5e604b61e0f0-kube-api-access-vpkv5\") pod \"nova-scheduler-0\" (UID: \"c47ad26c-c149-41c4-8527-5e604b61e0f0\") " pod="openstack/nova-scheduler-0" Dec 12 00:49:17 crc kubenswrapper[4606]: I1212 00:49:17.371365 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3157a9f-4b11-4116-8be9-f4cb87e19b9f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f3157a9f-4b11-4116-8be9-f4cb87e19b9f\") " pod="openstack/nova-metadata-0" Dec 12 00:49:17 crc kubenswrapper[4606]: I1212 00:49:17.377813 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c47ad26c-c149-41c4-8527-5e604b61e0f0-config-data\") pod \"nova-scheduler-0\" (UID: \"c47ad26c-c149-41c4-8527-5e604b61e0f0\") " pod="openstack/nova-scheduler-0" Dec 12 00:49:17 crc kubenswrapper[4606]: I1212 00:49:17.381945 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3157a9f-4b11-4116-8be9-f4cb87e19b9f-logs\") pod \"nova-metadata-0\" (UID: \"f3157a9f-4b11-4116-8be9-f4cb87e19b9f\") " pod="openstack/nova-metadata-0" Dec 12 00:49:17 crc kubenswrapper[4606]: I1212 00:49:17.386371 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c47ad26c-c149-41c4-8527-5e604b61e0f0-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c47ad26c-c149-41c4-8527-5e604b61e0f0\") " pod="openstack/nova-scheduler-0" Dec 12 00:49:17 crc kubenswrapper[4606]: I1212 00:49:17.393010 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpkv5\" (UniqueName: \"kubernetes.io/projected/c47ad26c-c149-41c4-8527-5e604b61e0f0-kube-api-access-vpkv5\") pod \"nova-scheduler-0\" (UID: \"c47ad26c-c149-41c4-8527-5e604b61e0f0\") " pod="openstack/nova-scheduler-0" Dec 12 00:49:17 crc kubenswrapper[4606]: I1212 00:49:17.399756 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3157a9f-4b11-4116-8be9-f4cb87e19b9f-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"f3157a9f-4b11-4116-8be9-f4cb87e19b9f\") " pod="openstack/nova-metadata-0" Dec 12 00:49:17 crc kubenswrapper[4606]: I1212 00:49:17.419687 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3157a9f-4b11-4116-8be9-f4cb87e19b9f-config-data\") pod \"nova-metadata-0\" (UID: \"f3157a9f-4b11-4116-8be9-f4cb87e19b9f\") " pod="openstack/nova-metadata-0" Dec 12 00:49:17 crc kubenswrapper[4606]: I1212 00:49:17.429803 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-np255\" (UniqueName: \"kubernetes.io/projected/f3157a9f-4b11-4116-8be9-f4cb87e19b9f-kube-api-access-np255\") pod \"nova-metadata-0\" (UID: \"f3157a9f-4b11-4116-8be9-f4cb87e19b9f\") " pod="openstack/nova-metadata-0" Dec 12 00:49:17 crc kubenswrapper[4606]: I1212 00:49:17.514551 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 12 00:49:17 crc kubenswrapper[4606]: I1212 00:49:17.536929 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 12 00:49:17 crc kubenswrapper[4606]: I1212 00:49:17.718948 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4176ccd-46b9-44e4-af1b-f02f91762469" path="/var/lib/kubelet/pods/a4176ccd-46b9-44e4-af1b-f02f91762469/volumes" Dec 12 00:49:17 crc kubenswrapper[4606]: I1212 00:49:17.722506 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eac07a42-5397-42fd-be30-3677507d5a65" path="/var/lib/kubelet/pods/eac07a42-5397-42fd-be30-3677507d5a65/volumes" Dec 12 00:49:18 crc kubenswrapper[4606]: I1212 00:49:18.036510 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 12 00:49:18 crc kubenswrapper[4606]: W1212 00:49:18.160683 4606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf3157a9f_4b11_4116_8be9_f4cb87e19b9f.slice/crio-ac753f4395d4edc0cd6e1d5c6d127ef359c68191bebc9931c6bc39aa33418791 WatchSource:0}: Error finding container ac753f4395d4edc0cd6e1d5c6d127ef359c68191bebc9931c6bc39aa33418791: Status 404 returned error can't find the container with id ac753f4395d4edc0cd6e1d5c6d127ef359c68191bebc9931c6bc39aa33418791 Dec 12 00:49:18 crc kubenswrapper[4606]: I1212 00:49:18.160707 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 12 00:49:18 crc kubenswrapper[4606]: I1212 00:49:18.948945 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 12 00:49:19 crc kubenswrapper[4606]: I1212 00:49:19.007561 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/788076ff-f08d-494e-b7ce-23897879660c-config-data\") pod \"788076ff-f08d-494e-b7ce-23897879660c\" (UID: \"788076ff-f08d-494e-b7ce-23897879660c\") " Dec 12 00:49:19 crc kubenswrapper[4606]: I1212 00:49:19.007711 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/788076ff-f08d-494e-b7ce-23897879660c-logs\") pod \"788076ff-f08d-494e-b7ce-23897879660c\" (UID: \"788076ff-f08d-494e-b7ce-23897879660c\") " Dec 12 00:49:19 crc kubenswrapper[4606]: I1212 00:49:19.007733 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/788076ff-f08d-494e-b7ce-23897879660c-public-tls-certs\") pod \"788076ff-f08d-494e-b7ce-23897879660c\" (UID: \"788076ff-f08d-494e-b7ce-23897879660c\") " Dec 12 00:49:19 crc kubenswrapper[4606]: I1212 00:49:19.007751 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/788076ff-f08d-494e-b7ce-23897879660c-combined-ca-bundle\") pod \"788076ff-f08d-494e-b7ce-23897879660c\" (UID: \"788076ff-f08d-494e-b7ce-23897879660c\") " Dec 12 00:49:19 crc kubenswrapper[4606]: I1212 00:49:19.008136 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/788076ff-f08d-494e-b7ce-23897879660c-logs" (OuterVolumeSpecName: "logs") pod "788076ff-f08d-494e-b7ce-23897879660c" (UID: "788076ff-f08d-494e-b7ce-23897879660c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 00:49:19 crc kubenswrapper[4606]: I1212 00:49:19.008280 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t9r54\" (UniqueName: \"kubernetes.io/projected/788076ff-f08d-494e-b7ce-23897879660c-kube-api-access-t9r54\") pod \"788076ff-f08d-494e-b7ce-23897879660c\" (UID: \"788076ff-f08d-494e-b7ce-23897879660c\") " Dec 12 00:49:19 crc kubenswrapper[4606]: I1212 00:49:19.008309 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/788076ff-f08d-494e-b7ce-23897879660c-internal-tls-certs\") pod \"788076ff-f08d-494e-b7ce-23897879660c\" (UID: \"788076ff-f08d-494e-b7ce-23897879660c\") " Dec 12 00:49:19 crc kubenswrapper[4606]: I1212 00:49:19.008675 4606 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/788076ff-f08d-494e-b7ce-23897879660c-logs\") on node \"crc\" DevicePath \"\"" Dec 12 00:49:19 crc kubenswrapper[4606]: I1212 00:49:19.030444 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/788076ff-f08d-494e-b7ce-23897879660c-kube-api-access-t9r54" (OuterVolumeSpecName: "kube-api-access-t9r54") pod "788076ff-f08d-494e-b7ce-23897879660c" (UID: "788076ff-f08d-494e-b7ce-23897879660c"). InnerVolumeSpecName "kube-api-access-t9r54". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:49:19 crc kubenswrapper[4606]: I1212 00:49:19.047696 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f3157a9f-4b11-4116-8be9-f4cb87e19b9f","Type":"ContainerStarted","Data":"84993a3b9f08b52002e87033a182e072078e3be5855067e0344772f848593947"} Dec 12 00:49:19 crc kubenswrapper[4606]: I1212 00:49:19.047738 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f3157a9f-4b11-4116-8be9-f4cb87e19b9f","Type":"ContainerStarted","Data":"3ba3a831dbace68775ad1cb053fca9935681ec10953adba3f2af5c875434aa30"} Dec 12 00:49:19 crc kubenswrapper[4606]: I1212 00:49:19.047747 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f3157a9f-4b11-4116-8be9-f4cb87e19b9f","Type":"ContainerStarted","Data":"ac753f4395d4edc0cd6e1d5c6d127ef359c68191bebc9931c6bc39aa33418791"} Dec 12 00:49:19 crc kubenswrapper[4606]: I1212 00:49:19.060264 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/788076ff-f08d-494e-b7ce-23897879660c-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "788076ff-f08d-494e-b7ce-23897879660c" (UID: "788076ff-f08d-494e-b7ce-23897879660c"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:49:19 crc kubenswrapper[4606]: I1212 00:49:19.062258 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/788076ff-f08d-494e-b7ce-23897879660c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "788076ff-f08d-494e-b7ce-23897879660c" (UID: "788076ff-f08d-494e-b7ce-23897879660c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:49:19 crc kubenswrapper[4606]: I1212 00:49:19.078294 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c47ad26c-c149-41c4-8527-5e604b61e0f0","Type":"ContainerStarted","Data":"c3c7b33a2dae945f9d492ecb3a64b1935a16bd6e52bc681239b852c1224197db"} Dec 12 00:49:19 crc kubenswrapper[4606]: I1212 00:49:19.078340 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c47ad26c-c149-41c4-8527-5e604b61e0f0","Type":"ContainerStarted","Data":"0796c30d0d5f0a769cdc4779b19bcf94a1a434979d9639103daa5d122ab1c38f"} Dec 12 00:49:19 crc kubenswrapper[4606]: I1212 00:49:19.088679 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.088663223 podStartE2EDuration="2.088663223s" podCreationTimestamp="2025-12-12 00:49:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:49:19.071787601 +0000 UTC m=+1549.617140477" watchObservedRunningTime="2025-12-12 00:49:19.088663223 +0000 UTC m=+1549.634016089" Dec 12 00:49:19 crc kubenswrapper[4606]: I1212 00:49:19.088992 4606 generic.go:334] "Generic (PLEG): container finished" podID="788076ff-f08d-494e-b7ce-23897879660c" containerID="9db76ff4e6e38f42ea7581306a0e2c09815892feaf64e25eb23dc323434de321" exitCode=0 Dec 12 00:49:19 crc kubenswrapper[4606]: I1212 00:49:19.089035 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"788076ff-f08d-494e-b7ce-23897879660c","Type":"ContainerDied","Data":"9db76ff4e6e38f42ea7581306a0e2c09815892feaf64e25eb23dc323434de321"} Dec 12 00:49:19 crc kubenswrapper[4606]: I1212 00:49:19.089060 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"788076ff-f08d-494e-b7ce-23897879660c","Type":"ContainerDied","Data":"6c2d1ae960b6beb782608cd72240004bd98e8b667cbd85dccd11da921ffc5592"} Dec 12 00:49:19 crc kubenswrapper[4606]: I1212 00:49:19.089077 4606 scope.go:117] "RemoveContainer" containerID="9db76ff4e6e38f42ea7581306a0e2c09815892feaf64e25eb23dc323434de321" Dec 12 00:49:19 crc kubenswrapper[4606]: I1212 00:49:19.089232 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 12 00:49:19 crc kubenswrapper[4606]: I1212 00:49:19.108488 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.108469464 podStartE2EDuration="2.108469464s" podCreationTimestamp="2025-12-12 00:49:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:49:19.097352576 +0000 UTC m=+1549.642705462" watchObservedRunningTime="2025-12-12 00:49:19.108469464 +0000 UTC m=+1549.653822330" Dec 12 00:49:19 crc kubenswrapper[4606]: I1212 00:49:19.115338 4606 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/788076ff-f08d-494e-b7ce-23897879660c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 12 00:49:19 crc kubenswrapper[4606]: I1212 00:49:19.115379 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t9r54\" (UniqueName: \"kubernetes.io/projected/788076ff-f08d-494e-b7ce-23897879660c-kube-api-access-t9r54\") on node \"crc\" DevicePath \"\"" Dec 12 00:49:19 crc kubenswrapper[4606]: I1212 00:49:19.115394 4606 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/788076ff-f08d-494e-b7ce-23897879660c-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 12 00:49:19 crc kubenswrapper[4606]: I1212 00:49:19.126440 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/788076ff-f08d-494e-b7ce-23897879660c-config-data" (OuterVolumeSpecName: "config-data") pod "788076ff-f08d-494e-b7ce-23897879660c" (UID: "788076ff-f08d-494e-b7ce-23897879660c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:49:19 crc kubenswrapper[4606]: I1212 00:49:19.138015 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/788076ff-f08d-494e-b7ce-23897879660c-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "788076ff-f08d-494e-b7ce-23897879660c" (UID: "788076ff-f08d-494e-b7ce-23897879660c"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:49:19 crc kubenswrapper[4606]: I1212 00:49:19.216825 4606 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/788076ff-f08d-494e-b7ce-23897879660c-config-data\") on node \"crc\" DevicePath \"\"" Dec 12 00:49:19 crc kubenswrapper[4606]: I1212 00:49:19.217152 4606 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/788076ff-f08d-494e-b7ce-23897879660c-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 12 00:49:19 crc kubenswrapper[4606]: I1212 00:49:19.221672 4606 scope.go:117] "RemoveContainer" containerID="27adee9d67e7ade1d4e757bdd742833c9cc9e8ed08f91f347cccbd879eeb3353" Dec 12 00:49:19 crc kubenswrapper[4606]: I1212 00:49:19.243009 4606 scope.go:117] "RemoveContainer" containerID="9db76ff4e6e38f42ea7581306a0e2c09815892feaf64e25eb23dc323434de321" Dec 12 00:49:19 crc kubenswrapper[4606]: E1212 00:49:19.243473 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9db76ff4e6e38f42ea7581306a0e2c09815892feaf64e25eb23dc323434de321\": container with ID starting with 9db76ff4e6e38f42ea7581306a0e2c09815892feaf64e25eb23dc323434de321 not found: ID does not exist" containerID="9db76ff4e6e38f42ea7581306a0e2c09815892feaf64e25eb23dc323434de321" Dec 12 00:49:19 crc kubenswrapper[4606]: I1212 00:49:19.243508 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9db76ff4e6e38f42ea7581306a0e2c09815892feaf64e25eb23dc323434de321"} err="failed to get container status \"9db76ff4e6e38f42ea7581306a0e2c09815892feaf64e25eb23dc323434de321\": rpc error: code = NotFound desc = could not find container \"9db76ff4e6e38f42ea7581306a0e2c09815892feaf64e25eb23dc323434de321\": container with ID starting with 9db76ff4e6e38f42ea7581306a0e2c09815892feaf64e25eb23dc323434de321 not found: ID does not exist" Dec 12 00:49:19 crc kubenswrapper[4606]: I1212 00:49:19.243530 4606 scope.go:117] "RemoveContainer" containerID="27adee9d67e7ade1d4e757bdd742833c9cc9e8ed08f91f347cccbd879eeb3353" Dec 12 00:49:19 crc kubenswrapper[4606]: E1212 00:49:19.243856 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27adee9d67e7ade1d4e757bdd742833c9cc9e8ed08f91f347cccbd879eeb3353\": container with ID starting with 27adee9d67e7ade1d4e757bdd742833c9cc9e8ed08f91f347cccbd879eeb3353 not found: ID does not exist" containerID="27adee9d67e7ade1d4e757bdd742833c9cc9e8ed08f91f347cccbd879eeb3353" Dec 12 00:49:19 crc kubenswrapper[4606]: I1212 00:49:19.243914 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27adee9d67e7ade1d4e757bdd742833c9cc9e8ed08f91f347cccbd879eeb3353"} err="failed to get container status \"27adee9d67e7ade1d4e757bdd742833c9cc9e8ed08f91f347cccbd879eeb3353\": rpc error: code = NotFound desc = could not find container \"27adee9d67e7ade1d4e757bdd742833c9cc9e8ed08f91f347cccbd879eeb3353\": container with ID starting with 27adee9d67e7ade1d4e757bdd742833c9cc9e8ed08f91f347cccbd879eeb3353 not found: ID does not exist" Dec 12 00:49:19 crc kubenswrapper[4606]: I1212 00:49:19.448464 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 12 00:49:19 crc kubenswrapper[4606]: I1212 00:49:19.468258 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 12 00:49:19 crc kubenswrapper[4606]: I1212 00:49:19.488513 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 12 00:49:19 crc kubenswrapper[4606]: E1212 00:49:19.488934 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="788076ff-f08d-494e-b7ce-23897879660c" containerName="nova-api-log" Dec 12 00:49:19 crc kubenswrapper[4606]: I1212 00:49:19.488952 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="788076ff-f08d-494e-b7ce-23897879660c" containerName="nova-api-log" Dec 12 00:49:19 crc kubenswrapper[4606]: E1212 00:49:19.488973 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="788076ff-f08d-494e-b7ce-23897879660c" containerName="nova-api-api" Dec 12 00:49:19 crc kubenswrapper[4606]: I1212 00:49:19.488979 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="788076ff-f08d-494e-b7ce-23897879660c" containerName="nova-api-api" Dec 12 00:49:19 crc kubenswrapper[4606]: I1212 00:49:19.489161 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="788076ff-f08d-494e-b7ce-23897879660c" containerName="nova-api-api" Dec 12 00:49:19 crc kubenswrapper[4606]: I1212 00:49:19.489204 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="788076ff-f08d-494e-b7ce-23897879660c" containerName="nova-api-log" Dec 12 00:49:19 crc kubenswrapper[4606]: I1212 00:49:19.490302 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 12 00:49:19 crc kubenswrapper[4606]: I1212 00:49:19.493398 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 12 00:49:19 crc kubenswrapper[4606]: I1212 00:49:19.493991 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 12 00:49:19 crc kubenswrapper[4606]: I1212 00:49:19.495110 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 12 00:49:19 crc kubenswrapper[4606]: I1212 00:49:19.508681 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 12 00:49:19 crc kubenswrapper[4606]: I1212 00:49:19.630824 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab424069-d5cd-4b92-b1d8-1311cffefad6-config-data\") pod \"nova-api-0\" (UID: \"ab424069-d5cd-4b92-b1d8-1311cffefad6\") " pod="openstack/nova-api-0" Dec 12 00:49:19 crc kubenswrapper[4606]: I1212 00:49:19.630893 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab424069-d5cd-4b92-b1d8-1311cffefad6-internal-tls-certs\") pod \"nova-api-0\" (UID: \"ab424069-d5cd-4b92-b1d8-1311cffefad6\") " pod="openstack/nova-api-0" Dec 12 00:49:19 crc kubenswrapper[4606]: I1212 00:49:19.630925 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab424069-d5cd-4b92-b1d8-1311cffefad6-logs\") pod \"nova-api-0\" (UID: \"ab424069-d5cd-4b92-b1d8-1311cffefad6\") " pod="openstack/nova-api-0" Dec 12 00:49:19 crc kubenswrapper[4606]: I1212 00:49:19.630966 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab424069-d5cd-4b92-b1d8-1311cffefad6-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ab424069-d5cd-4b92-b1d8-1311cffefad6\") " pod="openstack/nova-api-0" Dec 12 00:49:19 crc kubenswrapper[4606]: I1212 00:49:19.631103 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfg69\" (UniqueName: \"kubernetes.io/projected/ab424069-d5cd-4b92-b1d8-1311cffefad6-kube-api-access-xfg69\") pod \"nova-api-0\" (UID: \"ab424069-d5cd-4b92-b1d8-1311cffefad6\") " pod="openstack/nova-api-0" Dec 12 00:49:19 crc kubenswrapper[4606]: I1212 00:49:19.631297 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab424069-d5cd-4b92-b1d8-1311cffefad6-public-tls-certs\") pod \"nova-api-0\" (UID: \"ab424069-d5cd-4b92-b1d8-1311cffefad6\") " pod="openstack/nova-api-0" Dec 12 00:49:19 crc kubenswrapper[4606]: I1212 00:49:19.709544 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="788076ff-f08d-494e-b7ce-23897879660c" path="/var/lib/kubelet/pods/788076ff-f08d-494e-b7ce-23897879660c/volumes" Dec 12 00:49:19 crc kubenswrapper[4606]: I1212 00:49:19.732531 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab424069-d5cd-4b92-b1d8-1311cffefad6-public-tls-certs\") pod \"nova-api-0\" (UID: \"ab424069-d5cd-4b92-b1d8-1311cffefad6\") " pod="openstack/nova-api-0" Dec 12 00:49:19 crc kubenswrapper[4606]: I1212 00:49:19.732594 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab424069-d5cd-4b92-b1d8-1311cffefad6-config-data\") pod \"nova-api-0\" (UID: \"ab424069-d5cd-4b92-b1d8-1311cffefad6\") " pod="openstack/nova-api-0" Dec 12 00:49:19 crc kubenswrapper[4606]: I1212 00:49:19.732627 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab424069-d5cd-4b92-b1d8-1311cffefad6-internal-tls-certs\") pod \"nova-api-0\" (UID: \"ab424069-d5cd-4b92-b1d8-1311cffefad6\") " pod="openstack/nova-api-0" Dec 12 00:49:19 crc kubenswrapper[4606]: I1212 00:49:19.732650 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab424069-d5cd-4b92-b1d8-1311cffefad6-logs\") pod \"nova-api-0\" (UID: \"ab424069-d5cd-4b92-b1d8-1311cffefad6\") " pod="openstack/nova-api-0" Dec 12 00:49:19 crc kubenswrapper[4606]: I1212 00:49:19.732669 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab424069-d5cd-4b92-b1d8-1311cffefad6-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ab424069-d5cd-4b92-b1d8-1311cffefad6\") " pod="openstack/nova-api-0" Dec 12 00:49:19 crc kubenswrapper[4606]: I1212 00:49:19.732823 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xfg69\" (UniqueName: \"kubernetes.io/projected/ab424069-d5cd-4b92-b1d8-1311cffefad6-kube-api-access-xfg69\") pod \"nova-api-0\" (UID: \"ab424069-d5cd-4b92-b1d8-1311cffefad6\") " pod="openstack/nova-api-0" Dec 12 00:49:19 crc kubenswrapper[4606]: I1212 00:49:19.733400 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab424069-d5cd-4b92-b1d8-1311cffefad6-logs\") pod \"nova-api-0\" (UID: \"ab424069-d5cd-4b92-b1d8-1311cffefad6\") " pod="openstack/nova-api-0" Dec 12 00:49:19 crc kubenswrapper[4606]: I1212 00:49:19.736657 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab424069-d5cd-4b92-b1d8-1311cffefad6-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ab424069-d5cd-4b92-b1d8-1311cffefad6\") " pod="openstack/nova-api-0" Dec 12 00:49:19 crc kubenswrapper[4606]: I1212 00:49:19.736745 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab424069-d5cd-4b92-b1d8-1311cffefad6-internal-tls-certs\") pod \"nova-api-0\" (UID: \"ab424069-d5cd-4b92-b1d8-1311cffefad6\") " pod="openstack/nova-api-0" Dec 12 00:49:19 crc kubenswrapper[4606]: I1212 00:49:19.737017 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab424069-d5cd-4b92-b1d8-1311cffefad6-public-tls-certs\") pod \"nova-api-0\" (UID: \"ab424069-d5cd-4b92-b1d8-1311cffefad6\") " pod="openstack/nova-api-0" Dec 12 00:49:19 crc kubenswrapper[4606]: I1212 00:49:19.751439 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab424069-d5cd-4b92-b1d8-1311cffefad6-config-data\") pod \"nova-api-0\" (UID: \"ab424069-d5cd-4b92-b1d8-1311cffefad6\") " pod="openstack/nova-api-0" Dec 12 00:49:19 crc kubenswrapper[4606]: I1212 00:49:19.753863 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfg69\" (UniqueName: \"kubernetes.io/projected/ab424069-d5cd-4b92-b1d8-1311cffefad6-kube-api-access-xfg69\") pod \"nova-api-0\" (UID: \"ab424069-d5cd-4b92-b1d8-1311cffefad6\") " pod="openstack/nova-api-0" Dec 12 00:49:19 crc kubenswrapper[4606]: I1212 00:49:19.812287 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 12 00:49:20 crc kubenswrapper[4606]: I1212 00:49:20.343745 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 12 00:49:21 crc kubenswrapper[4606]: I1212 00:49:21.112437 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ab424069-d5cd-4b92-b1d8-1311cffefad6","Type":"ContainerStarted","Data":"43ca2ea71b926e14e14a608e431664656cec022124cff3df65d252b26224ccdf"} Dec 12 00:49:21 crc kubenswrapper[4606]: I1212 00:49:21.112725 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ab424069-d5cd-4b92-b1d8-1311cffefad6","Type":"ContainerStarted","Data":"056698b0d32e8f435de97c4cfb86170cc5eff45372300390cb97cdb88ce4c127"} Dec 12 00:49:21 crc kubenswrapper[4606]: I1212 00:49:21.112739 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ab424069-d5cd-4b92-b1d8-1311cffefad6","Type":"ContainerStarted","Data":"eb1d339025bdb36b2deed273e1475c50d2f85adae6c305badc6da0ebd0033264"} Dec 12 00:49:21 crc kubenswrapper[4606]: I1212 00:49:21.161954 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.161932086 podStartE2EDuration="2.161932086s" podCreationTimestamp="2025-12-12 00:49:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:49:21.151434905 +0000 UTC m=+1551.696787781" watchObservedRunningTime="2025-12-12 00:49:21.161932086 +0000 UTC m=+1551.707284952" Dec 12 00:49:22 crc kubenswrapper[4606]: I1212 00:49:22.515362 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 12 00:49:22 crc kubenswrapper[4606]: I1212 00:49:22.537463 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 12 00:49:22 crc kubenswrapper[4606]: I1212 00:49:22.538916 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 12 00:49:22 crc kubenswrapper[4606]: I1212 00:49:22.699670 4606 scope.go:117] "RemoveContainer" containerID="77f50006b910ef408eff1d278d14d4c6df84559c43dc90a75f9e7ad4aa5dad37" Dec 12 00:49:22 crc kubenswrapper[4606]: E1212 00:49:22.699947 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 00:49:23 crc kubenswrapper[4606]: I1212 00:49:23.235354 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-frh6k" Dec 12 00:49:23 crc kubenswrapper[4606]: I1212 00:49:23.283855 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-frh6k" Dec 12 00:49:23 crc kubenswrapper[4606]: I1212 00:49:23.986437 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-frh6k"] Dec 12 00:49:25 crc kubenswrapper[4606]: I1212 00:49:25.150692 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-frh6k" podUID="459d1fea-af9d-46b7-ad3a-057dc9b980f2" containerName="registry-server" containerID="cri-o://68b421032f2d46c01319e6120e1f78bc5dacededb5ba2d3fa3dfacae051b6759" gracePeriod=2 Dec 12 00:49:25 crc kubenswrapper[4606]: I1212 00:49:25.602744 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-frh6k" Dec 12 00:49:25 crc kubenswrapper[4606]: I1212 00:49:25.652608 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/459d1fea-af9d-46b7-ad3a-057dc9b980f2-utilities\") pod \"459d1fea-af9d-46b7-ad3a-057dc9b980f2\" (UID: \"459d1fea-af9d-46b7-ad3a-057dc9b980f2\") " Dec 12 00:49:25 crc kubenswrapper[4606]: I1212 00:49:25.652897 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-62wc8\" (UniqueName: \"kubernetes.io/projected/459d1fea-af9d-46b7-ad3a-057dc9b980f2-kube-api-access-62wc8\") pod \"459d1fea-af9d-46b7-ad3a-057dc9b980f2\" (UID: \"459d1fea-af9d-46b7-ad3a-057dc9b980f2\") " Dec 12 00:49:25 crc kubenswrapper[4606]: I1212 00:49:25.653095 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/459d1fea-af9d-46b7-ad3a-057dc9b980f2-catalog-content\") pod \"459d1fea-af9d-46b7-ad3a-057dc9b980f2\" (UID: \"459d1fea-af9d-46b7-ad3a-057dc9b980f2\") " Dec 12 00:49:25 crc kubenswrapper[4606]: I1212 00:49:25.653643 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/459d1fea-af9d-46b7-ad3a-057dc9b980f2-utilities" (OuterVolumeSpecName: "utilities") pod "459d1fea-af9d-46b7-ad3a-057dc9b980f2" (UID: "459d1fea-af9d-46b7-ad3a-057dc9b980f2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 00:49:25 crc kubenswrapper[4606]: I1212 00:49:25.672101 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/459d1fea-af9d-46b7-ad3a-057dc9b980f2-kube-api-access-62wc8" (OuterVolumeSpecName: "kube-api-access-62wc8") pod "459d1fea-af9d-46b7-ad3a-057dc9b980f2" (UID: "459d1fea-af9d-46b7-ad3a-057dc9b980f2"). InnerVolumeSpecName "kube-api-access-62wc8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:49:25 crc kubenswrapper[4606]: I1212 00:49:25.760090 4606 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/459d1fea-af9d-46b7-ad3a-057dc9b980f2-utilities\") on node \"crc\" DevicePath \"\"" Dec 12 00:49:25 crc kubenswrapper[4606]: I1212 00:49:25.760366 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-62wc8\" (UniqueName: \"kubernetes.io/projected/459d1fea-af9d-46b7-ad3a-057dc9b980f2-kube-api-access-62wc8\") on node \"crc\" DevicePath \"\"" Dec 12 00:49:25 crc kubenswrapper[4606]: I1212 00:49:25.769592 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/459d1fea-af9d-46b7-ad3a-057dc9b980f2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "459d1fea-af9d-46b7-ad3a-057dc9b980f2" (UID: "459d1fea-af9d-46b7-ad3a-057dc9b980f2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 00:49:25 crc kubenswrapper[4606]: I1212 00:49:25.862185 4606 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/459d1fea-af9d-46b7-ad3a-057dc9b980f2-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 12 00:49:26 crc kubenswrapper[4606]: I1212 00:49:26.165492 4606 generic.go:334] "Generic (PLEG): container finished" podID="459d1fea-af9d-46b7-ad3a-057dc9b980f2" containerID="68b421032f2d46c01319e6120e1f78bc5dacededb5ba2d3fa3dfacae051b6759" exitCode=0 Dec 12 00:49:26 crc kubenswrapper[4606]: I1212 00:49:26.165828 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-frh6k" event={"ID":"459d1fea-af9d-46b7-ad3a-057dc9b980f2","Type":"ContainerDied","Data":"68b421032f2d46c01319e6120e1f78bc5dacededb5ba2d3fa3dfacae051b6759"} Dec 12 00:49:26 crc kubenswrapper[4606]: I1212 00:49:26.165854 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-frh6k" event={"ID":"459d1fea-af9d-46b7-ad3a-057dc9b980f2","Type":"ContainerDied","Data":"dbe15dcf9524a28856eb82721eb2ec26aa6767253c8102118aea53b5c6094126"} Dec 12 00:49:26 crc kubenswrapper[4606]: I1212 00:49:26.165869 4606 scope.go:117] "RemoveContainer" containerID="68b421032f2d46c01319e6120e1f78bc5dacededb5ba2d3fa3dfacae051b6759" Dec 12 00:49:26 crc kubenswrapper[4606]: I1212 00:49:26.165988 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-frh6k" Dec 12 00:49:26 crc kubenswrapper[4606]: I1212 00:49:26.198346 4606 scope.go:117] "RemoveContainer" containerID="1ff9f15affa16338ee7c1a31ca8734bc9c5ee043118de138ae21b0fa6272fa53" Dec 12 00:49:26 crc kubenswrapper[4606]: I1212 00:49:26.297278 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-frh6k"] Dec 12 00:49:26 crc kubenswrapper[4606]: I1212 00:49:26.305666 4606 scope.go:117] "RemoveContainer" containerID="1fd3d72af352fc3e4c858caf8b200b88f45aea4e3801c2f73ab1e63867e6945f" Dec 12 00:49:26 crc kubenswrapper[4606]: I1212 00:49:26.308487 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-frh6k"] Dec 12 00:49:26 crc kubenswrapper[4606]: I1212 00:49:26.336236 4606 scope.go:117] "RemoveContainer" containerID="68b421032f2d46c01319e6120e1f78bc5dacededb5ba2d3fa3dfacae051b6759" Dec 12 00:49:26 crc kubenswrapper[4606]: E1212 00:49:26.336651 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68b421032f2d46c01319e6120e1f78bc5dacededb5ba2d3fa3dfacae051b6759\": container with ID starting with 68b421032f2d46c01319e6120e1f78bc5dacededb5ba2d3fa3dfacae051b6759 not found: ID does not exist" containerID="68b421032f2d46c01319e6120e1f78bc5dacededb5ba2d3fa3dfacae051b6759" Dec 12 00:49:26 crc kubenswrapper[4606]: I1212 00:49:26.336683 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68b421032f2d46c01319e6120e1f78bc5dacededb5ba2d3fa3dfacae051b6759"} err="failed to get container status \"68b421032f2d46c01319e6120e1f78bc5dacededb5ba2d3fa3dfacae051b6759\": rpc error: code = NotFound desc = could not find container \"68b421032f2d46c01319e6120e1f78bc5dacededb5ba2d3fa3dfacae051b6759\": container with ID starting with 68b421032f2d46c01319e6120e1f78bc5dacededb5ba2d3fa3dfacae051b6759 not found: ID does not exist" Dec 12 00:49:26 crc kubenswrapper[4606]: I1212 00:49:26.336704 4606 scope.go:117] "RemoveContainer" containerID="1ff9f15affa16338ee7c1a31ca8734bc9c5ee043118de138ae21b0fa6272fa53" Dec 12 00:49:26 crc kubenswrapper[4606]: E1212 00:49:26.336904 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ff9f15affa16338ee7c1a31ca8734bc9c5ee043118de138ae21b0fa6272fa53\": container with ID starting with 1ff9f15affa16338ee7c1a31ca8734bc9c5ee043118de138ae21b0fa6272fa53 not found: ID does not exist" containerID="1ff9f15affa16338ee7c1a31ca8734bc9c5ee043118de138ae21b0fa6272fa53" Dec 12 00:49:26 crc kubenswrapper[4606]: I1212 00:49:26.336925 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ff9f15affa16338ee7c1a31ca8734bc9c5ee043118de138ae21b0fa6272fa53"} err="failed to get container status \"1ff9f15affa16338ee7c1a31ca8734bc9c5ee043118de138ae21b0fa6272fa53\": rpc error: code = NotFound desc = could not find container \"1ff9f15affa16338ee7c1a31ca8734bc9c5ee043118de138ae21b0fa6272fa53\": container with ID starting with 1ff9f15affa16338ee7c1a31ca8734bc9c5ee043118de138ae21b0fa6272fa53 not found: ID does not exist" Dec 12 00:49:26 crc kubenswrapper[4606]: I1212 00:49:26.336937 4606 scope.go:117] "RemoveContainer" containerID="1fd3d72af352fc3e4c858caf8b200b88f45aea4e3801c2f73ab1e63867e6945f" Dec 12 00:49:26 crc kubenswrapper[4606]: E1212 00:49:26.337094 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1fd3d72af352fc3e4c858caf8b200b88f45aea4e3801c2f73ab1e63867e6945f\": container with ID starting with 1fd3d72af352fc3e4c858caf8b200b88f45aea4e3801c2f73ab1e63867e6945f not found: ID does not exist" containerID="1fd3d72af352fc3e4c858caf8b200b88f45aea4e3801c2f73ab1e63867e6945f" Dec 12 00:49:26 crc kubenswrapper[4606]: I1212 00:49:26.337112 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1fd3d72af352fc3e4c858caf8b200b88f45aea4e3801c2f73ab1e63867e6945f"} err="failed to get container status \"1fd3d72af352fc3e4c858caf8b200b88f45aea4e3801c2f73ab1e63867e6945f\": rpc error: code = NotFound desc = could not find container \"1fd3d72af352fc3e4c858caf8b200b88f45aea4e3801c2f73ab1e63867e6945f\": container with ID starting with 1fd3d72af352fc3e4c858caf8b200b88f45aea4e3801c2f73ab1e63867e6945f not found: ID does not exist" Dec 12 00:49:27 crc kubenswrapper[4606]: I1212 00:49:27.515382 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 12 00:49:27 crc kubenswrapper[4606]: I1212 00:49:27.537698 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 12 00:49:27 crc kubenswrapper[4606]: I1212 00:49:27.537862 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 12 00:49:27 crc kubenswrapper[4606]: I1212 00:49:27.547228 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 12 00:49:27 crc kubenswrapper[4606]: I1212 00:49:27.710849 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="459d1fea-af9d-46b7-ad3a-057dc9b980f2" path="/var/lib/kubelet/pods/459d1fea-af9d-46b7-ad3a-057dc9b980f2/volumes" Dec 12 00:49:28 crc kubenswrapper[4606]: I1212 00:49:28.222663 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 12 00:49:28 crc kubenswrapper[4606]: I1212 00:49:28.550329 4606 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="f3157a9f-4b11-4116-8be9-f4cb87e19b9f" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.209:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 12 00:49:28 crc kubenswrapper[4606]: I1212 00:49:28.550372 4606 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="f3157a9f-4b11-4116-8be9-f4cb87e19b9f" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.209:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 12 00:49:29 crc kubenswrapper[4606]: I1212 00:49:29.812867 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 12 00:49:29 crc kubenswrapper[4606]: I1212 00:49:29.812928 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 12 00:49:30 crc kubenswrapper[4606]: I1212 00:49:30.824329 4606 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="ab424069-d5cd-4b92-b1d8-1311cffefad6" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.210:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 12 00:49:30 crc kubenswrapper[4606]: I1212 00:49:30.824352 4606 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="ab424069-d5cd-4b92-b1d8-1311cffefad6" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.210:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 12 00:49:33 crc kubenswrapper[4606]: I1212 00:49:33.700932 4606 scope.go:117] "RemoveContainer" containerID="77f50006b910ef408eff1d278d14d4c6df84559c43dc90a75f9e7ad4aa5dad37" Dec 12 00:49:33 crc kubenswrapper[4606]: E1212 00:49:33.701789 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 00:49:37 crc kubenswrapper[4606]: I1212 00:49:37.549117 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 12 00:49:37 crc kubenswrapper[4606]: I1212 00:49:37.557627 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 12 00:49:37 crc kubenswrapper[4606]: I1212 00:49:37.558362 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 12 00:49:38 crc kubenswrapper[4606]: I1212 00:49:38.289265 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 12 00:49:39 crc kubenswrapper[4606]: I1212 00:49:39.820610 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 12 00:49:39 crc kubenswrapper[4606]: I1212 00:49:39.821225 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 12 00:49:39 crc kubenswrapper[4606]: I1212 00:49:39.823970 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 12 00:49:39 crc kubenswrapper[4606]: I1212 00:49:39.831655 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 12 00:49:40 crc kubenswrapper[4606]: I1212 00:49:40.278256 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 12 00:49:40 crc kubenswrapper[4606]: I1212 00:49:40.311587 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 12 00:49:40 crc kubenswrapper[4606]: I1212 00:49:40.340564 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 12 00:49:47 crc kubenswrapper[4606]: I1212 00:49:47.700126 4606 scope.go:117] "RemoveContainer" containerID="77f50006b910ef408eff1d278d14d4c6df84559c43dc90a75f9e7ad4aa5dad37" Dec 12 00:49:47 crc kubenswrapper[4606]: E1212 00:49:47.700980 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 00:49:51 crc kubenswrapper[4606]: I1212 00:49:51.200175 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 12 00:49:52 crc kubenswrapper[4606]: I1212 00:49:52.263780 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 12 00:49:55 crc kubenswrapper[4606]: I1212 00:49:55.559624 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="0e415e37-636f-4f5d-a64e-4dd815e6030e" containerName="rabbitmq" containerID="cri-o://b69a0aacdfb7feb06f0e290611d83e131639980def628f7c7a0480887ccff02d" gracePeriod=604796 Dec 12 00:49:57 crc kubenswrapper[4606]: I1212 00:49:57.157689 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="bd9fd090-7c43-44f4-9951-10b4528fc8a2" containerName="rabbitmq" containerID="cri-o://bb5b1403233bfd6bb8bd59205d11593b5e080ca5002733c88e5911b16beedaf3" gracePeriod=604796 Dec 12 00:49:57 crc kubenswrapper[4606]: I1212 00:49:57.522994 4606 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="0e415e37-636f-4f5d-a64e-4dd815e6030e" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.98:5671: connect: connection refused" Dec 12 00:49:57 crc kubenswrapper[4606]: I1212 00:49:57.913145 4606 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="bd9fd090-7c43-44f4-9951-10b4528fc8a2" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.99:5671: connect: connection refused" Dec 12 00:49:58 crc kubenswrapper[4606]: I1212 00:49:58.699274 4606 scope.go:117] "RemoveContainer" containerID="77f50006b910ef408eff1d278d14d4c6df84559c43dc90a75f9e7ad4aa5dad37" Dec 12 00:49:58 crc kubenswrapper[4606]: E1212 00:49:58.699580 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 00:50:02 crc kubenswrapper[4606]: I1212 00:50:02.170532 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 12 00:50:02 crc kubenswrapper[4606]: I1212 00:50:02.353913 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0e415e37-636f-4f5d-a64e-4dd815e6030e-rabbitmq-erlang-cookie\") pod \"0e415e37-636f-4f5d-a64e-4dd815e6030e\" (UID: \"0e415e37-636f-4f5d-a64e-4dd815e6030e\") " Dec 12 00:50:02 crc kubenswrapper[4606]: I1212 00:50:02.354385 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0e415e37-636f-4f5d-a64e-4dd815e6030e-rabbitmq-confd\") pod \"0e415e37-636f-4f5d-a64e-4dd815e6030e\" (UID: \"0e415e37-636f-4f5d-a64e-4dd815e6030e\") " Dec 12 00:50:02 crc kubenswrapper[4606]: I1212 00:50:02.354435 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"0e415e37-636f-4f5d-a64e-4dd815e6030e\" (UID: \"0e415e37-636f-4f5d-a64e-4dd815e6030e\") " Dec 12 00:50:02 crc kubenswrapper[4606]: I1212 00:50:02.355205 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e415e37-636f-4f5d-a64e-4dd815e6030e-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "0e415e37-636f-4f5d-a64e-4dd815e6030e" (UID: "0e415e37-636f-4f5d-a64e-4dd815e6030e"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 00:50:02 crc kubenswrapper[4606]: I1212 00:50:02.356769 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0e415e37-636f-4f5d-a64e-4dd815e6030e-pod-info\") pod \"0e415e37-636f-4f5d-a64e-4dd815e6030e\" (UID: \"0e415e37-636f-4f5d-a64e-4dd815e6030e\") " Dec 12 00:50:02 crc kubenswrapper[4606]: I1212 00:50:02.356883 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g8fnp\" (UniqueName: \"kubernetes.io/projected/0e415e37-636f-4f5d-a64e-4dd815e6030e-kube-api-access-g8fnp\") pod \"0e415e37-636f-4f5d-a64e-4dd815e6030e\" (UID: \"0e415e37-636f-4f5d-a64e-4dd815e6030e\") " Dec 12 00:50:02 crc kubenswrapper[4606]: I1212 00:50:02.356909 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0e415e37-636f-4f5d-a64e-4dd815e6030e-plugins-conf\") pod \"0e415e37-636f-4f5d-a64e-4dd815e6030e\" (UID: \"0e415e37-636f-4f5d-a64e-4dd815e6030e\") " Dec 12 00:50:02 crc kubenswrapper[4606]: I1212 00:50:02.356932 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0e415e37-636f-4f5d-a64e-4dd815e6030e-config-data\") pod \"0e415e37-636f-4f5d-a64e-4dd815e6030e\" (UID: \"0e415e37-636f-4f5d-a64e-4dd815e6030e\") " Dec 12 00:50:02 crc kubenswrapper[4606]: I1212 00:50:02.356950 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0e415e37-636f-4f5d-a64e-4dd815e6030e-server-conf\") pod \"0e415e37-636f-4f5d-a64e-4dd815e6030e\" (UID: \"0e415e37-636f-4f5d-a64e-4dd815e6030e\") " Dec 12 00:50:02 crc kubenswrapper[4606]: I1212 00:50:02.356976 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0e415e37-636f-4f5d-a64e-4dd815e6030e-erlang-cookie-secret\") pod \"0e415e37-636f-4f5d-a64e-4dd815e6030e\" (UID: \"0e415e37-636f-4f5d-a64e-4dd815e6030e\") " Dec 12 00:50:02 crc kubenswrapper[4606]: I1212 00:50:02.357011 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0e415e37-636f-4f5d-a64e-4dd815e6030e-rabbitmq-tls\") pod \"0e415e37-636f-4f5d-a64e-4dd815e6030e\" (UID: \"0e415e37-636f-4f5d-a64e-4dd815e6030e\") " Dec 12 00:50:02 crc kubenswrapper[4606]: I1212 00:50:02.357031 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0e415e37-636f-4f5d-a64e-4dd815e6030e-rabbitmq-plugins\") pod \"0e415e37-636f-4f5d-a64e-4dd815e6030e\" (UID: \"0e415e37-636f-4f5d-a64e-4dd815e6030e\") " Dec 12 00:50:02 crc kubenswrapper[4606]: I1212 00:50:02.357629 4606 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0e415e37-636f-4f5d-a64e-4dd815e6030e-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 12 00:50:02 crc kubenswrapper[4606]: I1212 00:50:02.358253 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e415e37-636f-4f5d-a64e-4dd815e6030e-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "0e415e37-636f-4f5d-a64e-4dd815e6030e" (UID: "0e415e37-636f-4f5d-a64e-4dd815e6030e"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 00:50:02 crc kubenswrapper[4606]: I1212 00:50:02.358500 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0e415e37-636f-4f5d-a64e-4dd815e6030e-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "0e415e37-636f-4f5d-a64e-4dd815e6030e" (UID: "0e415e37-636f-4f5d-a64e-4dd815e6030e"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:50:02 crc kubenswrapper[4606]: I1212 00:50:02.363215 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e415e37-636f-4f5d-a64e-4dd815e6030e-kube-api-access-g8fnp" (OuterVolumeSpecName: "kube-api-access-g8fnp") pod "0e415e37-636f-4f5d-a64e-4dd815e6030e" (UID: "0e415e37-636f-4f5d-a64e-4dd815e6030e"). InnerVolumeSpecName "kube-api-access-g8fnp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:50:02 crc kubenswrapper[4606]: I1212 00:50:02.364740 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/0e415e37-636f-4f5d-a64e-4dd815e6030e-pod-info" (OuterVolumeSpecName: "pod-info") pod "0e415e37-636f-4f5d-a64e-4dd815e6030e" (UID: "0e415e37-636f-4f5d-a64e-4dd815e6030e"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 12 00:50:02 crc kubenswrapper[4606]: I1212 00:50:02.364761 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e415e37-636f-4f5d-a64e-4dd815e6030e-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "0e415e37-636f-4f5d-a64e-4dd815e6030e" (UID: "0e415e37-636f-4f5d-a64e-4dd815e6030e"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:50:02 crc kubenswrapper[4606]: I1212 00:50:02.369112 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e415e37-636f-4f5d-a64e-4dd815e6030e-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "0e415e37-636f-4f5d-a64e-4dd815e6030e" (UID: "0e415e37-636f-4f5d-a64e-4dd815e6030e"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:50:02 crc kubenswrapper[4606]: I1212 00:50:02.379023 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "persistence") pod "0e415e37-636f-4f5d-a64e-4dd815e6030e" (UID: "0e415e37-636f-4f5d-a64e-4dd815e6030e"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 12 00:50:02 crc kubenswrapper[4606]: I1212 00:50:02.388563 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0e415e37-636f-4f5d-a64e-4dd815e6030e-config-data" (OuterVolumeSpecName: "config-data") pod "0e415e37-636f-4f5d-a64e-4dd815e6030e" (UID: "0e415e37-636f-4f5d-a64e-4dd815e6030e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:50:02 crc kubenswrapper[4606]: I1212 00:50:02.433931 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0e415e37-636f-4f5d-a64e-4dd815e6030e-server-conf" (OuterVolumeSpecName: "server-conf") pod "0e415e37-636f-4f5d-a64e-4dd815e6030e" (UID: "0e415e37-636f-4f5d-a64e-4dd815e6030e"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:50:02 crc kubenswrapper[4606]: I1212 00:50:02.458996 4606 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0e415e37-636f-4f5d-a64e-4dd815e6030e-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 12 00:50:02 crc kubenswrapper[4606]: I1212 00:50:02.459022 4606 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0e415e37-636f-4f5d-a64e-4dd815e6030e-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 12 00:50:02 crc kubenswrapper[4606]: I1212 00:50:02.459044 4606 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Dec 12 00:50:02 crc kubenswrapper[4606]: I1212 00:50:02.459052 4606 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0e415e37-636f-4f5d-a64e-4dd815e6030e-pod-info\") on node \"crc\" DevicePath \"\"" Dec 12 00:50:02 crc kubenswrapper[4606]: I1212 00:50:02.459062 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g8fnp\" (UniqueName: \"kubernetes.io/projected/0e415e37-636f-4f5d-a64e-4dd815e6030e-kube-api-access-g8fnp\") on node \"crc\" DevicePath \"\"" Dec 12 00:50:02 crc kubenswrapper[4606]: I1212 00:50:02.459073 4606 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0e415e37-636f-4f5d-a64e-4dd815e6030e-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 12 00:50:02 crc kubenswrapper[4606]: I1212 00:50:02.459081 4606 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0e415e37-636f-4f5d-a64e-4dd815e6030e-config-data\") on node \"crc\" DevicePath \"\"" Dec 12 00:50:02 crc kubenswrapper[4606]: I1212 00:50:02.459089 4606 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0e415e37-636f-4f5d-a64e-4dd815e6030e-server-conf\") on node \"crc\" DevicePath \"\"" Dec 12 00:50:02 crc kubenswrapper[4606]: I1212 00:50:02.459099 4606 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0e415e37-636f-4f5d-a64e-4dd815e6030e-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 12 00:50:02 crc kubenswrapper[4606]: I1212 00:50:02.490068 4606 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Dec 12 00:50:02 crc kubenswrapper[4606]: I1212 00:50:02.496408 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e415e37-636f-4f5d-a64e-4dd815e6030e-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "0e415e37-636f-4f5d-a64e-4dd815e6030e" (UID: "0e415e37-636f-4f5d-a64e-4dd815e6030e"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:50:02 crc kubenswrapper[4606]: I1212 00:50:02.553743 4606 generic.go:334] "Generic (PLEG): container finished" podID="0e415e37-636f-4f5d-a64e-4dd815e6030e" containerID="b69a0aacdfb7feb06f0e290611d83e131639980def628f7c7a0480887ccff02d" exitCode=0 Dec 12 00:50:02 crc kubenswrapper[4606]: I1212 00:50:02.553808 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"0e415e37-636f-4f5d-a64e-4dd815e6030e","Type":"ContainerDied","Data":"b69a0aacdfb7feb06f0e290611d83e131639980def628f7c7a0480887ccff02d"} Dec 12 00:50:02 crc kubenswrapper[4606]: I1212 00:50:02.553836 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"0e415e37-636f-4f5d-a64e-4dd815e6030e","Type":"ContainerDied","Data":"46d6b3a8daa9ea589f99763355e721931535c6633a9bf7e7686bae3882f9f2d8"} Dec 12 00:50:02 crc kubenswrapper[4606]: I1212 00:50:02.553852 4606 scope.go:117] "RemoveContainer" containerID="b69a0aacdfb7feb06f0e290611d83e131639980def628f7c7a0480887ccff02d" Dec 12 00:50:02 crc kubenswrapper[4606]: I1212 00:50:02.553972 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 12 00:50:02 crc kubenswrapper[4606]: I1212 00:50:02.562002 4606 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0e415e37-636f-4f5d-a64e-4dd815e6030e-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 12 00:50:02 crc kubenswrapper[4606]: I1212 00:50:02.562572 4606 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Dec 12 00:50:02 crc kubenswrapper[4606]: I1212 00:50:02.593747 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 12 00:50:02 crc kubenswrapper[4606]: I1212 00:50:02.595871 4606 scope.go:117] "RemoveContainer" containerID="8f65950682d8d7663800a314d38af8809bafba5b459d243224796e51d49acb3c" Dec 12 00:50:02 crc kubenswrapper[4606]: I1212 00:50:02.603660 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 12 00:50:02 crc kubenswrapper[4606]: I1212 00:50:02.630872 4606 scope.go:117] "RemoveContainer" containerID="b69a0aacdfb7feb06f0e290611d83e131639980def628f7c7a0480887ccff02d" Dec 12 00:50:02 crc kubenswrapper[4606]: E1212 00:50:02.631507 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b69a0aacdfb7feb06f0e290611d83e131639980def628f7c7a0480887ccff02d\": container with ID starting with b69a0aacdfb7feb06f0e290611d83e131639980def628f7c7a0480887ccff02d not found: ID does not exist" containerID="b69a0aacdfb7feb06f0e290611d83e131639980def628f7c7a0480887ccff02d" Dec 12 00:50:02 crc kubenswrapper[4606]: I1212 00:50:02.631543 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b69a0aacdfb7feb06f0e290611d83e131639980def628f7c7a0480887ccff02d"} err="failed to get container status \"b69a0aacdfb7feb06f0e290611d83e131639980def628f7c7a0480887ccff02d\": rpc error: code = NotFound desc = could not find container \"b69a0aacdfb7feb06f0e290611d83e131639980def628f7c7a0480887ccff02d\": container with ID starting with b69a0aacdfb7feb06f0e290611d83e131639980def628f7c7a0480887ccff02d not found: ID does not exist" Dec 12 00:50:02 crc kubenswrapper[4606]: I1212 00:50:02.631571 4606 scope.go:117] "RemoveContainer" containerID="8f65950682d8d7663800a314d38af8809bafba5b459d243224796e51d49acb3c" Dec 12 00:50:02 crc kubenswrapper[4606]: E1212 00:50:02.632216 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f65950682d8d7663800a314d38af8809bafba5b459d243224796e51d49acb3c\": container with ID starting with 8f65950682d8d7663800a314d38af8809bafba5b459d243224796e51d49acb3c not found: ID does not exist" containerID="8f65950682d8d7663800a314d38af8809bafba5b459d243224796e51d49acb3c" Dec 12 00:50:02 crc kubenswrapper[4606]: I1212 00:50:02.632248 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f65950682d8d7663800a314d38af8809bafba5b459d243224796e51d49acb3c"} err="failed to get container status \"8f65950682d8d7663800a314d38af8809bafba5b459d243224796e51d49acb3c\": rpc error: code = NotFound desc = could not find container \"8f65950682d8d7663800a314d38af8809bafba5b459d243224796e51d49acb3c\": container with ID starting with 8f65950682d8d7663800a314d38af8809bafba5b459d243224796e51d49acb3c not found: ID does not exist" Dec 12 00:50:02 crc kubenswrapper[4606]: I1212 00:50:02.635415 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 12 00:50:02 crc kubenswrapper[4606]: E1212 00:50:02.635801 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="459d1fea-af9d-46b7-ad3a-057dc9b980f2" containerName="extract-content" Dec 12 00:50:02 crc kubenswrapper[4606]: I1212 00:50:02.635817 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="459d1fea-af9d-46b7-ad3a-057dc9b980f2" containerName="extract-content" Dec 12 00:50:02 crc kubenswrapper[4606]: E1212 00:50:02.635829 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e415e37-636f-4f5d-a64e-4dd815e6030e" containerName="rabbitmq" Dec 12 00:50:02 crc kubenswrapper[4606]: I1212 00:50:02.635835 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e415e37-636f-4f5d-a64e-4dd815e6030e" containerName="rabbitmq" Dec 12 00:50:02 crc kubenswrapper[4606]: E1212 00:50:02.635846 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="459d1fea-af9d-46b7-ad3a-057dc9b980f2" containerName="registry-server" Dec 12 00:50:02 crc kubenswrapper[4606]: I1212 00:50:02.635852 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="459d1fea-af9d-46b7-ad3a-057dc9b980f2" containerName="registry-server" Dec 12 00:50:02 crc kubenswrapper[4606]: E1212 00:50:02.635861 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e415e37-636f-4f5d-a64e-4dd815e6030e" containerName="setup-container" Dec 12 00:50:02 crc kubenswrapper[4606]: I1212 00:50:02.635866 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e415e37-636f-4f5d-a64e-4dd815e6030e" containerName="setup-container" Dec 12 00:50:02 crc kubenswrapper[4606]: E1212 00:50:02.635882 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="459d1fea-af9d-46b7-ad3a-057dc9b980f2" containerName="extract-utilities" Dec 12 00:50:02 crc kubenswrapper[4606]: I1212 00:50:02.635888 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="459d1fea-af9d-46b7-ad3a-057dc9b980f2" containerName="extract-utilities" Dec 12 00:50:02 crc kubenswrapper[4606]: I1212 00:50:02.636057 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e415e37-636f-4f5d-a64e-4dd815e6030e" containerName="rabbitmq" Dec 12 00:50:02 crc kubenswrapper[4606]: I1212 00:50:02.636074 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="459d1fea-af9d-46b7-ad3a-057dc9b980f2" containerName="registry-server" Dec 12 00:50:02 crc kubenswrapper[4606]: I1212 00:50:02.638762 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 12 00:50:02 crc kubenswrapper[4606]: I1212 00:50:02.641294 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-666rg" Dec 12 00:50:02 crc kubenswrapper[4606]: I1212 00:50:02.641359 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 12 00:50:02 crc kubenswrapper[4606]: I1212 00:50:02.641619 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 12 00:50:02 crc kubenswrapper[4606]: I1212 00:50:02.641682 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Dec 12 00:50:02 crc kubenswrapper[4606]: I1212 00:50:02.642577 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Dec 12 00:50:02 crc kubenswrapper[4606]: I1212 00:50:02.646710 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 12 00:50:02 crc kubenswrapper[4606]: I1212 00:50:02.646832 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 12 00:50:02 crc kubenswrapper[4606]: I1212 00:50:02.648193 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 12 00:50:02 crc kubenswrapper[4606]: I1212 00:50:02.769002 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/aec304bf-3003-493d-9e17-3a2f75997bdb-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"aec304bf-3003-493d-9e17-3a2f75997bdb\") " pod="openstack/rabbitmq-server-0" Dec 12 00:50:02 crc kubenswrapper[4606]: I1212 00:50:02.769269 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/aec304bf-3003-493d-9e17-3a2f75997bdb-config-data\") pod \"rabbitmq-server-0\" (UID: \"aec304bf-3003-493d-9e17-3a2f75997bdb\") " pod="openstack/rabbitmq-server-0" Dec 12 00:50:02 crc kubenswrapper[4606]: I1212 00:50:02.769518 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92ftf\" (UniqueName: \"kubernetes.io/projected/aec304bf-3003-493d-9e17-3a2f75997bdb-kube-api-access-92ftf\") pod \"rabbitmq-server-0\" (UID: \"aec304bf-3003-493d-9e17-3a2f75997bdb\") " pod="openstack/rabbitmq-server-0" Dec 12 00:50:02 crc kubenswrapper[4606]: I1212 00:50:02.769585 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/aec304bf-3003-493d-9e17-3a2f75997bdb-server-conf\") pod \"rabbitmq-server-0\" (UID: \"aec304bf-3003-493d-9e17-3a2f75997bdb\") " pod="openstack/rabbitmq-server-0" Dec 12 00:50:02 crc kubenswrapper[4606]: I1212 00:50:02.769625 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/aec304bf-3003-493d-9e17-3a2f75997bdb-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"aec304bf-3003-493d-9e17-3a2f75997bdb\") " pod="openstack/rabbitmq-server-0" Dec 12 00:50:02 crc kubenswrapper[4606]: I1212 00:50:02.769679 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"aec304bf-3003-493d-9e17-3a2f75997bdb\") " pod="openstack/rabbitmq-server-0" Dec 12 00:50:02 crc kubenswrapper[4606]: I1212 00:50:02.769726 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/aec304bf-3003-493d-9e17-3a2f75997bdb-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"aec304bf-3003-493d-9e17-3a2f75997bdb\") " pod="openstack/rabbitmq-server-0" Dec 12 00:50:02 crc kubenswrapper[4606]: I1212 00:50:02.769784 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/aec304bf-3003-493d-9e17-3a2f75997bdb-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"aec304bf-3003-493d-9e17-3a2f75997bdb\") " pod="openstack/rabbitmq-server-0" Dec 12 00:50:02 crc kubenswrapper[4606]: I1212 00:50:02.769814 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/aec304bf-3003-493d-9e17-3a2f75997bdb-pod-info\") pod \"rabbitmq-server-0\" (UID: \"aec304bf-3003-493d-9e17-3a2f75997bdb\") " pod="openstack/rabbitmq-server-0" Dec 12 00:50:02 crc kubenswrapper[4606]: I1212 00:50:02.769852 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/aec304bf-3003-493d-9e17-3a2f75997bdb-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"aec304bf-3003-493d-9e17-3a2f75997bdb\") " pod="openstack/rabbitmq-server-0" Dec 12 00:50:02 crc kubenswrapper[4606]: I1212 00:50:02.769886 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/aec304bf-3003-493d-9e17-3a2f75997bdb-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"aec304bf-3003-493d-9e17-3a2f75997bdb\") " pod="openstack/rabbitmq-server-0" Dec 12 00:50:02 crc kubenswrapper[4606]: I1212 00:50:02.871675 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/aec304bf-3003-493d-9e17-3a2f75997bdb-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"aec304bf-3003-493d-9e17-3a2f75997bdb\") " pod="openstack/rabbitmq-server-0" Dec 12 00:50:02 crc kubenswrapper[4606]: I1212 00:50:02.871905 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/aec304bf-3003-493d-9e17-3a2f75997bdb-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"aec304bf-3003-493d-9e17-3a2f75997bdb\") " pod="openstack/rabbitmq-server-0" Dec 12 00:50:02 crc kubenswrapper[4606]: I1212 00:50:02.872020 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/aec304bf-3003-493d-9e17-3a2f75997bdb-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"aec304bf-3003-493d-9e17-3a2f75997bdb\") " pod="openstack/rabbitmq-server-0" Dec 12 00:50:02 crc kubenswrapper[4606]: I1212 00:50:02.872094 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/aec304bf-3003-493d-9e17-3a2f75997bdb-config-data\") pod \"rabbitmq-server-0\" (UID: \"aec304bf-3003-493d-9e17-3a2f75997bdb\") " pod="openstack/rabbitmq-server-0" Dec 12 00:50:02 crc kubenswrapper[4606]: I1212 00:50:02.872219 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-92ftf\" (UniqueName: \"kubernetes.io/projected/aec304bf-3003-493d-9e17-3a2f75997bdb-kube-api-access-92ftf\") pod \"rabbitmq-server-0\" (UID: \"aec304bf-3003-493d-9e17-3a2f75997bdb\") " pod="openstack/rabbitmq-server-0" Dec 12 00:50:02 crc kubenswrapper[4606]: I1212 00:50:02.872304 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/aec304bf-3003-493d-9e17-3a2f75997bdb-server-conf\") pod \"rabbitmq-server-0\" (UID: \"aec304bf-3003-493d-9e17-3a2f75997bdb\") " pod="openstack/rabbitmq-server-0" Dec 12 00:50:02 crc kubenswrapper[4606]: I1212 00:50:02.872378 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/aec304bf-3003-493d-9e17-3a2f75997bdb-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"aec304bf-3003-493d-9e17-3a2f75997bdb\") " pod="openstack/rabbitmq-server-0" Dec 12 00:50:02 crc kubenswrapper[4606]: I1212 00:50:02.872454 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"aec304bf-3003-493d-9e17-3a2f75997bdb\") " pod="openstack/rabbitmq-server-0" Dec 12 00:50:02 crc kubenswrapper[4606]: I1212 00:50:02.875378 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/aec304bf-3003-493d-9e17-3a2f75997bdb-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"aec304bf-3003-493d-9e17-3a2f75997bdb\") " pod="openstack/rabbitmq-server-0" Dec 12 00:50:02 crc kubenswrapper[4606]: I1212 00:50:02.875559 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/aec304bf-3003-493d-9e17-3a2f75997bdb-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"aec304bf-3003-493d-9e17-3a2f75997bdb\") " pod="openstack/rabbitmq-server-0" Dec 12 00:50:02 crc kubenswrapper[4606]: I1212 00:50:02.875694 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/aec304bf-3003-493d-9e17-3a2f75997bdb-pod-info\") pod \"rabbitmq-server-0\" (UID: \"aec304bf-3003-493d-9e17-3a2f75997bdb\") " pod="openstack/rabbitmq-server-0" Dec 12 00:50:02 crc kubenswrapper[4606]: I1212 00:50:02.873792 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/aec304bf-3003-493d-9e17-3a2f75997bdb-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"aec304bf-3003-493d-9e17-3a2f75997bdb\") " pod="openstack/rabbitmq-server-0" Dec 12 00:50:02 crc kubenswrapper[4606]: I1212 00:50:02.874816 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/aec304bf-3003-493d-9e17-3a2f75997bdb-server-conf\") pod \"rabbitmq-server-0\" (UID: \"aec304bf-3003-493d-9e17-3a2f75997bdb\") " pod="openstack/rabbitmq-server-0" Dec 12 00:50:02 crc kubenswrapper[4606]: I1212 00:50:02.875554 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/aec304bf-3003-493d-9e17-3a2f75997bdb-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"aec304bf-3003-493d-9e17-3a2f75997bdb\") " pod="openstack/rabbitmq-server-0" Dec 12 00:50:02 crc kubenswrapper[4606]: I1212 00:50:02.875763 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/aec304bf-3003-493d-9e17-3a2f75997bdb-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"aec304bf-3003-493d-9e17-3a2f75997bdb\") " pod="openstack/rabbitmq-server-0" Dec 12 00:50:02 crc kubenswrapper[4606]: I1212 00:50:02.872970 4606 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"aec304bf-3003-493d-9e17-3a2f75997bdb\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/rabbitmq-server-0" Dec 12 00:50:02 crc kubenswrapper[4606]: I1212 00:50:02.873779 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/aec304bf-3003-493d-9e17-3a2f75997bdb-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"aec304bf-3003-493d-9e17-3a2f75997bdb\") " pod="openstack/rabbitmq-server-0" Dec 12 00:50:02 crc kubenswrapper[4606]: I1212 00:50:02.873572 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/aec304bf-3003-493d-9e17-3a2f75997bdb-config-data\") pod \"rabbitmq-server-0\" (UID: \"aec304bf-3003-493d-9e17-3a2f75997bdb\") " pod="openstack/rabbitmq-server-0" Dec 12 00:50:02 crc kubenswrapper[4606]: I1212 00:50:02.877342 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/aec304bf-3003-493d-9e17-3a2f75997bdb-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"aec304bf-3003-493d-9e17-3a2f75997bdb\") " pod="openstack/rabbitmq-server-0" Dec 12 00:50:02 crc kubenswrapper[4606]: I1212 00:50:02.890559 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/aec304bf-3003-493d-9e17-3a2f75997bdb-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"aec304bf-3003-493d-9e17-3a2f75997bdb\") " pod="openstack/rabbitmq-server-0" Dec 12 00:50:02 crc kubenswrapper[4606]: I1212 00:50:02.891260 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-92ftf\" (UniqueName: \"kubernetes.io/projected/aec304bf-3003-493d-9e17-3a2f75997bdb-kube-api-access-92ftf\") pod \"rabbitmq-server-0\" (UID: \"aec304bf-3003-493d-9e17-3a2f75997bdb\") " pod="openstack/rabbitmq-server-0" Dec 12 00:50:02 crc kubenswrapper[4606]: I1212 00:50:02.896201 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/aec304bf-3003-493d-9e17-3a2f75997bdb-pod-info\") pod \"rabbitmq-server-0\" (UID: \"aec304bf-3003-493d-9e17-3a2f75997bdb\") " pod="openstack/rabbitmq-server-0" Dec 12 00:50:02 crc kubenswrapper[4606]: I1212 00:50:02.919025 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"aec304bf-3003-493d-9e17-3a2f75997bdb\") " pod="openstack/rabbitmq-server-0" Dec 12 00:50:02 crc kubenswrapper[4606]: I1212 00:50:02.968804 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 12 00:50:03 crc kubenswrapper[4606]: I1212 00:50:03.493331 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 12 00:50:03 crc kubenswrapper[4606]: I1212 00:50:03.582526 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"aec304bf-3003-493d-9e17-3a2f75997bdb","Type":"ContainerStarted","Data":"fd1c372363c66c16ade8b8205fe522e20cfd99969a70e0e05e2f3ff503e43416"} Dec 12 00:50:03 crc kubenswrapper[4606]: I1212 00:50:03.588628 4606 generic.go:334] "Generic (PLEG): container finished" podID="bd9fd090-7c43-44f4-9951-10b4528fc8a2" containerID="bb5b1403233bfd6bb8bd59205d11593b5e080ca5002733c88e5911b16beedaf3" exitCode=0 Dec 12 00:50:03 crc kubenswrapper[4606]: I1212 00:50:03.588882 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"bd9fd090-7c43-44f4-9951-10b4528fc8a2","Type":"ContainerDied","Data":"bb5b1403233bfd6bb8bd59205d11593b5e080ca5002733c88e5911b16beedaf3"} Dec 12 00:50:03 crc kubenswrapper[4606]: I1212 00:50:03.671499 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 12 00:50:03 crc kubenswrapper[4606]: I1212 00:50:03.718486 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e415e37-636f-4f5d-a64e-4dd815e6030e" path="/var/lib/kubelet/pods/0e415e37-636f-4f5d-a64e-4dd815e6030e/volumes" Dec 12 00:50:03 crc kubenswrapper[4606]: I1212 00:50:03.790869 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bd9fd090-7c43-44f4-9951-10b4528fc8a2-rabbitmq-erlang-cookie\") pod \"bd9fd090-7c43-44f4-9951-10b4528fc8a2\" (UID: \"bd9fd090-7c43-44f4-9951-10b4528fc8a2\") " Dec 12 00:50:03 crc kubenswrapper[4606]: I1212 00:50:03.791001 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bd9fd090-7c43-44f4-9951-10b4528fc8a2-plugins-conf\") pod \"bd9fd090-7c43-44f4-9951-10b4528fc8a2\" (UID: \"bd9fd090-7c43-44f4-9951-10b4528fc8a2\") " Dec 12 00:50:03 crc kubenswrapper[4606]: I1212 00:50:03.791060 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bd9fd090-7c43-44f4-9951-10b4528fc8a2-pod-info\") pod \"bd9fd090-7c43-44f4-9951-10b4528fc8a2\" (UID: \"bd9fd090-7c43-44f4-9951-10b4528fc8a2\") " Dec 12 00:50:03 crc kubenswrapper[4606]: I1212 00:50:03.791093 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qcbhk\" (UniqueName: \"kubernetes.io/projected/bd9fd090-7c43-44f4-9951-10b4528fc8a2-kube-api-access-qcbhk\") pod \"bd9fd090-7c43-44f4-9951-10b4528fc8a2\" (UID: \"bd9fd090-7c43-44f4-9951-10b4528fc8a2\") " Dec 12 00:50:03 crc kubenswrapper[4606]: I1212 00:50:03.791124 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"bd9fd090-7c43-44f4-9951-10b4528fc8a2\" (UID: \"bd9fd090-7c43-44f4-9951-10b4528fc8a2\") " Dec 12 00:50:03 crc kubenswrapper[4606]: I1212 00:50:03.791162 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/bd9fd090-7c43-44f4-9951-10b4528fc8a2-rabbitmq-tls\") pod \"bd9fd090-7c43-44f4-9951-10b4528fc8a2\" (UID: \"bd9fd090-7c43-44f4-9951-10b4528fc8a2\") " Dec 12 00:50:03 crc kubenswrapper[4606]: I1212 00:50:03.791231 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bd9fd090-7c43-44f4-9951-10b4528fc8a2-rabbitmq-plugins\") pod \"bd9fd090-7c43-44f4-9951-10b4528fc8a2\" (UID: \"bd9fd090-7c43-44f4-9951-10b4528fc8a2\") " Dec 12 00:50:03 crc kubenswrapper[4606]: I1212 00:50:03.791266 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bd9fd090-7c43-44f4-9951-10b4528fc8a2-config-data\") pod \"bd9fd090-7c43-44f4-9951-10b4528fc8a2\" (UID: \"bd9fd090-7c43-44f4-9951-10b4528fc8a2\") " Dec 12 00:50:03 crc kubenswrapper[4606]: I1212 00:50:03.791339 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bd9fd090-7c43-44f4-9951-10b4528fc8a2-rabbitmq-confd\") pod \"bd9fd090-7c43-44f4-9951-10b4528fc8a2\" (UID: \"bd9fd090-7c43-44f4-9951-10b4528fc8a2\") " Dec 12 00:50:03 crc kubenswrapper[4606]: I1212 00:50:03.791396 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bd9fd090-7c43-44f4-9951-10b4528fc8a2-erlang-cookie-secret\") pod \"bd9fd090-7c43-44f4-9951-10b4528fc8a2\" (UID: \"bd9fd090-7c43-44f4-9951-10b4528fc8a2\") " Dec 12 00:50:03 crc kubenswrapper[4606]: I1212 00:50:03.791453 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bd9fd090-7c43-44f4-9951-10b4528fc8a2-server-conf\") pod \"bd9fd090-7c43-44f4-9951-10b4528fc8a2\" (UID: \"bd9fd090-7c43-44f4-9951-10b4528fc8a2\") " Dec 12 00:50:03 crc kubenswrapper[4606]: I1212 00:50:03.792147 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd9fd090-7c43-44f4-9951-10b4528fc8a2-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "bd9fd090-7c43-44f4-9951-10b4528fc8a2" (UID: "bd9fd090-7c43-44f4-9951-10b4528fc8a2"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:50:03 crc kubenswrapper[4606]: I1212 00:50:03.792388 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd9fd090-7c43-44f4-9951-10b4528fc8a2-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "bd9fd090-7c43-44f4-9951-10b4528fc8a2" (UID: "bd9fd090-7c43-44f4-9951-10b4528fc8a2"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 00:50:03 crc kubenswrapper[4606]: I1212 00:50:03.798352 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd9fd090-7c43-44f4-9951-10b4528fc8a2-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "bd9fd090-7c43-44f4-9951-10b4528fc8a2" (UID: "bd9fd090-7c43-44f4-9951-10b4528fc8a2"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 00:50:03 crc kubenswrapper[4606]: I1212 00:50:03.806312 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd9fd090-7c43-44f4-9951-10b4528fc8a2-kube-api-access-qcbhk" (OuterVolumeSpecName: "kube-api-access-qcbhk") pod "bd9fd090-7c43-44f4-9951-10b4528fc8a2" (UID: "bd9fd090-7c43-44f4-9951-10b4528fc8a2"). InnerVolumeSpecName "kube-api-access-qcbhk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:50:03 crc kubenswrapper[4606]: I1212 00:50:03.808354 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "persistence") pod "bd9fd090-7c43-44f4-9951-10b4528fc8a2" (UID: "bd9fd090-7c43-44f4-9951-10b4528fc8a2"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 12 00:50:03 crc kubenswrapper[4606]: I1212 00:50:03.808685 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd9fd090-7c43-44f4-9951-10b4528fc8a2-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "bd9fd090-7c43-44f4-9951-10b4528fc8a2" (UID: "bd9fd090-7c43-44f4-9951-10b4528fc8a2"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:50:03 crc kubenswrapper[4606]: I1212 00:50:03.808655 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd9fd090-7c43-44f4-9951-10b4528fc8a2-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "bd9fd090-7c43-44f4-9951-10b4528fc8a2" (UID: "bd9fd090-7c43-44f4-9951-10b4528fc8a2"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:50:03 crc kubenswrapper[4606]: I1212 00:50:03.825079 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/bd9fd090-7c43-44f4-9951-10b4528fc8a2-pod-info" (OuterVolumeSpecName: "pod-info") pod "bd9fd090-7c43-44f4-9951-10b4528fc8a2" (UID: "bd9fd090-7c43-44f4-9951-10b4528fc8a2"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 12 00:50:03 crc kubenswrapper[4606]: I1212 00:50:03.849609 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd9fd090-7c43-44f4-9951-10b4528fc8a2-config-data" (OuterVolumeSpecName: "config-data") pod "bd9fd090-7c43-44f4-9951-10b4528fc8a2" (UID: "bd9fd090-7c43-44f4-9951-10b4528fc8a2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:50:03 crc kubenswrapper[4606]: I1212 00:50:03.890517 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd9fd090-7c43-44f4-9951-10b4528fc8a2-server-conf" (OuterVolumeSpecName: "server-conf") pod "bd9fd090-7c43-44f4-9951-10b4528fc8a2" (UID: "bd9fd090-7c43-44f4-9951-10b4528fc8a2"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:50:03 crc kubenswrapper[4606]: I1212 00:50:03.897696 4606 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/bd9fd090-7c43-44f4-9951-10b4528fc8a2-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 12 00:50:03 crc kubenswrapper[4606]: I1212 00:50:03.897856 4606 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bd9fd090-7c43-44f4-9951-10b4528fc8a2-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 12 00:50:03 crc kubenswrapper[4606]: I1212 00:50:03.897951 4606 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bd9fd090-7c43-44f4-9951-10b4528fc8a2-config-data\") on node \"crc\" DevicePath \"\"" Dec 12 00:50:03 crc kubenswrapper[4606]: I1212 00:50:03.898022 4606 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bd9fd090-7c43-44f4-9951-10b4528fc8a2-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 12 00:50:03 crc kubenswrapper[4606]: I1212 00:50:03.898225 4606 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bd9fd090-7c43-44f4-9951-10b4528fc8a2-server-conf\") on node \"crc\" DevicePath \"\"" Dec 12 00:50:03 crc kubenswrapper[4606]: I1212 00:50:03.898301 4606 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bd9fd090-7c43-44f4-9951-10b4528fc8a2-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 12 00:50:03 crc kubenswrapper[4606]: I1212 00:50:03.898376 4606 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bd9fd090-7c43-44f4-9951-10b4528fc8a2-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 12 00:50:03 crc kubenswrapper[4606]: I1212 00:50:03.898461 4606 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bd9fd090-7c43-44f4-9951-10b4528fc8a2-pod-info\") on node \"crc\" DevicePath \"\"" Dec 12 00:50:03 crc kubenswrapper[4606]: I1212 00:50:03.898543 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qcbhk\" (UniqueName: \"kubernetes.io/projected/bd9fd090-7c43-44f4-9951-10b4528fc8a2-kube-api-access-qcbhk\") on node \"crc\" DevicePath \"\"" Dec 12 00:50:03 crc kubenswrapper[4606]: I1212 00:50:03.898638 4606 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Dec 12 00:50:03 crc kubenswrapper[4606]: I1212 00:50:03.952776 4606 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Dec 12 00:50:04 crc kubenswrapper[4606]: I1212 00:50:04.000109 4606 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Dec 12 00:50:04 crc kubenswrapper[4606]: I1212 00:50:04.019561 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd9fd090-7c43-44f4-9951-10b4528fc8a2-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "bd9fd090-7c43-44f4-9951-10b4528fc8a2" (UID: "bd9fd090-7c43-44f4-9951-10b4528fc8a2"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:50:04 crc kubenswrapper[4606]: I1212 00:50:04.101689 4606 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bd9fd090-7c43-44f4-9951-10b4528fc8a2-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 12 00:50:04 crc kubenswrapper[4606]: I1212 00:50:04.605572 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"bd9fd090-7c43-44f4-9951-10b4528fc8a2","Type":"ContainerDied","Data":"1c8f9b1263c4b5b0e26b095d054e0d7444112d6765d12422bba276e6ffdc26a4"} Dec 12 00:50:04 crc kubenswrapper[4606]: I1212 00:50:04.605621 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 12 00:50:04 crc kubenswrapper[4606]: I1212 00:50:04.605624 4606 scope.go:117] "RemoveContainer" containerID="bb5b1403233bfd6bb8bd59205d11593b5e080ca5002733c88e5911b16beedaf3" Dec 12 00:50:04 crc kubenswrapper[4606]: I1212 00:50:04.641050 4606 scope.go:117] "RemoveContainer" containerID="0de71c822d771a40ccdc61ecaaab12bca9931df22b2c4c086696c1a0a0173f7d" Dec 12 00:50:04 crc kubenswrapper[4606]: I1212 00:50:04.642447 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 12 00:50:04 crc kubenswrapper[4606]: I1212 00:50:04.655189 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 12 00:50:04 crc kubenswrapper[4606]: I1212 00:50:04.671458 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 12 00:50:04 crc kubenswrapper[4606]: E1212 00:50:04.671982 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd9fd090-7c43-44f4-9951-10b4528fc8a2" containerName="setup-container" Dec 12 00:50:04 crc kubenswrapper[4606]: I1212 00:50:04.672074 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd9fd090-7c43-44f4-9951-10b4528fc8a2" containerName="setup-container" Dec 12 00:50:04 crc kubenswrapper[4606]: E1212 00:50:04.672209 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd9fd090-7c43-44f4-9951-10b4528fc8a2" containerName="rabbitmq" Dec 12 00:50:04 crc kubenswrapper[4606]: I1212 00:50:04.672261 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd9fd090-7c43-44f4-9951-10b4528fc8a2" containerName="rabbitmq" Dec 12 00:50:04 crc kubenswrapper[4606]: I1212 00:50:04.672560 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd9fd090-7c43-44f4-9951-10b4528fc8a2" containerName="rabbitmq" Dec 12 00:50:04 crc kubenswrapper[4606]: I1212 00:50:04.673882 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 12 00:50:04 crc kubenswrapper[4606]: I1212 00:50:04.677908 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 12 00:50:04 crc kubenswrapper[4606]: I1212 00:50:04.683388 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 12 00:50:04 crc kubenswrapper[4606]: I1212 00:50:04.683538 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-xdfhc" Dec 12 00:50:04 crc kubenswrapper[4606]: I1212 00:50:04.683646 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 12 00:50:04 crc kubenswrapper[4606]: I1212 00:50:04.684297 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Dec 12 00:50:04 crc kubenswrapper[4606]: I1212 00:50:04.702580 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 12 00:50:04 crc kubenswrapper[4606]: I1212 00:50:04.702837 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 12 00:50:04 crc kubenswrapper[4606]: I1212 00:50:04.725517 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Dec 12 00:50:04 crc kubenswrapper[4606]: I1212 00:50:04.824200 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b744992a-d383-4df5-859e-b24a8e70c1bb-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"b744992a-d383-4df5-859e-b24a8e70c1bb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 12 00:50:04 crc kubenswrapper[4606]: I1212 00:50:04.824500 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b744992a-d383-4df5-859e-b24a8e70c1bb-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"b744992a-d383-4df5-859e-b24a8e70c1bb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 12 00:50:04 crc kubenswrapper[4606]: I1212 00:50:04.824631 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b744992a-d383-4df5-859e-b24a8e70c1bb-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b744992a-d383-4df5-859e-b24a8e70c1bb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 12 00:50:04 crc kubenswrapper[4606]: I1212 00:50:04.824742 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"b744992a-d383-4df5-859e-b24a8e70c1bb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 12 00:50:04 crc kubenswrapper[4606]: I1212 00:50:04.824845 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b744992a-d383-4df5-859e-b24a8e70c1bb-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"b744992a-d383-4df5-859e-b24a8e70c1bb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 12 00:50:04 crc kubenswrapper[4606]: I1212 00:50:04.824974 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b744992a-d383-4df5-859e-b24a8e70c1bb-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"b744992a-d383-4df5-859e-b24a8e70c1bb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 12 00:50:04 crc kubenswrapper[4606]: I1212 00:50:04.825190 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b744992a-d383-4df5-859e-b24a8e70c1bb-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"b744992a-d383-4df5-859e-b24a8e70c1bb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 12 00:50:04 crc kubenswrapper[4606]: I1212 00:50:04.825316 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b744992a-d383-4df5-859e-b24a8e70c1bb-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"b744992a-d383-4df5-859e-b24a8e70c1bb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 12 00:50:04 crc kubenswrapper[4606]: I1212 00:50:04.825469 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b744992a-d383-4df5-859e-b24a8e70c1bb-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b744992a-d383-4df5-859e-b24a8e70c1bb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 12 00:50:04 crc kubenswrapper[4606]: I1212 00:50:04.825561 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b744992a-d383-4df5-859e-b24a8e70c1bb-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"b744992a-d383-4df5-859e-b24a8e70c1bb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 12 00:50:04 crc kubenswrapper[4606]: I1212 00:50:04.825651 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rprg2\" (UniqueName: \"kubernetes.io/projected/b744992a-d383-4df5-859e-b24a8e70c1bb-kube-api-access-rprg2\") pod \"rabbitmq-cell1-server-0\" (UID: \"b744992a-d383-4df5-859e-b24a8e70c1bb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 12 00:50:04 crc kubenswrapper[4606]: I1212 00:50:04.929149 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b744992a-d383-4df5-859e-b24a8e70c1bb-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b744992a-d383-4df5-859e-b24a8e70c1bb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 12 00:50:04 crc kubenswrapper[4606]: I1212 00:50:04.929211 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b744992a-d383-4df5-859e-b24a8e70c1bb-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"b744992a-d383-4df5-859e-b24a8e70c1bb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 12 00:50:04 crc kubenswrapper[4606]: I1212 00:50:04.929231 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rprg2\" (UniqueName: \"kubernetes.io/projected/b744992a-d383-4df5-859e-b24a8e70c1bb-kube-api-access-rprg2\") pod \"rabbitmq-cell1-server-0\" (UID: \"b744992a-d383-4df5-859e-b24a8e70c1bb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 12 00:50:04 crc kubenswrapper[4606]: I1212 00:50:04.929286 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b744992a-d383-4df5-859e-b24a8e70c1bb-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"b744992a-d383-4df5-859e-b24a8e70c1bb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 12 00:50:04 crc kubenswrapper[4606]: I1212 00:50:04.929318 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b744992a-d383-4df5-859e-b24a8e70c1bb-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"b744992a-d383-4df5-859e-b24a8e70c1bb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 12 00:50:04 crc kubenswrapper[4606]: I1212 00:50:04.929334 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b744992a-d383-4df5-859e-b24a8e70c1bb-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b744992a-d383-4df5-859e-b24a8e70c1bb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 12 00:50:04 crc kubenswrapper[4606]: I1212 00:50:04.929362 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"b744992a-d383-4df5-859e-b24a8e70c1bb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 12 00:50:04 crc kubenswrapper[4606]: I1212 00:50:04.929390 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b744992a-d383-4df5-859e-b24a8e70c1bb-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"b744992a-d383-4df5-859e-b24a8e70c1bb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 12 00:50:04 crc kubenswrapper[4606]: I1212 00:50:04.929431 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b744992a-d383-4df5-859e-b24a8e70c1bb-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"b744992a-d383-4df5-859e-b24a8e70c1bb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 12 00:50:04 crc kubenswrapper[4606]: I1212 00:50:04.929456 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b744992a-d383-4df5-859e-b24a8e70c1bb-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"b744992a-d383-4df5-859e-b24a8e70c1bb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 12 00:50:04 crc kubenswrapper[4606]: I1212 00:50:04.929486 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b744992a-d383-4df5-859e-b24a8e70c1bb-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"b744992a-d383-4df5-859e-b24a8e70c1bb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 12 00:50:04 crc kubenswrapper[4606]: I1212 00:50:04.930964 4606 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"b744992a-d383-4df5-859e-b24a8e70c1bb\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/rabbitmq-cell1-server-0" Dec 12 00:50:04 crc kubenswrapper[4606]: I1212 00:50:04.931556 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b744992a-d383-4df5-859e-b24a8e70c1bb-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"b744992a-d383-4df5-859e-b24a8e70c1bb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 12 00:50:04 crc kubenswrapper[4606]: I1212 00:50:04.932247 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b744992a-d383-4df5-859e-b24a8e70c1bb-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b744992a-d383-4df5-859e-b24a8e70c1bb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 12 00:50:04 crc kubenswrapper[4606]: I1212 00:50:04.932305 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b744992a-d383-4df5-859e-b24a8e70c1bb-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b744992a-d383-4df5-859e-b24a8e70c1bb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 12 00:50:04 crc kubenswrapper[4606]: I1212 00:50:04.934790 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b744992a-d383-4df5-859e-b24a8e70c1bb-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"b744992a-d383-4df5-859e-b24a8e70c1bb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 12 00:50:04 crc kubenswrapper[4606]: I1212 00:50:04.942353 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b744992a-d383-4df5-859e-b24a8e70c1bb-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"b744992a-d383-4df5-859e-b24a8e70c1bb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 12 00:50:04 crc kubenswrapper[4606]: I1212 00:50:04.943242 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b744992a-d383-4df5-859e-b24a8e70c1bb-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"b744992a-d383-4df5-859e-b24a8e70c1bb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 12 00:50:04 crc kubenswrapper[4606]: I1212 00:50:04.943895 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b744992a-d383-4df5-859e-b24a8e70c1bb-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"b744992a-d383-4df5-859e-b24a8e70c1bb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 12 00:50:04 crc kubenswrapper[4606]: I1212 00:50:04.944674 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b744992a-d383-4df5-859e-b24a8e70c1bb-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"b744992a-d383-4df5-859e-b24a8e70c1bb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 12 00:50:04 crc kubenswrapper[4606]: I1212 00:50:04.949193 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b744992a-d383-4df5-859e-b24a8e70c1bb-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"b744992a-d383-4df5-859e-b24a8e70c1bb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 12 00:50:04 crc kubenswrapper[4606]: I1212 00:50:04.952325 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rprg2\" (UniqueName: \"kubernetes.io/projected/b744992a-d383-4df5-859e-b24a8e70c1bb-kube-api-access-rprg2\") pod \"rabbitmq-cell1-server-0\" (UID: \"b744992a-d383-4df5-859e-b24a8e70c1bb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 12 00:50:04 crc kubenswrapper[4606]: I1212 00:50:04.964678 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"b744992a-d383-4df5-859e-b24a8e70c1bb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 12 00:50:05 crc kubenswrapper[4606]: I1212 00:50:05.032786 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 12 00:50:05 crc kubenswrapper[4606]: I1212 00:50:05.456891 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 12 00:50:05 crc kubenswrapper[4606]: W1212 00:50:05.458815 4606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb744992a_d383_4df5_859e_b24a8e70c1bb.slice/crio-95f2bba514ed2475ce1ec9ed17c2fbc42dcbd862c884cee331c97d71e0d43150 WatchSource:0}: Error finding container 95f2bba514ed2475ce1ec9ed17c2fbc42dcbd862c884cee331c97d71e0d43150: Status 404 returned error can't find the container with id 95f2bba514ed2475ce1ec9ed17c2fbc42dcbd862c884cee331c97d71e0d43150 Dec 12 00:50:05 crc kubenswrapper[4606]: I1212 00:50:05.617768 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b744992a-d383-4df5-859e-b24a8e70c1bb","Type":"ContainerStarted","Data":"95f2bba514ed2475ce1ec9ed17c2fbc42dcbd862c884cee331c97d71e0d43150"} Dec 12 00:50:05 crc kubenswrapper[4606]: I1212 00:50:05.619901 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"aec304bf-3003-493d-9e17-3a2f75997bdb","Type":"ContainerStarted","Data":"5bc39d8f3ddf5d9c9dd2080ad4f3f8533edb9b6550252d90280ecb869d7eb3bf"} Dec 12 00:50:05 crc kubenswrapper[4606]: I1212 00:50:05.725975 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd9fd090-7c43-44f4-9951-10b4528fc8a2" path="/var/lib/kubelet/pods/bd9fd090-7c43-44f4-9951-10b4528fc8a2/volumes" Dec 12 00:50:07 crc kubenswrapper[4606]: I1212 00:50:07.640601 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b744992a-d383-4df5-859e-b24a8e70c1bb","Type":"ContainerStarted","Data":"0ecc2404a92d1a0f02ed8b1e89f262f7d48c12b623218d2152967c539e5b7955"} Dec 12 00:50:09 crc kubenswrapper[4606]: I1212 00:50:09.730968 4606 scope.go:117] "RemoveContainer" containerID="77f50006b910ef408eff1d278d14d4c6df84559c43dc90a75f9e7ad4aa5dad37" Dec 12 00:50:09 crc kubenswrapper[4606]: E1212 00:50:09.731815 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 00:50:10 crc kubenswrapper[4606]: I1212 00:50:10.544679 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-d558885bc-qbx5q"] Dec 12 00:50:10 crc kubenswrapper[4606]: I1212 00:50:10.546901 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d558885bc-qbx5q" Dec 12 00:50:10 crc kubenswrapper[4606]: I1212 00:50:10.548963 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Dec 12 00:50:10 crc kubenswrapper[4606]: I1212 00:50:10.569283 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-d558885bc-qbx5q"] Dec 12 00:50:10 crc kubenswrapper[4606]: I1212 00:50:10.639847 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c8a76785-a630-4df9-ac2e-1f2a93ecc0db-ovsdbserver-sb\") pod \"dnsmasq-dns-d558885bc-qbx5q\" (UID: \"c8a76785-a630-4df9-ac2e-1f2a93ecc0db\") " pod="openstack/dnsmasq-dns-d558885bc-qbx5q" Dec 12 00:50:10 crc kubenswrapper[4606]: I1212 00:50:10.640339 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvgfd\" (UniqueName: \"kubernetes.io/projected/c8a76785-a630-4df9-ac2e-1f2a93ecc0db-kube-api-access-xvgfd\") pod \"dnsmasq-dns-d558885bc-qbx5q\" (UID: \"c8a76785-a630-4df9-ac2e-1f2a93ecc0db\") " pod="openstack/dnsmasq-dns-d558885bc-qbx5q" Dec 12 00:50:10 crc kubenswrapper[4606]: I1212 00:50:10.640388 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/c8a76785-a630-4df9-ac2e-1f2a93ecc0db-openstack-edpm-ipam\") pod \"dnsmasq-dns-d558885bc-qbx5q\" (UID: \"c8a76785-a630-4df9-ac2e-1f2a93ecc0db\") " pod="openstack/dnsmasq-dns-d558885bc-qbx5q" Dec 12 00:50:10 crc kubenswrapper[4606]: I1212 00:50:10.640417 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c8a76785-a630-4df9-ac2e-1f2a93ecc0db-dns-svc\") pod \"dnsmasq-dns-d558885bc-qbx5q\" (UID: \"c8a76785-a630-4df9-ac2e-1f2a93ecc0db\") " pod="openstack/dnsmasq-dns-d558885bc-qbx5q" Dec 12 00:50:10 crc kubenswrapper[4606]: I1212 00:50:10.640446 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8a76785-a630-4df9-ac2e-1f2a93ecc0db-config\") pod \"dnsmasq-dns-d558885bc-qbx5q\" (UID: \"c8a76785-a630-4df9-ac2e-1f2a93ecc0db\") " pod="openstack/dnsmasq-dns-d558885bc-qbx5q" Dec 12 00:50:10 crc kubenswrapper[4606]: I1212 00:50:10.640506 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c8a76785-a630-4df9-ac2e-1f2a93ecc0db-ovsdbserver-nb\") pod \"dnsmasq-dns-d558885bc-qbx5q\" (UID: \"c8a76785-a630-4df9-ac2e-1f2a93ecc0db\") " pod="openstack/dnsmasq-dns-d558885bc-qbx5q" Dec 12 00:50:10 crc kubenswrapper[4606]: I1212 00:50:10.640544 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c8a76785-a630-4df9-ac2e-1f2a93ecc0db-dns-swift-storage-0\") pod \"dnsmasq-dns-d558885bc-qbx5q\" (UID: \"c8a76785-a630-4df9-ac2e-1f2a93ecc0db\") " pod="openstack/dnsmasq-dns-d558885bc-qbx5q" Dec 12 00:50:10 crc kubenswrapper[4606]: I1212 00:50:10.743348 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvgfd\" (UniqueName: \"kubernetes.io/projected/c8a76785-a630-4df9-ac2e-1f2a93ecc0db-kube-api-access-xvgfd\") pod \"dnsmasq-dns-d558885bc-qbx5q\" (UID: \"c8a76785-a630-4df9-ac2e-1f2a93ecc0db\") " pod="openstack/dnsmasq-dns-d558885bc-qbx5q" Dec 12 00:50:10 crc kubenswrapper[4606]: I1212 00:50:10.744347 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/c8a76785-a630-4df9-ac2e-1f2a93ecc0db-openstack-edpm-ipam\") pod \"dnsmasq-dns-d558885bc-qbx5q\" (UID: \"c8a76785-a630-4df9-ac2e-1f2a93ecc0db\") " pod="openstack/dnsmasq-dns-d558885bc-qbx5q" Dec 12 00:50:10 crc kubenswrapper[4606]: I1212 00:50:10.744449 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c8a76785-a630-4df9-ac2e-1f2a93ecc0db-dns-svc\") pod \"dnsmasq-dns-d558885bc-qbx5q\" (UID: \"c8a76785-a630-4df9-ac2e-1f2a93ecc0db\") " pod="openstack/dnsmasq-dns-d558885bc-qbx5q" Dec 12 00:50:10 crc kubenswrapper[4606]: I1212 00:50:10.744548 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8a76785-a630-4df9-ac2e-1f2a93ecc0db-config\") pod \"dnsmasq-dns-d558885bc-qbx5q\" (UID: \"c8a76785-a630-4df9-ac2e-1f2a93ecc0db\") " pod="openstack/dnsmasq-dns-d558885bc-qbx5q" Dec 12 00:50:10 crc kubenswrapper[4606]: I1212 00:50:10.744681 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c8a76785-a630-4df9-ac2e-1f2a93ecc0db-ovsdbserver-nb\") pod \"dnsmasq-dns-d558885bc-qbx5q\" (UID: \"c8a76785-a630-4df9-ac2e-1f2a93ecc0db\") " pod="openstack/dnsmasq-dns-d558885bc-qbx5q" Dec 12 00:50:10 crc kubenswrapper[4606]: I1212 00:50:10.744768 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c8a76785-a630-4df9-ac2e-1f2a93ecc0db-dns-swift-storage-0\") pod \"dnsmasq-dns-d558885bc-qbx5q\" (UID: \"c8a76785-a630-4df9-ac2e-1f2a93ecc0db\") " pod="openstack/dnsmasq-dns-d558885bc-qbx5q" Dec 12 00:50:10 crc kubenswrapper[4606]: I1212 00:50:10.744928 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c8a76785-a630-4df9-ac2e-1f2a93ecc0db-ovsdbserver-sb\") pod \"dnsmasq-dns-d558885bc-qbx5q\" (UID: \"c8a76785-a630-4df9-ac2e-1f2a93ecc0db\") " pod="openstack/dnsmasq-dns-d558885bc-qbx5q" Dec 12 00:50:10 crc kubenswrapper[4606]: I1212 00:50:10.745330 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8a76785-a630-4df9-ac2e-1f2a93ecc0db-config\") pod \"dnsmasq-dns-d558885bc-qbx5q\" (UID: \"c8a76785-a630-4df9-ac2e-1f2a93ecc0db\") " pod="openstack/dnsmasq-dns-d558885bc-qbx5q" Dec 12 00:50:10 crc kubenswrapper[4606]: I1212 00:50:10.745361 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c8a76785-a630-4df9-ac2e-1f2a93ecc0db-dns-svc\") pod \"dnsmasq-dns-d558885bc-qbx5q\" (UID: \"c8a76785-a630-4df9-ac2e-1f2a93ecc0db\") " pod="openstack/dnsmasq-dns-d558885bc-qbx5q" Dec 12 00:50:10 crc kubenswrapper[4606]: I1212 00:50:10.745556 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c8a76785-a630-4df9-ac2e-1f2a93ecc0db-ovsdbserver-nb\") pod \"dnsmasq-dns-d558885bc-qbx5q\" (UID: \"c8a76785-a630-4df9-ac2e-1f2a93ecc0db\") " pod="openstack/dnsmasq-dns-d558885bc-qbx5q" Dec 12 00:50:10 crc kubenswrapper[4606]: I1212 00:50:10.745720 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c8a76785-a630-4df9-ac2e-1f2a93ecc0db-ovsdbserver-sb\") pod \"dnsmasq-dns-d558885bc-qbx5q\" (UID: \"c8a76785-a630-4df9-ac2e-1f2a93ecc0db\") " pod="openstack/dnsmasq-dns-d558885bc-qbx5q" Dec 12 00:50:10 crc kubenswrapper[4606]: I1212 00:50:10.745725 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c8a76785-a630-4df9-ac2e-1f2a93ecc0db-dns-swift-storage-0\") pod \"dnsmasq-dns-d558885bc-qbx5q\" (UID: \"c8a76785-a630-4df9-ac2e-1f2a93ecc0db\") " pod="openstack/dnsmasq-dns-d558885bc-qbx5q" Dec 12 00:50:10 crc kubenswrapper[4606]: I1212 00:50:10.746063 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/c8a76785-a630-4df9-ac2e-1f2a93ecc0db-openstack-edpm-ipam\") pod \"dnsmasq-dns-d558885bc-qbx5q\" (UID: \"c8a76785-a630-4df9-ac2e-1f2a93ecc0db\") " pod="openstack/dnsmasq-dns-d558885bc-qbx5q" Dec 12 00:50:10 crc kubenswrapper[4606]: I1212 00:50:10.770301 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvgfd\" (UniqueName: \"kubernetes.io/projected/c8a76785-a630-4df9-ac2e-1f2a93ecc0db-kube-api-access-xvgfd\") pod \"dnsmasq-dns-d558885bc-qbx5q\" (UID: \"c8a76785-a630-4df9-ac2e-1f2a93ecc0db\") " pod="openstack/dnsmasq-dns-d558885bc-qbx5q" Dec 12 00:50:10 crc kubenswrapper[4606]: I1212 00:50:10.870492 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d558885bc-qbx5q" Dec 12 00:50:11 crc kubenswrapper[4606]: I1212 00:50:11.375808 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-d558885bc-qbx5q"] Dec 12 00:50:11 crc kubenswrapper[4606]: I1212 00:50:11.696870 4606 generic.go:334] "Generic (PLEG): container finished" podID="c8a76785-a630-4df9-ac2e-1f2a93ecc0db" containerID="8d06268c8fb96d9c6968de678391f9c6e99c282b8c911601658fe2a57c4e3e3c" exitCode=0 Dec 12 00:50:11 crc kubenswrapper[4606]: I1212 00:50:11.697017 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d558885bc-qbx5q" event={"ID":"c8a76785-a630-4df9-ac2e-1f2a93ecc0db","Type":"ContainerDied","Data":"8d06268c8fb96d9c6968de678391f9c6e99c282b8c911601658fe2a57c4e3e3c"} Dec 12 00:50:11 crc kubenswrapper[4606]: I1212 00:50:11.697188 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d558885bc-qbx5q" event={"ID":"c8a76785-a630-4df9-ac2e-1f2a93ecc0db","Type":"ContainerStarted","Data":"1e2deaae151ed4036383a4639872cd03dd42fcb21f28794481e7fbadaede3921"} Dec 12 00:50:12 crc kubenswrapper[4606]: I1212 00:50:12.710085 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d558885bc-qbx5q" event={"ID":"c8a76785-a630-4df9-ac2e-1f2a93ecc0db","Type":"ContainerStarted","Data":"3181fef06a3d6ce0a21ddbdd0a4703195aeb701f665cdecc71f0073d020ab6cc"} Dec 12 00:50:12 crc kubenswrapper[4606]: I1212 00:50:12.710532 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-d558885bc-qbx5q" Dec 12 00:50:12 crc kubenswrapper[4606]: I1212 00:50:12.736944 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-d558885bc-qbx5q" podStartSLOduration=2.736899824 podStartE2EDuration="2.736899824s" podCreationTimestamp="2025-12-12 00:50:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:50:12.728754765 +0000 UTC m=+1603.274107631" watchObservedRunningTime="2025-12-12 00:50:12.736899824 +0000 UTC m=+1603.282252690" Dec 12 00:50:20 crc kubenswrapper[4606]: I1212 00:50:20.700093 4606 scope.go:117] "RemoveContainer" containerID="77f50006b910ef408eff1d278d14d4c6df84559c43dc90a75f9e7ad4aa5dad37" Dec 12 00:50:20 crc kubenswrapper[4606]: E1212 00:50:20.701048 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 00:50:20 crc kubenswrapper[4606]: I1212 00:50:20.872053 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-d558885bc-qbx5q" Dec 12 00:50:20 crc kubenswrapper[4606]: I1212 00:50:20.956978 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-78bt2"] Dec 12 00:50:20 crc kubenswrapper[4606]: I1212 00:50:20.957261 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-cd5cbd7b9-78bt2" podUID="a8632450-2d2a-4683-a1f8-fa91a510e5bd" containerName="dnsmasq-dns" containerID="cri-o://970367b0c541a354fc32e548e05e79678a24a044c272d623660a060bea052259" gracePeriod=10 Dec 12 00:50:21 crc kubenswrapper[4606]: I1212 00:50:21.124471 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-94d468747-glhd9"] Dec 12 00:50:21 crc kubenswrapper[4606]: I1212 00:50:21.126409 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-94d468747-glhd9" Dec 12 00:50:21 crc kubenswrapper[4606]: I1212 00:50:21.147858 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-94d468747-glhd9"] Dec 12 00:50:21 crc kubenswrapper[4606]: I1212 00:50:21.278208 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a4ba3b3-cc4c-4f65-92c7-ffe91f827bf9-config\") pod \"dnsmasq-dns-94d468747-glhd9\" (UID: \"9a4ba3b3-cc4c-4f65-92c7-ffe91f827bf9\") " pod="openstack/dnsmasq-dns-94d468747-glhd9" Dec 12 00:50:21 crc kubenswrapper[4606]: I1212 00:50:21.278251 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9a4ba3b3-cc4c-4f65-92c7-ffe91f827bf9-ovsdbserver-nb\") pod \"dnsmasq-dns-94d468747-glhd9\" (UID: \"9a4ba3b3-cc4c-4f65-92c7-ffe91f827bf9\") " pod="openstack/dnsmasq-dns-94d468747-glhd9" Dec 12 00:50:21 crc kubenswrapper[4606]: I1212 00:50:21.278275 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/9a4ba3b3-cc4c-4f65-92c7-ffe91f827bf9-openstack-edpm-ipam\") pod \"dnsmasq-dns-94d468747-glhd9\" (UID: \"9a4ba3b3-cc4c-4f65-92c7-ffe91f827bf9\") " pod="openstack/dnsmasq-dns-94d468747-glhd9" Dec 12 00:50:21 crc kubenswrapper[4606]: I1212 00:50:21.278299 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9a4ba3b3-cc4c-4f65-92c7-ffe91f827bf9-dns-svc\") pod \"dnsmasq-dns-94d468747-glhd9\" (UID: \"9a4ba3b3-cc4c-4f65-92c7-ffe91f827bf9\") " pod="openstack/dnsmasq-dns-94d468747-glhd9" Dec 12 00:50:21 crc kubenswrapper[4606]: I1212 00:50:21.278381 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5p5c6\" (UniqueName: \"kubernetes.io/projected/9a4ba3b3-cc4c-4f65-92c7-ffe91f827bf9-kube-api-access-5p5c6\") pod \"dnsmasq-dns-94d468747-glhd9\" (UID: \"9a4ba3b3-cc4c-4f65-92c7-ffe91f827bf9\") " pod="openstack/dnsmasq-dns-94d468747-glhd9" Dec 12 00:50:21 crc kubenswrapper[4606]: I1212 00:50:21.278405 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9a4ba3b3-cc4c-4f65-92c7-ffe91f827bf9-dns-swift-storage-0\") pod \"dnsmasq-dns-94d468747-glhd9\" (UID: \"9a4ba3b3-cc4c-4f65-92c7-ffe91f827bf9\") " pod="openstack/dnsmasq-dns-94d468747-glhd9" Dec 12 00:50:21 crc kubenswrapper[4606]: I1212 00:50:21.278440 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9a4ba3b3-cc4c-4f65-92c7-ffe91f827bf9-ovsdbserver-sb\") pod \"dnsmasq-dns-94d468747-glhd9\" (UID: \"9a4ba3b3-cc4c-4f65-92c7-ffe91f827bf9\") " pod="openstack/dnsmasq-dns-94d468747-glhd9" Dec 12 00:50:21 crc kubenswrapper[4606]: I1212 00:50:21.384829 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5p5c6\" (UniqueName: \"kubernetes.io/projected/9a4ba3b3-cc4c-4f65-92c7-ffe91f827bf9-kube-api-access-5p5c6\") pod \"dnsmasq-dns-94d468747-glhd9\" (UID: \"9a4ba3b3-cc4c-4f65-92c7-ffe91f827bf9\") " pod="openstack/dnsmasq-dns-94d468747-glhd9" Dec 12 00:50:21 crc kubenswrapper[4606]: I1212 00:50:21.385240 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9a4ba3b3-cc4c-4f65-92c7-ffe91f827bf9-dns-swift-storage-0\") pod \"dnsmasq-dns-94d468747-glhd9\" (UID: \"9a4ba3b3-cc4c-4f65-92c7-ffe91f827bf9\") " pod="openstack/dnsmasq-dns-94d468747-glhd9" Dec 12 00:50:21 crc kubenswrapper[4606]: I1212 00:50:21.385341 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9a4ba3b3-cc4c-4f65-92c7-ffe91f827bf9-ovsdbserver-sb\") pod \"dnsmasq-dns-94d468747-glhd9\" (UID: \"9a4ba3b3-cc4c-4f65-92c7-ffe91f827bf9\") " pod="openstack/dnsmasq-dns-94d468747-glhd9" Dec 12 00:50:21 crc kubenswrapper[4606]: I1212 00:50:21.386304 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9a4ba3b3-cc4c-4f65-92c7-ffe91f827bf9-dns-swift-storage-0\") pod \"dnsmasq-dns-94d468747-glhd9\" (UID: \"9a4ba3b3-cc4c-4f65-92c7-ffe91f827bf9\") " pod="openstack/dnsmasq-dns-94d468747-glhd9" Dec 12 00:50:21 crc kubenswrapper[4606]: I1212 00:50:21.390110 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9a4ba3b3-cc4c-4f65-92c7-ffe91f827bf9-ovsdbserver-sb\") pod \"dnsmasq-dns-94d468747-glhd9\" (UID: \"9a4ba3b3-cc4c-4f65-92c7-ffe91f827bf9\") " pod="openstack/dnsmasq-dns-94d468747-glhd9" Dec 12 00:50:21 crc kubenswrapper[4606]: I1212 00:50:21.390357 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a4ba3b3-cc4c-4f65-92c7-ffe91f827bf9-config\") pod \"dnsmasq-dns-94d468747-glhd9\" (UID: \"9a4ba3b3-cc4c-4f65-92c7-ffe91f827bf9\") " pod="openstack/dnsmasq-dns-94d468747-glhd9" Dec 12 00:50:21 crc kubenswrapper[4606]: I1212 00:50:21.390381 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9a4ba3b3-cc4c-4f65-92c7-ffe91f827bf9-ovsdbserver-nb\") pod \"dnsmasq-dns-94d468747-glhd9\" (UID: \"9a4ba3b3-cc4c-4f65-92c7-ffe91f827bf9\") " pod="openstack/dnsmasq-dns-94d468747-glhd9" Dec 12 00:50:21 crc kubenswrapper[4606]: I1212 00:50:21.390409 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/9a4ba3b3-cc4c-4f65-92c7-ffe91f827bf9-openstack-edpm-ipam\") pod \"dnsmasq-dns-94d468747-glhd9\" (UID: \"9a4ba3b3-cc4c-4f65-92c7-ffe91f827bf9\") " pod="openstack/dnsmasq-dns-94d468747-glhd9" Dec 12 00:50:21 crc kubenswrapper[4606]: I1212 00:50:21.390441 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9a4ba3b3-cc4c-4f65-92c7-ffe91f827bf9-dns-svc\") pod \"dnsmasq-dns-94d468747-glhd9\" (UID: \"9a4ba3b3-cc4c-4f65-92c7-ffe91f827bf9\") " pod="openstack/dnsmasq-dns-94d468747-glhd9" Dec 12 00:50:21 crc kubenswrapper[4606]: I1212 00:50:21.391236 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9a4ba3b3-cc4c-4f65-92c7-ffe91f827bf9-dns-svc\") pod \"dnsmasq-dns-94d468747-glhd9\" (UID: \"9a4ba3b3-cc4c-4f65-92c7-ffe91f827bf9\") " pod="openstack/dnsmasq-dns-94d468747-glhd9" Dec 12 00:50:21 crc kubenswrapper[4606]: I1212 00:50:21.391783 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a4ba3b3-cc4c-4f65-92c7-ffe91f827bf9-config\") pod \"dnsmasq-dns-94d468747-glhd9\" (UID: \"9a4ba3b3-cc4c-4f65-92c7-ffe91f827bf9\") " pod="openstack/dnsmasq-dns-94d468747-glhd9" Dec 12 00:50:21 crc kubenswrapper[4606]: I1212 00:50:21.392387 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9a4ba3b3-cc4c-4f65-92c7-ffe91f827bf9-ovsdbserver-nb\") pod \"dnsmasq-dns-94d468747-glhd9\" (UID: \"9a4ba3b3-cc4c-4f65-92c7-ffe91f827bf9\") " pod="openstack/dnsmasq-dns-94d468747-glhd9" Dec 12 00:50:21 crc kubenswrapper[4606]: I1212 00:50:21.392819 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/9a4ba3b3-cc4c-4f65-92c7-ffe91f827bf9-openstack-edpm-ipam\") pod \"dnsmasq-dns-94d468747-glhd9\" (UID: \"9a4ba3b3-cc4c-4f65-92c7-ffe91f827bf9\") " pod="openstack/dnsmasq-dns-94d468747-glhd9" Dec 12 00:50:21 crc kubenswrapper[4606]: I1212 00:50:21.405760 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5p5c6\" (UniqueName: \"kubernetes.io/projected/9a4ba3b3-cc4c-4f65-92c7-ffe91f827bf9-kube-api-access-5p5c6\") pod \"dnsmasq-dns-94d468747-glhd9\" (UID: \"9a4ba3b3-cc4c-4f65-92c7-ffe91f827bf9\") " pod="openstack/dnsmasq-dns-94d468747-glhd9" Dec 12 00:50:21 crc kubenswrapper[4606]: I1212 00:50:21.450996 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-94d468747-glhd9" Dec 12 00:50:21 crc kubenswrapper[4606]: I1212 00:50:21.650543 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cd5cbd7b9-78bt2" Dec 12 00:50:21 crc kubenswrapper[4606]: I1212 00:50:21.824528 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a8632450-2d2a-4683-a1f8-fa91a510e5bd-ovsdbserver-nb\") pod \"a8632450-2d2a-4683-a1f8-fa91a510e5bd\" (UID: \"a8632450-2d2a-4683-a1f8-fa91a510e5bd\") " Dec 12 00:50:21 crc kubenswrapper[4606]: I1212 00:50:21.825464 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a8632450-2d2a-4683-a1f8-fa91a510e5bd-ovsdbserver-sb\") pod \"a8632450-2d2a-4683-a1f8-fa91a510e5bd\" (UID: \"a8632450-2d2a-4683-a1f8-fa91a510e5bd\") " Dec 12 00:50:21 crc kubenswrapper[4606]: I1212 00:50:21.825724 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a8632450-2d2a-4683-a1f8-fa91a510e5bd-dns-svc\") pod \"a8632450-2d2a-4683-a1f8-fa91a510e5bd\" (UID: \"a8632450-2d2a-4683-a1f8-fa91a510e5bd\") " Dec 12 00:50:21 crc kubenswrapper[4606]: I1212 00:50:21.825818 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lwmcv\" (UniqueName: \"kubernetes.io/projected/a8632450-2d2a-4683-a1f8-fa91a510e5bd-kube-api-access-lwmcv\") pod \"a8632450-2d2a-4683-a1f8-fa91a510e5bd\" (UID: \"a8632450-2d2a-4683-a1f8-fa91a510e5bd\") " Dec 12 00:50:21 crc kubenswrapper[4606]: I1212 00:50:21.825919 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8632450-2d2a-4683-a1f8-fa91a510e5bd-config\") pod \"a8632450-2d2a-4683-a1f8-fa91a510e5bd\" (UID: \"a8632450-2d2a-4683-a1f8-fa91a510e5bd\") " Dec 12 00:50:21 crc kubenswrapper[4606]: I1212 00:50:21.826024 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a8632450-2d2a-4683-a1f8-fa91a510e5bd-dns-swift-storage-0\") pod \"a8632450-2d2a-4683-a1f8-fa91a510e5bd\" (UID: \"a8632450-2d2a-4683-a1f8-fa91a510e5bd\") " Dec 12 00:50:21 crc kubenswrapper[4606]: I1212 00:50:21.862110 4606 generic.go:334] "Generic (PLEG): container finished" podID="a8632450-2d2a-4683-a1f8-fa91a510e5bd" containerID="970367b0c541a354fc32e548e05e79678a24a044c272d623660a060bea052259" exitCode=0 Dec 12 00:50:21 crc kubenswrapper[4606]: I1212 00:50:21.862291 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-78bt2" event={"ID":"a8632450-2d2a-4683-a1f8-fa91a510e5bd","Type":"ContainerDied","Data":"970367b0c541a354fc32e548e05e79678a24a044c272d623660a060bea052259"} Dec 12 00:50:21 crc kubenswrapper[4606]: I1212 00:50:21.862505 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-78bt2" event={"ID":"a8632450-2d2a-4683-a1f8-fa91a510e5bd","Type":"ContainerDied","Data":"3c741bcede5b7732ebe5d748c41109cbef64f871b4e47df3971002b64dc5ddec"} Dec 12 00:50:21 crc kubenswrapper[4606]: I1212 00:50:21.862534 4606 scope.go:117] "RemoveContainer" containerID="970367b0c541a354fc32e548e05e79678a24a044c272d623660a060bea052259" Dec 12 00:50:21 crc kubenswrapper[4606]: I1212 00:50:21.862736 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cd5cbd7b9-78bt2" Dec 12 00:50:21 crc kubenswrapper[4606]: I1212 00:50:21.908762 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8632450-2d2a-4683-a1f8-fa91a510e5bd-kube-api-access-lwmcv" (OuterVolumeSpecName: "kube-api-access-lwmcv") pod "a8632450-2d2a-4683-a1f8-fa91a510e5bd" (UID: "a8632450-2d2a-4683-a1f8-fa91a510e5bd"). InnerVolumeSpecName "kube-api-access-lwmcv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:50:21 crc kubenswrapper[4606]: I1212 00:50:21.922069 4606 scope.go:117] "RemoveContainer" containerID="d6910078722b4ed4898959861a8b0d6edd4f2ed9a4261f6686ac2f1f45232d86" Dec 12 00:50:21 crc kubenswrapper[4606]: I1212 00:50:21.928999 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lwmcv\" (UniqueName: \"kubernetes.io/projected/a8632450-2d2a-4683-a1f8-fa91a510e5bd-kube-api-access-lwmcv\") on node \"crc\" DevicePath \"\"" Dec 12 00:50:21 crc kubenswrapper[4606]: I1212 00:50:21.934507 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8632450-2d2a-4683-a1f8-fa91a510e5bd-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a8632450-2d2a-4683-a1f8-fa91a510e5bd" (UID: "a8632450-2d2a-4683-a1f8-fa91a510e5bd"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:50:21 crc kubenswrapper[4606]: I1212 00:50:21.976820 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8632450-2d2a-4683-a1f8-fa91a510e5bd-config" (OuterVolumeSpecName: "config") pod "a8632450-2d2a-4683-a1f8-fa91a510e5bd" (UID: "a8632450-2d2a-4683-a1f8-fa91a510e5bd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:50:21 crc kubenswrapper[4606]: I1212 00:50:21.985813 4606 scope.go:117] "RemoveContainer" containerID="970367b0c541a354fc32e548e05e79678a24a044c272d623660a060bea052259" Dec 12 00:50:21 crc kubenswrapper[4606]: E1212 00:50:21.986752 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"970367b0c541a354fc32e548e05e79678a24a044c272d623660a060bea052259\": container with ID starting with 970367b0c541a354fc32e548e05e79678a24a044c272d623660a060bea052259 not found: ID does not exist" containerID="970367b0c541a354fc32e548e05e79678a24a044c272d623660a060bea052259" Dec 12 00:50:21 crc kubenswrapper[4606]: I1212 00:50:21.986796 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"970367b0c541a354fc32e548e05e79678a24a044c272d623660a060bea052259"} err="failed to get container status \"970367b0c541a354fc32e548e05e79678a24a044c272d623660a060bea052259\": rpc error: code = NotFound desc = could not find container \"970367b0c541a354fc32e548e05e79678a24a044c272d623660a060bea052259\": container with ID starting with 970367b0c541a354fc32e548e05e79678a24a044c272d623660a060bea052259 not found: ID does not exist" Dec 12 00:50:21 crc kubenswrapper[4606]: I1212 00:50:21.986825 4606 scope.go:117] "RemoveContainer" containerID="d6910078722b4ed4898959861a8b0d6edd4f2ed9a4261f6686ac2f1f45232d86" Dec 12 00:50:21 crc kubenswrapper[4606]: E1212 00:50:21.987203 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6910078722b4ed4898959861a8b0d6edd4f2ed9a4261f6686ac2f1f45232d86\": container with ID starting with d6910078722b4ed4898959861a8b0d6edd4f2ed9a4261f6686ac2f1f45232d86 not found: ID does not exist" containerID="d6910078722b4ed4898959861a8b0d6edd4f2ed9a4261f6686ac2f1f45232d86" Dec 12 00:50:21 crc kubenswrapper[4606]: I1212 00:50:21.987225 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6910078722b4ed4898959861a8b0d6edd4f2ed9a4261f6686ac2f1f45232d86"} err="failed to get container status \"d6910078722b4ed4898959861a8b0d6edd4f2ed9a4261f6686ac2f1f45232d86\": rpc error: code = NotFound desc = could not find container \"d6910078722b4ed4898959861a8b0d6edd4f2ed9a4261f6686ac2f1f45232d86\": container with ID starting with d6910078722b4ed4898959861a8b0d6edd4f2ed9a4261f6686ac2f1f45232d86 not found: ID does not exist" Dec 12 00:50:21 crc kubenswrapper[4606]: I1212 00:50:21.997538 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8632450-2d2a-4683-a1f8-fa91a510e5bd-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a8632450-2d2a-4683-a1f8-fa91a510e5bd" (UID: "a8632450-2d2a-4683-a1f8-fa91a510e5bd"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:50:21 crc kubenswrapper[4606]: I1212 00:50:21.998748 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8632450-2d2a-4683-a1f8-fa91a510e5bd-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a8632450-2d2a-4683-a1f8-fa91a510e5bd" (UID: "a8632450-2d2a-4683-a1f8-fa91a510e5bd"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:50:22 crc kubenswrapper[4606]: I1212 00:50:22.001085 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8632450-2d2a-4683-a1f8-fa91a510e5bd-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "a8632450-2d2a-4683-a1f8-fa91a510e5bd" (UID: "a8632450-2d2a-4683-a1f8-fa91a510e5bd"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:50:22 crc kubenswrapper[4606]: I1212 00:50:22.030728 4606 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a8632450-2d2a-4683-a1f8-fa91a510e5bd-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 12 00:50:22 crc kubenswrapper[4606]: I1212 00:50:22.030761 4606 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8632450-2d2a-4683-a1f8-fa91a510e5bd-config\") on node \"crc\" DevicePath \"\"" Dec 12 00:50:22 crc kubenswrapper[4606]: I1212 00:50:22.030772 4606 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a8632450-2d2a-4683-a1f8-fa91a510e5bd-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 12 00:50:22 crc kubenswrapper[4606]: I1212 00:50:22.030784 4606 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a8632450-2d2a-4683-a1f8-fa91a510e5bd-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 12 00:50:22 crc kubenswrapper[4606]: I1212 00:50:22.030792 4606 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a8632450-2d2a-4683-a1f8-fa91a510e5bd-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 12 00:50:22 crc kubenswrapper[4606]: I1212 00:50:22.202296 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-78bt2"] Dec 12 00:50:22 crc kubenswrapper[4606]: I1212 00:50:22.218234 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-78bt2"] Dec 12 00:50:22 crc kubenswrapper[4606]: I1212 00:50:22.234489 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-94d468747-glhd9"] Dec 12 00:50:22 crc kubenswrapper[4606]: I1212 00:50:22.873448 4606 generic.go:334] "Generic (PLEG): container finished" podID="9a4ba3b3-cc4c-4f65-92c7-ffe91f827bf9" containerID="5c54e57b8ba78da59c6c631af860b39585d7a6e31a5296efe51eeafcdb8fe499" exitCode=0 Dec 12 00:50:22 crc kubenswrapper[4606]: I1212 00:50:22.873495 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-94d468747-glhd9" event={"ID":"9a4ba3b3-cc4c-4f65-92c7-ffe91f827bf9","Type":"ContainerDied","Data":"5c54e57b8ba78da59c6c631af860b39585d7a6e31a5296efe51eeafcdb8fe499"} Dec 12 00:50:22 crc kubenswrapper[4606]: I1212 00:50:22.873819 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-94d468747-glhd9" event={"ID":"9a4ba3b3-cc4c-4f65-92c7-ffe91f827bf9","Type":"ContainerStarted","Data":"95dc6cf373c6f994d11e19acab3375087f168eba680006e17144c2bc9423e2c5"} Dec 12 00:50:23 crc kubenswrapper[4606]: I1212 00:50:23.718732 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8632450-2d2a-4683-a1f8-fa91a510e5bd" path="/var/lib/kubelet/pods/a8632450-2d2a-4683-a1f8-fa91a510e5bd/volumes" Dec 12 00:50:23 crc kubenswrapper[4606]: I1212 00:50:23.888514 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-94d468747-glhd9" event={"ID":"9a4ba3b3-cc4c-4f65-92c7-ffe91f827bf9","Type":"ContainerStarted","Data":"49db7e69b66766a24108a7119604f610c815ed1d1f88890bc9cc401a58d6085c"} Dec 12 00:50:23 crc kubenswrapper[4606]: I1212 00:50:23.890002 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-94d468747-glhd9" Dec 12 00:50:23 crc kubenswrapper[4606]: I1212 00:50:23.930023 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-94d468747-glhd9" podStartSLOduration=2.930001412 podStartE2EDuration="2.930001412s" podCreationTimestamp="2025-12-12 00:50:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:50:23.912932244 +0000 UTC m=+1614.458285120" watchObservedRunningTime="2025-12-12 00:50:23.930001412 +0000 UTC m=+1614.475354288" Dec 12 00:50:31 crc kubenswrapper[4606]: I1212 00:50:31.454827 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-94d468747-glhd9" Dec 12 00:50:31 crc kubenswrapper[4606]: I1212 00:50:31.530096 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d558885bc-qbx5q"] Dec 12 00:50:31 crc kubenswrapper[4606]: I1212 00:50:31.530380 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-d558885bc-qbx5q" podUID="c8a76785-a630-4df9-ac2e-1f2a93ecc0db" containerName="dnsmasq-dns" containerID="cri-o://3181fef06a3d6ce0a21ddbdd0a4703195aeb701f665cdecc71f0073d020ab6cc" gracePeriod=10 Dec 12 00:50:31 crc kubenswrapper[4606]: I1212 00:50:31.980974 4606 generic.go:334] "Generic (PLEG): container finished" podID="c8a76785-a630-4df9-ac2e-1f2a93ecc0db" containerID="3181fef06a3d6ce0a21ddbdd0a4703195aeb701f665cdecc71f0073d020ab6cc" exitCode=0 Dec 12 00:50:31 crc kubenswrapper[4606]: I1212 00:50:31.981035 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d558885bc-qbx5q" event={"ID":"c8a76785-a630-4df9-ac2e-1f2a93ecc0db","Type":"ContainerDied","Data":"3181fef06a3d6ce0a21ddbdd0a4703195aeb701f665cdecc71f0073d020ab6cc"} Dec 12 00:50:32 crc kubenswrapper[4606]: I1212 00:50:32.110806 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d558885bc-qbx5q" Dec 12 00:50:32 crc kubenswrapper[4606]: I1212 00:50:32.235580 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c8a76785-a630-4df9-ac2e-1f2a93ecc0db-ovsdbserver-sb\") pod \"c8a76785-a630-4df9-ac2e-1f2a93ecc0db\" (UID: \"c8a76785-a630-4df9-ac2e-1f2a93ecc0db\") " Dec 12 00:50:32 crc kubenswrapper[4606]: I1212 00:50:32.235774 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c8a76785-a630-4df9-ac2e-1f2a93ecc0db-ovsdbserver-nb\") pod \"c8a76785-a630-4df9-ac2e-1f2a93ecc0db\" (UID: \"c8a76785-a630-4df9-ac2e-1f2a93ecc0db\") " Dec 12 00:50:32 crc kubenswrapper[4606]: I1212 00:50:32.235825 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c8a76785-a630-4df9-ac2e-1f2a93ecc0db-dns-swift-storage-0\") pod \"c8a76785-a630-4df9-ac2e-1f2a93ecc0db\" (UID: \"c8a76785-a630-4df9-ac2e-1f2a93ecc0db\") " Dec 12 00:50:32 crc kubenswrapper[4606]: I1212 00:50:32.235858 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c8a76785-a630-4df9-ac2e-1f2a93ecc0db-dns-svc\") pod \"c8a76785-a630-4df9-ac2e-1f2a93ecc0db\" (UID: \"c8a76785-a630-4df9-ac2e-1f2a93ecc0db\") " Dec 12 00:50:32 crc kubenswrapper[4606]: I1212 00:50:32.235916 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xvgfd\" (UniqueName: \"kubernetes.io/projected/c8a76785-a630-4df9-ac2e-1f2a93ecc0db-kube-api-access-xvgfd\") pod \"c8a76785-a630-4df9-ac2e-1f2a93ecc0db\" (UID: \"c8a76785-a630-4df9-ac2e-1f2a93ecc0db\") " Dec 12 00:50:32 crc kubenswrapper[4606]: I1212 00:50:32.236014 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8a76785-a630-4df9-ac2e-1f2a93ecc0db-config\") pod \"c8a76785-a630-4df9-ac2e-1f2a93ecc0db\" (UID: \"c8a76785-a630-4df9-ac2e-1f2a93ecc0db\") " Dec 12 00:50:32 crc kubenswrapper[4606]: I1212 00:50:32.236106 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/c8a76785-a630-4df9-ac2e-1f2a93ecc0db-openstack-edpm-ipam\") pod \"c8a76785-a630-4df9-ac2e-1f2a93ecc0db\" (UID: \"c8a76785-a630-4df9-ac2e-1f2a93ecc0db\") " Dec 12 00:50:32 crc kubenswrapper[4606]: I1212 00:50:32.244217 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8a76785-a630-4df9-ac2e-1f2a93ecc0db-kube-api-access-xvgfd" (OuterVolumeSpecName: "kube-api-access-xvgfd") pod "c8a76785-a630-4df9-ac2e-1f2a93ecc0db" (UID: "c8a76785-a630-4df9-ac2e-1f2a93ecc0db"). InnerVolumeSpecName "kube-api-access-xvgfd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:50:32 crc kubenswrapper[4606]: I1212 00:50:32.316741 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c8a76785-a630-4df9-ac2e-1f2a93ecc0db-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c8a76785-a630-4df9-ac2e-1f2a93ecc0db" (UID: "c8a76785-a630-4df9-ac2e-1f2a93ecc0db"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:50:32 crc kubenswrapper[4606]: I1212 00:50:32.330939 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c8a76785-a630-4df9-ac2e-1f2a93ecc0db-config" (OuterVolumeSpecName: "config") pod "c8a76785-a630-4df9-ac2e-1f2a93ecc0db" (UID: "c8a76785-a630-4df9-ac2e-1f2a93ecc0db"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:50:32 crc kubenswrapper[4606]: I1212 00:50:32.331405 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c8a76785-a630-4df9-ac2e-1f2a93ecc0db-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c8a76785-a630-4df9-ac2e-1f2a93ecc0db" (UID: "c8a76785-a630-4df9-ac2e-1f2a93ecc0db"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:50:32 crc kubenswrapper[4606]: I1212 00:50:32.333240 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c8a76785-a630-4df9-ac2e-1f2a93ecc0db-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "c8a76785-a630-4df9-ac2e-1f2a93ecc0db" (UID: "c8a76785-a630-4df9-ac2e-1f2a93ecc0db"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:50:32 crc kubenswrapper[4606]: I1212 00:50:32.335067 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c8a76785-a630-4df9-ac2e-1f2a93ecc0db-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "c8a76785-a630-4df9-ac2e-1f2a93ecc0db" (UID: "c8a76785-a630-4df9-ac2e-1f2a93ecc0db"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:50:32 crc kubenswrapper[4606]: I1212 00:50:32.338821 4606 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8a76785-a630-4df9-ac2e-1f2a93ecc0db-config\") on node \"crc\" DevicePath \"\"" Dec 12 00:50:32 crc kubenswrapper[4606]: I1212 00:50:32.338843 4606 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/c8a76785-a630-4df9-ac2e-1f2a93ecc0db-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Dec 12 00:50:32 crc kubenswrapper[4606]: I1212 00:50:32.338853 4606 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c8a76785-a630-4df9-ac2e-1f2a93ecc0db-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 12 00:50:32 crc kubenswrapper[4606]: I1212 00:50:32.338863 4606 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c8a76785-a630-4df9-ac2e-1f2a93ecc0db-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 12 00:50:32 crc kubenswrapper[4606]: I1212 00:50:32.338870 4606 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c8a76785-a630-4df9-ac2e-1f2a93ecc0db-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 12 00:50:32 crc kubenswrapper[4606]: I1212 00:50:32.338878 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xvgfd\" (UniqueName: \"kubernetes.io/projected/c8a76785-a630-4df9-ac2e-1f2a93ecc0db-kube-api-access-xvgfd\") on node \"crc\" DevicePath \"\"" Dec 12 00:50:32 crc kubenswrapper[4606]: I1212 00:50:32.339799 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c8a76785-a630-4df9-ac2e-1f2a93ecc0db-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c8a76785-a630-4df9-ac2e-1f2a93ecc0db" (UID: "c8a76785-a630-4df9-ac2e-1f2a93ecc0db"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 00:50:32 crc kubenswrapper[4606]: I1212 00:50:32.441925 4606 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c8a76785-a630-4df9-ac2e-1f2a93ecc0db-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 12 00:50:32 crc kubenswrapper[4606]: I1212 00:50:32.699428 4606 scope.go:117] "RemoveContainer" containerID="77f50006b910ef408eff1d278d14d4c6df84559c43dc90a75f9e7ad4aa5dad37" Dec 12 00:50:32 crc kubenswrapper[4606]: E1212 00:50:32.699801 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 00:50:32 crc kubenswrapper[4606]: I1212 00:50:32.993645 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d558885bc-qbx5q" event={"ID":"c8a76785-a630-4df9-ac2e-1f2a93ecc0db","Type":"ContainerDied","Data":"1e2deaae151ed4036383a4639872cd03dd42fcb21f28794481e7fbadaede3921"} Dec 12 00:50:32 crc kubenswrapper[4606]: I1212 00:50:32.993730 4606 scope.go:117] "RemoveContainer" containerID="3181fef06a3d6ce0a21ddbdd0a4703195aeb701f665cdecc71f0073d020ab6cc" Dec 12 00:50:32 crc kubenswrapper[4606]: I1212 00:50:32.993769 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d558885bc-qbx5q" Dec 12 00:50:33 crc kubenswrapper[4606]: I1212 00:50:33.012968 4606 scope.go:117] "RemoveContainer" containerID="8d06268c8fb96d9c6968de678391f9c6e99c282b8c911601658fe2a57c4e3e3c" Dec 12 00:50:33 crc kubenswrapper[4606]: I1212 00:50:33.071600 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d558885bc-qbx5q"] Dec 12 00:50:33 crc kubenswrapper[4606]: I1212 00:50:33.083776 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-d558885bc-qbx5q"] Dec 12 00:50:33 crc kubenswrapper[4606]: I1212 00:50:33.711489 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8a76785-a630-4df9-ac2e-1f2a93ecc0db" path="/var/lib/kubelet/pods/c8a76785-a630-4df9-ac2e-1f2a93ecc0db/volumes" Dec 12 00:50:37 crc kubenswrapper[4606]: I1212 00:50:37.030861 4606 generic.go:334] "Generic (PLEG): container finished" podID="aec304bf-3003-493d-9e17-3a2f75997bdb" containerID="5bc39d8f3ddf5d9c9dd2080ad4f3f8533edb9b6550252d90280ecb869d7eb3bf" exitCode=0 Dec 12 00:50:37 crc kubenswrapper[4606]: I1212 00:50:37.030944 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"aec304bf-3003-493d-9e17-3a2f75997bdb","Type":"ContainerDied","Data":"5bc39d8f3ddf5d9c9dd2080ad4f3f8533edb9b6550252d90280ecb869d7eb3bf"} Dec 12 00:50:38 crc kubenswrapper[4606]: I1212 00:50:38.046326 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"aec304bf-3003-493d-9e17-3a2f75997bdb","Type":"ContainerStarted","Data":"99a4338716c749ce381cf7dd67a885ccc46f9c1acf91d785bb69287904ab5f6c"} Dec 12 00:50:38 crc kubenswrapper[4606]: I1212 00:50:38.046810 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 12 00:50:38 crc kubenswrapper[4606]: I1212 00:50:38.092228 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=36.092204751 podStartE2EDuration="36.092204751s" podCreationTimestamp="2025-12-12 00:50:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:50:38.086072967 +0000 UTC m=+1628.631425833" watchObservedRunningTime="2025-12-12 00:50:38.092204751 +0000 UTC m=+1628.637557617" Dec 12 00:50:40 crc kubenswrapper[4606]: I1212 00:50:40.064973 4606 generic.go:334] "Generic (PLEG): container finished" podID="b744992a-d383-4df5-859e-b24a8e70c1bb" containerID="0ecc2404a92d1a0f02ed8b1e89f262f7d48c12b623218d2152967c539e5b7955" exitCode=0 Dec 12 00:50:40 crc kubenswrapper[4606]: I1212 00:50:40.065067 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b744992a-d383-4df5-859e-b24a8e70c1bb","Type":"ContainerDied","Data":"0ecc2404a92d1a0f02ed8b1e89f262f7d48c12b623218d2152967c539e5b7955"} Dec 12 00:50:41 crc kubenswrapper[4606]: I1212 00:50:41.075228 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b744992a-d383-4df5-859e-b24a8e70c1bb","Type":"ContainerStarted","Data":"dbbc0a5900de5ef6bb02d5d264a586060944ee2cb61e359d3d2a1c11b87bda16"} Dec 12 00:50:41 crc kubenswrapper[4606]: I1212 00:50:41.076098 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 12 00:50:41 crc kubenswrapper[4606]: I1212 00:50:41.114725 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.114690033 podStartE2EDuration="37.114690033s" podCreationTimestamp="2025-12-12 00:50:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 00:50:41.099686252 +0000 UTC m=+1631.645039138" watchObservedRunningTime="2025-12-12 00:50:41.114690033 +0000 UTC m=+1631.660042899" Dec 12 00:50:44 crc kubenswrapper[4606]: I1212 00:50:44.699280 4606 scope.go:117] "RemoveContainer" containerID="77f50006b910ef408eff1d278d14d4c6df84559c43dc90a75f9e7ad4aa5dad37" Dec 12 00:50:44 crc kubenswrapper[4606]: E1212 00:50:44.699967 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 00:50:52 crc kubenswrapper[4606]: I1212 00:50:52.973459 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 12 00:50:54 crc kubenswrapper[4606]: I1212 00:50:54.829032 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lwk29"] Dec 12 00:50:54 crc kubenswrapper[4606]: E1212 00:50:54.829809 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8632450-2d2a-4683-a1f8-fa91a510e5bd" containerName="dnsmasq-dns" Dec 12 00:50:54 crc kubenswrapper[4606]: I1212 00:50:54.829828 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8632450-2d2a-4683-a1f8-fa91a510e5bd" containerName="dnsmasq-dns" Dec 12 00:50:54 crc kubenswrapper[4606]: E1212 00:50:54.829845 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8a76785-a630-4df9-ac2e-1f2a93ecc0db" containerName="dnsmasq-dns" Dec 12 00:50:54 crc kubenswrapper[4606]: I1212 00:50:54.829853 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8a76785-a630-4df9-ac2e-1f2a93ecc0db" containerName="dnsmasq-dns" Dec 12 00:50:54 crc kubenswrapper[4606]: E1212 00:50:54.829890 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8a76785-a630-4df9-ac2e-1f2a93ecc0db" containerName="init" Dec 12 00:50:54 crc kubenswrapper[4606]: I1212 00:50:54.829899 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8a76785-a630-4df9-ac2e-1f2a93ecc0db" containerName="init" Dec 12 00:50:54 crc kubenswrapper[4606]: E1212 00:50:54.829916 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8632450-2d2a-4683-a1f8-fa91a510e5bd" containerName="init" Dec 12 00:50:54 crc kubenswrapper[4606]: I1212 00:50:54.829925 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8632450-2d2a-4683-a1f8-fa91a510e5bd" containerName="init" Dec 12 00:50:54 crc kubenswrapper[4606]: I1212 00:50:54.830157 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8a76785-a630-4df9-ac2e-1f2a93ecc0db" containerName="dnsmasq-dns" Dec 12 00:50:54 crc kubenswrapper[4606]: I1212 00:50:54.830203 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8632450-2d2a-4683-a1f8-fa91a510e5bd" containerName="dnsmasq-dns" Dec 12 00:50:54 crc kubenswrapper[4606]: I1212 00:50:54.830960 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lwk29" Dec 12 00:50:54 crc kubenswrapper[4606]: I1212 00:50:54.834921 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 12 00:50:54 crc kubenswrapper[4606]: I1212 00:50:54.835043 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-w59bl" Dec 12 00:50:54 crc kubenswrapper[4606]: I1212 00:50:54.835152 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 12 00:50:54 crc kubenswrapper[4606]: I1212 00:50:54.835265 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 12 00:50:54 crc kubenswrapper[4606]: I1212 00:50:54.854001 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lwk29"] Dec 12 00:50:54 crc kubenswrapper[4606]: I1212 00:50:54.955907 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ba53dca3-2038-4e50-9c9e-ebed89ee7a86-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-lwk29\" (UID: \"ba53dca3-2038-4e50-9c9e-ebed89ee7a86\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lwk29" Dec 12 00:50:54 crc kubenswrapper[4606]: I1212 00:50:54.955957 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ba53dca3-2038-4e50-9c9e-ebed89ee7a86-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-lwk29\" (UID: \"ba53dca3-2038-4e50-9c9e-ebed89ee7a86\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lwk29" Dec 12 00:50:54 crc kubenswrapper[4606]: I1212 00:50:54.956003 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xv6hx\" (UniqueName: \"kubernetes.io/projected/ba53dca3-2038-4e50-9c9e-ebed89ee7a86-kube-api-access-xv6hx\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-lwk29\" (UID: \"ba53dca3-2038-4e50-9c9e-ebed89ee7a86\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lwk29" Dec 12 00:50:54 crc kubenswrapper[4606]: I1212 00:50:54.956026 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba53dca3-2038-4e50-9c9e-ebed89ee7a86-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-lwk29\" (UID: \"ba53dca3-2038-4e50-9c9e-ebed89ee7a86\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lwk29" Dec 12 00:50:55 crc kubenswrapper[4606]: I1212 00:50:55.036358 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 12 00:50:55 crc kubenswrapper[4606]: I1212 00:50:55.058024 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ba53dca3-2038-4e50-9c9e-ebed89ee7a86-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-lwk29\" (UID: \"ba53dca3-2038-4e50-9c9e-ebed89ee7a86\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lwk29" Dec 12 00:50:55 crc kubenswrapper[4606]: I1212 00:50:55.058070 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ba53dca3-2038-4e50-9c9e-ebed89ee7a86-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-lwk29\" (UID: \"ba53dca3-2038-4e50-9c9e-ebed89ee7a86\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lwk29" Dec 12 00:50:55 crc kubenswrapper[4606]: I1212 00:50:55.058128 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba53dca3-2038-4e50-9c9e-ebed89ee7a86-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-lwk29\" (UID: \"ba53dca3-2038-4e50-9c9e-ebed89ee7a86\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lwk29" Dec 12 00:50:55 crc kubenswrapper[4606]: I1212 00:50:55.058147 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xv6hx\" (UniqueName: \"kubernetes.io/projected/ba53dca3-2038-4e50-9c9e-ebed89ee7a86-kube-api-access-xv6hx\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-lwk29\" (UID: \"ba53dca3-2038-4e50-9c9e-ebed89ee7a86\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lwk29" Dec 12 00:50:55 crc kubenswrapper[4606]: I1212 00:50:55.063500 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ba53dca3-2038-4e50-9c9e-ebed89ee7a86-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-lwk29\" (UID: \"ba53dca3-2038-4e50-9c9e-ebed89ee7a86\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lwk29" Dec 12 00:50:55 crc kubenswrapper[4606]: I1212 00:50:55.072643 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ba53dca3-2038-4e50-9c9e-ebed89ee7a86-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-lwk29\" (UID: \"ba53dca3-2038-4e50-9c9e-ebed89ee7a86\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lwk29" Dec 12 00:50:55 crc kubenswrapper[4606]: I1212 00:50:55.080946 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba53dca3-2038-4e50-9c9e-ebed89ee7a86-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-lwk29\" (UID: \"ba53dca3-2038-4e50-9c9e-ebed89ee7a86\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lwk29" Dec 12 00:50:55 crc kubenswrapper[4606]: I1212 00:50:55.081589 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xv6hx\" (UniqueName: \"kubernetes.io/projected/ba53dca3-2038-4e50-9c9e-ebed89ee7a86-kube-api-access-xv6hx\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-lwk29\" (UID: \"ba53dca3-2038-4e50-9c9e-ebed89ee7a86\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lwk29" Dec 12 00:50:55 crc kubenswrapper[4606]: I1212 00:50:55.150945 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lwk29" Dec 12 00:50:55 crc kubenswrapper[4606]: I1212 00:50:55.703628 4606 scope.go:117] "RemoveContainer" containerID="77f50006b910ef408eff1d278d14d4c6df84559c43dc90a75f9e7ad4aa5dad37" Dec 12 00:50:55 crc kubenswrapper[4606]: E1212 00:50:55.704313 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 00:50:55 crc kubenswrapper[4606]: I1212 00:50:55.799960 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lwk29"] Dec 12 00:50:56 crc kubenswrapper[4606]: I1212 00:50:56.235644 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lwk29" event={"ID":"ba53dca3-2038-4e50-9c9e-ebed89ee7a86","Type":"ContainerStarted","Data":"16500c25a44b97c72eb61ce3c5b447d22d04b3de297cb32b27858cb7ec6255b8"} Dec 12 00:51:00 crc kubenswrapper[4606]: I1212 00:51:00.997145 4606 scope.go:117] "RemoveContainer" containerID="19267573f605240774577b26e61c1f3f1d10bebac2b7012f0af98dd1b671898b" Dec 12 00:51:07 crc kubenswrapper[4606]: I1212 00:51:07.699518 4606 scope.go:117] "RemoveContainer" containerID="77f50006b910ef408eff1d278d14d4c6df84559c43dc90a75f9e7ad4aa5dad37" Dec 12 00:51:07 crc kubenswrapper[4606]: E1212 00:51:07.700278 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 00:51:09 crc kubenswrapper[4606]: I1212 00:51:09.377130 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lwk29" event={"ID":"ba53dca3-2038-4e50-9c9e-ebed89ee7a86","Type":"ContainerStarted","Data":"1a1a69887d04e70e2e581599e7ff8d975ad1021d36f8efb07d9ef278e60bb21b"} Dec 12 00:51:09 crc kubenswrapper[4606]: I1212 00:51:09.411317 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lwk29" podStartSLOduration=2.846397381 podStartE2EDuration="15.411287029s" podCreationTimestamp="2025-12-12 00:50:54 +0000 UTC" firstStartedPulling="2025-12-12 00:50:55.800138122 +0000 UTC m=+1646.345490998" lastFinishedPulling="2025-12-12 00:51:08.36502778 +0000 UTC m=+1658.910380646" observedRunningTime="2025-12-12 00:51:09.402782841 +0000 UTC m=+1659.948135717" watchObservedRunningTime="2025-12-12 00:51:09.411287029 +0000 UTC m=+1659.956639905" Dec 12 00:51:22 crc kubenswrapper[4606]: I1212 00:51:22.496236 4606 generic.go:334] "Generic (PLEG): container finished" podID="ba53dca3-2038-4e50-9c9e-ebed89ee7a86" containerID="1a1a69887d04e70e2e581599e7ff8d975ad1021d36f8efb07d9ef278e60bb21b" exitCode=0 Dec 12 00:51:22 crc kubenswrapper[4606]: I1212 00:51:22.496304 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lwk29" event={"ID":"ba53dca3-2038-4e50-9c9e-ebed89ee7a86","Type":"ContainerDied","Data":"1a1a69887d04e70e2e581599e7ff8d975ad1021d36f8efb07d9ef278e60bb21b"} Dec 12 00:51:22 crc kubenswrapper[4606]: I1212 00:51:22.699205 4606 scope.go:117] "RemoveContainer" containerID="77f50006b910ef408eff1d278d14d4c6df84559c43dc90a75f9e7ad4aa5dad37" Dec 12 00:51:22 crc kubenswrapper[4606]: E1212 00:51:22.699484 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 00:51:23 crc kubenswrapper[4606]: I1212 00:51:23.931549 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lwk29" Dec 12 00:51:24 crc kubenswrapper[4606]: I1212 00:51:24.021485 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ba53dca3-2038-4e50-9c9e-ebed89ee7a86-inventory\") pod \"ba53dca3-2038-4e50-9c9e-ebed89ee7a86\" (UID: \"ba53dca3-2038-4e50-9c9e-ebed89ee7a86\") " Dec 12 00:51:24 crc kubenswrapper[4606]: I1212 00:51:24.021939 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba53dca3-2038-4e50-9c9e-ebed89ee7a86-repo-setup-combined-ca-bundle\") pod \"ba53dca3-2038-4e50-9c9e-ebed89ee7a86\" (UID: \"ba53dca3-2038-4e50-9c9e-ebed89ee7a86\") " Dec 12 00:51:24 crc kubenswrapper[4606]: I1212 00:51:24.021980 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xv6hx\" (UniqueName: \"kubernetes.io/projected/ba53dca3-2038-4e50-9c9e-ebed89ee7a86-kube-api-access-xv6hx\") pod \"ba53dca3-2038-4e50-9c9e-ebed89ee7a86\" (UID: \"ba53dca3-2038-4e50-9c9e-ebed89ee7a86\") " Dec 12 00:51:24 crc kubenswrapper[4606]: I1212 00:51:24.022640 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ba53dca3-2038-4e50-9c9e-ebed89ee7a86-ssh-key\") pod \"ba53dca3-2038-4e50-9c9e-ebed89ee7a86\" (UID: \"ba53dca3-2038-4e50-9c9e-ebed89ee7a86\") " Dec 12 00:51:24 crc kubenswrapper[4606]: I1212 00:51:24.026855 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba53dca3-2038-4e50-9c9e-ebed89ee7a86-kube-api-access-xv6hx" (OuterVolumeSpecName: "kube-api-access-xv6hx") pod "ba53dca3-2038-4e50-9c9e-ebed89ee7a86" (UID: "ba53dca3-2038-4e50-9c9e-ebed89ee7a86"). InnerVolumeSpecName "kube-api-access-xv6hx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:51:24 crc kubenswrapper[4606]: I1212 00:51:24.028321 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba53dca3-2038-4e50-9c9e-ebed89ee7a86-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "ba53dca3-2038-4e50-9c9e-ebed89ee7a86" (UID: "ba53dca3-2038-4e50-9c9e-ebed89ee7a86"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:51:24 crc kubenswrapper[4606]: I1212 00:51:24.048057 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba53dca3-2038-4e50-9c9e-ebed89ee7a86-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "ba53dca3-2038-4e50-9c9e-ebed89ee7a86" (UID: "ba53dca3-2038-4e50-9c9e-ebed89ee7a86"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:51:24 crc kubenswrapper[4606]: I1212 00:51:24.059393 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba53dca3-2038-4e50-9c9e-ebed89ee7a86-inventory" (OuterVolumeSpecName: "inventory") pod "ba53dca3-2038-4e50-9c9e-ebed89ee7a86" (UID: "ba53dca3-2038-4e50-9c9e-ebed89ee7a86"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:51:24 crc kubenswrapper[4606]: I1212 00:51:24.125377 4606 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ba53dca3-2038-4e50-9c9e-ebed89ee7a86-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 12 00:51:24 crc kubenswrapper[4606]: I1212 00:51:24.125540 4606 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ba53dca3-2038-4e50-9c9e-ebed89ee7a86-inventory\") on node \"crc\" DevicePath \"\"" Dec 12 00:51:24 crc kubenswrapper[4606]: I1212 00:51:24.125612 4606 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba53dca3-2038-4e50-9c9e-ebed89ee7a86-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 12 00:51:24 crc kubenswrapper[4606]: I1212 00:51:24.125690 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xv6hx\" (UniqueName: \"kubernetes.io/projected/ba53dca3-2038-4e50-9c9e-ebed89ee7a86-kube-api-access-xv6hx\") on node \"crc\" DevicePath \"\"" Dec 12 00:51:24 crc kubenswrapper[4606]: I1212 00:51:24.515722 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lwk29" event={"ID":"ba53dca3-2038-4e50-9c9e-ebed89ee7a86","Type":"ContainerDied","Data":"16500c25a44b97c72eb61ce3c5b447d22d04b3de297cb32b27858cb7ec6255b8"} Dec 12 00:51:24 crc kubenswrapper[4606]: I1212 00:51:24.515760 4606 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="16500c25a44b97c72eb61ce3c5b447d22d04b3de297cb32b27858cb7ec6255b8" Dec 12 00:51:24 crc kubenswrapper[4606]: I1212 00:51:24.515790 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lwk29" Dec 12 00:51:24 crc kubenswrapper[4606]: I1212 00:51:24.601791 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-kqmmc"] Dec 12 00:51:24 crc kubenswrapper[4606]: E1212 00:51:24.602236 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba53dca3-2038-4e50-9c9e-ebed89ee7a86" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 12 00:51:24 crc kubenswrapper[4606]: I1212 00:51:24.602256 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba53dca3-2038-4e50-9c9e-ebed89ee7a86" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 12 00:51:24 crc kubenswrapper[4606]: I1212 00:51:24.602498 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba53dca3-2038-4e50-9c9e-ebed89ee7a86" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 12 00:51:24 crc kubenswrapper[4606]: I1212 00:51:24.603173 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-kqmmc" Dec 12 00:51:24 crc kubenswrapper[4606]: I1212 00:51:24.607658 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 12 00:51:24 crc kubenswrapper[4606]: I1212 00:51:24.607668 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 12 00:51:24 crc kubenswrapper[4606]: I1212 00:51:24.607868 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 12 00:51:24 crc kubenswrapper[4606]: I1212 00:51:24.613003 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-w59bl" Dec 12 00:51:24 crc kubenswrapper[4606]: I1212 00:51:24.615922 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-kqmmc"] Dec 12 00:51:24 crc kubenswrapper[4606]: I1212 00:51:24.736298 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrtrr\" (UniqueName: \"kubernetes.io/projected/0118f359-bb8d-4b8a-be2b-0437ed43b303-kube-api-access-lrtrr\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-kqmmc\" (UID: \"0118f359-bb8d-4b8a-be2b-0437ed43b303\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-kqmmc" Dec 12 00:51:24 crc kubenswrapper[4606]: I1212 00:51:24.736483 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0118f359-bb8d-4b8a-be2b-0437ed43b303-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-kqmmc\" (UID: \"0118f359-bb8d-4b8a-be2b-0437ed43b303\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-kqmmc" Dec 12 00:51:24 crc kubenswrapper[4606]: I1212 00:51:24.736713 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0118f359-bb8d-4b8a-be2b-0437ed43b303-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-kqmmc\" (UID: \"0118f359-bb8d-4b8a-be2b-0437ed43b303\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-kqmmc" Dec 12 00:51:24 crc kubenswrapper[4606]: I1212 00:51:24.837911 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrtrr\" (UniqueName: \"kubernetes.io/projected/0118f359-bb8d-4b8a-be2b-0437ed43b303-kube-api-access-lrtrr\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-kqmmc\" (UID: \"0118f359-bb8d-4b8a-be2b-0437ed43b303\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-kqmmc" Dec 12 00:51:24 crc kubenswrapper[4606]: I1212 00:51:24.838049 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0118f359-bb8d-4b8a-be2b-0437ed43b303-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-kqmmc\" (UID: \"0118f359-bb8d-4b8a-be2b-0437ed43b303\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-kqmmc" Dec 12 00:51:24 crc kubenswrapper[4606]: I1212 00:51:24.838129 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0118f359-bb8d-4b8a-be2b-0437ed43b303-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-kqmmc\" (UID: \"0118f359-bb8d-4b8a-be2b-0437ed43b303\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-kqmmc" Dec 12 00:51:24 crc kubenswrapper[4606]: I1212 00:51:24.843433 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0118f359-bb8d-4b8a-be2b-0437ed43b303-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-kqmmc\" (UID: \"0118f359-bb8d-4b8a-be2b-0437ed43b303\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-kqmmc" Dec 12 00:51:24 crc kubenswrapper[4606]: I1212 00:51:24.849108 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0118f359-bb8d-4b8a-be2b-0437ed43b303-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-kqmmc\" (UID: \"0118f359-bb8d-4b8a-be2b-0437ed43b303\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-kqmmc" Dec 12 00:51:24 crc kubenswrapper[4606]: I1212 00:51:24.854975 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrtrr\" (UniqueName: \"kubernetes.io/projected/0118f359-bb8d-4b8a-be2b-0437ed43b303-kube-api-access-lrtrr\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-kqmmc\" (UID: \"0118f359-bb8d-4b8a-be2b-0437ed43b303\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-kqmmc" Dec 12 00:51:24 crc kubenswrapper[4606]: I1212 00:51:24.921390 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-kqmmc" Dec 12 00:51:25 crc kubenswrapper[4606]: I1212 00:51:25.431568 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-kqmmc"] Dec 12 00:51:25 crc kubenswrapper[4606]: I1212 00:51:25.524872 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-kqmmc" event={"ID":"0118f359-bb8d-4b8a-be2b-0437ed43b303","Type":"ContainerStarted","Data":"cf6f964d4759d522ee46c09c10ec38b8f58dc4955dc63aeb78c1f547ac334aa8"} Dec 12 00:51:26 crc kubenswrapper[4606]: I1212 00:51:26.551152 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-kqmmc" event={"ID":"0118f359-bb8d-4b8a-be2b-0437ed43b303","Type":"ContainerStarted","Data":"5dce1ed870de675bffb73793ea962691ddd2dbe5cd231adcf3a5270b724bed55"} Dec 12 00:51:28 crc kubenswrapper[4606]: I1212 00:51:28.582947 4606 generic.go:334] "Generic (PLEG): container finished" podID="0118f359-bb8d-4b8a-be2b-0437ed43b303" containerID="5dce1ed870de675bffb73793ea962691ddd2dbe5cd231adcf3a5270b724bed55" exitCode=0 Dec 12 00:51:28 crc kubenswrapper[4606]: I1212 00:51:28.583003 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-kqmmc" event={"ID":"0118f359-bb8d-4b8a-be2b-0437ed43b303","Type":"ContainerDied","Data":"5dce1ed870de675bffb73793ea962691ddd2dbe5cd231adcf3a5270b724bed55"} Dec 12 00:51:29 crc kubenswrapper[4606]: I1212 00:51:29.961275 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-kqmmc" Dec 12 00:51:30 crc kubenswrapper[4606]: I1212 00:51:30.038035 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lrtrr\" (UniqueName: \"kubernetes.io/projected/0118f359-bb8d-4b8a-be2b-0437ed43b303-kube-api-access-lrtrr\") pod \"0118f359-bb8d-4b8a-be2b-0437ed43b303\" (UID: \"0118f359-bb8d-4b8a-be2b-0437ed43b303\") " Dec 12 00:51:30 crc kubenswrapper[4606]: I1212 00:51:30.038351 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0118f359-bb8d-4b8a-be2b-0437ed43b303-ssh-key\") pod \"0118f359-bb8d-4b8a-be2b-0437ed43b303\" (UID: \"0118f359-bb8d-4b8a-be2b-0437ed43b303\") " Dec 12 00:51:30 crc kubenswrapper[4606]: I1212 00:51:30.038489 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0118f359-bb8d-4b8a-be2b-0437ed43b303-inventory\") pod \"0118f359-bb8d-4b8a-be2b-0437ed43b303\" (UID: \"0118f359-bb8d-4b8a-be2b-0437ed43b303\") " Dec 12 00:51:30 crc kubenswrapper[4606]: I1212 00:51:30.047009 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0118f359-bb8d-4b8a-be2b-0437ed43b303-kube-api-access-lrtrr" (OuterVolumeSpecName: "kube-api-access-lrtrr") pod "0118f359-bb8d-4b8a-be2b-0437ed43b303" (UID: "0118f359-bb8d-4b8a-be2b-0437ed43b303"). InnerVolumeSpecName "kube-api-access-lrtrr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:51:30 crc kubenswrapper[4606]: I1212 00:51:30.070657 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0118f359-bb8d-4b8a-be2b-0437ed43b303-inventory" (OuterVolumeSpecName: "inventory") pod "0118f359-bb8d-4b8a-be2b-0437ed43b303" (UID: "0118f359-bb8d-4b8a-be2b-0437ed43b303"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:51:30 crc kubenswrapper[4606]: I1212 00:51:30.071101 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0118f359-bb8d-4b8a-be2b-0437ed43b303-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "0118f359-bb8d-4b8a-be2b-0437ed43b303" (UID: "0118f359-bb8d-4b8a-be2b-0437ed43b303"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:51:30 crc kubenswrapper[4606]: I1212 00:51:30.141145 4606 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0118f359-bb8d-4b8a-be2b-0437ed43b303-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 12 00:51:30 crc kubenswrapper[4606]: I1212 00:51:30.141194 4606 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0118f359-bb8d-4b8a-be2b-0437ed43b303-inventory\") on node \"crc\" DevicePath \"\"" Dec 12 00:51:30 crc kubenswrapper[4606]: I1212 00:51:30.141205 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lrtrr\" (UniqueName: \"kubernetes.io/projected/0118f359-bb8d-4b8a-be2b-0437ed43b303-kube-api-access-lrtrr\") on node \"crc\" DevicePath \"\"" Dec 12 00:51:30 crc kubenswrapper[4606]: I1212 00:51:30.605108 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-kqmmc" Dec 12 00:51:30 crc kubenswrapper[4606]: I1212 00:51:30.604998 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-kqmmc" event={"ID":"0118f359-bb8d-4b8a-be2b-0437ed43b303","Type":"ContainerDied","Data":"cf6f964d4759d522ee46c09c10ec38b8f58dc4955dc63aeb78c1f547ac334aa8"} Dec 12 00:51:30 crc kubenswrapper[4606]: I1212 00:51:30.619066 4606 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cf6f964d4759d522ee46c09c10ec38b8f58dc4955dc63aeb78c1f547ac334aa8" Dec 12 00:51:30 crc kubenswrapper[4606]: I1212 00:51:30.698286 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mbv7c"] Dec 12 00:51:30 crc kubenswrapper[4606]: E1212 00:51:30.698799 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0118f359-bb8d-4b8a-be2b-0437ed43b303" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Dec 12 00:51:30 crc kubenswrapper[4606]: I1212 00:51:30.698818 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="0118f359-bb8d-4b8a-be2b-0437ed43b303" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Dec 12 00:51:30 crc kubenswrapper[4606]: I1212 00:51:30.699022 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="0118f359-bb8d-4b8a-be2b-0437ed43b303" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Dec 12 00:51:30 crc kubenswrapper[4606]: I1212 00:51:30.699933 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mbv7c" Dec 12 00:51:30 crc kubenswrapper[4606]: I1212 00:51:30.703691 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 12 00:51:30 crc kubenswrapper[4606]: I1212 00:51:30.703997 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-w59bl" Dec 12 00:51:30 crc kubenswrapper[4606]: I1212 00:51:30.704121 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 12 00:51:30 crc kubenswrapper[4606]: I1212 00:51:30.704275 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 12 00:51:30 crc kubenswrapper[4606]: I1212 00:51:30.712270 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mbv7c"] Dec 12 00:51:30 crc kubenswrapper[4606]: I1212 00:51:30.853198 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cd9c36e5-43c7-4723-b818-e8b4129d578a-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-mbv7c\" (UID: \"cd9c36e5-43c7-4723-b818-e8b4129d578a\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mbv7c" Dec 12 00:51:30 crc kubenswrapper[4606]: I1212 00:51:30.853272 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd9c36e5-43c7-4723-b818-e8b4129d578a-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-mbv7c\" (UID: \"cd9c36e5-43c7-4723-b818-e8b4129d578a\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mbv7c" Dec 12 00:51:30 crc kubenswrapper[4606]: I1212 00:51:30.853308 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cd9c36e5-43c7-4723-b818-e8b4129d578a-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-mbv7c\" (UID: \"cd9c36e5-43c7-4723-b818-e8b4129d578a\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mbv7c" Dec 12 00:51:30 crc kubenswrapper[4606]: I1212 00:51:30.854090 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9xbk\" (UniqueName: \"kubernetes.io/projected/cd9c36e5-43c7-4723-b818-e8b4129d578a-kube-api-access-d9xbk\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-mbv7c\" (UID: \"cd9c36e5-43c7-4723-b818-e8b4129d578a\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mbv7c" Dec 12 00:51:30 crc kubenswrapper[4606]: I1212 00:51:30.955684 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9xbk\" (UniqueName: \"kubernetes.io/projected/cd9c36e5-43c7-4723-b818-e8b4129d578a-kube-api-access-d9xbk\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-mbv7c\" (UID: \"cd9c36e5-43c7-4723-b818-e8b4129d578a\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mbv7c" Dec 12 00:51:30 crc kubenswrapper[4606]: I1212 00:51:30.955799 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cd9c36e5-43c7-4723-b818-e8b4129d578a-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-mbv7c\" (UID: \"cd9c36e5-43c7-4723-b818-e8b4129d578a\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mbv7c" Dec 12 00:51:30 crc kubenswrapper[4606]: I1212 00:51:30.955860 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd9c36e5-43c7-4723-b818-e8b4129d578a-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-mbv7c\" (UID: \"cd9c36e5-43c7-4723-b818-e8b4129d578a\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mbv7c" Dec 12 00:51:30 crc kubenswrapper[4606]: I1212 00:51:30.955906 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cd9c36e5-43c7-4723-b818-e8b4129d578a-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-mbv7c\" (UID: \"cd9c36e5-43c7-4723-b818-e8b4129d578a\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mbv7c" Dec 12 00:51:30 crc kubenswrapper[4606]: I1212 00:51:30.962059 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cd9c36e5-43c7-4723-b818-e8b4129d578a-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-mbv7c\" (UID: \"cd9c36e5-43c7-4723-b818-e8b4129d578a\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mbv7c" Dec 12 00:51:30 crc kubenswrapper[4606]: I1212 00:51:30.962494 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cd9c36e5-43c7-4723-b818-e8b4129d578a-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-mbv7c\" (UID: \"cd9c36e5-43c7-4723-b818-e8b4129d578a\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mbv7c" Dec 12 00:51:30 crc kubenswrapper[4606]: I1212 00:51:30.968580 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd9c36e5-43c7-4723-b818-e8b4129d578a-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-mbv7c\" (UID: \"cd9c36e5-43c7-4723-b818-e8b4129d578a\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mbv7c" Dec 12 00:51:30 crc kubenswrapper[4606]: I1212 00:51:30.975060 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9xbk\" (UniqueName: \"kubernetes.io/projected/cd9c36e5-43c7-4723-b818-e8b4129d578a-kube-api-access-d9xbk\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-mbv7c\" (UID: \"cd9c36e5-43c7-4723-b818-e8b4129d578a\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mbv7c" Dec 12 00:51:31 crc kubenswrapper[4606]: I1212 00:51:31.022199 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mbv7c" Dec 12 00:51:31 crc kubenswrapper[4606]: I1212 00:51:31.564291 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mbv7c"] Dec 12 00:51:31 crc kubenswrapper[4606]: I1212 00:51:31.615389 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mbv7c" event={"ID":"cd9c36e5-43c7-4723-b818-e8b4129d578a","Type":"ContainerStarted","Data":"2e74ff71f206ec4558dacad296723dafa492df2d151fa550515508f72458ad9c"} Dec 12 00:51:32 crc kubenswrapper[4606]: I1212 00:51:32.625234 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mbv7c" event={"ID":"cd9c36e5-43c7-4723-b818-e8b4129d578a","Type":"ContainerStarted","Data":"4c69680dfdea70200baeaa31a69b48738f2b8b09b7491d2301583c216e1d970f"} Dec 12 00:51:33 crc kubenswrapper[4606]: I1212 00:51:33.700375 4606 scope.go:117] "RemoveContainer" containerID="77f50006b910ef408eff1d278d14d4c6df84559c43dc90a75f9e7ad4aa5dad37" Dec 12 00:51:33 crc kubenswrapper[4606]: E1212 00:51:33.700780 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 00:51:46 crc kubenswrapper[4606]: I1212 00:51:46.699512 4606 scope.go:117] "RemoveContainer" containerID="77f50006b910ef408eff1d278d14d4c6df84559c43dc90a75f9e7ad4aa5dad37" Dec 12 00:51:46 crc kubenswrapper[4606]: E1212 00:51:46.700331 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 00:52:00 crc kubenswrapper[4606]: I1212 00:52:00.701154 4606 scope.go:117] "RemoveContainer" containerID="77f50006b910ef408eff1d278d14d4c6df84559c43dc90a75f9e7ad4aa5dad37" Dec 12 00:52:00 crc kubenswrapper[4606]: E1212 00:52:00.702106 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 00:52:01 crc kubenswrapper[4606]: I1212 00:52:01.140219 4606 scope.go:117] "RemoveContainer" containerID="a4823e5b2d7c2714afe1b56e652606df93932f01689b012beee3a434a8505f96" Dec 12 00:52:14 crc kubenswrapper[4606]: I1212 00:52:14.700194 4606 scope.go:117] "RemoveContainer" containerID="77f50006b910ef408eff1d278d14d4c6df84559c43dc90a75f9e7ad4aa5dad37" Dec 12 00:52:14 crc kubenswrapper[4606]: E1212 00:52:14.700975 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 00:52:27 crc kubenswrapper[4606]: I1212 00:52:27.700265 4606 scope.go:117] "RemoveContainer" containerID="77f50006b910ef408eff1d278d14d4c6df84559c43dc90a75f9e7ad4aa5dad37" Dec 12 00:52:27 crc kubenswrapper[4606]: E1212 00:52:27.701602 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 00:52:38 crc kubenswrapper[4606]: I1212 00:52:38.699911 4606 scope.go:117] "RemoveContainer" containerID="77f50006b910ef408eff1d278d14d4c6df84559c43dc90a75f9e7ad4aa5dad37" Dec 12 00:52:38 crc kubenswrapper[4606]: E1212 00:52:38.701152 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 00:52:53 crc kubenswrapper[4606]: I1212 00:52:53.700992 4606 scope.go:117] "RemoveContainer" containerID="77f50006b910ef408eff1d278d14d4c6df84559c43dc90a75f9e7ad4aa5dad37" Dec 12 00:52:53 crc kubenswrapper[4606]: E1212 00:52:53.702790 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 00:53:05 crc kubenswrapper[4606]: I1212 00:53:05.700385 4606 scope.go:117] "RemoveContainer" containerID="77f50006b910ef408eff1d278d14d4c6df84559c43dc90a75f9e7ad4aa5dad37" Dec 12 00:53:06 crc kubenswrapper[4606]: I1212 00:53:06.616805 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" event={"ID":"a543e227-be89-40cb-941d-b4707cc28921","Type":"ContainerStarted","Data":"99029e9498baa7a74ac4278469a21ec659c962bde39e91cbead95d5b96de00b5"} Dec 12 00:53:06 crc kubenswrapper[4606]: I1212 00:53:06.634997 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mbv7c" podStartSLOduration=96.460861524 podStartE2EDuration="1m36.634963998s" podCreationTimestamp="2025-12-12 00:51:30 +0000 UTC" firstStartedPulling="2025-12-12 00:51:31.569895291 +0000 UTC m=+1682.115248157" lastFinishedPulling="2025-12-12 00:51:31.743997725 +0000 UTC m=+1682.289350631" observedRunningTime="2025-12-12 00:51:32.644780287 +0000 UTC m=+1683.190133173" watchObservedRunningTime="2025-12-12 00:53:06.634963998 +0000 UTC m=+1777.180316864" Dec 12 00:54:28 crc kubenswrapper[4606]: I1212 00:54:28.001746 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-pb274"] Dec 12 00:54:28 crc kubenswrapper[4606]: I1212 00:54:28.004096 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pb274" Dec 12 00:54:28 crc kubenswrapper[4606]: I1212 00:54:28.019107 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pb274"] Dec 12 00:54:28 crc kubenswrapper[4606]: I1212 00:54:28.058987 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7lpv\" (UniqueName: \"kubernetes.io/projected/71acbb65-a2ae-48dc-9b25-8e8c3eb02d14-kube-api-access-w7lpv\") pod \"community-operators-pb274\" (UID: \"71acbb65-a2ae-48dc-9b25-8e8c3eb02d14\") " pod="openshift-marketplace/community-operators-pb274" Dec 12 00:54:28 crc kubenswrapper[4606]: I1212 00:54:28.059094 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71acbb65-a2ae-48dc-9b25-8e8c3eb02d14-utilities\") pod \"community-operators-pb274\" (UID: \"71acbb65-a2ae-48dc-9b25-8e8c3eb02d14\") " pod="openshift-marketplace/community-operators-pb274" Dec 12 00:54:28 crc kubenswrapper[4606]: I1212 00:54:28.059129 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71acbb65-a2ae-48dc-9b25-8e8c3eb02d14-catalog-content\") pod \"community-operators-pb274\" (UID: \"71acbb65-a2ae-48dc-9b25-8e8c3eb02d14\") " pod="openshift-marketplace/community-operators-pb274" Dec 12 00:54:28 crc kubenswrapper[4606]: I1212 00:54:28.160604 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7lpv\" (UniqueName: \"kubernetes.io/projected/71acbb65-a2ae-48dc-9b25-8e8c3eb02d14-kube-api-access-w7lpv\") pod \"community-operators-pb274\" (UID: \"71acbb65-a2ae-48dc-9b25-8e8c3eb02d14\") " pod="openshift-marketplace/community-operators-pb274" Dec 12 00:54:28 crc kubenswrapper[4606]: I1212 00:54:28.160694 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71acbb65-a2ae-48dc-9b25-8e8c3eb02d14-utilities\") pod \"community-operators-pb274\" (UID: \"71acbb65-a2ae-48dc-9b25-8e8c3eb02d14\") " pod="openshift-marketplace/community-operators-pb274" Dec 12 00:54:28 crc kubenswrapper[4606]: I1212 00:54:28.160720 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71acbb65-a2ae-48dc-9b25-8e8c3eb02d14-catalog-content\") pod \"community-operators-pb274\" (UID: \"71acbb65-a2ae-48dc-9b25-8e8c3eb02d14\") " pod="openshift-marketplace/community-operators-pb274" Dec 12 00:54:28 crc kubenswrapper[4606]: I1212 00:54:28.161198 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71acbb65-a2ae-48dc-9b25-8e8c3eb02d14-catalog-content\") pod \"community-operators-pb274\" (UID: \"71acbb65-a2ae-48dc-9b25-8e8c3eb02d14\") " pod="openshift-marketplace/community-operators-pb274" Dec 12 00:54:28 crc kubenswrapper[4606]: I1212 00:54:28.161283 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71acbb65-a2ae-48dc-9b25-8e8c3eb02d14-utilities\") pod \"community-operators-pb274\" (UID: \"71acbb65-a2ae-48dc-9b25-8e8c3eb02d14\") " pod="openshift-marketplace/community-operators-pb274" Dec 12 00:54:28 crc kubenswrapper[4606]: I1212 00:54:28.183054 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7lpv\" (UniqueName: \"kubernetes.io/projected/71acbb65-a2ae-48dc-9b25-8e8c3eb02d14-kube-api-access-w7lpv\") pod \"community-operators-pb274\" (UID: \"71acbb65-a2ae-48dc-9b25-8e8c3eb02d14\") " pod="openshift-marketplace/community-operators-pb274" Dec 12 00:54:28 crc kubenswrapper[4606]: I1212 00:54:28.348854 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pb274" Dec 12 00:54:28 crc kubenswrapper[4606]: I1212 00:54:28.988845 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pb274"] Dec 12 00:54:29 crc kubenswrapper[4606]: I1212 00:54:29.012245 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-hpfv8"] Dec 12 00:54:29 crc kubenswrapper[4606]: I1212 00:54:29.015156 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hpfv8" Dec 12 00:54:29 crc kubenswrapper[4606]: I1212 00:54:29.040285 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hpfv8"] Dec 12 00:54:29 crc kubenswrapper[4606]: I1212 00:54:29.178611 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1086d3e2-5d50-4b16-9f8a-84d41323dd7d-utilities\") pod \"certified-operators-hpfv8\" (UID: \"1086d3e2-5d50-4b16-9f8a-84d41323dd7d\") " pod="openshift-marketplace/certified-operators-hpfv8" Dec 12 00:54:29 crc kubenswrapper[4606]: I1212 00:54:29.178900 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xsz2m\" (UniqueName: \"kubernetes.io/projected/1086d3e2-5d50-4b16-9f8a-84d41323dd7d-kube-api-access-xsz2m\") pod \"certified-operators-hpfv8\" (UID: \"1086d3e2-5d50-4b16-9f8a-84d41323dd7d\") " pod="openshift-marketplace/certified-operators-hpfv8" Dec 12 00:54:29 crc kubenswrapper[4606]: I1212 00:54:29.179032 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1086d3e2-5d50-4b16-9f8a-84d41323dd7d-catalog-content\") pod \"certified-operators-hpfv8\" (UID: \"1086d3e2-5d50-4b16-9f8a-84d41323dd7d\") " pod="openshift-marketplace/certified-operators-hpfv8" Dec 12 00:54:29 crc kubenswrapper[4606]: I1212 00:54:29.280935 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1086d3e2-5d50-4b16-9f8a-84d41323dd7d-utilities\") pod \"certified-operators-hpfv8\" (UID: \"1086d3e2-5d50-4b16-9f8a-84d41323dd7d\") " pod="openshift-marketplace/certified-operators-hpfv8" Dec 12 00:54:29 crc kubenswrapper[4606]: I1212 00:54:29.281025 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xsz2m\" (UniqueName: \"kubernetes.io/projected/1086d3e2-5d50-4b16-9f8a-84d41323dd7d-kube-api-access-xsz2m\") pod \"certified-operators-hpfv8\" (UID: \"1086d3e2-5d50-4b16-9f8a-84d41323dd7d\") " pod="openshift-marketplace/certified-operators-hpfv8" Dec 12 00:54:29 crc kubenswrapper[4606]: I1212 00:54:29.281056 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1086d3e2-5d50-4b16-9f8a-84d41323dd7d-catalog-content\") pod \"certified-operators-hpfv8\" (UID: \"1086d3e2-5d50-4b16-9f8a-84d41323dd7d\") " pod="openshift-marketplace/certified-operators-hpfv8" Dec 12 00:54:29 crc kubenswrapper[4606]: I1212 00:54:29.281466 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1086d3e2-5d50-4b16-9f8a-84d41323dd7d-utilities\") pod \"certified-operators-hpfv8\" (UID: \"1086d3e2-5d50-4b16-9f8a-84d41323dd7d\") " pod="openshift-marketplace/certified-operators-hpfv8" Dec 12 00:54:29 crc kubenswrapper[4606]: I1212 00:54:29.281500 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1086d3e2-5d50-4b16-9f8a-84d41323dd7d-catalog-content\") pod \"certified-operators-hpfv8\" (UID: \"1086d3e2-5d50-4b16-9f8a-84d41323dd7d\") " pod="openshift-marketplace/certified-operators-hpfv8" Dec 12 00:54:29 crc kubenswrapper[4606]: I1212 00:54:29.305383 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xsz2m\" (UniqueName: \"kubernetes.io/projected/1086d3e2-5d50-4b16-9f8a-84d41323dd7d-kube-api-access-xsz2m\") pod \"certified-operators-hpfv8\" (UID: \"1086d3e2-5d50-4b16-9f8a-84d41323dd7d\") " pod="openshift-marketplace/certified-operators-hpfv8" Dec 12 00:54:29 crc kubenswrapper[4606]: I1212 00:54:29.414214 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hpfv8" Dec 12 00:54:29 crc kubenswrapper[4606]: I1212 00:54:29.466396 4606 generic.go:334] "Generic (PLEG): container finished" podID="71acbb65-a2ae-48dc-9b25-8e8c3eb02d14" containerID="2654a8c7fe19afb9880831ddda941101db6b9d583ca6fe3779ff2e38ef92c41b" exitCode=0 Dec 12 00:54:29 crc kubenswrapper[4606]: I1212 00:54:29.466455 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pb274" event={"ID":"71acbb65-a2ae-48dc-9b25-8e8c3eb02d14","Type":"ContainerDied","Data":"2654a8c7fe19afb9880831ddda941101db6b9d583ca6fe3779ff2e38ef92c41b"} Dec 12 00:54:29 crc kubenswrapper[4606]: I1212 00:54:29.466485 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pb274" event={"ID":"71acbb65-a2ae-48dc-9b25-8e8c3eb02d14","Type":"ContainerStarted","Data":"9c07a5e57d41a6e8928927855cd993a3719976d013c0c6ecd05b804573a7f8c1"} Dec 12 00:54:29 crc kubenswrapper[4606]: I1212 00:54:29.469061 4606 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 12 00:54:29 crc kubenswrapper[4606]: I1212 00:54:29.905825 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hpfv8"] Dec 12 00:54:30 crc kubenswrapper[4606]: I1212 00:54:30.475751 4606 generic.go:334] "Generic (PLEG): container finished" podID="1086d3e2-5d50-4b16-9f8a-84d41323dd7d" containerID="358d96df0c5dc796c96f9a8da2a75271df9311dd22dda47d54eb1e02d4699935" exitCode=0 Dec 12 00:54:30 crc kubenswrapper[4606]: I1212 00:54:30.475853 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hpfv8" event={"ID":"1086d3e2-5d50-4b16-9f8a-84d41323dd7d","Type":"ContainerDied","Data":"358d96df0c5dc796c96f9a8da2a75271df9311dd22dda47d54eb1e02d4699935"} Dec 12 00:54:30 crc kubenswrapper[4606]: I1212 00:54:30.475985 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hpfv8" event={"ID":"1086d3e2-5d50-4b16-9f8a-84d41323dd7d","Type":"ContainerStarted","Data":"8c2340d6218996b21fac57c9b2510b1a6e87b9834c7ad941272324f681c8cd36"} Dec 12 00:54:31 crc kubenswrapper[4606]: I1212 00:54:31.497333 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pb274" event={"ID":"71acbb65-a2ae-48dc-9b25-8e8c3eb02d14","Type":"ContainerStarted","Data":"7be936345d71c2f4c558d8a717cc513658e2025a812a4b79207db51a896d1309"} Dec 12 00:54:32 crc kubenswrapper[4606]: I1212 00:54:32.508321 4606 generic.go:334] "Generic (PLEG): container finished" podID="71acbb65-a2ae-48dc-9b25-8e8c3eb02d14" containerID="7be936345d71c2f4c558d8a717cc513658e2025a812a4b79207db51a896d1309" exitCode=0 Dec 12 00:54:32 crc kubenswrapper[4606]: I1212 00:54:32.508415 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pb274" event={"ID":"71acbb65-a2ae-48dc-9b25-8e8c3eb02d14","Type":"ContainerDied","Data":"7be936345d71c2f4c558d8a717cc513658e2025a812a4b79207db51a896d1309"} Dec 12 00:54:32 crc kubenswrapper[4606]: I1212 00:54:32.511519 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hpfv8" event={"ID":"1086d3e2-5d50-4b16-9f8a-84d41323dd7d","Type":"ContainerStarted","Data":"4fef64e7bc1febb85db967e4ffb45d893e25372c2c9d63f9cb7d00b2b2962cee"} Dec 12 00:54:34 crc kubenswrapper[4606]: I1212 00:54:34.540911 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pb274" event={"ID":"71acbb65-a2ae-48dc-9b25-8e8c3eb02d14","Type":"ContainerStarted","Data":"2ad9e01e12bd63d0223303b37441e31c0257b705e690c627d56adc80fa827a91"} Dec 12 00:54:34 crc kubenswrapper[4606]: I1212 00:54:34.545709 4606 generic.go:334] "Generic (PLEG): container finished" podID="1086d3e2-5d50-4b16-9f8a-84d41323dd7d" containerID="4fef64e7bc1febb85db967e4ffb45d893e25372c2c9d63f9cb7d00b2b2962cee" exitCode=0 Dec 12 00:54:34 crc kubenswrapper[4606]: I1212 00:54:34.545758 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hpfv8" event={"ID":"1086d3e2-5d50-4b16-9f8a-84d41323dd7d","Type":"ContainerDied","Data":"4fef64e7bc1febb85db967e4ffb45d893e25372c2c9d63f9cb7d00b2b2962cee"} Dec 12 00:54:34 crc kubenswrapper[4606]: I1212 00:54:34.564662 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-pb274" podStartSLOduration=2.900817756 podStartE2EDuration="7.564630906s" podCreationTimestamp="2025-12-12 00:54:27 +0000 UTC" firstStartedPulling="2025-12-12 00:54:29.468710018 +0000 UTC m=+1860.014062904" lastFinishedPulling="2025-12-12 00:54:34.132523188 +0000 UTC m=+1864.677876054" observedRunningTime="2025-12-12 00:54:34.564439431 +0000 UTC m=+1865.109792297" watchObservedRunningTime="2025-12-12 00:54:34.564630906 +0000 UTC m=+1865.109983782" Dec 12 00:54:35 crc kubenswrapper[4606]: I1212 00:54:35.555837 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hpfv8" event={"ID":"1086d3e2-5d50-4b16-9f8a-84d41323dd7d","Type":"ContainerStarted","Data":"62dd53669f72020d8f009e0c68986f361a470cfaf458a4af33d2aef8c7054bbe"} Dec 12 00:54:35 crc kubenswrapper[4606]: I1212 00:54:35.584505 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-hpfv8" podStartSLOduration=3.084349749 podStartE2EDuration="7.584488258s" podCreationTimestamp="2025-12-12 00:54:28 +0000 UTC" firstStartedPulling="2025-12-12 00:54:30.477241916 +0000 UTC m=+1861.022594782" lastFinishedPulling="2025-12-12 00:54:34.977380425 +0000 UTC m=+1865.522733291" observedRunningTime="2025-12-12 00:54:35.579964266 +0000 UTC m=+1866.125317132" watchObservedRunningTime="2025-12-12 00:54:35.584488258 +0000 UTC m=+1866.129841114" Dec 12 00:54:38 crc kubenswrapper[4606]: I1212 00:54:38.349720 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-pb274" Dec 12 00:54:38 crc kubenswrapper[4606]: I1212 00:54:38.350402 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-pb274" Dec 12 00:54:39 crc kubenswrapper[4606]: I1212 00:54:39.063062 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-nm7sl"] Dec 12 00:54:39 crc kubenswrapper[4606]: I1212 00:54:39.076003 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-z942v"] Dec 12 00:54:39 crc kubenswrapper[4606]: I1212 00:54:39.087742 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-6e57-account-create-update-bstwq"] Dec 12 00:54:39 crc kubenswrapper[4606]: I1212 00:54:39.098674 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-rpf5j"] Dec 12 00:54:39 crc kubenswrapper[4606]: I1212 00:54:39.108048 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-4df9-account-create-update-7t79l"] Dec 12 00:54:39 crc kubenswrapper[4606]: I1212 00:54:39.116352 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-nm7sl"] Dec 12 00:54:39 crc kubenswrapper[4606]: I1212 00:54:39.124126 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-z942v"] Dec 12 00:54:39 crc kubenswrapper[4606]: I1212 00:54:39.131760 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-6e57-account-create-update-bstwq"] Dec 12 00:54:39 crc kubenswrapper[4606]: I1212 00:54:39.141239 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-4df9-account-create-update-7t79l"] Dec 12 00:54:39 crc kubenswrapper[4606]: I1212 00:54:39.148758 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-rpf5j"] Dec 12 00:54:39 crc kubenswrapper[4606]: I1212 00:54:39.156879 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-ee90-account-create-update-2wb28"] Dec 12 00:54:39 crc kubenswrapper[4606]: I1212 00:54:39.164829 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-ee90-account-create-update-2wb28"] Dec 12 00:54:39 crc kubenswrapper[4606]: I1212 00:54:39.415010 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-hpfv8" Dec 12 00:54:39 crc kubenswrapper[4606]: I1212 00:54:39.418031 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-hpfv8" Dec 12 00:54:39 crc kubenswrapper[4606]: I1212 00:54:39.425043 4606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-pb274" podUID="71acbb65-a2ae-48dc-9b25-8e8c3eb02d14" containerName="registry-server" probeResult="failure" output=< Dec 12 00:54:39 crc kubenswrapper[4606]: timeout: failed to connect service ":50051" within 1s Dec 12 00:54:39 crc kubenswrapper[4606]: > Dec 12 00:54:39 crc kubenswrapper[4606]: I1212 00:54:39.732544 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38c5f767-2214-4103-94ef-c9b98cfb9269" path="/var/lib/kubelet/pods/38c5f767-2214-4103-94ef-c9b98cfb9269/volumes" Dec 12 00:54:39 crc kubenswrapper[4606]: I1212 00:54:39.733869 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c857da8-746f-4a51-b509-e6ed45614ab6" path="/var/lib/kubelet/pods/6c857da8-746f-4a51-b509-e6ed45614ab6/volumes" Dec 12 00:54:39 crc kubenswrapper[4606]: I1212 00:54:39.735276 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92703eda-3f9e-40c1-9eef-c637ccfe0552" path="/var/lib/kubelet/pods/92703eda-3f9e-40c1-9eef-c637ccfe0552/volumes" Dec 12 00:54:39 crc kubenswrapper[4606]: I1212 00:54:39.738109 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2358288-ebbf-4430-9684-7bcdf01349a4" path="/var/lib/kubelet/pods/b2358288-ebbf-4430-9684-7bcdf01349a4/volumes" Dec 12 00:54:39 crc kubenswrapper[4606]: I1212 00:54:39.739288 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d34d547f-c6ae-48f9-8df6-1d2d35942f23" path="/var/lib/kubelet/pods/d34d547f-c6ae-48f9-8df6-1d2d35942f23/volumes" Dec 12 00:54:39 crc kubenswrapper[4606]: I1212 00:54:39.740791 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8b5bcea-0e2f-4d42-b6bd-0f7ea87ed0e5" path="/var/lib/kubelet/pods/f8b5bcea-0e2f-4d42-b6bd-0f7ea87ed0e5/volumes" Dec 12 00:54:40 crc kubenswrapper[4606]: I1212 00:54:40.496916 4606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-hpfv8" podUID="1086d3e2-5d50-4b16-9f8a-84d41323dd7d" containerName="registry-server" probeResult="failure" output=< Dec 12 00:54:40 crc kubenswrapper[4606]: timeout: failed to connect service ":50051" within 1s Dec 12 00:54:40 crc kubenswrapper[4606]: > Dec 12 00:54:48 crc kubenswrapper[4606]: I1212 00:54:48.424105 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-pb274" Dec 12 00:54:48 crc kubenswrapper[4606]: I1212 00:54:48.496717 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-pb274" Dec 12 00:54:48 crc kubenswrapper[4606]: I1212 00:54:48.664141 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pb274"] Dec 12 00:54:49 crc kubenswrapper[4606]: I1212 00:54:49.484696 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-hpfv8" Dec 12 00:54:49 crc kubenswrapper[4606]: I1212 00:54:49.529506 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-hpfv8" Dec 12 00:54:49 crc kubenswrapper[4606]: I1212 00:54:49.682668 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-pb274" podUID="71acbb65-a2ae-48dc-9b25-8e8c3eb02d14" containerName="registry-server" containerID="cri-o://2ad9e01e12bd63d0223303b37441e31c0257b705e690c627d56adc80fa827a91" gracePeriod=2 Dec 12 00:54:50 crc kubenswrapper[4606]: I1212 00:54:50.105131 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pb274" Dec 12 00:54:50 crc kubenswrapper[4606]: I1212 00:54:50.197262 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71acbb65-a2ae-48dc-9b25-8e8c3eb02d14-utilities\") pod \"71acbb65-a2ae-48dc-9b25-8e8c3eb02d14\" (UID: \"71acbb65-a2ae-48dc-9b25-8e8c3eb02d14\") " Dec 12 00:54:50 crc kubenswrapper[4606]: I1212 00:54:50.197336 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71acbb65-a2ae-48dc-9b25-8e8c3eb02d14-catalog-content\") pod \"71acbb65-a2ae-48dc-9b25-8e8c3eb02d14\" (UID: \"71acbb65-a2ae-48dc-9b25-8e8c3eb02d14\") " Dec 12 00:54:50 crc kubenswrapper[4606]: I1212 00:54:50.197453 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7lpv\" (UniqueName: \"kubernetes.io/projected/71acbb65-a2ae-48dc-9b25-8e8c3eb02d14-kube-api-access-w7lpv\") pod \"71acbb65-a2ae-48dc-9b25-8e8c3eb02d14\" (UID: \"71acbb65-a2ae-48dc-9b25-8e8c3eb02d14\") " Dec 12 00:54:50 crc kubenswrapper[4606]: I1212 00:54:50.197749 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71acbb65-a2ae-48dc-9b25-8e8c3eb02d14-utilities" (OuterVolumeSpecName: "utilities") pod "71acbb65-a2ae-48dc-9b25-8e8c3eb02d14" (UID: "71acbb65-a2ae-48dc-9b25-8e8c3eb02d14"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 00:54:50 crc kubenswrapper[4606]: I1212 00:54:50.198135 4606 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71acbb65-a2ae-48dc-9b25-8e8c3eb02d14-utilities\") on node \"crc\" DevicePath \"\"" Dec 12 00:54:50 crc kubenswrapper[4606]: I1212 00:54:50.206439 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71acbb65-a2ae-48dc-9b25-8e8c3eb02d14-kube-api-access-w7lpv" (OuterVolumeSpecName: "kube-api-access-w7lpv") pod "71acbb65-a2ae-48dc-9b25-8e8c3eb02d14" (UID: "71acbb65-a2ae-48dc-9b25-8e8c3eb02d14"). InnerVolumeSpecName "kube-api-access-w7lpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:54:50 crc kubenswrapper[4606]: I1212 00:54:50.260339 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71acbb65-a2ae-48dc-9b25-8e8c3eb02d14-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "71acbb65-a2ae-48dc-9b25-8e8c3eb02d14" (UID: "71acbb65-a2ae-48dc-9b25-8e8c3eb02d14"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 00:54:50 crc kubenswrapper[4606]: I1212 00:54:50.299599 4606 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71acbb65-a2ae-48dc-9b25-8e8c3eb02d14-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 12 00:54:50 crc kubenswrapper[4606]: I1212 00:54:50.299656 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7lpv\" (UniqueName: \"kubernetes.io/projected/71acbb65-a2ae-48dc-9b25-8e8c3eb02d14-kube-api-access-w7lpv\") on node \"crc\" DevicePath \"\"" Dec 12 00:54:50 crc kubenswrapper[4606]: I1212 00:54:50.698501 4606 generic.go:334] "Generic (PLEG): container finished" podID="71acbb65-a2ae-48dc-9b25-8e8c3eb02d14" containerID="2ad9e01e12bd63d0223303b37441e31c0257b705e690c627d56adc80fa827a91" exitCode=0 Dec 12 00:54:50 crc kubenswrapper[4606]: I1212 00:54:50.698745 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pb274" event={"ID":"71acbb65-a2ae-48dc-9b25-8e8c3eb02d14","Type":"ContainerDied","Data":"2ad9e01e12bd63d0223303b37441e31c0257b705e690c627d56adc80fa827a91"} Dec 12 00:54:50 crc kubenswrapper[4606]: I1212 00:54:50.698771 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pb274" event={"ID":"71acbb65-a2ae-48dc-9b25-8e8c3eb02d14","Type":"ContainerDied","Data":"9c07a5e57d41a6e8928927855cd993a3719976d013c0c6ecd05b804573a7f8c1"} Dec 12 00:54:50 crc kubenswrapper[4606]: I1212 00:54:50.698788 4606 scope.go:117] "RemoveContainer" containerID="2ad9e01e12bd63d0223303b37441e31c0257b705e690c627d56adc80fa827a91" Dec 12 00:54:50 crc kubenswrapper[4606]: I1212 00:54:50.698913 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pb274" Dec 12 00:54:50 crc kubenswrapper[4606]: I1212 00:54:50.733682 4606 scope.go:117] "RemoveContainer" containerID="7be936345d71c2f4c558d8a717cc513658e2025a812a4b79207db51a896d1309" Dec 12 00:54:50 crc kubenswrapper[4606]: I1212 00:54:50.754380 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pb274"] Dec 12 00:54:50 crc kubenswrapper[4606]: I1212 00:54:50.761662 4606 scope.go:117] "RemoveContainer" containerID="2654a8c7fe19afb9880831ddda941101db6b9d583ca6fe3779ff2e38ef92c41b" Dec 12 00:54:50 crc kubenswrapper[4606]: I1212 00:54:50.763489 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-pb274"] Dec 12 00:54:50 crc kubenswrapper[4606]: I1212 00:54:50.799537 4606 scope.go:117] "RemoveContainer" containerID="2ad9e01e12bd63d0223303b37441e31c0257b705e690c627d56adc80fa827a91" Dec 12 00:54:50 crc kubenswrapper[4606]: E1212 00:54:50.800014 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ad9e01e12bd63d0223303b37441e31c0257b705e690c627d56adc80fa827a91\": container with ID starting with 2ad9e01e12bd63d0223303b37441e31c0257b705e690c627d56adc80fa827a91 not found: ID does not exist" containerID="2ad9e01e12bd63d0223303b37441e31c0257b705e690c627d56adc80fa827a91" Dec 12 00:54:50 crc kubenswrapper[4606]: I1212 00:54:50.800062 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ad9e01e12bd63d0223303b37441e31c0257b705e690c627d56adc80fa827a91"} err="failed to get container status \"2ad9e01e12bd63d0223303b37441e31c0257b705e690c627d56adc80fa827a91\": rpc error: code = NotFound desc = could not find container \"2ad9e01e12bd63d0223303b37441e31c0257b705e690c627d56adc80fa827a91\": container with ID starting with 2ad9e01e12bd63d0223303b37441e31c0257b705e690c627d56adc80fa827a91 not found: ID does not exist" Dec 12 00:54:50 crc kubenswrapper[4606]: I1212 00:54:50.800095 4606 scope.go:117] "RemoveContainer" containerID="7be936345d71c2f4c558d8a717cc513658e2025a812a4b79207db51a896d1309" Dec 12 00:54:50 crc kubenswrapper[4606]: E1212 00:54:50.800414 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7be936345d71c2f4c558d8a717cc513658e2025a812a4b79207db51a896d1309\": container with ID starting with 7be936345d71c2f4c558d8a717cc513658e2025a812a4b79207db51a896d1309 not found: ID does not exist" containerID="7be936345d71c2f4c558d8a717cc513658e2025a812a4b79207db51a896d1309" Dec 12 00:54:50 crc kubenswrapper[4606]: I1212 00:54:50.800440 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7be936345d71c2f4c558d8a717cc513658e2025a812a4b79207db51a896d1309"} err="failed to get container status \"7be936345d71c2f4c558d8a717cc513658e2025a812a4b79207db51a896d1309\": rpc error: code = NotFound desc = could not find container \"7be936345d71c2f4c558d8a717cc513658e2025a812a4b79207db51a896d1309\": container with ID starting with 7be936345d71c2f4c558d8a717cc513658e2025a812a4b79207db51a896d1309 not found: ID does not exist" Dec 12 00:54:50 crc kubenswrapper[4606]: I1212 00:54:50.800455 4606 scope.go:117] "RemoveContainer" containerID="2654a8c7fe19afb9880831ddda941101db6b9d583ca6fe3779ff2e38ef92c41b" Dec 12 00:54:50 crc kubenswrapper[4606]: E1212 00:54:50.800677 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2654a8c7fe19afb9880831ddda941101db6b9d583ca6fe3779ff2e38ef92c41b\": container with ID starting with 2654a8c7fe19afb9880831ddda941101db6b9d583ca6fe3779ff2e38ef92c41b not found: ID does not exist" containerID="2654a8c7fe19afb9880831ddda941101db6b9d583ca6fe3779ff2e38ef92c41b" Dec 12 00:54:50 crc kubenswrapper[4606]: I1212 00:54:50.800708 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2654a8c7fe19afb9880831ddda941101db6b9d583ca6fe3779ff2e38ef92c41b"} err="failed to get container status \"2654a8c7fe19afb9880831ddda941101db6b9d583ca6fe3779ff2e38ef92c41b\": rpc error: code = NotFound desc = could not find container \"2654a8c7fe19afb9880831ddda941101db6b9d583ca6fe3779ff2e38ef92c41b\": container with ID starting with 2654a8c7fe19afb9880831ddda941101db6b9d583ca6fe3779ff2e38ef92c41b not found: ID does not exist" Dec 12 00:54:51 crc kubenswrapper[4606]: I1212 00:54:51.710239 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71acbb65-a2ae-48dc-9b25-8e8c3eb02d14" path="/var/lib/kubelet/pods/71acbb65-a2ae-48dc-9b25-8e8c3eb02d14/volumes" Dec 12 00:54:51 crc kubenswrapper[4606]: I1212 00:54:51.868653 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hpfv8"] Dec 12 00:54:51 crc kubenswrapper[4606]: I1212 00:54:51.869189 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-hpfv8" podUID="1086d3e2-5d50-4b16-9f8a-84d41323dd7d" containerName="registry-server" containerID="cri-o://62dd53669f72020d8f009e0c68986f361a470cfaf458a4af33d2aef8c7054bbe" gracePeriod=2 Dec 12 00:54:52 crc kubenswrapper[4606]: I1212 00:54:52.338157 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hpfv8" Dec 12 00:54:52 crc kubenswrapper[4606]: I1212 00:54:52.454483 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1086d3e2-5d50-4b16-9f8a-84d41323dd7d-catalog-content\") pod \"1086d3e2-5d50-4b16-9f8a-84d41323dd7d\" (UID: \"1086d3e2-5d50-4b16-9f8a-84d41323dd7d\") " Dec 12 00:54:52 crc kubenswrapper[4606]: I1212 00:54:52.454967 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xsz2m\" (UniqueName: \"kubernetes.io/projected/1086d3e2-5d50-4b16-9f8a-84d41323dd7d-kube-api-access-xsz2m\") pod \"1086d3e2-5d50-4b16-9f8a-84d41323dd7d\" (UID: \"1086d3e2-5d50-4b16-9f8a-84d41323dd7d\") " Dec 12 00:54:52 crc kubenswrapper[4606]: I1212 00:54:52.455209 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1086d3e2-5d50-4b16-9f8a-84d41323dd7d-utilities\") pod \"1086d3e2-5d50-4b16-9f8a-84d41323dd7d\" (UID: \"1086d3e2-5d50-4b16-9f8a-84d41323dd7d\") " Dec 12 00:54:52 crc kubenswrapper[4606]: I1212 00:54:52.456128 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1086d3e2-5d50-4b16-9f8a-84d41323dd7d-utilities" (OuterVolumeSpecName: "utilities") pod "1086d3e2-5d50-4b16-9f8a-84d41323dd7d" (UID: "1086d3e2-5d50-4b16-9f8a-84d41323dd7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 00:54:52 crc kubenswrapper[4606]: I1212 00:54:52.470268 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1086d3e2-5d50-4b16-9f8a-84d41323dd7d-kube-api-access-xsz2m" (OuterVolumeSpecName: "kube-api-access-xsz2m") pod "1086d3e2-5d50-4b16-9f8a-84d41323dd7d" (UID: "1086d3e2-5d50-4b16-9f8a-84d41323dd7d"). InnerVolumeSpecName "kube-api-access-xsz2m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:54:52 crc kubenswrapper[4606]: I1212 00:54:52.510575 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1086d3e2-5d50-4b16-9f8a-84d41323dd7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1086d3e2-5d50-4b16-9f8a-84d41323dd7d" (UID: "1086d3e2-5d50-4b16-9f8a-84d41323dd7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 00:54:52 crc kubenswrapper[4606]: I1212 00:54:52.563344 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xsz2m\" (UniqueName: \"kubernetes.io/projected/1086d3e2-5d50-4b16-9f8a-84d41323dd7d-kube-api-access-xsz2m\") on node \"crc\" DevicePath \"\"" Dec 12 00:54:52 crc kubenswrapper[4606]: I1212 00:54:52.563418 4606 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1086d3e2-5d50-4b16-9f8a-84d41323dd7d-utilities\") on node \"crc\" DevicePath \"\"" Dec 12 00:54:52 crc kubenswrapper[4606]: I1212 00:54:52.563432 4606 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1086d3e2-5d50-4b16-9f8a-84d41323dd7d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 12 00:54:52 crc kubenswrapper[4606]: I1212 00:54:52.720740 4606 generic.go:334] "Generic (PLEG): container finished" podID="1086d3e2-5d50-4b16-9f8a-84d41323dd7d" containerID="62dd53669f72020d8f009e0c68986f361a470cfaf458a4af33d2aef8c7054bbe" exitCode=0 Dec 12 00:54:52 crc kubenswrapper[4606]: I1212 00:54:52.720809 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hpfv8" Dec 12 00:54:52 crc kubenswrapper[4606]: I1212 00:54:52.720802 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hpfv8" event={"ID":"1086d3e2-5d50-4b16-9f8a-84d41323dd7d","Type":"ContainerDied","Data":"62dd53669f72020d8f009e0c68986f361a470cfaf458a4af33d2aef8c7054bbe"} Dec 12 00:54:52 crc kubenswrapper[4606]: I1212 00:54:52.721989 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hpfv8" event={"ID":"1086d3e2-5d50-4b16-9f8a-84d41323dd7d","Type":"ContainerDied","Data":"8c2340d6218996b21fac57c9b2510b1a6e87b9834c7ad941272324f681c8cd36"} Dec 12 00:54:52 crc kubenswrapper[4606]: I1212 00:54:52.722051 4606 scope.go:117] "RemoveContainer" containerID="62dd53669f72020d8f009e0c68986f361a470cfaf458a4af33d2aef8c7054bbe" Dec 12 00:54:52 crc kubenswrapper[4606]: I1212 00:54:52.765331 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hpfv8"] Dec 12 00:54:52 crc kubenswrapper[4606]: I1212 00:54:52.777093 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-hpfv8"] Dec 12 00:54:52 crc kubenswrapper[4606]: I1212 00:54:52.778257 4606 scope.go:117] "RemoveContainer" containerID="4fef64e7bc1febb85db967e4ffb45d893e25372c2c9d63f9cb7d00b2b2962cee" Dec 12 00:54:52 crc kubenswrapper[4606]: I1212 00:54:52.808058 4606 scope.go:117] "RemoveContainer" containerID="358d96df0c5dc796c96f9a8da2a75271df9311dd22dda47d54eb1e02d4699935" Dec 12 00:54:52 crc kubenswrapper[4606]: I1212 00:54:52.861055 4606 scope.go:117] "RemoveContainer" containerID="62dd53669f72020d8f009e0c68986f361a470cfaf458a4af33d2aef8c7054bbe" Dec 12 00:54:52 crc kubenswrapper[4606]: E1212 00:54:52.861756 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62dd53669f72020d8f009e0c68986f361a470cfaf458a4af33d2aef8c7054bbe\": container with ID starting with 62dd53669f72020d8f009e0c68986f361a470cfaf458a4af33d2aef8c7054bbe not found: ID does not exist" containerID="62dd53669f72020d8f009e0c68986f361a470cfaf458a4af33d2aef8c7054bbe" Dec 12 00:54:52 crc kubenswrapper[4606]: I1212 00:54:52.861806 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62dd53669f72020d8f009e0c68986f361a470cfaf458a4af33d2aef8c7054bbe"} err="failed to get container status \"62dd53669f72020d8f009e0c68986f361a470cfaf458a4af33d2aef8c7054bbe\": rpc error: code = NotFound desc = could not find container \"62dd53669f72020d8f009e0c68986f361a470cfaf458a4af33d2aef8c7054bbe\": container with ID starting with 62dd53669f72020d8f009e0c68986f361a470cfaf458a4af33d2aef8c7054bbe not found: ID does not exist" Dec 12 00:54:52 crc kubenswrapper[4606]: I1212 00:54:52.861840 4606 scope.go:117] "RemoveContainer" containerID="4fef64e7bc1febb85db967e4ffb45d893e25372c2c9d63f9cb7d00b2b2962cee" Dec 12 00:54:52 crc kubenswrapper[4606]: E1212 00:54:52.862410 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4fef64e7bc1febb85db967e4ffb45d893e25372c2c9d63f9cb7d00b2b2962cee\": container with ID starting with 4fef64e7bc1febb85db967e4ffb45d893e25372c2c9d63f9cb7d00b2b2962cee not found: ID does not exist" containerID="4fef64e7bc1febb85db967e4ffb45d893e25372c2c9d63f9cb7d00b2b2962cee" Dec 12 00:54:52 crc kubenswrapper[4606]: I1212 00:54:52.862447 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4fef64e7bc1febb85db967e4ffb45d893e25372c2c9d63f9cb7d00b2b2962cee"} err="failed to get container status \"4fef64e7bc1febb85db967e4ffb45d893e25372c2c9d63f9cb7d00b2b2962cee\": rpc error: code = NotFound desc = could not find container \"4fef64e7bc1febb85db967e4ffb45d893e25372c2c9d63f9cb7d00b2b2962cee\": container with ID starting with 4fef64e7bc1febb85db967e4ffb45d893e25372c2c9d63f9cb7d00b2b2962cee not found: ID does not exist" Dec 12 00:54:52 crc kubenswrapper[4606]: I1212 00:54:52.862475 4606 scope.go:117] "RemoveContainer" containerID="358d96df0c5dc796c96f9a8da2a75271df9311dd22dda47d54eb1e02d4699935" Dec 12 00:54:52 crc kubenswrapper[4606]: E1212 00:54:52.862762 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"358d96df0c5dc796c96f9a8da2a75271df9311dd22dda47d54eb1e02d4699935\": container with ID starting with 358d96df0c5dc796c96f9a8da2a75271df9311dd22dda47d54eb1e02d4699935 not found: ID does not exist" containerID="358d96df0c5dc796c96f9a8da2a75271df9311dd22dda47d54eb1e02d4699935" Dec 12 00:54:52 crc kubenswrapper[4606]: I1212 00:54:52.862792 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"358d96df0c5dc796c96f9a8da2a75271df9311dd22dda47d54eb1e02d4699935"} err="failed to get container status \"358d96df0c5dc796c96f9a8da2a75271df9311dd22dda47d54eb1e02d4699935\": rpc error: code = NotFound desc = could not find container \"358d96df0c5dc796c96f9a8da2a75271df9311dd22dda47d54eb1e02d4699935\": container with ID starting with 358d96df0c5dc796c96f9a8da2a75271df9311dd22dda47d54eb1e02d4699935 not found: ID does not exist" Dec 12 00:54:53 crc kubenswrapper[4606]: I1212 00:54:53.714618 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1086d3e2-5d50-4b16-9f8a-84d41323dd7d" path="/var/lib/kubelet/pods/1086d3e2-5d50-4b16-9f8a-84d41323dd7d/volumes" Dec 12 00:55:01 crc kubenswrapper[4606]: I1212 00:55:01.275100 4606 scope.go:117] "RemoveContainer" containerID="dd2e2914261d84eecc80e972bb81081ea3349b7dddbb356f4faf14ffc5996a2a" Dec 12 00:55:01 crc kubenswrapper[4606]: I1212 00:55:01.300984 4606 scope.go:117] "RemoveContainer" containerID="bfd693e38c2149d439bc9ac37a35c49a4535ee289f44be5a6267d84afb08398c" Dec 12 00:55:01 crc kubenswrapper[4606]: I1212 00:55:01.347518 4606 scope.go:117] "RemoveContainer" containerID="40f018ad1c87a265e10bb7fc511038d32e836b545fc9ae443dbd1830c25d4216" Dec 12 00:55:01 crc kubenswrapper[4606]: I1212 00:55:01.407156 4606 scope.go:117] "RemoveContainer" containerID="23741727f4506a2f3bd377fd00f0540d2072a00cf6e8864f626bd9f977381f4c" Dec 12 00:55:01 crc kubenswrapper[4606]: I1212 00:55:01.429036 4606 scope.go:117] "RemoveContainer" containerID="f25dad178361a4e06a05c81b514a8a431465f4ea2fe6bba51af3760c62b8d500" Dec 12 00:55:01 crc kubenswrapper[4606]: I1212 00:55:01.462733 4606 scope.go:117] "RemoveContainer" containerID="ca55a0c69b9e5148ed7e570d1e1cff32cf0739821995257ca5662d20ac134add" Dec 12 00:55:01 crc kubenswrapper[4606]: I1212 00:55:01.497150 4606 scope.go:117] "RemoveContainer" containerID="f453eb4163c594ea7b3a74aba61b9ff9e7cc35070add842376817c36698caa93" Dec 12 00:55:01 crc kubenswrapper[4606]: I1212 00:55:01.542099 4606 scope.go:117] "RemoveContainer" containerID="54ddf391554cecd54f714f13fac2eb26dc6050ba5beb3441de205298d6d8f284" Dec 12 00:55:01 crc kubenswrapper[4606]: I1212 00:55:01.573676 4606 scope.go:117] "RemoveContainer" containerID="86a4d1893867a6d26ce4dc36c713b48bd9b0c1ea6454a44918a9a72cb29a5282" Dec 12 00:55:01 crc kubenswrapper[4606]: I1212 00:55:01.604577 4606 scope.go:117] "RemoveContainer" containerID="e3707bda609b96a4e83c0b905bc76c57410e4ae657ff1a5cf39a5ac924f18912" Dec 12 00:55:01 crc kubenswrapper[4606]: I1212 00:55:01.624705 4606 scope.go:117] "RemoveContainer" containerID="a3b6a1da3cc761d686854df968b045d2f1625ef9d6816a1aae23f451a66e1453" Dec 12 00:55:02 crc kubenswrapper[4606]: I1212 00:55:02.809075 4606 generic.go:334] "Generic (PLEG): container finished" podID="cd9c36e5-43c7-4723-b818-e8b4129d578a" containerID="4c69680dfdea70200baeaa31a69b48738f2b8b09b7491d2301583c216e1d970f" exitCode=0 Dec 12 00:55:02 crc kubenswrapper[4606]: I1212 00:55:02.809114 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mbv7c" event={"ID":"cd9c36e5-43c7-4723-b818-e8b4129d578a","Type":"ContainerDied","Data":"4c69680dfdea70200baeaa31a69b48738f2b8b09b7491d2301583c216e1d970f"} Dec 12 00:55:04 crc kubenswrapper[4606]: I1212 00:55:04.311030 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mbv7c" Dec 12 00:55:04 crc kubenswrapper[4606]: I1212 00:55:04.479932 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cd9c36e5-43c7-4723-b818-e8b4129d578a-ssh-key\") pod \"cd9c36e5-43c7-4723-b818-e8b4129d578a\" (UID: \"cd9c36e5-43c7-4723-b818-e8b4129d578a\") " Dec 12 00:55:04 crc kubenswrapper[4606]: I1212 00:55:04.480936 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d9xbk\" (UniqueName: \"kubernetes.io/projected/cd9c36e5-43c7-4723-b818-e8b4129d578a-kube-api-access-d9xbk\") pod \"cd9c36e5-43c7-4723-b818-e8b4129d578a\" (UID: \"cd9c36e5-43c7-4723-b818-e8b4129d578a\") " Dec 12 00:55:04 crc kubenswrapper[4606]: I1212 00:55:04.481053 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd9c36e5-43c7-4723-b818-e8b4129d578a-bootstrap-combined-ca-bundle\") pod \"cd9c36e5-43c7-4723-b818-e8b4129d578a\" (UID: \"cd9c36e5-43c7-4723-b818-e8b4129d578a\") " Dec 12 00:55:04 crc kubenswrapper[4606]: I1212 00:55:04.481127 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cd9c36e5-43c7-4723-b818-e8b4129d578a-inventory\") pod \"cd9c36e5-43c7-4723-b818-e8b4129d578a\" (UID: \"cd9c36e5-43c7-4723-b818-e8b4129d578a\") " Dec 12 00:55:04 crc kubenswrapper[4606]: I1212 00:55:04.487027 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd9c36e5-43c7-4723-b818-e8b4129d578a-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "cd9c36e5-43c7-4723-b818-e8b4129d578a" (UID: "cd9c36e5-43c7-4723-b818-e8b4129d578a"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:55:04 crc kubenswrapper[4606]: I1212 00:55:04.491052 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd9c36e5-43c7-4723-b818-e8b4129d578a-kube-api-access-d9xbk" (OuterVolumeSpecName: "kube-api-access-d9xbk") pod "cd9c36e5-43c7-4723-b818-e8b4129d578a" (UID: "cd9c36e5-43c7-4723-b818-e8b4129d578a"). InnerVolumeSpecName "kube-api-access-d9xbk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:55:04 crc kubenswrapper[4606]: I1212 00:55:04.509232 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd9c36e5-43c7-4723-b818-e8b4129d578a-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "cd9c36e5-43c7-4723-b818-e8b4129d578a" (UID: "cd9c36e5-43c7-4723-b818-e8b4129d578a"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:55:04 crc kubenswrapper[4606]: I1212 00:55:04.520012 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd9c36e5-43c7-4723-b818-e8b4129d578a-inventory" (OuterVolumeSpecName: "inventory") pod "cd9c36e5-43c7-4723-b818-e8b4129d578a" (UID: "cd9c36e5-43c7-4723-b818-e8b4129d578a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:55:04 crc kubenswrapper[4606]: I1212 00:55:04.585249 4606 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cd9c36e5-43c7-4723-b818-e8b4129d578a-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 12 00:55:04 crc kubenswrapper[4606]: I1212 00:55:04.585315 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d9xbk\" (UniqueName: \"kubernetes.io/projected/cd9c36e5-43c7-4723-b818-e8b4129d578a-kube-api-access-d9xbk\") on node \"crc\" DevicePath \"\"" Dec 12 00:55:04 crc kubenswrapper[4606]: I1212 00:55:04.585339 4606 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd9c36e5-43c7-4723-b818-e8b4129d578a-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 12 00:55:04 crc kubenswrapper[4606]: I1212 00:55:04.585359 4606 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cd9c36e5-43c7-4723-b818-e8b4129d578a-inventory\") on node \"crc\" DevicePath \"\"" Dec 12 00:55:04 crc kubenswrapper[4606]: I1212 00:55:04.836086 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mbv7c" event={"ID":"cd9c36e5-43c7-4723-b818-e8b4129d578a","Type":"ContainerDied","Data":"2e74ff71f206ec4558dacad296723dafa492df2d151fa550515508f72458ad9c"} Dec 12 00:55:04 crc kubenswrapper[4606]: I1212 00:55:04.836122 4606 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2e74ff71f206ec4558dacad296723dafa492df2d151fa550515508f72458ad9c" Dec 12 00:55:04 crc kubenswrapper[4606]: I1212 00:55:04.836129 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mbv7c" Dec 12 00:55:04 crc kubenswrapper[4606]: I1212 00:55:04.967350 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ff8s5"] Dec 12 00:55:04 crc kubenswrapper[4606]: E1212 00:55:04.967705 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1086d3e2-5d50-4b16-9f8a-84d41323dd7d" containerName="registry-server" Dec 12 00:55:04 crc kubenswrapper[4606]: I1212 00:55:04.967720 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="1086d3e2-5d50-4b16-9f8a-84d41323dd7d" containerName="registry-server" Dec 12 00:55:04 crc kubenswrapper[4606]: E1212 00:55:04.967738 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd9c36e5-43c7-4723-b818-e8b4129d578a" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 12 00:55:04 crc kubenswrapper[4606]: I1212 00:55:04.967745 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd9c36e5-43c7-4723-b818-e8b4129d578a" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 12 00:55:04 crc kubenswrapper[4606]: E1212 00:55:04.967767 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71acbb65-a2ae-48dc-9b25-8e8c3eb02d14" containerName="extract-content" Dec 12 00:55:04 crc kubenswrapper[4606]: I1212 00:55:04.967773 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="71acbb65-a2ae-48dc-9b25-8e8c3eb02d14" containerName="extract-content" Dec 12 00:55:04 crc kubenswrapper[4606]: E1212 00:55:04.967780 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71acbb65-a2ae-48dc-9b25-8e8c3eb02d14" containerName="registry-server" Dec 12 00:55:04 crc kubenswrapper[4606]: I1212 00:55:04.967787 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="71acbb65-a2ae-48dc-9b25-8e8c3eb02d14" containerName="registry-server" Dec 12 00:55:04 crc kubenswrapper[4606]: E1212 00:55:04.967801 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71acbb65-a2ae-48dc-9b25-8e8c3eb02d14" containerName="extract-utilities" Dec 12 00:55:04 crc kubenswrapper[4606]: I1212 00:55:04.967809 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="71acbb65-a2ae-48dc-9b25-8e8c3eb02d14" containerName="extract-utilities" Dec 12 00:55:04 crc kubenswrapper[4606]: E1212 00:55:04.967818 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1086d3e2-5d50-4b16-9f8a-84d41323dd7d" containerName="extract-utilities" Dec 12 00:55:04 crc kubenswrapper[4606]: I1212 00:55:04.967824 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="1086d3e2-5d50-4b16-9f8a-84d41323dd7d" containerName="extract-utilities" Dec 12 00:55:04 crc kubenswrapper[4606]: E1212 00:55:04.967835 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1086d3e2-5d50-4b16-9f8a-84d41323dd7d" containerName="extract-content" Dec 12 00:55:04 crc kubenswrapper[4606]: I1212 00:55:04.967843 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="1086d3e2-5d50-4b16-9f8a-84d41323dd7d" containerName="extract-content" Dec 12 00:55:04 crc kubenswrapper[4606]: I1212 00:55:04.968036 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="71acbb65-a2ae-48dc-9b25-8e8c3eb02d14" containerName="registry-server" Dec 12 00:55:04 crc kubenswrapper[4606]: I1212 00:55:04.968056 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd9c36e5-43c7-4723-b818-e8b4129d578a" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 12 00:55:04 crc kubenswrapper[4606]: I1212 00:55:04.968068 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="1086d3e2-5d50-4b16-9f8a-84d41323dd7d" containerName="registry-server" Dec 12 00:55:04 crc kubenswrapper[4606]: I1212 00:55:04.968677 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ff8s5" Dec 12 00:55:04 crc kubenswrapper[4606]: I1212 00:55:04.975062 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 12 00:55:04 crc kubenswrapper[4606]: I1212 00:55:04.975252 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 12 00:55:04 crc kubenswrapper[4606]: I1212 00:55:04.977909 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 12 00:55:04 crc kubenswrapper[4606]: I1212 00:55:04.989372 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-w59bl" Dec 12 00:55:05 crc kubenswrapper[4606]: I1212 00:55:05.012836 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ff8s5"] Dec 12 00:55:05 crc kubenswrapper[4606]: I1212 00:55:05.094277 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0d1c2004-cdcb-4729-a4f2-43a08adf9c04-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-ff8s5\" (UID: \"0d1c2004-cdcb-4729-a4f2-43a08adf9c04\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ff8s5" Dec 12 00:55:05 crc kubenswrapper[4606]: I1212 00:55:05.094450 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0d1c2004-cdcb-4729-a4f2-43a08adf9c04-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-ff8s5\" (UID: \"0d1c2004-cdcb-4729-a4f2-43a08adf9c04\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ff8s5" Dec 12 00:55:05 crc kubenswrapper[4606]: I1212 00:55:05.094488 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvjmx\" (UniqueName: \"kubernetes.io/projected/0d1c2004-cdcb-4729-a4f2-43a08adf9c04-kube-api-access-mvjmx\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-ff8s5\" (UID: \"0d1c2004-cdcb-4729-a4f2-43a08adf9c04\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ff8s5" Dec 12 00:55:05 crc kubenswrapper[4606]: I1212 00:55:05.196060 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0d1c2004-cdcb-4729-a4f2-43a08adf9c04-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-ff8s5\" (UID: \"0d1c2004-cdcb-4729-a4f2-43a08adf9c04\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ff8s5" Dec 12 00:55:05 crc kubenswrapper[4606]: I1212 00:55:05.196429 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0d1c2004-cdcb-4729-a4f2-43a08adf9c04-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-ff8s5\" (UID: \"0d1c2004-cdcb-4729-a4f2-43a08adf9c04\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ff8s5" Dec 12 00:55:05 crc kubenswrapper[4606]: I1212 00:55:05.196514 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvjmx\" (UniqueName: \"kubernetes.io/projected/0d1c2004-cdcb-4729-a4f2-43a08adf9c04-kube-api-access-mvjmx\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-ff8s5\" (UID: \"0d1c2004-cdcb-4729-a4f2-43a08adf9c04\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ff8s5" Dec 12 00:55:05 crc kubenswrapper[4606]: I1212 00:55:05.199765 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0d1c2004-cdcb-4729-a4f2-43a08adf9c04-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-ff8s5\" (UID: \"0d1c2004-cdcb-4729-a4f2-43a08adf9c04\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ff8s5" Dec 12 00:55:05 crc kubenswrapper[4606]: I1212 00:55:05.216684 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0d1c2004-cdcb-4729-a4f2-43a08adf9c04-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-ff8s5\" (UID: \"0d1c2004-cdcb-4729-a4f2-43a08adf9c04\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ff8s5" Dec 12 00:55:05 crc kubenswrapper[4606]: I1212 00:55:05.220093 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvjmx\" (UniqueName: \"kubernetes.io/projected/0d1c2004-cdcb-4729-a4f2-43a08adf9c04-kube-api-access-mvjmx\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-ff8s5\" (UID: \"0d1c2004-cdcb-4729-a4f2-43a08adf9c04\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ff8s5" Dec 12 00:55:05 crc kubenswrapper[4606]: I1212 00:55:05.287929 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ff8s5" Dec 12 00:55:05 crc kubenswrapper[4606]: I1212 00:55:05.882826 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ff8s5"] Dec 12 00:55:05 crc kubenswrapper[4606]: W1212 00:55:05.889662 4606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0d1c2004_cdcb_4729_a4f2_43a08adf9c04.slice/crio-912c5fad06924179b1a1d225d50620afe4c9ccecd886fa02cbe9450e741a1f55 WatchSource:0}: Error finding container 912c5fad06924179b1a1d225d50620afe4c9ccecd886fa02cbe9450e741a1f55: Status 404 returned error can't find the container with id 912c5fad06924179b1a1d225d50620afe4c9ccecd886fa02cbe9450e741a1f55 Dec 12 00:55:06 crc kubenswrapper[4606]: I1212 00:55:06.852951 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ff8s5" event={"ID":"0d1c2004-cdcb-4729-a4f2-43a08adf9c04","Type":"ContainerStarted","Data":"3821f6ab823d4a08913b2037a3798473e2de5adfe64e889fb5c6b00d7d0a6653"} Dec 12 00:55:06 crc kubenswrapper[4606]: I1212 00:55:06.853663 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ff8s5" event={"ID":"0d1c2004-cdcb-4729-a4f2-43a08adf9c04","Type":"ContainerStarted","Data":"912c5fad06924179b1a1d225d50620afe4c9ccecd886fa02cbe9450e741a1f55"} Dec 12 00:55:06 crc kubenswrapper[4606]: I1212 00:55:06.884043 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ff8s5" podStartSLOduration=2.671608691 podStartE2EDuration="2.884021166s" podCreationTimestamp="2025-12-12 00:55:04 +0000 UTC" firstStartedPulling="2025-12-12 00:55:05.891790645 +0000 UTC m=+1896.437143521" lastFinishedPulling="2025-12-12 00:55:06.10420312 +0000 UTC m=+1896.649555996" observedRunningTime="2025-12-12 00:55:06.873692059 +0000 UTC m=+1897.419044925" watchObservedRunningTime="2025-12-12 00:55:06.884021166 +0000 UTC m=+1897.429374032" Dec 12 00:55:09 crc kubenswrapper[4606]: I1212 00:55:09.049593 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-hhlpv"] Dec 12 00:55:09 crc kubenswrapper[4606]: I1212 00:55:09.058469 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-hhlpv"] Dec 12 00:55:09 crc kubenswrapper[4606]: I1212 00:55:09.711892 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9db5468c-db5e-4cbb-a854-d3e805d9744e" path="/var/lib/kubelet/pods/9db5468c-db5e-4cbb-a854-d3e805d9744e/volumes" Dec 12 00:55:13 crc kubenswrapper[4606]: I1212 00:55:13.035778 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-5080-account-create-update-ztlj6"] Dec 12 00:55:13 crc kubenswrapper[4606]: I1212 00:55:13.043726 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-019c-account-create-update-h2brj"] Dec 12 00:55:13 crc kubenswrapper[4606]: I1212 00:55:13.054271 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-nrs7j"] Dec 12 00:55:13 crc kubenswrapper[4606]: I1212 00:55:13.062374 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-019c-account-create-update-h2brj"] Dec 12 00:55:13 crc kubenswrapper[4606]: I1212 00:55:13.070549 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-5080-account-create-update-ztlj6"] Dec 12 00:55:13 crc kubenswrapper[4606]: I1212 00:55:13.078113 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-nrs7j"] Dec 12 00:55:13 crc kubenswrapper[4606]: I1212 00:55:13.719061 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d88d3ae-b52d-4a92-9bdb-ba5d5541abaa" path="/var/lib/kubelet/pods/5d88d3ae-b52d-4a92-9bdb-ba5d5541abaa/volumes" Dec 12 00:55:13 crc kubenswrapper[4606]: I1212 00:55:13.722307 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92f2dac5-52f1-4438-aa85-059425ed7822" path="/var/lib/kubelet/pods/92f2dac5-52f1-4438-aa85-059425ed7822/volumes" Dec 12 00:55:13 crc kubenswrapper[4606]: I1212 00:55:13.724728 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be20f016-2639-4715-80af-5719a068a857" path="/var/lib/kubelet/pods/be20f016-2639-4715-80af-5719a068a857/volumes" Dec 12 00:55:14 crc kubenswrapper[4606]: I1212 00:55:14.036886 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5d1d-account-create-update-r2rq9"] Dec 12 00:55:14 crc kubenswrapper[4606]: I1212 00:55:14.049689 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-8mpd7"] Dec 12 00:55:14 crc kubenswrapper[4606]: I1212 00:55:14.065611 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-4rtsn"] Dec 12 00:55:14 crc kubenswrapper[4606]: I1212 00:55:14.073690 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-4rtsn"] Dec 12 00:55:14 crc kubenswrapper[4606]: I1212 00:55:14.081678 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-8mpd7"] Dec 12 00:55:14 crc kubenswrapper[4606]: I1212 00:55:14.090047 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-5d1d-account-create-update-r2rq9"] Dec 12 00:55:15 crc kubenswrapper[4606]: I1212 00:55:15.716987 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ea658e5-7318-49c3-ab43-00c885902e4f" path="/var/lib/kubelet/pods/0ea658e5-7318-49c3-ab43-00c885902e4f/volumes" Dec 12 00:55:15 crc kubenswrapper[4606]: I1212 00:55:15.718954 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18577071-2d04-4c75-99fd-cd721836a571" path="/var/lib/kubelet/pods/18577071-2d04-4c75-99fd-cd721836a571/volumes" Dec 12 00:55:15 crc kubenswrapper[4606]: I1212 00:55:15.720165 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b00e1c7b-96a5-46de-8f2e-66ed8ff9275c" path="/var/lib/kubelet/pods/b00e1c7b-96a5-46de-8f2e-66ed8ff9275c/volumes" Dec 12 00:55:21 crc kubenswrapper[4606]: I1212 00:55:21.031679 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-j24jk"] Dec 12 00:55:21 crc kubenswrapper[4606]: I1212 00:55:21.042259 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-j24jk"] Dec 12 00:55:21 crc kubenswrapper[4606]: I1212 00:55:21.711709 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a58ecb89-37d3-4d35-9a1a-9820df05848e" path="/var/lib/kubelet/pods/a58ecb89-37d3-4d35-9a1a-9820df05848e/volumes" Dec 12 00:55:32 crc kubenswrapper[4606]: I1212 00:55:32.010300 4606 patch_prober.go:28] interesting pod/machine-config-daemon-cqmz5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 12 00:55:32 crc kubenswrapper[4606]: I1212 00:55:32.012236 4606 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 12 00:55:57 crc kubenswrapper[4606]: I1212 00:55:57.070618 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-hp9lg"] Dec 12 00:55:57 crc kubenswrapper[4606]: I1212 00:55:57.080969 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-hp9lg"] Dec 12 00:55:57 crc kubenswrapper[4606]: I1212 00:55:57.710769 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce0c3f48-d61e-420d-ab53-61361c7a4a25" path="/var/lib/kubelet/pods/ce0c3f48-d61e-420d-ab53-61361c7a4a25/volumes" Dec 12 00:56:01 crc kubenswrapper[4606]: I1212 00:56:01.844686 4606 scope.go:117] "RemoveContainer" containerID="e76773484484c312e8bf03d6e9ac05b4491b9cd0a33d4352cfc58f910464c2e5" Dec 12 00:56:01 crc kubenswrapper[4606]: I1212 00:56:01.897947 4606 scope.go:117] "RemoveContainer" containerID="9313441d4ff4e009604a3cd040bfbe6d68e5a117c52c8de43661a35abe25c295" Dec 12 00:56:01 crc kubenswrapper[4606]: I1212 00:56:01.965728 4606 scope.go:117] "RemoveContainer" containerID="43e74f811bdd82896befc7495dab3869fa3c81bec160c4912838cddece137eca" Dec 12 00:56:02 crc kubenswrapper[4606]: I1212 00:56:02.010579 4606 patch_prober.go:28] interesting pod/machine-config-daemon-cqmz5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 12 00:56:02 crc kubenswrapper[4606]: I1212 00:56:02.010668 4606 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 12 00:56:02 crc kubenswrapper[4606]: I1212 00:56:02.017944 4606 scope.go:117] "RemoveContainer" containerID="7590f4e5bf7c08d9e7683102e025e145ecbaa39ebc5127fba13a965ecebb7d1c" Dec 12 00:56:02 crc kubenswrapper[4606]: I1212 00:56:02.050519 4606 scope.go:117] "RemoveContainer" containerID="8944007f23342877bee55299498a19ce188152f5110dc69dad0fc44d79539aa2" Dec 12 00:56:02 crc kubenswrapper[4606]: I1212 00:56:02.101710 4606 scope.go:117] "RemoveContainer" containerID="2960132e59f66efa0c957f351e2825dee6705cc8549fd74234110e8fe867a2bf" Dec 12 00:56:02 crc kubenswrapper[4606]: I1212 00:56:02.132277 4606 scope.go:117] "RemoveContainer" containerID="54016490d6a36738eecae0c922fe64bcf2044ea1661f592cea65e0ef980ce0ae" Dec 12 00:56:02 crc kubenswrapper[4606]: I1212 00:56:02.163650 4606 scope.go:117] "RemoveContainer" containerID="f02cd2fb8847afdb298b2342547cde2b3f66ced980c44d729cd2003e20832d41" Dec 12 00:56:02 crc kubenswrapper[4606]: I1212 00:56:02.189727 4606 scope.go:117] "RemoveContainer" containerID="28dd3737c76d219cc9b1806a838aaa6e389ed9dc5614f000677c18465b7744ec" Dec 12 00:56:16 crc kubenswrapper[4606]: I1212 00:56:16.048424 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-4hg5f"] Dec 12 00:56:16 crc kubenswrapper[4606]: I1212 00:56:16.059624 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-4hg5f"] Dec 12 00:56:17 crc kubenswrapper[4606]: I1212 00:56:17.714335 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a0b6b98-c743-4435-a967-55c0edb95531" path="/var/lib/kubelet/pods/5a0b6b98-c743-4435-a967-55c0edb95531/volumes" Dec 12 00:56:19 crc kubenswrapper[4606]: I1212 00:56:19.064237 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-qmcpn"] Dec 12 00:56:19 crc kubenswrapper[4606]: I1212 00:56:19.075563 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-qmcpn"] Dec 12 00:56:19 crc kubenswrapper[4606]: I1212 00:56:19.088446 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-kcwn5"] Dec 12 00:56:19 crc kubenswrapper[4606]: I1212 00:56:19.097437 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-kcwn5"] Dec 12 00:56:19 crc kubenswrapper[4606]: I1212 00:56:19.711343 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73c136c9-e12d-434a-aab1-ed21dfaf0f60" path="/var/lib/kubelet/pods/73c136c9-e12d-434a-aab1-ed21dfaf0f60/volumes" Dec 12 00:56:19 crc kubenswrapper[4606]: I1212 00:56:19.712773 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ff95d54-4b78-48cb-b8c9-33801a6818f0" path="/var/lib/kubelet/pods/8ff95d54-4b78-48cb-b8c9-33801a6818f0/volumes" Dec 12 00:56:32 crc kubenswrapper[4606]: I1212 00:56:32.010979 4606 patch_prober.go:28] interesting pod/machine-config-daemon-cqmz5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 12 00:56:32 crc kubenswrapper[4606]: I1212 00:56:32.011572 4606 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 12 00:56:32 crc kubenswrapper[4606]: I1212 00:56:32.011615 4606 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" Dec 12 00:56:32 crc kubenswrapper[4606]: I1212 00:56:32.012278 4606 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"99029e9498baa7a74ac4278469a21ec659c962bde39e91cbead95d5b96de00b5"} pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 12 00:56:32 crc kubenswrapper[4606]: I1212 00:56:32.012337 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" containerName="machine-config-daemon" containerID="cri-o://99029e9498baa7a74ac4278469a21ec659c962bde39e91cbead95d5b96de00b5" gracePeriod=600 Dec 12 00:56:32 crc kubenswrapper[4606]: I1212 00:56:32.697744 4606 generic.go:334] "Generic (PLEG): container finished" podID="a543e227-be89-40cb-941d-b4707cc28921" containerID="99029e9498baa7a74ac4278469a21ec659c962bde39e91cbead95d5b96de00b5" exitCode=0 Dec 12 00:56:32 crc kubenswrapper[4606]: I1212 00:56:32.698056 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" event={"ID":"a543e227-be89-40cb-941d-b4707cc28921","Type":"ContainerDied","Data":"99029e9498baa7a74ac4278469a21ec659c962bde39e91cbead95d5b96de00b5"} Dec 12 00:56:32 crc kubenswrapper[4606]: I1212 00:56:32.698105 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" event={"ID":"a543e227-be89-40cb-941d-b4707cc28921","Type":"ContainerStarted","Data":"5c37b89455dd84575aa571020478d60921333939ed948577bddb71be18b5f553"} Dec 12 00:56:32 crc kubenswrapper[4606]: I1212 00:56:32.698123 4606 scope.go:117] "RemoveContainer" containerID="77f50006b910ef408eff1d278d14d4c6df84559c43dc90a75f9e7ad4aa5dad37" Dec 12 00:56:39 crc kubenswrapper[4606]: I1212 00:56:39.046320 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-9xbj8"] Dec 12 00:56:39 crc kubenswrapper[4606]: I1212 00:56:39.065020 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-9xbj8"] Dec 12 00:56:39 crc kubenswrapper[4606]: I1212 00:56:39.716450 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7978c0cd-b859-49f1-ad0e-1cb88ff58495" path="/var/lib/kubelet/pods/7978c0cd-b859-49f1-ad0e-1cb88ff58495/volumes" Dec 12 00:57:01 crc kubenswrapper[4606]: I1212 00:57:01.997928 4606 generic.go:334] "Generic (PLEG): container finished" podID="0d1c2004-cdcb-4729-a4f2-43a08adf9c04" containerID="3821f6ab823d4a08913b2037a3798473e2de5adfe64e889fb5c6b00d7d0a6653" exitCode=0 Dec 12 00:57:01 crc kubenswrapper[4606]: I1212 00:57:01.997993 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ff8s5" event={"ID":"0d1c2004-cdcb-4729-a4f2-43a08adf9c04","Type":"ContainerDied","Data":"3821f6ab823d4a08913b2037a3798473e2de5adfe64e889fb5c6b00d7d0a6653"} Dec 12 00:57:02 crc kubenswrapper[4606]: I1212 00:57:02.390604 4606 scope.go:117] "RemoveContainer" containerID="b515f4d8e08e5e2e4e36edb12127b4cd245223498e1e82b164d9c51b5ca6bd93" Dec 12 00:57:02 crc kubenswrapper[4606]: I1212 00:57:02.433713 4606 scope.go:117] "RemoveContainer" containerID="6123990c2827dad2f196359e8f22bb8b9f9b9d940730041d059d21c3bc9fd5e1" Dec 12 00:57:02 crc kubenswrapper[4606]: I1212 00:57:02.480354 4606 scope.go:117] "RemoveContainer" containerID="f7c412f15338c35c0d22eec74a9158bf49f2120376a8b6e416c4b366c83906f5" Dec 12 00:57:02 crc kubenswrapper[4606]: I1212 00:57:02.521605 4606 scope.go:117] "RemoveContainer" containerID="0e8eece22d2df1260b252eb6bff1082a6dcbf84ce11e6b0f6c0ac4cd3cd3b87a" Dec 12 00:57:03 crc kubenswrapper[4606]: I1212 00:57:03.466872 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ff8s5" Dec 12 00:57:03 crc kubenswrapper[4606]: I1212 00:57:03.490931 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mvjmx\" (UniqueName: \"kubernetes.io/projected/0d1c2004-cdcb-4729-a4f2-43a08adf9c04-kube-api-access-mvjmx\") pod \"0d1c2004-cdcb-4729-a4f2-43a08adf9c04\" (UID: \"0d1c2004-cdcb-4729-a4f2-43a08adf9c04\") " Dec 12 00:57:03 crc kubenswrapper[4606]: I1212 00:57:03.491202 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0d1c2004-cdcb-4729-a4f2-43a08adf9c04-ssh-key\") pod \"0d1c2004-cdcb-4729-a4f2-43a08adf9c04\" (UID: \"0d1c2004-cdcb-4729-a4f2-43a08adf9c04\") " Dec 12 00:57:03 crc kubenswrapper[4606]: I1212 00:57:03.491231 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0d1c2004-cdcb-4729-a4f2-43a08adf9c04-inventory\") pod \"0d1c2004-cdcb-4729-a4f2-43a08adf9c04\" (UID: \"0d1c2004-cdcb-4729-a4f2-43a08adf9c04\") " Dec 12 00:57:03 crc kubenswrapper[4606]: I1212 00:57:03.509129 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d1c2004-cdcb-4729-a4f2-43a08adf9c04-kube-api-access-mvjmx" (OuterVolumeSpecName: "kube-api-access-mvjmx") pod "0d1c2004-cdcb-4729-a4f2-43a08adf9c04" (UID: "0d1c2004-cdcb-4729-a4f2-43a08adf9c04"). InnerVolumeSpecName "kube-api-access-mvjmx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:57:03 crc kubenswrapper[4606]: I1212 00:57:03.521294 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d1c2004-cdcb-4729-a4f2-43a08adf9c04-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "0d1c2004-cdcb-4729-a4f2-43a08adf9c04" (UID: "0d1c2004-cdcb-4729-a4f2-43a08adf9c04"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:57:03 crc kubenswrapper[4606]: I1212 00:57:03.541935 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d1c2004-cdcb-4729-a4f2-43a08adf9c04-inventory" (OuterVolumeSpecName: "inventory") pod "0d1c2004-cdcb-4729-a4f2-43a08adf9c04" (UID: "0d1c2004-cdcb-4729-a4f2-43a08adf9c04"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:57:03 crc kubenswrapper[4606]: I1212 00:57:03.593763 4606 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0d1c2004-cdcb-4729-a4f2-43a08adf9c04-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 12 00:57:03 crc kubenswrapper[4606]: I1212 00:57:03.593819 4606 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0d1c2004-cdcb-4729-a4f2-43a08adf9c04-inventory\") on node \"crc\" DevicePath \"\"" Dec 12 00:57:03 crc kubenswrapper[4606]: I1212 00:57:03.593843 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mvjmx\" (UniqueName: \"kubernetes.io/projected/0d1c2004-cdcb-4729-a4f2-43a08adf9c04-kube-api-access-mvjmx\") on node \"crc\" DevicePath \"\"" Dec 12 00:57:04 crc kubenswrapper[4606]: I1212 00:57:04.021429 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ff8s5" event={"ID":"0d1c2004-cdcb-4729-a4f2-43a08adf9c04","Type":"ContainerDied","Data":"912c5fad06924179b1a1d225d50620afe4c9ccecd886fa02cbe9450e741a1f55"} Dec 12 00:57:04 crc kubenswrapper[4606]: I1212 00:57:04.021472 4606 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="912c5fad06924179b1a1d225d50620afe4c9ccecd886fa02cbe9450e741a1f55" Dec 12 00:57:04 crc kubenswrapper[4606]: I1212 00:57:04.021570 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ff8s5" Dec 12 00:57:04 crc kubenswrapper[4606]: I1212 00:57:04.103231 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kz97r"] Dec 12 00:57:04 crc kubenswrapper[4606]: E1212 00:57:04.103729 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d1c2004-cdcb-4729-a4f2-43a08adf9c04" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Dec 12 00:57:04 crc kubenswrapper[4606]: I1212 00:57:04.103754 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d1c2004-cdcb-4729-a4f2-43a08adf9c04" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Dec 12 00:57:04 crc kubenswrapper[4606]: I1212 00:57:04.104060 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d1c2004-cdcb-4729-a4f2-43a08adf9c04" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Dec 12 00:57:04 crc kubenswrapper[4606]: I1212 00:57:04.104873 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kz97r" Dec 12 00:57:04 crc kubenswrapper[4606]: I1212 00:57:04.109547 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 12 00:57:04 crc kubenswrapper[4606]: I1212 00:57:04.109929 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-w59bl" Dec 12 00:57:04 crc kubenswrapper[4606]: I1212 00:57:04.110604 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 12 00:57:04 crc kubenswrapper[4606]: I1212 00:57:04.110963 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 12 00:57:04 crc kubenswrapper[4606]: I1212 00:57:04.113262 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kz97r"] Dec 12 00:57:04 crc kubenswrapper[4606]: I1212 00:57:04.207624 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tn78d\" (UniqueName: \"kubernetes.io/projected/ffbef1cf-b8bf-44ae-be40-52fa989c44d7-kube-api-access-tn78d\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-kz97r\" (UID: \"ffbef1cf-b8bf-44ae-be40-52fa989c44d7\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kz97r" Dec 12 00:57:04 crc kubenswrapper[4606]: I1212 00:57:04.207745 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ffbef1cf-b8bf-44ae-be40-52fa989c44d7-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-kz97r\" (UID: \"ffbef1cf-b8bf-44ae-be40-52fa989c44d7\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kz97r" Dec 12 00:57:04 crc kubenswrapper[4606]: I1212 00:57:04.207816 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ffbef1cf-b8bf-44ae-be40-52fa989c44d7-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-kz97r\" (UID: \"ffbef1cf-b8bf-44ae-be40-52fa989c44d7\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kz97r" Dec 12 00:57:04 crc kubenswrapper[4606]: I1212 00:57:04.309401 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tn78d\" (UniqueName: \"kubernetes.io/projected/ffbef1cf-b8bf-44ae-be40-52fa989c44d7-kube-api-access-tn78d\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-kz97r\" (UID: \"ffbef1cf-b8bf-44ae-be40-52fa989c44d7\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kz97r" Dec 12 00:57:04 crc kubenswrapper[4606]: I1212 00:57:04.309526 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ffbef1cf-b8bf-44ae-be40-52fa989c44d7-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-kz97r\" (UID: \"ffbef1cf-b8bf-44ae-be40-52fa989c44d7\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kz97r" Dec 12 00:57:04 crc kubenswrapper[4606]: I1212 00:57:04.309614 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ffbef1cf-b8bf-44ae-be40-52fa989c44d7-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-kz97r\" (UID: \"ffbef1cf-b8bf-44ae-be40-52fa989c44d7\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kz97r" Dec 12 00:57:04 crc kubenswrapper[4606]: I1212 00:57:04.313633 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ffbef1cf-b8bf-44ae-be40-52fa989c44d7-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-kz97r\" (UID: \"ffbef1cf-b8bf-44ae-be40-52fa989c44d7\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kz97r" Dec 12 00:57:04 crc kubenswrapper[4606]: I1212 00:57:04.315710 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ffbef1cf-b8bf-44ae-be40-52fa989c44d7-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-kz97r\" (UID: \"ffbef1cf-b8bf-44ae-be40-52fa989c44d7\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kz97r" Dec 12 00:57:04 crc kubenswrapper[4606]: I1212 00:57:04.331012 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tn78d\" (UniqueName: \"kubernetes.io/projected/ffbef1cf-b8bf-44ae-be40-52fa989c44d7-kube-api-access-tn78d\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-kz97r\" (UID: \"ffbef1cf-b8bf-44ae-be40-52fa989c44d7\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kz97r" Dec 12 00:57:04 crc kubenswrapper[4606]: I1212 00:57:04.423923 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kz97r" Dec 12 00:57:05 crc kubenswrapper[4606]: I1212 00:57:05.002393 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kz97r"] Dec 12 00:57:05 crc kubenswrapper[4606]: I1212 00:57:05.039977 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kz97r" event={"ID":"ffbef1cf-b8bf-44ae-be40-52fa989c44d7","Type":"ContainerStarted","Data":"fe7889b8ae93d94c5a7d47be18e18084c7bbdc0e3c9668fb51e692bdfa45634b"} Dec 12 00:57:06 crc kubenswrapper[4606]: I1212 00:57:06.054653 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kz97r" event={"ID":"ffbef1cf-b8bf-44ae-be40-52fa989c44d7","Type":"ContainerStarted","Data":"cf22bba676dfe0747765417a68a316d4e5bd0483069b253bc8d4179b4d18baf6"} Dec 12 00:57:06 crc kubenswrapper[4606]: I1212 00:57:06.072479 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kz97r" podStartSLOduration=1.886381989 podStartE2EDuration="2.072449746s" podCreationTimestamp="2025-12-12 00:57:04 +0000 UTC" firstStartedPulling="2025-12-12 00:57:05.009586821 +0000 UTC m=+2015.554939687" lastFinishedPulling="2025-12-12 00:57:05.195654568 +0000 UTC m=+2015.741007444" observedRunningTime="2025-12-12 00:57:06.072046185 +0000 UTC m=+2016.617399051" watchObservedRunningTime="2025-12-12 00:57:06.072449746 +0000 UTC m=+2016.617802612" Dec 12 00:57:24 crc kubenswrapper[4606]: I1212 00:57:24.061407 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-5k8jp"] Dec 12 00:57:24 crc kubenswrapper[4606]: I1212 00:57:24.075295 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-7fad-account-create-update-bgwst"] Dec 12 00:57:24 crc kubenswrapper[4606]: I1212 00:57:24.091679 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-5k8jp"] Dec 12 00:57:24 crc kubenswrapper[4606]: I1212 00:57:24.101531 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-lh8vs"] Dec 12 00:57:24 crc kubenswrapper[4606]: I1212 00:57:24.108390 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-7fad-account-create-update-bgwst"] Dec 12 00:57:24 crc kubenswrapper[4606]: I1212 00:57:24.115319 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-lh8vs"] Dec 12 00:57:25 crc kubenswrapper[4606]: I1212 00:57:25.030551 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-zhvg8"] Dec 12 00:57:25 crc kubenswrapper[4606]: I1212 00:57:25.048256 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-13d9-account-create-update-glqls"] Dec 12 00:57:25 crc kubenswrapper[4606]: I1212 00:57:25.060287 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-13d9-account-create-update-glqls"] Dec 12 00:57:25 crc kubenswrapper[4606]: I1212 00:57:25.071115 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-zhvg8"] Dec 12 00:57:25 crc kubenswrapper[4606]: I1212 00:57:25.079542 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-6bb0-account-create-update-p7kwg"] Dec 12 00:57:25 crc kubenswrapper[4606]: I1212 00:57:25.085912 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-6bb0-account-create-update-p7kwg"] Dec 12 00:57:25 crc kubenswrapper[4606]: I1212 00:57:25.713759 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2bbd1699-c391-4c27-9a8b-9dadfc9d5530" path="/var/lib/kubelet/pods/2bbd1699-c391-4c27-9a8b-9dadfc9d5530/volumes" Dec 12 00:57:25 crc kubenswrapper[4606]: I1212 00:57:25.715106 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3483c50d-cf68-45ab-b01b-7fe2e6f1c057" path="/var/lib/kubelet/pods/3483c50d-cf68-45ab-b01b-7fe2e6f1c057/volumes" Dec 12 00:57:25 crc kubenswrapper[4606]: I1212 00:57:25.716486 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="509a7acf-27c5-45b9-8018-2b21b84b9b0a" path="/var/lib/kubelet/pods/509a7acf-27c5-45b9-8018-2b21b84b9b0a/volumes" Dec 12 00:57:25 crc kubenswrapper[4606]: I1212 00:57:25.717893 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="920936eb-f659-4feb-b571-e90906e8bee2" path="/var/lib/kubelet/pods/920936eb-f659-4feb-b571-e90906e8bee2/volumes" Dec 12 00:57:25 crc kubenswrapper[4606]: I1212 00:57:25.719815 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3208805-45c9-44bd-b7b3-622cdbc2dae9" path="/var/lib/kubelet/pods/c3208805-45c9-44bd-b7b3-622cdbc2dae9/volumes" Dec 12 00:57:25 crc kubenswrapper[4606]: I1212 00:57:25.721163 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f17c1ec8-277b-4c4d-9dc7-75cdc2fb0eda" path="/var/lib/kubelet/pods/f17c1ec8-277b-4c4d-9dc7-75cdc2fb0eda/volumes" Dec 12 00:58:00 crc kubenswrapper[4606]: I1212 00:58:00.071153 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-s2ncv"] Dec 12 00:58:00 crc kubenswrapper[4606]: I1212 00:58:00.085772 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-s2ncv"] Dec 12 00:58:01 crc kubenswrapper[4606]: I1212 00:58:01.711338 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73f59adf-51e1-45e9-95d8-ac24a6310f1e" path="/var/lib/kubelet/pods/73f59adf-51e1-45e9-95d8-ac24a6310f1e/volumes" Dec 12 00:58:02 crc kubenswrapper[4606]: I1212 00:58:02.661549 4606 scope.go:117] "RemoveContainer" containerID="38af5d998135ba12b6d1e8e5c2970e40b0c33bb52defd76f200d68829570d9ad" Dec 12 00:58:02 crc kubenswrapper[4606]: I1212 00:58:02.683964 4606 scope.go:117] "RemoveContainer" containerID="fdd80f0fe03a58d54350f78d7440602e42cfd91a7035fe119613a3b5bac54740" Dec 12 00:58:02 crc kubenswrapper[4606]: I1212 00:58:02.754794 4606 scope.go:117] "RemoveContainer" containerID="213d60894d2453712748ea73c425102921cc0b24b95201f323d5c4ba48526224" Dec 12 00:58:02 crc kubenswrapper[4606]: I1212 00:58:02.798995 4606 scope.go:117] "RemoveContainer" containerID="a8677ce0e335635f3ad4f08931e968cfaa89743560ba6a0c94f0e8aa068b2550" Dec 12 00:58:02 crc kubenswrapper[4606]: I1212 00:58:02.827716 4606 scope.go:117] "RemoveContainer" containerID="4cf8b7c0ae6c560fa6186ad7009387bc04e6f2e401a474840065e803126a8b84" Dec 12 00:58:02 crc kubenswrapper[4606]: I1212 00:58:02.875863 4606 scope.go:117] "RemoveContainer" containerID="06e78b4586f69ffcd4a6697e1b14dae2d3096a03ec94847beb9405c7f122232a" Dec 12 00:58:02 crc kubenswrapper[4606]: I1212 00:58:02.911402 4606 scope.go:117] "RemoveContainer" containerID="0037d02b8812391ec58697f3534f566bd94955c3d84ef3b0cf1e6e57cbb7e6f9" Dec 12 00:58:25 crc kubenswrapper[4606]: I1212 00:58:25.032648 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-tfrdx"] Dec 12 00:58:25 crc kubenswrapper[4606]: I1212 00:58:25.041337 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-tfrdx"] Dec 12 00:58:25 crc kubenswrapper[4606]: I1212 00:58:25.710920 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a61d852-d814-4230-9ab7-4d0b5742b00a" path="/var/lib/kubelet/pods/8a61d852-d814-4230-9ab7-4d0b5742b00a/volumes" Dec 12 00:58:26 crc kubenswrapper[4606]: I1212 00:58:26.028820 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-xzftq"] Dec 12 00:58:26 crc kubenswrapper[4606]: I1212 00:58:26.041704 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-xzftq"] Dec 12 00:58:27 crc kubenswrapper[4606]: I1212 00:58:27.715416 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cdd1a39f-c0cc-4f2f-938b-b170f5f7d9a4" path="/var/lib/kubelet/pods/cdd1a39f-c0cc-4f2f-938b-b170f5f7d9a4/volumes" Dec 12 00:58:29 crc kubenswrapper[4606]: I1212 00:58:29.877931 4606 generic.go:334] "Generic (PLEG): container finished" podID="ffbef1cf-b8bf-44ae-be40-52fa989c44d7" containerID="cf22bba676dfe0747765417a68a316d4e5bd0483069b253bc8d4179b4d18baf6" exitCode=0 Dec 12 00:58:29 crc kubenswrapper[4606]: I1212 00:58:29.877982 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kz97r" event={"ID":"ffbef1cf-b8bf-44ae-be40-52fa989c44d7","Type":"ContainerDied","Data":"cf22bba676dfe0747765417a68a316d4e5bd0483069b253bc8d4179b4d18baf6"} Dec 12 00:58:31 crc kubenswrapper[4606]: I1212 00:58:31.337043 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kz97r" Dec 12 00:58:31 crc kubenswrapper[4606]: I1212 00:58:31.436742 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tn78d\" (UniqueName: \"kubernetes.io/projected/ffbef1cf-b8bf-44ae-be40-52fa989c44d7-kube-api-access-tn78d\") pod \"ffbef1cf-b8bf-44ae-be40-52fa989c44d7\" (UID: \"ffbef1cf-b8bf-44ae-be40-52fa989c44d7\") " Dec 12 00:58:31 crc kubenswrapper[4606]: I1212 00:58:31.437013 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ffbef1cf-b8bf-44ae-be40-52fa989c44d7-ssh-key\") pod \"ffbef1cf-b8bf-44ae-be40-52fa989c44d7\" (UID: \"ffbef1cf-b8bf-44ae-be40-52fa989c44d7\") " Dec 12 00:58:31 crc kubenswrapper[4606]: I1212 00:58:31.437207 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ffbef1cf-b8bf-44ae-be40-52fa989c44d7-inventory\") pod \"ffbef1cf-b8bf-44ae-be40-52fa989c44d7\" (UID: \"ffbef1cf-b8bf-44ae-be40-52fa989c44d7\") " Dec 12 00:58:31 crc kubenswrapper[4606]: I1212 00:58:31.457979 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ffbef1cf-b8bf-44ae-be40-52fa989c44d7-kube-api-access-tn78d" (OuterVolumeSpecName: "kube-api-access-tn78d") pod "ffbef1cf-b8bf-44ae-be40-52fa989c44d7" (UID: "ffbef1cf-b8bf-44ae-be40-52fa989c44d7"). InnerVolumeSpecName "kube-api-access-tn78d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:58:31 crc kubenswrapper[4606]: I1212 00:58:31.465425 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ffbef1cf-b8bf-44ae-be40-52fa989c44d7-inventory" (OuterVolumeSpecName: "inventory") pod "ffbef1cf-b8bf-44ae-be40-52fa989c44d7" (UID: "ffbef1cf-b8bf-44ae-be40-52fa989c44d7"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:58:31 crc kubenswrapper[4606]: I1212 00:58:31.472884 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ffbef1cf-b8bf-44ae-be40-52fa989c44d7-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "ffbef1cf-b8bf-44ae-be40-52fa989c44d7" (UID: "ffbef1cf-b8bf-44ae-be40-52fa989c44d7"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:58:31 crc kubenswrapper[4606]: I1212 00:58:31.540317 4606 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ffbef1cf-b8bf-44ae-be40-52fa989c44d7-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 12 00:58:31 crc kubenswrapper[4606]: I1212 00:58:31.540365 4606 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ffbef1cf-b8bf-44ae-be40-52fa989c44d7-inventory\") on node \"crc\" DevicePath \"\"" Dec 12 00:58:31 crc kubenswrapper[4606]: I1212 00:58:31.540380 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tn78d\" (UniqueName: \"kubernetes.io/projected/ffbef1cf-b8bf-44ae-be40-52fa989c44d7-kube-api-access-tn78d\") on node \"crc\" DevicePath \"\"" Dec 12 00:58:31 crc kubenswrapper[4606]: I1212 00:58:31.902312 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kz97r" event={"ID":"ffbef1cf-b8bf-44ae-be40-52fa989c44d7","Type":"ContainerDied","Data":"fe7889b8ae93d94c5a7d47be18e18084c7bbdc0e3c9668fb51e692bdfa45634b"} Dec 12 00:58:31 crc kubenswrapper[4606]: I1212 00:58:31.902386 4606 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fe7889b8ae93d94c5a7d47be18e18084c7bbdc0e3c9668fb51e692bdfa45634b" Dec 12 00:58:31 crc kubenswrapper[4606]: I1212 00:58:31.902404 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kz97r" Dec 12 00:58:31 crc kubenswrapper[4606]: I1212 00:58:31.983494 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gph9x"] Dec 12 00:58:31 crc kubenswrapper[4606]: E1212 00:58:31.984118 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffbef1cf-b8bf-44ae-be40-52fa989c44d7" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 12 00:58:31 crc kubenswrapper[4606]: I1212 00:58:31.984143 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffbef1cf-b8bf-44ae-be40-52fa989c44d7" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 12 00:58:31 crc kubenswrapper[4606]: I1212 00:58:31.984465 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="ffbef1cf-b8bf-44ae-be40-52fa989c44d7" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 12 00:58:31 crc kubenswrapper[4606]: I1212 00:58:31.985247 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gph9x" Dec 12 00:58:31 crc kubenswrapper[4606]: I1212 00:58:31.991453 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 12 00:58:31 crc kubenswrapper[4606]: I1212 00:58:31.992082 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-w59bl" Dec 12 00:58:31 crc kubenswrapper[4606]: I1212 00:58:31.992680 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 12 00:58:31 crc kubenswrapper[4606]: I1212 00:58:31.994903 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 12 00:58:31 crc kubenswrapper[4606]: I1212 00:58:31.998486 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gph9x"] Dec 12 00:58:32 crc kubenswrapper[4606]: I1212 00:58:32.010321 4606 patch_prober.go:28] interesting pod/machine-config-daemon-cqmz5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 12 00:58:32 crc kubenswrapper[4606]: I1212 00:58:32.010419 4606 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 12 00:58:32 crc kubenswrapper[4606]: I1212 00:58:32.051903 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5a1bfeda-d153-4465-b060-5f8b3b5d5b23-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-gph9x\" (UID: \"5a1bfeda-d153-4465-b060-5f8b3b5d5b23\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gph9x" Dec 12 00:58:32 crc kubenswrapper[4606]: I1212 00:58:32.051966 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5a1bfeda-d153-4465-b060-5f8b3b5d5b23-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-gph9x\" (UID: \"5a1bfeda-d153-4465-b060-5f8b3b5d5b23\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gph9x" Dec 12 00:58:32 crc kubenswrapper[4606]: I1212 00:58:32.052060 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxbwf\" (UniqueName: \"kubernetes.io/projected/5a1bfeda-d153-4465-b060-5f8b3b5d5b23-kube-api-access-gxbwf\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-gph9x\" (UID: \"5a1bfeda-d153-4465-b060-5f8b3b5d5b23\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gph9x" Dec 12 00:58:32 crc kubenswrapper[4606]: I1212 00:58:32.153799 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5a1bfeda-d153-4465-b060-5f8b3b5d5b23-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-gph9x\" (UID: \"5a1bfeda-d153-4465-b060-5f8b3b5d5b23\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gph9x" Dec 12 00:58:32 crc kubenswrapper[4606]: I1212 00:58:32.153866 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5a1bfeda-d153-4465-b060-5f8b3b5d5b23-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-gph9x\" (UID: \"5a1bfeda-d153-4465-b060-5f8b3b5d5b23\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gph9x" Dec 12 00:58:32 crc kubenswrapper[4606]: I1212 00:58:32.153963 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxbwf\" (UniqueName: \"kubernetes.io/projected/5a1bfeda-d153-4465-b060-5f8b3b5d5b23-kube-api-access-gxbwf\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-gph9x\" (UID: \"5a1bfeda-d153-4465-b060-5f8b3b5d5b23\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gph9x" Dec 12 00:58:32 crc kubenswrapper[4606]: I1212 00:58:32.159762 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5a1bfeda-d153-4465-b060-5f8b3b5d5b23-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-gph9x\" (UID: \"5a1bfeda-d153-4465-b060-5f8b3b5d5b23\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gph9x" Dec 12 00:58:32 crc kubenswrapper[4606]: I1212 00:58:32.160088 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5a1bfeda-d153-4465-b060-5f8b3b5d5b23-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-gph9x\" (UID: \"5a1bfeda-d153-4465-b060-5f8b3b5d5b23\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gph9x" Dec 12 00:58:32 crc kubenswrapper[4606]: I1212 00:58:32.171475 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxbwf\" (UniqueName: \"kubernetes.io/projected/5a1bfeda-d153-4465-b060-5f8b3b5d5b23-kube-api-access-gxbwf\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-gph9x\" (UID: \"5a1bfeda-d153-4465-b060-5f8b3b5d5b23\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gph9x" Dec 12 00:58:32 crc kubenswrapper[4606]: I1212 00:58:32.306539 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gph9x" Dec 12 00:58:32 crc kubenswrapper[4606]: W1212 00:58:32.864267 4606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5a1bfeda_d153_4465_b060_5f8b3b5d5b23.slice/crio-92bb8e842abda004fcfdd52da6e58bdcb3ff8ced41b05362ce8a5c7a7276c609 WatchSource:0}: Error finding container 92bb8e842abda004fcfdd52da6e58bdcb3ff8ced41b05362ce8a5c7a7276c609: Status 404 returned error can't find the container with id 92bb8e842abda004fcfdd52da6e58bdcb3ff8ced41b05362ce8a5c7a7276c609 Dec 12 00:58:32 crc kubenswrapper[4606]: I1212 00:58:32.875885 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gph9x"] Dec 12 00:58:32 crc kubenswrapper[4606]: I1212 00:58:32.916341 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gph9x" event={"ID":"5a1bfeda-d153-4465-b060-5f8b3b5d5b23","Type":"ContainerStarted","Data":"92bb8e842abda004fcfdd52da6e58bdcb3ff8ced41b05362ce8a5c7a7276c609"} Dec 12 00:58:33 crc kubenswrapper[4606]: I1212 00:58:33.928580 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gph9x" event={"ID":"5a1bfeda-d153-4465-b060-5f8b3b5d5b23","Type":"ContainerStarted","Data":"af1d214078dd75c924161200297e89830d7048dde868c09f91e40f899ebcd7d0"} Dec 12 00:58:33 crc kubenswrapper[4606]: I1212 00:58:33.956877 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gph9x" podStartSLOduration=2.784882164 podStartE2EDuration="2.956834845s" podCreationTimestamp="2025-12-12 00:58:31 +0000 UTC" firstStartedPulling="2025-12-12 00:58:32.867413355 +0000 UTC m=+2103.412766251" lastFinishedPulling="2025-12-12 00:58:33.039366026 +0000 UTC m=+2103.584718932" observedRunningTime="2025-12-12 00:58:33.951525574 +0000 UTC m=+2104.496878460" watchObservedRunningTime="2025-12-12 00:58:33.956834845 +0000 UTC m=+2104.502187721" Dec 12 00:58:38 crc kubenswrapper[4606]: I1212 00:58:38.980360 4606 generic.go:334] "Generic (PLEG): container finished" podID="5a1bfeda-d153-4465-b060-5f8b3b5d5b23" containerID="af1d214078dd75c924161200297e89830d7048dde868c09f91e40f899ebcd7d0" exitCode=0 Dec 12 00:58:38 crc kubenswrapper[4606]: I1212 00:58:38.980580 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gph9x" event={"ID":"5a1bfeda-d153-4465-b060-5f8b3b5d5b23","Type":"ContainerDied","Data":"af1d214078dd75c924161200297e89830d7048dde868c09f91e40f899ebcd7d0"} Dec 12 00:58:40 crc kubenswrapper[4606]: I1212 00:58:40.418189 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gph9x" Dec 12 00:58:40 crc kubenswrapper[4606]: I1212 00:58:40.513912 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5a1bfeda-d153-4465-b060-5f8b3b5d5b23-inventory\") pod \"5a1bfeda-d153-4465-b060-5f8b3b5d5b23\" (UID: \"5a1bfeda-d153-4465-b060-5f8b3b5d5b23\") " Dec 12 00:58:40 crc kubenswrapper[4606]: I1212 00:58:40.514361 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5a1bfeda-d153-4465-b060-5f8b3b5d5b23-ssh-key\") pod \"5a1bfeda-d153-4465-b060-5f8b3b5d5b23\" (UID: \"5a1bfeda-d153-4465-b060-5f8b3b5d5b23\") " Dec 12 00:58:40 crc kubenswrapper[4606]: I1212 00:58:40.514512 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gxbwf\" (UniqueName: \"kubernetes.io/projected/5a1bfeda-d153-4465-b060-5f8b3b5d5b23-kube-api-access-gxbwf\") pod \"5a1bfeda-d153-4465-b060-5f8b3b5d5b23\" (UID: \"5a1bfeda-d153-4465-b060-5f8b3b5d5b23\") " Dec 12 00:58:40 crc kubenswrapper[4606]: I1212 00:58:40.518860 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a1bfeda-d153-4465-b060-5f8b3b5d5b23-kube-api-access-gxbwf" (OuterVolumeSpecName: "kube-api-access-gxbwf") pod "5a1bfeda-d153-4465-b060-5f8b3b5d5b23" (UID: "5a1bfeda-d153-4465-b060-5f8b3b5d5b23"). InnerVolumeSpecName "kube-api-access-gxbwf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:58:40 crc kubenswrapper[4606]: I1212 00:58:40.541394 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a1bfeda-d153-4465-b060-5f8b3b5d5b23-inventory" (OuterVolumeSpecName: "inventory") pod "5a1bfeda-d153-4465-b060-5f8b3b5d5b23" (UID: "5a1bfeda-d153-4465-b060-5f8b3b5d5b23"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:58:40 crc kubenswrapper[4606]: I1212 00:58:40.543123 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a1bfeda-d153-4465-b060-5f8b3b5d5b23-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "5a1bfeda-d153-4465-b060-5f8b3b5d5b23" (UID: "5a1bfeda-d153-4465-b060-5f8b3b5d5b23"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:58:40 crc kubenswrapper[4606]: I1212 00:58:40.617074 4606 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5a1bfeda-d153-4465-b060-5f8b3b5d5b23-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 12 00:58:40 crc kubenswrapper[4606]: I1212 00:58:40.617112 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gxbwf\" (UniqueName: \"kubernetes.io/projected/5a1bfeda-d153-4465-b060-5f8b3b5d5b23-kube-api-access-gxbwf\") on node \"crc\" DevicePath \"\"" Dec 12 00:58:40 crc kubenswrapper[4606]: I1212 00:58:40.617123 4606 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5a1bfeda-d153-4465-b060-5f8b3b5d5b23-inventory\") on node \"crc\" DevicePath \"\"" Dec 12 00:58:41 crc kubenswrapper[4606]: I1212 00:58:41.001735 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gph9x" event={"ID":"5a1bfeda-d153-4465-b060-5f8b3b5d5b23","Type":"ContainerDied","Data":"92bb8e842abda004fcfdd52da6e58bdcb3ff8ced41b05362ce8a5c7a7276c609"} Dec 12 00:58:41 crc kubenswrapper[4606]: I1212 00:58:41.001774 4606 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="92bb8e842abda004fcfdd52da6e58bdcb3ff8ced41b05362ce8a5c7a7276c609" Dec 12 00:58:41 crc kubenswrapper[4606]: I1212 00:58:41.001784 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gph9x" Dec 12 00:58:41 crc kubenswrapper[4606]: I1212 00:58:41.119273 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-h8w5g"] Dec 12 00:58:41 crc kubenswrapper[4606]: E1212 00:58:41.130091 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a1bfeda-d153-4465-b060-5f8b3b5d5b23" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 12 00:58:41 crc kubenswrapper[4606]: I1212 00:58:41.130130 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a1bfeda-d153-4465-b060-5f8b3b5d5b23" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 12 00:58:41 crc kubenswrapper[4606]: I1212 00:58:41.130490 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a1bfeda-d153-4465-b060-5f8b3b5d5b23" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 12 00:58:41 crc kubenswrapper[4606]: I1212 00:58:41.131242 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-h8w5g"] Dec 12 00:58:41 crc kubenswrapper[4606]: I1212 00:58:41.131363 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-h8w5g" Dec 12 00:58:41 crc kubenswrapper[4606]: I1212 00:58:41.134855 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 12 00:58:41 crc kubenswrapper[4606]: I1212 00:58:41.135106 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-w59bl" Dec 12 00:58:41 crc kubenswrapper[4606]: I1212 00:58:41.135343 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 12 00:58:41 crc kubenswrapper[4606]: I1212 00:58:41.137343 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 12 00:58:41 crc kubenswrapper[4606]: I1212 00:58:41.228136 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9ffe19a0-667c-400f-b80f-3ddedcbec6dd-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-h8w5g\" (UID: \"9ffe19a0-667c-400f-b80f-3ddedcbec6dd\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-h8w5g" Dec 12 00:58:41 crc kubenswrapper[4606]: I1212 00:58:41.228318 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9ffe19a0-667c-400f-b80f-3ddedcbec6dd-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-h8w5g\" (UID: \"9ffe19a0-667c-400f-b80f-3ddedcbec6dd\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-h8w5g" Dec 12 00:58:41 crc kubenswrapper[4606]: I1212 00:58:41.228435 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlt56\" (UniqueName: \"kubernetes.io/projected/9ffe19a0-667c-400f-b80f-3ddedcbec6dd-kube-api-access-nlt56\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-h8w5g\" (UID: \"9ffe19a0-667c-400f-b80f-3ddedcbec6dd\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-h8w5g" Dec 12 00:58:41 crc kubenswrapper[4606]: I1212 00:58:41.329892 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9ffe19a0-667c-400f-b80f-3ddedcbec6dd-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-h8w5g\" (UID: \"9ffe19a0-667c-400f-b80f-3ddedcbec6dd\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-h8w5g" Dec 12 00:58:41 crc kubenswrapper[4606]: I1212 00:58:41.329960 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9ffe19a0-667c-400f-b80f-3ddedcbec6dd-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-h8w5g\" (UID: \"9ffe19a0-667c-400f-b80f-3ddedcbec6dd\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-h8w5g" Dec 12 00:58:41 crc kubenswrapper[4606]: I1212 00:58:41.330036 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nlt56\" (UniqueName: \"kubernetes.io/projected/9ffe19a0-667c-400f-b80f-3ddedcbec6dd-kube-api-access-nlt56\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-h8w5g\" (UID: \"9ffe19a0-667c-400f-b80f-3ddedcbec6dd\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-h8w5g" Dec 12 00:58:41 crc kubenswrapper[4606]: I1212 00:58:41.334116 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9ffe19a0-667c-400f-b80f-3ddedcbec6dd-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-h8w5g\" (UID: \"9ffe19a0-667c-400f-b80f-3ddedcbec6dd\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-h8w5g" Dec 12 00:58:41 crc kubenswrapper[4606]: I1212 00:58:41.334920 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9ffe19a0-667c-400f-b80f-3ddedcbec6dd-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-h8w5g\" (UID: \"9ffe19a0-667c-400f-b80f-3ddedcbec6dd\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-h8w5g" Dec 12 00:58:41 crc kubenswrapper[4606]: I1212 00:58:41.347797 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlt56\" (UniqueName: \"kubernetes.io/projected/9ffe19a0-667c-400f-b80f-3ddedcbec6dd-kube-api-access-nlt56\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-h8w5g\" (UID: \"9ffe19a0-667c-400f-b80f-3ddedcbec6dd\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-h8w5g" Dec 12 00:58:41 crc kubenswrapper[4606]: I1212 00:58:41.461240 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-h8w5g" Dec 12 00:58:41 crc kubenswrapper[4606]: I1212 00:58:41.978337 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-h8w5g"] Dec 12 00:58:42 crc kubenswrapper[4606]: I1212 00:58:42.011974 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-h8w5g" event={"ID":"9ffe19a0-667c-400f-b80f-3ddedcbec6dd","Type":"ContainerStarted","Data":"07863bf19fac130b450123035ae0924a3b24cb5430b84170dc70b64cc09ee2f3"} Dec 12 00:58:43 crc kubenswrapper[4606]: I1212 00:58:43.022387 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-h8w5g" event={"ID":"9ffe19a0-667c-400f-b80f-3ddedcbec6dd","Type":"ContainerStarted","Data":"c9d8199c41ddb3afaeb1a6c64dce61b7a0a8e52f022fa120cbbc83534d661ce8"} Dec 12 00:58:43 crc kubenswrapper[4606]: I1212 00:58:43.052741 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-h8w5g" podStartSLOduration=1.878837464 podStartE2EDuration="2.052723286s" podCreationTimestamp="2025-12-12 00:58:41 +0000 UTC" firstStartedPulling="2025-12-12 00:58:41.987076337 +0000 UTC m=+2112.532429203" lastFinishedPulling="2025-12-12 00:58:42.160962159 +0000 UTC m=+2112.706315025" observedRunningTime="2025-12-12 00:58:43.041048405 +0000 UTC m=+2113.586401271" watchObservedRunningTime="2025-12-12 00:58:43.052723286 +0000 UTC m=+2113.598076152" Dec 12 00:58:56 crc kubenswrapper[4606]: I1212 00:58:56.251673 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zfp5f"] Dec 12 00:58:56 crc kubenswrapper[4606]: I1212 00:58:56.254747 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zfp5f" Dec 12 00:58:56 crc kubenswrapper[4606]: I1212 00:58:56.266572 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zfp5f"] Dec 12 00:58:56 crc kubenswrapper[4606]: I1212 00:58:56.444444 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5965cae1-310d-4026-bbee-f637e9677cd7-utilities\") pod \"redhat-operators-zfp5f\" (UID: \"5965cae1-310d-4026-bbee-f637e9677cd7\") " pod="openshift-marketplace/redhat-operators-zfp5f" Dec 12 00:58:56 crc kubenswrapper[4606]: I1212 00:58:56.444514 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8jmt\" (UniqueName: \"kubernetes.io/projected/5965cae1-310d-4026-bbee-f637e9677cd7-kube-api-access-c8jmt\") pod \"redhat-operators-zfp5f\" (UID: \"5965cae1-310d-4026-bbee-f637e9677cd7\") " pod="openshift-marketplace/redhat-operators-zfp5f" Dec 12 00:58:56 crc kubenswrapper[4606]: I1212 00:58:56.444615 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5965cae1-310d-4026-bbee-f637e9677cd7-catalog-content\") pod \"redhat-operators-zfp5f\" (UID: \"5965cae1-310d-4026-bbee-f637e9677cd7\") " pod="openshift-marketplace/redhat-operators-zfp5f" Dec 12 00:58:56 crc kubenswrapper[4606]: I1212 00:58:56.548863 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5965cae1-310d-4026-bbee-f637e9677cd7-utilities\") pod \"redhat-operators-zfp5f\" (UID: \"5965cae1-310d-4026-bbee-f637e9677cd7\") " pod="openshift-marketplace/redhat-operators-zfp5f" Dec 12 00:58:56 crc kubenswrapper[4606]: I1212 00:58:56.549257 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8jmt\" (UniqueName: \"kubernetes.io/projected/5965cae1-310d-4026-bbee-f637e9677cd7-kube-api-access-c8jmt\") pod \"redhat-operators-zfp5f\" (UID: \"5965cae1-310d-4026-bbee-f637e9677cd7\") " pod="openshift-marketplace/redhat-operators-zfp5f" Dec 12 00:58:56 crc kubenswrapper[4606]: I1212 00:58:56.549323 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5965cae1-310d-4026-bbee-f637e9677cd7-catalog-content\") pod \"redhat-operators-zfp5f\" (UID: \"5965cae1-310d-4026-bbee-f637e9677cd7\") " pod="openshift-marketplace/redhat-operators-zfp5f" Dec 12 00:58:56 crc kubenswrapper[4606]: I1212 00:58:56.549726 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5965cae1-310d-4026-bbee-f637e9677cd7-catalog-content\") pod \"redhat-operators-zfp5f\" (UID: \"5965cae1-310d-4026-bbee-f637e9677cd7\") " pod="openshift-marketplace/redhat-operators-zfp5f" Dec 12 00:58:56 crc kubenswrapper[4606]: I1212 00:58:56.550019 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5965cae1-310d-4026-bbee-f637e9677cd7-utilities\") pod \"redhat-operators-zfp5f\" (UID: \"5965cae1-310d-4026-bbee-f637e9677cd7\") " pod="openshift-marketplace/redhat-operators-zfp5f" Dec 12 00:58:56 crc kubenswrapper[4606]: I1212 00:58:56.574963 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8jmt\" (UniqueName: \"kubernetes.io/projected/5965cae1-310d-4026-bbee-f637e9677cd7-kube-api-access-c8jmt\") pod \"redhat-operators-zfp5f\" (UID: \"5965cae1-310d-4026-bbee-f637e9677cd7\") " pod="openshift-marketplace/redhat-operators-zfp5f" Dec 12 00:58:56 crc kubenswrapper[4606]: I1212 00:58:56.586692 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zfp5f" Dec 12 00:58:56 crc kubenswrapper[4606]: I1212 00:58:56.921093 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zfp5f"] Dec 12 00:58:57 crc kubenswrapper[4606]: I1212 00:58:57.154948 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zfp5f" event={"ID":"5965cae1-310d-4026-bbee-f637e9677cd7","Type":"ContainerStarted","Data":"a00fda902c0b7caea6ef85eff424a417be1abe3f656da143708c510b1e9ef45e"} Dec 12 00:58:57 crc kubenswrapper[4606]: I1212 00:58:57.155394 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zfp5f" event={"ID":"5965cae1-310d-4026-bbee-f637e9677cd7","Type":"ContainerStarted","Data":"bb82e1d22d1c31874c10914ff9fb03732a97aa1d99403b6fc3855720af668568"} Dec 12 00:58:58 crc kubenswrapper[4606]: I1212 00:58:58.172414 4606 generic.go:334] "Generic (PLEG): container finished" podID="5965cae1-310d-4026-bbee-f637e9677cd7" containerID="a00fda902c0b7caea6ef85eff424a417be1abe3f656da143708c510b1e9ef45e" exitCode=0 Dec 12 00:58:58 crc kubenswrapper[4606]: I1212 00:58:58.172468 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zfp5f" event={"ID":"5965cae1-310d-4026-bbee-f637e9677cd7","Type":"ContainerDied","Data":"a00fda902c0b7caea6ef85eff424a417be1abe3f656da143708c510b1e9ef45e"} Dec 12 00:59:00 crc kubenswrapper[4606]: I1212 00:59:00.196277 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zfp5f" event={"ID":"5965cae1-310d-4026-bbee-f637e9677cd7","Type":"ContainerStarted","Data":"83ff539727bcab1249d73ebc37cefa138c109be35649aacb3e7b20fe7c06e877"} Dec 12 00:59:02 crc kubenswrapper[4606]: I1212 00:59:02.010916 4606 patch_prober.go:28] interesting pod/machine-config-daemon-cqmz5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 12 00:59:02 crc kubenswrapper[4606]: I1212 00:59:02.011981 4606 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 12 00:59:02 crc kubenswrapper[4606]: I1212 00:59:02.216939 4606 generic.go:334] "Generic (PLEG): container finished" podID="5965cae1-310d-4026-bbee-f637e9677cd7" containerID="83ff539727bcab1249d73ebc37cefa138c109be35649aacb3e7b20fe7c06e877" exitCode=0 Dec 12 00:59:02 crc kubenswrapper[4606]: I1212 00:59:02.216985 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zfp5f" event={"ID":"5965cae1-310d-4026-bbee-f637e9677cd7","Type":"ContainerDied","Data":"83ff539727bcab1249d73ebc37cefa138c109be35649aacb3e7b20fe7c06e877"} Dec 12 00:59:03 crc kubenswrapper[4606]: I1212 00:59:03.049425 4606 scope.go:117] "RemoveContainer" containerID="e4bfde433c9e93055eba44d7b0159e70ca005ce3945986a0a0d98516213ccef1" Dec 12 00:59:03 crc kubenswrapper[4606]: I1212 00:59:03.095048 4606 scope.go:117] "RemoveContainer" containerID="ba05fb9ddf63c631f27d231a8d8954e06df430388b725101e1130c24e60f9fcc" Dec 12 00:59:03 crc kubenswrapper[4606]: I1212 00:59:03.228439 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zfp5f" event={"ID":"5965cae1-310d-4026-bbee-f637e9677cd7","Type":"ContainerStarted","Data":"1061c4536fafcf4804299388e16ea663607b26e9e475b57c3656ee65870600bb"} Dec 12 00:59:03 crc kubenswrapper[4606]: I1212 00:59:03.246474 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zfp5f" podStartSLOduration=2.742589907 podStartE2EDuration="7.246452574s" podCreationTimestamp="2025-12-12 00:58:56 +0000 UTC" firstStartedPulling="2025-12-12 00:58:58.175366628 +0000 UTC m=+2128.720719494" lastFinishedPulling="2025-12-12 00:59:02.679229285 +0000 UTC m=+2133.224582161" observedRunningTime="2025-12-12 00:59:03.243715961 +0000 UTC m=+2133.789068827" watchObservedRunningTime="2025-12-12 00:59:03.246452574 +0000 UTC m=+2133.791805440" Dec 12 00:59:06 crc kubenswrapper[4606]: I1212 00:59:06.587355 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zfp5f" Dec 12 00:59:06 crc kubenswrapper[4606]: I1212 00:59:06.587648 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zfp5f" Dec 12 00:59:06 crc kubenswrapper[4606]: I1212 00:59:06.962242 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-hcsh2"] Dec 12 00:59:06 crc kubenswrapper[4606]: I1212 00:59:06.964673 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hcsh2" Dec 12 00:59:06 crc kubenswrapper[4606]: I1212 00:59:06.977312 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hcsh2"] Dec 12 00:59:07 crc kubenswrapper[4606]: I1212 00:59:07.062114 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmjc8\" (UniqueName: \"kubernetes.io/projected/17bc90fc-6056-411c-943f-927e1329fe16-kube-api-access-wmjc8\") pod \"redhat-marketplace-hcsh2\" (UID: \"17bc90fc-6056-411c-943f-927e1329fe16\") " pod="openshift-marketplace/redhat-marketplace-hcsh2" Dec 12 00:59:07 crc kubenswrapper[4606]: I1212 00:59:07.062284 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17bc90fc-6056-411c-943f-927e1329fe16-catalog-content\") pod \"redhat-marketplace-hcsh2\" (UID: \"17bc90fc-6056-411c-943f-927e1329fe16\") " pod="openshift-marketplace/redhat-marketplace-hcsh2" Dec 12 00:59:07 crc kubenswrapper[4606]: I1212 00:59:07.062320 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17bc90fc-6056-411c-943f-927e1329fe16-utilities\") pod \"redhat-marketplace-hcsh2\" (UID: \"17bc90fc-6056-411c-943f-927e1329fe16\") " pod="openshift-marketplace/redhat-marketplace-hcsh2" Dec 12 00:59:07 crc kubenswrapper[4606]: I1212 00:59:07.163373 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmjc8\" (UniqueName: \"kubernetes.io/projected/17bc90fc-6056-411c-943f-927e1329fe16-kube-api-access-wmjc8\") pod \"redhat-marketplace-hcsh2\" (UID: \"17bc90fc-6056-411c-943f-927e1329fe16\") " pod="openshift-marketplace/redhat-marketplace-hcsh2" Dec 12 00:59:07 crc kubenswrapper[4606]: I1212 00:59:07.163451 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17bc90fc-6056-411c-943f-927e1329fe16-catalog-content\") pod \"redhat-marketplace-hcsh2\" (UID: \"17bc90fc-6056-411c-943f-927e1329fe16\") " pod="openshift-marketplace/redhat-marketplace-hcsh2" Dec 12 00:59:07 crc kubenswrapper[4606]: I1212 00:59:07.163483 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17bc90fc-6056-411c-943f-927e1329fe16-utilities\") pod \"redhat-marketplace-hcsh2\" (UID: \"17bc90fc-6056-411c-943f-927e1329fe16\") " pod="openshift-marketplace/redhat-marketplace-hcsh2" Dec 12 00:59:07 crc kubenswrapper[4606]: I1212 00:59:07.163977 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17bc90fc-6056-411c-943f-927e1329fe16-utilities\") pod \"redhat-marketplace-hcsh2\" (UID: \"17bc90fc-6056-411c-943f-927e1329fe16\") " pod="openshift-marketplace/redhat-marketplace-hcsh2" Dec 12 00:59:07 crc kubenswrapper[4606]: I1212 00:59:07.164057 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17bc90fc-6056-411c-943f-927e1329fe16-catalog-content\") pod \"redhat-marketplace-hcsh2\" (UID: \"17bc90fc-6056-411c-943f-927e1329fe16\") " pod="openshift-marketplace/redhat-marketplace-hcsh2" Dec 12 00:59:07 crc kubenswrapper[4606]: I1212 00:59:07.186244 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmjc8\" (UniqueName: \"kubernetes.io/projected/17bc90fc-6056-411c-943f-927e1329fe16-kube-api-access-wmjc8\") pod \"redhat-marketplace-hcsh2\" (UID: \"17bc90fc-6056-411c-943f-927e1329fe16\") " pod="openshift-marketplace/redhat-marketplace-hcsh2" Dec 12 00:59:07 crc kubenswrapper[4606]: I1212 00:59:07.293581 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hcsh2" Dec 12 00:59:07 crc kubenswrapper[4606]: I1212 00:59:07.637110 4606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-zfp5f" podUID="5965cae1-310d-4026-bbee-f637e9677cd7" containerName="registry-server" probeResult="failure" output=< Dec 12 00:59:07 crc kubenswrapper[4606]: timeout: failed to connect service ":50051" within 1s Dec 12 00:59:07 crc kubenswrapper[4606]: > Dec 12 00:59:07 crc kubenswrapper[4606]: I1212 00:59:07.806149 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hcsh2"] Dec 12 00:59:07 crc kubenswrapper[4606]: W1212 00:59:07.807566 4606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17bc90fc_6056_411c_943f_927e1329fe16.slice/crio-7ac16f114dbd113c4a085e0b3d1d9e617d24107c27c2f71d3005913fcf50bf53 WatchSource:0}: Error finding container 7ac16f114dbd113c4a085e0b3d1d9e617d24107c27c2f71d3005913fcf50bf53: Status 404 returned error can't find the container with id 7ac16f114dbd113c4a085e0b3d1d9e617d24107c27c2f71d3005913fcf50bf53 Dec 12 00:59:08 crc kubenswrapper[4606]: I1212 00:59:08.274878 4606 generic.go:334] "Generic (PLEG): container finished" podID="17bc90fc-6056-411c-943f-927e1329fe16" containerID="96508a388e2e59770dc0aad4f1f9584166af2bccced607b09cb0ed04fdd6fe84" exitCode=0 Dec 12 00:59:08 crc kubenswrapper[4606]: I1212 00:59:08.275148 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hcsh2" event={"ID":"17bc90fc-6056-411c-943f-927e1329fe16","Type":"ContainerDied","Data":"96508a388e2e59770dc0aad4f1f9584166af2bccced607b09cb0ed04fdd6fe84"} Dec 12 00:59:08 crc kubenswrapper[4606]: I1212 00:59:08.275325 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hcsh2" event={"ID":"17bc90fc-6056-411c-943f-927e1329fe16","Type":"ContainerStarted","Data":"7ac16f114dbd113c4a085e0b3d1d9e617d24107c27c2f71d3005913fcf50bf53"} Dec 12 00:59:09 crc kubenswrapper[4606]: I1212 00:59:09.285048 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hcsh2" event={"ID":"17bc90fc-6056-411c-943f-927e1329fe16","Type":"ContainerStarted","Data":"914f5473de759a23212fc85bae1bda0d82337f37e7f60c2b48fbf23c0a0c1601"} Dec 12 00:59:10 crc kubenswrapper[4606]: I1212 00:59:10.296792 4606 generic.go:334] "Generic (PLEG): container finished" podID="17bc90fc-6056-411c-943f-927e1329fe16" containerID="914f5473de759a23212fc85bae1bda0d82337f37e7f60c2b48fbf23c0a0c1601" exitCode=0 Dec 12 00:59:10 crc kubenswrapper[4606]: I1212 00:59:10.296852 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hcsh2" event={"ID":"17bc90fc-6056-411c-943f-927e1329fe16","Type":"ContainerDied","Data":"914f5473de759a23212fc85bae1bda0d82337f37e7f60c2b48fbf23c0a0c1601"} Dec 12 00:59:11 crc kubenswrapper[4606]: I1212 00:59:11.050674 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-pz8fh"] Dec 12 00:59:11 crc kubenswrapper[4606]: I1212 00:59:11.060287 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-pz8fh"] Dec 12 00:59:11 crc kubenswrapper[4606]: I1212 00:59:11.309130 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hcsh2" event={"ID":"17bc90fc-6056-411c-943f-927e1329fe16","Type":"ContainerStarted","Data":"ab77cdeb5d12b93a712ad98ae0190eb350702d4bae389dd6ea0c47df23902d2f"} Dec 12 00:59:11 crc kubenswrapper[4606]: I1212 00:59:11.339159 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-hcsh2" podStartSLOduration=2.835316236 podStartE2EDuration="5.339137736s" podCreationTimestamp="2025-12-12 00:59:06 +0000 UTC" firstStartedPulling="2025-12-12 00:59:08.276928952 +0000 UTC m=+2138.822281808" lastFinishedPulling="2025-12-12 00:59:10.780750442 +0000 UTC m=+2141.326103308" observedRunningTime="2025-12-12 00:59:11.330742682 +0000 UTC m=+2141.876095558" watchObservedRunningTime="2025-12-12 00:59:11.339137736 +0000 UTC m=+2141.884490602" Dec 12 00:59:11 crc kubenswrapper[4606]: I1212 00:59:11.709121 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="491bea2c-f0d9-45f2-bcf2-a49b4312e1f0" path="/var/lib/kubelet/pods/491bea2c-f0d9-45f2-bcf2-a49b4312e1f0/volumes" Dec 12 00:59:16 crc kubenswrapper[4606]: I1212 00:59:16.634730 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zfp5f" Dec 12 00:59:16 crc kubenswrapper[4606]: I1212 00:59:16.696460 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zfp5f" Dec 12 00:59:16 crc kubenswrapper[4606]: I1212 00:59:16.867726 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zfp5f"] Dec 12 00:59:17 crc kubenswrapper[4606]: I1212 00:59:17.295234 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-hcsh2" Dec 12 00:59:17 crc kubenswrapper[4606]: I1212 00:59:17.295290 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-hcsh2" Dec 12 00:59:17 crc kubenswrapper[4606]: I1212 00:59:17.361081 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-hcsh2" Dec 12 00:59:17 crc kubenswrapper[4606]: I1212 00:59:17.418094 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-hcsh2" Dec 12 00:59:18 crc kubenswrapper[4606]: I1212 00:59:18.359415 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-zfp5f" podUID="5965cae1-310d-4026-bbee-f637e9677cd7" containerName="registry-server" containerID="cri-o://1061c4536fafcf4804299388e16ea663607b26e9e475b57c3656ee65870600bb" gracePeriod=2 Dec 12 00:59:18 crc kubenswrapper[4606]: I1212 00:59:18.792114 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zfp5f" Dec 12 00:59:18 crc kubenswrapper[4606]: I1212 00:59:18.983950 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c8jmt\" (UniqueName: \"kubernetes.io/projected/5965cae1-310d-4026-bbee-f637e9677cd7-kube-api-access-c8jmt\") pod \"5965cae1-310d-4026-bbee-f637e9677cd7\" (UID: \"5965cae1-310d-4026-bbee-f637e9677cd7\") " Dec 12 00:59:18 crc kubenswrapper[4606]: I1212 00:59:18.984207 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5965cae1-310d-4026-bbee-f637e9677cd7-catalog-content\") pod \"5965cae1-310d-4026-bbee-f637e9677cd7\" (UID: \"5965cae1-310d-4026-bbee-f637e9677cd7\") " Dec 12 00:59:18 crc kubenswrapper[4606]: I1212 00:59:18.984279 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5965cae1-310d-4026-bbee-f637e9677cd7-utilities\") pod \"5965cae1-310d-4026-bbee-f637e9677cd7\" (UID: \"5965cae1-310d-4026-bbee-f637e9677cd7\") " Dec 12 00:59:18 crc kubenswrapper[4606]: I1212 00:59:18.985383 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5965cae1-310d-4026-bbee-f637e9677cd7-utilities" (OuterVolumeSpecName: "utilities") pod "5965cae1-310d-4026-bbee-f637e9677cd7" (UID: "5965cae1-310d-4026-bbee-f637e9677cd7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 00:59:18 crc kubenswrapper[4606]: I1212 00:59:18.996102 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5965cae1-310d-4026-bbee-f637e9677cd7-kube-api-access-c8jmt" (OuterVolumeSpecName: "kube-api-access-c8jmt") pod "5965cae1-310d-4026-bbee-f637e9677cd7" (UID: "5965cae1-310d-4026-bbee-f637e9677cd7"). InnerVolumeSpecName "kube-api-access-c8jmt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:59:19 crc kubenswrapper[4606]: I1212 00:59:19.086396 4606 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5965cae1-310d-4026-bbee-f637e9677cd7-utilities\") on node \"crc\" DevicePath \"\"" Dec 12 00:59:19 crc kubenswrapper[4606]: I1212 00:59:19.086580 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c8jmt\" (UniqueName: \"kubernetes.io/projected/5965cae1-310d-4026-bbee-f637e9677cd7-kube-api-access-c8jmt\") on node \"crc\" DevicePath \"\"" Dec 12 00:59:19 crc kubenswrapper[4606]: I1212 00:59:19.104703 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5965cae1-310d-4026-bbee-f637e9677cd7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5965cae1-310d-4026-bbee-f637e9677cd7" (UID: "5965cae1-310d-4026-bbee-f637e9677cd7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 00:59:19 crc kubenswrapper[4606]: I1212 00:59:19.187939 4606 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5965cae1-310d-4026-bbee-f637e9677cd7-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 12 00:59:19 crc kubenswrapper[4606]: I1212 00:59:19.368907 4606 generic.go:334] "Generic (PLEG): container finished" podID="5965cae1-310d-4026-bbee-f637e9677cd7" containerID="1061c4536fafcf4804299388e16ea663607b26e9e475b57c3656ee65870600bb" exitCode=0 Dec 12 00:59:19 crc kubenswrapper[4606]: I1212 00:59:19.368953 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zfp5f" event={"ID":"5965cae1-310d-4026-bbee-f637e9677cd7","Type":"ContainerDied","Data":"1061c4536fafcf4804299388e16ea663607b26e9e475b57c3656ee65870600bb"} Dec 12 00:59:19 crc kubenswrapper[4606]: I1212 00:59:19.368964 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zfp5f" Dec 12 00:59:19 crc kubenswrapper[4606]: I1212 00:59:19.368987 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zfp5f" event={"ID":"5965cae1-310d-4026-bbee-f637e9677cd7","Type":"ContainerDied","Data":"bb82e1d22d1c31874c10914ff9fb03732a97aa1d99403b6fc3855720af668568"} Dec 12 00:59:19 crc kubenswrapper[4606]: I1212 00:59:19.369005 4606 scope.go:117] "RemoveContainer" containerID="1061c4536fafcf4804299388e16ea663607b26e9e475b57c3656ee65870600bb" Dec 12 00:59:19 crc kubenswrapper[4606]: I1212 00:59:19.389464 4606 scope.go:117] "RemoveContainer" containerID="83ff539727bcab1249d73ebc37cefa138c109be35649aacb3e7b20fe7c06e877" Dec 12 00:59:19 crc kubenswrapper[4606]: I1212 00:59:19.414702 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zfp5f"] Dec 12 00:59:19 crc kubenswrapper[4606]: I1212 00:59:19.425013 4606 scope.go:117] "RemoveContainer" containerID="a00fda902c0b7caea6ef85eff424a417be1abe3f656da143708c510b1e9ef45e" Dec 12 00:59:19 crc kubenswrapper[4606]: I1212 00:59:19.428047 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-zfp5f"] Dec 12 00:59:19 crc kubenswrapper[4606]: I1212 00:59:19.482983 4606 scope.go:117] "RemoveContainer" containerID="1061c4536fafcf4804299388e16ea663607b26e9e475b57c3656ee65870600bb" Dec 12 00:59:19 crc kubenswrapper[4606]: E1212 00:59:19.483547 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1061c4536fafcf4804299388e16ea663607b26e9e475b57c3656ee65870600bb\": container with ID starting with 1061c4536fafcf4804299388e16ea663607b26e9e475b57c3656ee65870600bb not found: ID does not exist" containerID="1061c4536fafcf4804299388e16ea663607b26e9e475b57c3656ee65870600bb" Dec 12 00:59:19 crc kubenswrapper[4606]: I1212 00:59:19.483577 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1061c4536fafcf4804299388e16ea663607b26e9e475b57c3656ee65870600bb"} err="failed to get container status \"1061c4536fafcf4804299388e16ea663607b26e9e475b57c3656ee65870600bb\": rpc error: code = NotFound desc = could not find container \"1061c4536fafcf4804299388e16ea663607b26e9e475b57c3656ee65870600bb\": container with ID starting with 1061c4536fafcf4804299388e16ea663607b26e9e475b57c3656ee65870600bb not found: ID does not exist" Dec 12 00:59:19 crc kubenswrapper[4606]: I1212 00:59:19.483598 4606 scope.go:117] "RemoveContainer" containerID="83ff539727bcab1249d73ebc37cefa138c109be35649aacb3e7b20fe7c06e877" Dec 12 00:59:19 crc kubenswrapper[4606]: E1212 00:59:19.483917 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83ff539727bcab1249d73ebc37cefa138c109be35649aacb3e7b20fe7c06e877\": container with ID starting with 83ff539727bcab1249d73ebc37cefa138c109be35649aacb3e7b20fe7c06e877 not found: ID does not exist" containerID="83ff539727bcab1249d73ebc37cefa138c109be35649aacb3e7b20fe7c06e877" Dec 12 00:59:19 crc kubenswrapper[4606]: I1212 00:59:19.483942 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83ff539727bcab1249d73ebc37cefa138c109be35649aacb3e7b20fe7c06e877"} err="failed to get container status \"83ff539727bcab1249d73ebc37cefa138c109be35649aacb3e7b20fe7c06e877\": rpc error: code = NotFound desc = could not find container \"83ff539727bcab1249d73ebc37cefa138c109be35649aacb3e7b20fe7c06e877\": container with ID starting with 83ff539727bcab1249d73ebc37cefa138c109be35649aacb3e7b20fe7c06e877 not found: ID does not exist" Dec 12 00:59:19 crc kubenswrapper[4606]: I1212 00:59:19.483954 4606 scope.go:117] "RemoveContainer" containerID="a00fda902c0b7caea6ef85eff424a417be1abe3f656da143708c510b1e9ef45e" Dec 12 00:59:19 crc kubenswrapper[4606]: E1212 00:59:19.484204 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a00fda902c0b7caea6ef85eff424a417be1abe3f656da143708c510b1e9ef45e\": container with ID starting with a00fda902c0b7caea6ef85eff424a417be1abe3f656da143708c510b1e9ef45e not found: ID does not exist" containerID="a00fda902c0b7caea6ef85eff424a417be1abe3f656da143708c510b1e9ef45e" Dec 12 00:59:19 crc kubenswrapper[4606]: I1212 00:59:19.484228 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a00fda902c0b7caea6ef85eff424a417be1abe3f656da143708c510b1e9ef45e"} err="failed to get container status \"a00fda902c0b7caea6ef85eff424a417be1abe3f656da143708c510b1e9ef45e\": rpc error: code = NotFound desc = could not find container \"a00fda902c0b7caea6ef85eff424a417be1abe3f656da143708c510b1e9ef45e\": container with ID starting with a00fda902c0b7caea6ef85eff424a417be1abe3f656da143708c510b1e9ef45e not found: ID does not exist" Dec 12 00:59:19 crc kubenswrapper[4606]: I1212 00:59:19.675777 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hcsh2"] Dec 12 00:59:19 crc kubenswrapper[4606]: I1212 00:59:19.676114 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-hcsh2" podUID="17bc90fc-6056-411c-943f-927e1329fe16" containerName="registry-server" containerID="cri-o://ab77cdeb5d12b93a712ad98ae0190eb350702d4bae389dd6ea0c47df23902d2f" gracePeriod=2 Dec 12 00:59:19 crc kubenswrapper[4606]: I1212 00:59:19.709875 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5965cae1-310d-4026-bbee-f637e9677cd7" path="/var/lib/kubelet/pods/5965cae1-310d-4026-bbee-f637e9677cd7/volumes" Dec 12 00:59:20 crc kubenswrapper[4606]: I1212 00:59:20.128145 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hcsh2" Dec 12 00:59:20 crc kubenswrapper[4606]: I1212 00:59:20.306614 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17bc90fc-6056-411c-943f-927e1329fe16-catalog-content\") pod \"17bc90fc-6056-411c-943f-927e1329fe16\" (UID: \"17bc90fc-6056-411c-943f-927e1329fe16\") " Dec 12 00:59:20 crc kubenswrapper[4606]: I1212 00:59:20.306713 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wmjc8\" (UniqueName: \"kubernetes.io/projected/17bc90fc-6056-411c-943f-927e1329fe16-kube-api-access-wmjc8\") pod \"17bc90fc-6056-411c-943f-927e1329fe16\" (UID: \"17bc90fc-6056-411c-943f-927e1329fe16\") " Dec 12 00:59:20 crc kubenswrapper[4606]: I1212 00:59:20.306870 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17bc90fc-6056-411c-943f-927e1329fe16-utilities\") pod \"17bc90fc-6056-411c-943f-927e1329fe16\" (UID: \"17bc90fc-6056-411c-943f-927e1329fe16\") " Dec 12 00:59:20 crc kubenswrapper[4606]: I1212 00:59:20.307479 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17bc90fc-6056-411c-943f-927e1329fe16-utilities" (OuterVolumeSpecName: "utilities") pod "17bc90fc-6056-411c-943f-927e1329fe16" (UID: "17bc90fc-6056-411c-943f-927e1329fe16"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 00:59:20 crc kubenswrapper[4606]: I1212 00:59:20.307668 4606 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17bc90fc-6056-411c-943f-927e1329fe16-utilities\") on node \"crc\" DevicePath \"\"" Dec 12 00:59:20 crc kubenswrapper[4606]: I1212 00:59:20.312117 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17bc90fc-6056-411c-943f-927e1329fe16-kube-api-access-wmjc8" (OuterVolumeSpecName: "kube-api-access-wmjc8") pod "17bc90fc-6056-411c-943f-927e1329fe16" (UID: "17bc90fc-6056-411c-943f-927e1329fe16"). InnerVolumeSpecName "kube-api-access-wmjc8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:59:20 crc kubenswrapper[4606]: I1212 00:59:20.339753 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17bc90fc-6056-411c-943f-927e1329fe16-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "17bc90fc-6056-411c-943f-927e1329fe16" (UID: "17bc90fc-6056-411c-943f-927e1329fe16"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 00:59:20 crc kubenswrapper[4606]: I1212 00:59:20.382629 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hcsh2" event={"ID":"17bc90fc-6056-411c-943f-927e1329fe16","Type":"ContainerDied","Data":"ab77cdeb5d12b93a712ad98ae0190eb350702d4bae389dd6ea0c47df23902d2f"} Dec 12 00:59:20 crc kubenswrapper[4606]: I1212 00:59:20.382688 4606 scope.go:117] "RemoveContainer" containerID="ab77cdeb5d12b93a712ad98ae0190eb350702d4bae389dd6ea0c47df23902d2f" Dec 12 00:59:20 crc kubenswrapper[4606]: I1212 00:59:20.384068 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hcsh2" Dec 12 00:59:20 crc kubenswrapper[4606]: I1212 00:59:20.382459 4606 generic.go:334] "Generic (PLEG): container finished" podID="17bc90fc-6056-411c-943f-927e1329fe16" containerID="ab77cdeb5d12b93a712ad98ae0190eb350702d4bae389dd6ea0c47df23902d2f" exitCode=0 Dec 12 00:59:20 crc kubenswrapper[4606]: I1212 00:59:20.390886 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hcsh2" event={"ID":"17bc90fc-6056-411c-943f-927e1329fe16","Type":"ContainerDied","Data":"7ac16f114dbd113c4a085e0b3d1d9e617d24107c27c2f71d3005913fcf50bf53"} Dec 12 00:59:20 crc kubenswrapper[4606]: I1212 00:59:20.403117 4606 scope.go:117] "RemoveContainer" containerID="914f5473de759a23212fc85bae1bda0d82337f37e7f60c2b48fbf23c0a0c1601" Dec 12 00:59:20 crc kubenswrapper[4606]: I1212 00:59:20.409573 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wmjc8\" (UniqueName: \"kubernetes.io/projected/17bc90fc-6056-411c-943f-927e1329fe16-kube-api-access-wmjc8\") on node \"crc\" DevicePath \"\"" Dec 12 00:59:20 crc kubenswrapper[4606]: I1212 00:59:20.409612 4606 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17bc90fc-6056-411c-943f-927e1329fe16-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 12 00:59:20 crc kubenswrapper[4606]: I1212 00:59:20.440578 4606 scope.go:117] "RemoveContainer" containerID="96508a388e2e59770dc0aad4f1f9584166af2bccced607b09cb0ed04fdd6fe84" Dec 12 00:59:20 crc kubenswrapper[4606]: I1212 00:59:20.446253 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hcsh2"] Dec 12 00:59:20 crc kubenswrapper[4606]: I1212 00:59:20.461452 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-hcsh2"] Dec 12 00:59:20 crc kubenswrapper[4606]: I1212 00:59:20.482886 4606 scope.go:117] "RemoveContainer" containerID="ab77cdeb5d12b93a712ad98ae0190eb350702d4bae389dd6ea0c47df23902d2f" Dec 12 00:59:20 crc kubenswrapper[4606]: E1212 00:59:20.483502 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab77cdeb5d12b93a712ad98ae0190eb350702d4bae389dd6ea0c47df23902d2f\": container with ID starting with ab77cdeb5d12b93a712ad98ae0190eb350702d4bae389dd6ea0c47df23902d2f not found: ID does not exist" containerID="ab77cdeb5d12b93a712ad98ae0190eb350702d4bae389dd6ea0c47df23902d2f" Dec 12 00:59:20 crc kubenswrapper[4606]: I1212 00:59:20.483545 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab77cdeb5d12b93a712ad98ae0190eb350702d4bae389dd6ea0c47df23902d2f"} err="failed to get container status \"ab77cdeb5d12b93a712ad98ae0190eb350702d4bae389dd6ea0c47df23902d2f\": rpc error: code = NotFound desc = could not find container \"ab77cdeb5d12b93a712ad98ae0190eb350702d4bae389dd6ea0c47df23902d2f\": container with ID starting with ab77cdeb5d12b93a712ad98ae0190eb350702d4bae389dd6ea0c47df23902d2f not found: ID does not exist" Dec 12 00:59:20 crc kubenswrapper[4606]: I1212 00:59:20.483573 4606 scope.go:117] "RemoveContainer" containerID="914f5473de759a23212fc85bae1bda0d82337f37e7f60c2b48fbf23c0a0c1601" Dec 12 00:59:20 crc kubenswrapper[4606]: E1212 00:59:20.483988 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"914f5473de759a23212fc85bae1bda0d82337f37e7f60c2b48fbf23c0a0c1601\": container with ID starting with 914f5473de759a23212fc85bae1bda0d82337f37e7f60c2b48fbf23c0a0c1601 not found: ID does not exist" containerID="914f5473de759a23212fc85bae1bda0d82337f37e7f60c2b48fbf23c0a0c1601" Dec 12 00:59:20 crc kubenswrapper[4606]: I1212 00:59:20.484030 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"914f5473de759a23212fc85bae1bda0d82337f37e7f60c2b48fbf23c0a0c1601"} err="failed to get container status \"914f5473de759a23212fc85bae1bda0d82337f37e7f60c2b48fbf23c0a0c1601\": rpc error: code = NotFound desc = could not find container \"914f5473de759a23212fc85bae1bda0d82337f37e7f60c2b48fbf23c0a0c1601\": container with ID starting with 914f5473de759a23212fc85bae1bda0d82337f37e7f60c2b48fbf23c0a0c1601 not found: ID does not exist" Dec 12 00:59:20 crc kubenswrapper[4606]: I1212 00:59:20.484260 4606 scope.go:117] "RemoveContainer" containerID="96508a388e2e59770dc0aad4f1f9584166af2bccced607b09cb0ed04fdd6fe84" Dec 12 00:59:20 crc kubenswrapper[4606]: E1212 00:59:20.484535 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96508a388e2e59770dc0aad4f1f9584166af2bccced607b09cb0ed04fdd6fe84\": container with ID starting with 96508a388e2e59770dc0aad4f1f9584166af2bccced607b09cb0ed04fdd6fe84 not found: ID does not exist" containerID="96508a388e2e59770dc0aad4f1f9584166af2bccced607b09cb0ed04fdd6fe84" Dec 12 00:59:20 crc kubenswrapper[4606]: I1212 00:59:20.484564 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96508a388e2e59770dc0aad4f1f9584166af2bccced607b09cb0ed04fdd6fe84"} err="failed to get container status \"96508a388e2e59770dc0aad4f1f9584166af2bccced607b09cb0ed04fdd6fe84\": rpc error: code = NotFound desc = could not find container \"96508a388e2e59770dc0aad4f1f9584166af2bccced607b09cb0ed04fdd6fe84\": container with ID starting with 96508a388e2e59770dc0aad4f1f9584166af2bccced607b09cb0ed04fdd6fe84 not found: ID does not exist" Dec 12 00:59:21 crc kubenswrapper[4606]: I1212 00:59:21.711065 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17bc90fc-6056-411c-943f-927e1329fe16" path="/var/lib/kubelet/pods/17bc90fc-6056-411c-943f-927e1329fe16/volumes" Dec 12 00:59:31 crc kubenswrapper[4606]: I1212 00:59:31.477703 4606 generic.go:334] "Generic (PLEG): container finished" podID="9ffe19a0-667c-400f-b80f-3ddedcbec6dd" containerID="c9d8199c41ddb3afaeb1a6c64dce61b7a0a8e52f022fa120cbbc83534d661ce8" exitCode=0 Dec 12 00:59:31 crc kubenswrapper[4606]: I1212 00:59:31.477775 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-h8w5g" event={"ID":"9ffe19a0-667c-400f-b80f-3ddedcbec6dd","Type":"ContainerDied","Data":"c9d8199c41ddb3afaeb1a6c64dce61b7a0a8e52f022fa120cbbc83534d661ce8"} Dec 12 00:59:32 crc kubenswrapper[4606]: I1212 00:59:32.010580 4606 patch_prober.go:28] interesting pod/machine-config-daemon-cqmz5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 12 00:59:32 crc kubenswrapper[4606]: I1212 00:59:32.010856 4606 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 12 00:59:32 crc kubenswrapper[4606]: I1212 00:59:32.010898 4606 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" Dec 12 00:59:32 crc kubenswrapper[4606]: I1212 00:59:32.011653 4606 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5c37b89455dd84575aa571020478d60921333939ed948577bddb71be18b5f553"} pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 12 00:59:32 crc kubenswrapper[4606]: I1212 00:59:32.011722 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" containerName="machine-config-daemon" containerID="cri-o://5c37b89455dd84575aa571020478d60921333939ed948577bddb71be18b5f553" gracePeriod=600 Dec 12 00:59:32 crc kubenswrapper[4606]: E1212 00:59:32.133846 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 00:59:32 crc kubenswrapper[4606]: I1212 00:59:32.491163 4606 generic.go:334] "Generic (PLEG): container finished" podID="a543e227-be89-40cb-941d-b4707cc28921" containerID="5c37b89455dd84575aa571020478d60921333939ed948577bddb71be18b5f553" exitCode=0 Dec 12 00:59:32 crc kubenswrapper[4606]: I1212 00:59:32.491204 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" event={"ID":"a543e227-be89-40cb-941d-b4707cc28921","Type":"ContainerDied","Data":"5c37b89455dd84575aa571020478d60921333939ed948577bddb71be18b5f553"} Dec 12 00:59:32 crc kubenswrapper[4606]: I1212 00:59:32.491271 4606 scope.go:117] "RemoveContainer" containerID="99029e9498baa7a74ac4278469a21ec659c962bde39e91cbead95d5b96de00b5" Dec 12 00:59:32 crc kubenswrapper[4606]: I1212 00:59:32.491995 4606 scope.go:117] "RemoveContainer" containerID="5c37b89455dd84575aa571020478d60921333939ed948577bddb71be18b5f553" Dec 12 00:59:32 crc kubenswrapper[4606]: E1212 00:59:32.492296 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 00:59:32 crc kubenswrapper[4606]: I1212 00:59:32.941014 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-h8w5g" Dec 12 00:59:33 crc kubenswrapper[4606]: I1212 00:59:33.049777 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9ffe19a0-667c-400f-b80f-3ddedcbec6dd-ssh-key\") pod \"9ffe19a0-667c-400f-b80f-3ddedcbec6dd\" (UID: \"9ffe19a0-667c-400f-b80f-3ddedcbec6dd\") " Dec 12 00:59:33 crc kubenswrapper[4606]: I1212 00:59:33.049846 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nlt56\" (UniqueName: \"kubernetes.io/projected/9ffe19a0-667c-400f-b80f-3ddedcbec6dd-kube-api-access-nlt56\") pod \"9ffe19a0-667c-400f-b80f-3ddedcbec6dd\" (UID: \"9ffe19a0-667c-400f-b80f-3ddedcbec6dd\") " Dec 12 00:59:33 crc kubenswrapper[4606]: I1212 00:59:33.049972 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9ffe19a0-667c-400f-b80f-3ddedcbec6dd-inventory\") pod \"9ffe19a0-667c-400f-b80f-3ddedcbec6dd\" (UID: \"9ffe19a0-667c-400f-b80f-3ddedcbec6dd\") " Dec 12 00:59:33 crc kubenswrapper[4606]: I1212 00:59:33.056237 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ffe19a0-667c-400f-b80f-3ddedcbec6dd-kube-api-access-nlt56" (OuterVolumeSpecName: "kube-api-access-nlt56") pod "9ffe19a0-667c-400f-b80f-3ddedcbec6dd" (UID: "9ffe19a0-667c-400f-b80f-3ddedcbec6dd"). InnerVolumeSpecName "kube-api-access-nlt56". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 00:59:33 crc kubenswrapper[4606]: I1212 00:59:33.081295 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ffe19a0-667c-400f-b80f-3ddedcbec6dd-inventory" (OuterVolumeSpecName: "inventory") pod "9ffe19a0-667c-400f-b80f-3ddedcbec6dd" (UID: "9ffe19a0-667c-400f-b80f-3ddedcbec6dd"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:59:33 crc kubenswrapper[4606]: I1212 00:59:33.083585 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ffe19a0-667c-400f-b80f-3ddedcbec6dd-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "9ffe19a0-667c-400f-b80f-3ddedcbec6dd" (UID: "9ffe19a0-667c-400f-b80f-3ddedcbec6dd"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 00:59:33 crc kubenswrapper[4606]: I1212 00:59:33.151950 4606 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9ffe19a0-667c-400f-b80f-3ddedcbec6dd-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 12 00:59:33 crc kubenswrapper[4606]: I1212 00:59:33.151987 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nlt56\" (UniqueName: \"kubernetes.io/projected/9ffe19a0-667c-400f-b80f-3ddedcbec6dd-kube-api-access-nlt56\") on node \"crc\" DevicePath \"\"" Dec 12 00:59:33 crc kubenswrapper[4606]: I1212 00:59:33.151998 4606 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9ffe19a0-667c-400f-b80f-3ddedcbec6dd-inventory\") on node \"crc\" DevicePath \"\"" Dec 12 00:59:33 crc kubenswrapper[4606]: I1212 00:59:33.502398 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-h8w5g" event={"ID":"9ffe19a0-667c-400f-b80f-3ddedcbec6dd","Type":"ContainerDied","Data":"07863bf19fac130b450123035ae0924a3b24cb5430b84170dc70b64cc09ee2f3"} Dec 12 00:59:33 crc kubenswrapper[4606]: I1212 00:59:33.502795 4606 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="07863bf19fac130b450123035ae0924a3b24cb5430b84170dc70b64cc09ee2f3" Dec 12 00:59:33 crc kubenswrapper[4606]: I1212 00:59:33.502887 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-h8w5g" Dec 12 00:59:33 crc kubenswrapper[4606]: I1212 00:59:33.611666 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ngd7j"] Dec 12 00:59:33 crc kubenswrapper[4606]: E1212 00:59:33.612009 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17bc90fc-6056-411c-943f-927e1329fe16" containerName="registry-server" Dec 12 00:59:33 crc kubenswrapper[4606]: I1212 00:59:33.612025 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="17bc90fc-6056-411c-943f-927e1329fe16" containerName="registry-server" Dec 12 00:59:33 crc kubenswrapper[4606]: E1212 00:59:33.612039 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5965cae1-310d-4026-bbee-f637e9677cd7" containerName="extract-content" Dec 12 00:59:33 crc kubenswrapper[4606]: I1212 00:59:33.612046 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="5965cae1-310d-4026-bbee-f637e9677cd7" containerName="extract-content" Dec 12 00:59:33 crc kubenswrapper[4606]: E1212 00:59:33.612070 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17bc90fc-6056-411c-943f-927e1329fe16" containerName="extract-content" Dec 12 00:59:33 crc kubenswrapper[4606]: I1212 00:59:33.612077 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="17bc90fc-6056-411c-943f-927e1329fe16" containerName="extract-content" Dec 12 00:59:33 crc kubenswrapper[4606]: E1212 00:59:33.612088 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ffe19a0-667c-400f-b80f-3ddedcbec6dd" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 12 00:59:33 crc kubenswrapper[4606]: I1212 00:59:33.612096 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ffe19a0-667c-400f-b80f-3ddedcbec6dd" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 12 00:59:33 crc kubenswrapper[4606]: E1212 00:59:33.612117 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17bc90fc-6056-411c-943f-927e1329fe16" containerName="extract-utilities" Dec 12 00:59:33 crc kubenswrapper[4606]: I1212 00:59:33.612123 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="17bc90fc-6056-411c-943f-927e1329fe16" containerName="extract-utilities" Dec 12 00:59:33 crc kubenswrapper[4606]: E1212 00:59:33.612133 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5965cae1-310d-4026-bbee-f637e9677cd7" containerName="extract-utilities" Dec 12 00:59:33 crc kubenswrapper[4606]: I1212 00:59:33.612138 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="5965cae1-310d-4026-bbee-f637e9677cd7" containerName="extract-utilities" Dec 12 00:59:33 crc kubenswrapper[4606]: E1212 00:59:33.612147 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5965cae1-310d-4026-bbee-f637e9677cd7" containerName="registry-server" Dec 12 00:59:33 crc kubenswrapper[4606]: I1212 00:59:33.612153 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="5965cae1-310d-4026-bbee-f637e9677cd7" containerName="registry-server" Dec 12 00:59:33 crc kubenswrapper[4606]: I1212 00:59:33.612340 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="5965cae1-310d-4026-bbee-f637e9677cd7" containerName="registry-server" Dec 12 00:59:33 crc kubenswrapper[4606]: I1212 00:59:33.612359 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="17bc90fc-6056-411c-943f-927e1329fe16" containerName="registry-server" Dec 12 00:59:33 crc kubenswrapper[4606]: I1212 00:59:33.612371 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ffe19a0-667c-400f-b80f-3ddedcbec6dd" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 12 00:59:33 crc kubenswrapper[4606]: I1212 00:59:33.612919 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ngd7j" Dec 12 00:59:33 crc kubenswrapper[4606]: I1212 00:59:33.615119 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 12 00:59:33 crc kubenswrapper[4606]: I1212 00:59:33.615712 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-w59bl" Dec 12 00:59:33 crc kubenswrapper[4606]: I1212 00:59:33.622736 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 12 00:59:33 crc kubenswrapper[4606]: I1212 00:59:33.623018 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 12 00:59:33 crc kubenswrapper[4606]: I1212 00:59:33.644254 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ngd7j"] Dec 12 00:59:33 crc kubenswrapper[4606]: I1212 00:59:33.764155 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2cd82033-0e61-42e1-b532-65e0baa9d60e-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-ngd7j\" (UID: \"2cd82033-0e61-42e1-b532-65e0baa9d60e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ngd7j" Dec 12 00:59:33 crc kubenswrapper[4606]: I1212 00:59:33.764313 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2cd82033-0e61-42e1-b532-65e0baa9d60e-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-ngd7j\" (UID: \"2cd82033-0e61-42e1-b532-65e0baa9d60e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ngd7j" Dec 12 00:59:33 crc kubenswrapper[4606]: I1212 00:59:33.764381 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nw89q\" (UniqueName: \"kubernetes.io/projected/2cd82033-0e61-42e1-b532-65e0baa9d60e-kube-api-access-nw89q\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-ngd7j\" (UID: \"2cd82033-0e61-42e1-b532-65e0baa9d60e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ngd7j" Dec 12 00:59:33 crc kubenswrapper[4606]: I1212 00:59:33.866407 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2cd82033-0e61-42e1-b532-65e0baa9d60e-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-ngd7j\" (UID: \"2cd82033-0e61-42e1-b532-65e0baa9d60e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ngd7j" Dec 12 00:59:33 crc kubenswrapper[4606]: I1212 00:59:33.866838 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2cd82033-0e61-42e1-b532-65e0baa9d60e-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-ngd7j\" (UID: \"2cd82033-0e61-42e1-b532-65e0baa9d60e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ngd7j" Dec 12 00:59:33 crc kubenswrapper[4606]: I1212 00:59:33.867012 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nw89q\" (UniqueName: \"kubernetes.io/projected/2cd82033-0e61-42e1-b532-65e0baa9d60e-kube-api-access-nw89q\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-ngd7j\" (UID: \"2cd82033-0e61-42e1-b532-65e0baa9d60e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ngd7j" Dec 12 00:59:33 crc kubenswrapper[4606]: I1212 00:59:33.871744 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2cd82033-0e61-42e1-b532-65e0baa9d60e-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-ngd7j\" (UID: \"2cd82033-0e61-42e1-b532-65e0baa9d60e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ngd7j" Dec 12 00:59:33 crc kubenswrapper[4606]: I1212 00:59:33.880565 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2cd82033-0e61-42e1-b532-65e0baa9d60e-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-ngd7j\" (UID: \"2cd82033-0e61-42e1-b532-65e0baa9d60e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ngd7j" Dec 12 00:59:33 crc kubenswrapper[4606]: I1212 00:59:33.898336 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nw89q\" (UniqueName: \"kubernetes.io/projected/2cd82033-0e61-42e1-b532-65e0baa9d60e-kube-api-access-nw89q\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-ngd7j\" (UID: \"2cd82033-0e61-42e1-b532-65e0baa9d60e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ngd7j" Dec 12 00:59:33 crc kubenswrapper[4606]: I1212 00:59:33.991042 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ngd7j" Dec 12 00:59:34 crc kubenswrapper[4606]: I1212 00:59:34.736759 4606 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 12 00:59:34 crc kubenswrapper[4606]: I1212 00:59:34.758420 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ngd7j"] Dec 12 00:59:35 crc kubenswrapper[4606]: I1212 00:59:35.532967 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ngd7j" event={"ID":"2cd82033-0e61-42e1-b532-65e0baa9d60e","Type":"ContainerStarted","Data":"3577197a9b4601678b76afeaa21e372dd62f22774f2c9f4e17096c8288035a6d"} Dec 12 00:59:35 crc kubenswrapper[4606]: I1212 00:59:35.533411 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ngd7j" event={"ID":"2cd82033-0e61-42e1-b532-65e0baa9d60e","Type":"ContainerStarted","Data":"7e31b951397ce2a2b23090b1f40a7dd2a98f0fe0525a163acf587edb19c48db7"} Dec 12 00:59:35 crc kubenswrapper[4606]: I1212 00:59:35.556890 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ngd7j" podStartSLOduration=2.3606177280000002 podStartE2EDuration="2.556844305s" podCreationTimestamp="2025-12-12 00:59:33 +0000 UTC" firstStartedPulling="2025-12-12 00:59:34.736329903 +0000 UTC m=+2165.281682769" lastFinishedPulling="2025-12-12 00:59:34.93255648 +0000 UTC m=+2165.477909346" observedRunningTime="2025-12-12 00:59:35.556136777 +0000 UTC m=+2166.101489663" watchObservedRunningTime="2025-12-12 00:59:35.556844305 +0000 UTC m=+2166.102197201" Dec 12 00:59:43 crc kubenswrapper[4606]: I1212 00:59:43.700658 4606 scope.go:117] "RemoveContainer" containerID="5c37b89455dd84575aa571020478d60921333939ed948577bddb71be18b5f553" Dec 12 00:59:43 crc kubenswrapper[4606]: E1212 00:59:43.701409 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 00:59:57 crc kubenswrapper[4606]: I1212 00:59:57.699931 4606 scope.go:117] "RemoveContainer" containerID="5c37b89455dd84575aa571020478d60921333939ed948577bddb71be18b5f553" Dec 12 00:59:57 crc kubenswrapper[4606]: E1212 00:59:57.700665 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 01:00:00 crc kubenswrapper[4606]: I1212 01:00:00.146276 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29425020-mjtqt"] Dec 12 01:00:00 crc kubenswrapper[4606]: I1212 01:00:00.148024 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29425020-mjtqt" Dec 12 01:00:00 crc kubenswrapper[4606]: I1212 01:00:00.153454 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 12 01:00:00 crc kubenswrapper[4606]: I1212 01:00:00.154191 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 12 01:00:00 crc kubenswrapper[4606]: I1212 01:00:00.161083 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29425020-mjtqt"] Dec 12 01:00:00 crc kubenswrapper[4606]: I1212 01:00:00.211858 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/32af7f7b-1f31-4c55-9020-3401f0cbae70-secret-volume\") pod \"collect-profiles-29425020-mjtqt\" (UID: \"32af7f7b-1f31-4c55-9020-3401f0cbae70\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29425020-mjtqt" Dec 12 01:00:00 crc kubenswrapper[4606]: I1212 01:00:00.211921 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nqwt\" (UniqueName: \"kubernetes.io/projected/32af7f7b-1f31-4c55-9020-3401f0cbae70-kube-api-access-4nqwt\") pod \"collect-profiles-29425020-mjtqt\" (UID: \"32af7f7b-1f31-4c55-9020-3401f0cbae70\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29425020-mjtqt" Dec 12 01:00:00 crc kubenswrapper[4606]: I1212 01:00:00.211955 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/32af7f7b-1f31-4c55-9020-3401f0cbae70-config-volume\") pod \"collect-profiles-29425020-mjtqt\" (UID: \"32af7f7b-1f31-4c55-9020-3401f0cbae70\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29425020-mjtqt" Dec 12 01:00:00 crc kubenswrapper[4606]: I1212 01:00:00.313287 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/32af7f7b-1f31-4c55-9020-3401f0cbae70-secret-volume\") pod \"collect-profiles-29425020-mjtqt\" (UID: \"32af7f7b-1f31-4c55-9020-3401f0cbae70\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29425020-mjtqt" Dec 12 01:00:00 crc kubenswrapper[4606]: I1212 01:00:00.313397 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4nqwt\" (UniqueName: \"kubernetes.io/projected/32af7f7b-1f31-4c55-9020-3401f0cbae70-kube-api-access-4nqwt\") pod \"collect-profiles-29425020-mjtqt\" (UID: \"32af7f7b-1f31-4c55-9020-3401f0cbae70\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29425020-mjtqt" Dec 12 01:00:00 crc kubenswrapper[4606]: I1212 01:00:00.313463 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/32af7f7b-1f31-4c55-9020-3401f0cbae70-config-volume\") pod \"collect-profiles-29425020-mjtqt\" (UID: \"32af7f7b-1f31-4c55-9020-3401f0cbae70\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29425020-mjtqt" Dec 12 01:00:00 crc kubenswrapper[4606]: I1212 01:00:00.314540 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/32af7f7b-1f31-4c55-9020-3401f0cbae70-config-volume\") pod \"collect-profiles-29425020-mjtqt\" (UID: \"32af7f7b-1f31-4c55-9020-3401f0cbae70\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29425020-mjtqt" Dec 12 01:00:00 crc kubenswrapper[4606]: I1212 01:00:00.328111 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/32af7f7b-1f31-4c55-9020-3401f0cbae70-secret-volume\") pod \"collect-profiles-29425020-mjtqt\" (UID: \"32af7f7b-1f31-4c55-9020-3401f0cbae70\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29425020-mjtqt" Dec 12 01:00:00 crc kubenswrapper[4606]: I1212 01:00:00.331053 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nqwt\" (UniqueName: \"kubernetes.io/projected/32af7f7b-1f31-4c55-9020-3401f0cbae70-kube-api-access-4nqwt\") pod \"collect-profiles-29425020-mjtqt\" (UID: \"32af7f7b-1f31-4c55-9020-3401f0cbae70\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29425020-mjtqt" Dec 12 01:00:00 crc kubenswrapper[4606]: I1212 01:00:00.469123 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29425020-mjtqt" Dec 12 01:00:00 crc kubenswrapper[4606]: I1212 01:00:00.978368 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29425020-mjtqt"] Dec 12 01:00:01 crc kubenswrapper[4606]: I1212 01:00:01.801827 4606 generic.go:334] "Generic (PLEG): container finished" podID="32af7f7b-1f31-4c55-9020-3401f0cbae70" containerID="6ec8ffdf536a2a4e21ce41c6e47cafa65561abafba622430f6d3b046f27ccb30" exitCode=0 Dec 12 01:00:01 crc kubenswrapper[4606]: I1212 01:00:01.802020 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29425020-mjtqt" event={"ID":"32af7f7b-1f31-4c55-9020-3401f0cbae70","Type":"ContainerDied","Data":"6ec8ffdf536a2a4e21ce41c6e47cafa65561abafba622430f6d3b046f27ccb30"} Dec 12 01:00:01 crc kubenswrapper[4606]: I1212 01:00:01.802349 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29425020-mjtqt" event={"ID":"32af7f7b-1f31-4c55-9020-3401f0cbae70","Type":"ContainerStarted","Data":"30ce0034d21fa0263387721d0bff80d0f838b952963499c631318924ca1e8bf1"} Dec 12 01:00:03 crc kubenswrapper[4606]: I1212 01:00:03.179698 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29425020-mjtqt" Dec 12 01:00:03 crc kubenswrapper[4606]: I1212 01:00:03.211062 4606 scope.go:117] "RemoveContainer" containerID="9d565397aa178c1bed8048e5ced41a078fbb2034318963028a3445099f8688f5" Dec 12 01:00:03 crc kubenswrapper[4606]: I1212 01:00:03.272112 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4nqwt\" (UniqueName: \"kubernetes.io/projected/32af7f7b-1f31-4c55-9020-3401f0cbae70-kube-api-access-4nqwt\") pod \"32af7f7b-1f31-4c55-9020-3401f0cbae70\" (UID: \"32af7f7b-1f31-4c55-9020-3401f0cbae70\") " Dec 12 01:00:03 crc kubenswrapper[4606]: I1212 01:00:03.272351 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/32af7f7b-1f31-4c55-9020-3401f0cbae70-secret-volume\") pod \"32af7f7b-1f31-4c55-9020-3401f0cbae70\" (UID: \"32af7f7b-1f31-4c55-9020-3401f0cbae70\") " Dec 12 01:00:03 crc kubenswrapper[4606]: I1212 01:00:03.272500 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/32af7f7b-1f31-4c55-9020-3401f0cbae70-config-volume\") pod \"32af7f7b-1f31-4c55-9020-3401f0cbae70\" (UID: \"32af7f7b-1f31-4c55-9020-3401f0cbae70\") " Dec 12 01:00:03 crc kubenswrapper[4606]: I1212 01:00:03.276849 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32af7f7b-1f31-4c55-9020-3401f0cbae70-config-volume" (OuterVolumeSpecName: "config-volume") pod "32af7f7b-1f31-4c55-9020-3401f0cbae70" (UID: "32af7f7b-1f31-4c55-9020-3401f0cbae70"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 01:00:03 crc kubenswrapper[4606]: I1212 01:00:03.292818 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32af7f7b-1f31-4c55-9020-3401f0cbae70-kube-api-access-4nqwt" (OuterVolumeSpecName: "kube-api-access-4nqwt") pod "32af7f7b-1f31-4c55-9020-3401f0cbae70" (UID: "32af7f7b-1f31-4c55-9020-3401f0cbae70"). InnerVolumeSpecName "kube-api-access-4nqwt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 01:00:03 crc kubenswrapper[4606]: I1212 01:00:03.293128 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32af7f7b-1f31-4c55-9020-3401f0cbae70-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "32af7f7b-1f31-4c55-9020-3401f0cbae70" (UID: "32af7f7b-1f31-4c55-9020-3401f0cbae70"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 01:00:03 crc kubenswrapper[4606]: I1212 01:00:03.376409 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4nqwt\" (UniqueName: \"kubernetes.io/projected/32af7f7b-1f31-4c55-9020-3401f0cbae70-kube-api-access-4nqwt\") on node \"crc\" DevicePath \"\"" Dec 12 01:00:03 crc kubenswrapper[4606]: I1212 01:00:03.376447 4606 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/32af7f7b-1f31-4c55-9020-3401f0cbae70-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 12 01:00:03 crc kubenswrapper[4606]: I1212 01:00:03.376460 4606 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/32af7f7b-1f31-4c55-9020-3401f0cbae70-config-volume\") on node \"crc\" DevicePath \"\"" Dec 12 01:00:03 crc kubenswrapper[4606]: I1212 01:00:03.820900 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29425020-mjtqt" event={"ID":"32af7f7b-1f31-4c55-9020-3401f0cbae70","Type":"ContainerDied","Data":"30ce0034d21fa0263387721d0bff80d0f838b952963499c631318924ca1e8bf1"} Dec 12 01:00:03 crc kubenswrapper[4606]: I1212 01:00:03.820949 4606 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="30ce0034d21fa0263387721d0bff80d0f838b952963499c631318924ca1e8bf1" Dec 12 01:00:03 crc kubenswrapper[4606]: I1212 01:00:03.820985 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29425020-mjtqt" Dec 12 01:00:04 crc kubenswrapper[4606]: I1212 01:00:04.273439 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29424975-jpm67"] Dec 12 01:00:04 crc kubenswrapper[4606]: I1212 01:00:04.281791 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29424975-jpm67"] Dec 12 01:00:05 crc kubenswrapper[4606]: I1212 01:00:05.722723 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b7c71bd-1fac-494e-8407-ecedfa667fc7" path="/var/lib/kubelet/pods/9b7c71bd-1fac-494e-8407-ecedfa667fc7/volumes" Dec 12 01:00:08 crc kubenswrapper[4606]: I1212 01:00:08.699828 4606 scope.go:117] "RemoveContainer" containerID="5c37b89455dd84575aa571020478d60921333939ed948577bddb71be18b5f553" Dec 12 01:00:08 crc kubenswrapper[4606]: E1212 01:00:08.700316 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 01:00:21 crc kubenswrapper[4606]: I1212 01:00:21.699651 4606 scope.go:117] "RemoveContainer" containerID="5c37b89455dd84575aa571020478d60921333939ed948577bddb71be18b5f553" Dec 12 01:00:21 crc kubenswrapper[4606]: E1212 01:00:21.700491 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 01:00:35 crc kubenswrapper[4606]: I1212 01:00:35.101799 4606 generic.go:334] "Generic (PLEG): container finished" podID="2cd82033-0e61-42e1-b532-65e0baa9d60e" containerID="3577197a9b4601678b76afeaa21e372dd62f22774f2c9f4e17096c8288035a6d" exitCode=0 Dec 12 01:00:35 crc kubenswrapper[4606]: I1212 01:00:35.101899 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ngd7j" event={"ID":"2cd82033-0e61-42e1-b532-65e0baa9d60e","Type":"ContainerDied","Data":"3577197a9b4601678b76afeaa21e372dd62f22774f2c9f4e17096c8288035a6d"} Dec 12 01:00:35 crc kubenswrapper[4606]: I1212 01:00:35.699365 4606 scope.go:117] "RemoveContainer" containerID="5c37b89455dd84575aa571020478d60921333939ed948577bddb71be18b5f553" Dec 12 01:00:35 crc kubenswrapper[4606]: E1212 01:00:35.699705 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 01:00:36 crc kubenswrapper[4606]: I1212 01:00:36.539612 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ngd7j" Dec 12 01:00:36 crc kubenswrapper[4606]: I1212 01:00:36.646023 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2cd82033-0e61-42e1-b532-65e0baa9d60e-inventory\") pod \"2cd82033-0e61-42e1-b532-65e0baa9d60e\" (UID: \"2cd82033-0e61-42e1-b532-65e0baa9d60e\") " Dec 12 01:00:36 crc kubenswrapper[4606]: I1212 01:00:36.646447 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2cd82033-0e61-42e1-b532-65e0baa9d60e-ssh-key\") pod \"2cd82033-0e61-42e1-b532-65e0baa9d60e\" (UID: \"2cd82033-0e61-42e1-b532-65e0baa9d60e\") " Dec 12 01:00:36 crc kubenswrapper[4606]: I1212 01:00:36.646671 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nw89q\" (UniqueName: \"kubernetes.io/projected/2cd82033-0e61-42e1-b532-65e0baa9d60e-kube-api-access-nw89q\") pod \"2cd82033-0e61-42e1-b532-65e0baa9d60e\" (UID: \"2cd82033-0e61-42e1-b532-65e0baa9d60e\") " Dec 12 01:00:36 crc kubenswrapper[4606]: I1212 01:00:36.663575 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2cd82033-0e61-42e1-b532-65e0baa9d60e-kube-api-access-nw89q" (OuterVolumeSpecName: "kube-api-access-nw89q") pod "2cd82033-0e61-42e1-b532-65e0baa9d60e" (UID: "2cd82033-0e61-42e1-b532-65e0baa9d60e"). InnerVolumeSpecName "kube-api-access-nw89q". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 01:00:36 crc kubenswrapper[4606]: I1212 01:00:36.677023 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cd82033-0e61-42e1-b532-65e0baa9d60e-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "2cd82033-0e61-42e1-b532-65e0baa9d60e" (UID: "2cd82033-0e61-42e1-b532-65e0baa9d60e"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 01:00:36 crc kubenswrapper[4606]: I1212 01:00:36.682758 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cd82033-0e61-42e1-b532-65e0baa9d60e-inventory" (OuterVolumeSpecName: "inventory") pod "2cd82033-0e61-42e1-b532-65e0baa9d60e" (UID: "2cd82033-0e61-42e1-b532-65e0baa9d60e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 01:00:36 crc kubenswrapper[4606]: I1212 01:00:36.749918 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nw89q\" (UniqueName: \"kubernetes.io/projected/2cd82033-0e61-42e1-b532-65e0baa9d60e-kube-api-access-nw89q\") on node \"crc\" DevicePath \"\"" Dec 12 01:00:36 crc kubenswrapper[4606]: I1212 01:00:36.749951 4606 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2cd82033-0e61-42e1-b532-65e0baa9d60e-inventory\") on node \"crc\" DevicePath \"\"" Dec 12 01:00:36 crc kubenswrapper[4606]: I1212 01:00:36.749966 4606 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2cd82033-0e61-42e1-b532-65e0baa9d60e-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 12 01:00:37 crc kubenswrapper[4606]: I1212 01:00:37.120681 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ngd7j" event={"ID":"2cd82033-0e61-42e1-b532-65e0baa9d60e","Type":"ContainerDied","Data":"7e31b951397ce2a2b23090b1f40a7dd2a98f0fe0525a163acf587edb19c48db7"} Dec 12 01:00:37 crc kubenswrapper[4606]: I1212 01:00:37.120730 4606 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7e31b951397ce2a2b23090b1f40a7dd2a98f0fe0525a163acf587edb19c48db7" Dec 12 01:00:37 crc kubenswrapper[4606]: I1212 01:00:37.120752 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ngd7j" Dec 12 01:00:37 crc kubenswrapper[4606]: I1212 01:00:37.238995 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-v76xq"] Dec 12 01:00:37 crc kubenswrapper[4606]: E1212 01:00:37.239525 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cd82033-0e61-42e1-b532-65e0baa9d60e" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 12 01:00:37 crc kubenswrapper[4606]: I1212 01:00:37.239548 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cd82033-0e61-42e1-b532-65e0baa9d60e" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 12 01:00:37 crc kubenswrapper[4606]: E1212 01:00:37.239569 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32af7f7b-1f31-4c55-9020-3401f0cbae70" containerName="collect-profiles" Dec 12 01:00:37 crc kubenswrapper[4606]: I1212 01:00:37.239579 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="32af7f7b-1f31-4c55-9020-3401f0cbae70" containerName="collect-profiles" Dec 12 01:00:37 crc kubenswrapper[4606]: I1212 01:00:37.239857 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="2cd82033-0e61-42e1-b532-65e0baa9d60e" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 12 01:00:37 crc kubenswrapper[4606]: I1212 01:00:37.239898 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="32af7f7b-1f31-4c55-9020-3401f0cbae70" containerName="collect-profiles" Dec 12 01:00:37 crc kubenswrapper[4606]: I1212 01:00:37.240677 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-v76xq" Dec 12 01:00:37 crc kubenswrapper[4606]: I1212 01:00:37.246427 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 12 01:00:37 crc kubenswrapper[4606]: I1212 01:00:37.246640 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 12 01:00:37 crc kubenswrapper[4606]: I1212 01:00:37.246894 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 12 01:00:37 crc kubenswrapper[4606]: I1212 01:00:37.247115 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-w59bl" Dec 12 01:00:37 crc kubenswrapper[4606]: I1212 01:00:37.254605 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-v76xq"] Dec 12 01:00:37 crc kubenswrapper[4606]: I1212 01:00:37.283652 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6vwk\" (UniqueName: \"kubernetes.io/projected/b40df3ae-e397-4c46-ae83-781aafd30e5e-kube-api-access-v6vwk\") pod \"ssh-known-hosts-edpm-deployment-v76xq\" (UID: \"b40df3ae-e397-4c46-ae83-781aafd30e5e\") " pod="openstack/ssh-known-hosts-edpm-deployment-v76xq" Dec 12 01:00:37 crc kubenswrapper[4606]: I1212 01:00:37.283726 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b40df3ae-e397-4c46-ae83-781aafd30e5e-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-v76xq\" (UID: \"b40df3ae-e397-4c46-ae83-781aafd30e5e\") " pod="openstack/ssh-known-hosts-edpm-deployment-v76xq" Dec 12 01:00:37 crc kubenswrapper[4606]: I1212 01:00:37.283743 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/b40df3ae-e397-4c46-ae83-781aafd30e5e-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-v76xq\" (UID: \"b40df3ae-e397-4c46-ae83-781aafd30e5e\") " pod="openstack/ssh-known-hosts-edpm-deployment-v76xq" Dec 12 01:00:37 crc kubenswrapper[4606]: I1212 01:00:37.384621 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v6vwk\" (UniqueName: \"kubernetes.io/projected/b40df3ae-e397-4c46-ae83-781aafd30e5e-kube-api-access-v6vwk\") pod \"ssh-known-hosts-edpm-deployment-v76xq\" (UID: \"b40df3ae-e397-4c46-ae83-781aafd30e5e\") " pod="openstack/ssh-known-hosts-edpm-deployment-v76xq" Dec 12 01:00:37 crc kubenswrapper[4606]: I1212 01:00:37.385003 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b40df3ae-e397-4c46-ae83-781aafd30e5e-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-v76xq\" (UID: \"b40df3ae-e397-4c46-ae83-781aafd30e5e\") " pod="openstack/ssh-known-hosts-edpm-deployment-v76xq" Dec 12 01:00:37 crc kubenswrapper[4606]: I1212 01:00:37.385028 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/b40df3ae-e397-4c46-ae83-781aafd30e5e-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-v76xq\" (UID: \"b40df3ae-e397-4c46-ae83-781aafd30e5e\") " pod="openstack/ssh-known-hosts-edpm-deployment-v76xq" Dec 12 01:00:37 crc kubenswrapper[4606]: I1212 01:00:37.389892 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/b40df3ae-e397-4c46-ae83-781aafd30e5e-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-v76xq\" (UID: \"b40df3ae-e397-4c46-ae83-781aafd30e5e\") " pod="openstack/ssh-known-hosts-edpm-deployment-v76xq" Dec 12 01:00:37 crc kubenswrapper[4606]: I1212 01:00:37.392880 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b40df3ae-e397-4c46-ae83-781aafd30e5e-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-v76xq\" (UID: \"b40df3ae-e397-4c46-ae83-781aafd30e5e\") " pod="openstack/ssh-known-hosts-edpm-deployment-v76xq" Dec 12 01:00:37 crc kubenswrapper[4606]: I1212 01:00:37.399964 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6vwk\" (UniqueName: \"kubernetes.io/projected/b40df3ae-e397-4c46-ae83-781aafd30e5e-kube-api-access-v6vwk\") pod \"ssh-known-hosts-edpm-deployment-v76xq\" (UID: \"b40df3ae-e397-4c46-ae83-781aafd30e5e\") " pod="openstack/ssh-known-hosts-edpm-deployment-v76xq" Dec 12 01:00:37 crc kubenswrapper[4606]: I1212 01:00:37.625380 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-v76xq" Dec 12 01:00:37 crc kubenswrapper[4606]: I1212 01:00:37.968488 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-v76xq"] Dec 12 01:00:38 crc kubenswrapper[4606]: I1212 01:00:38.138400 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-v76xq" event={"ID":"b40df3ae-e397-4c46-ae83-781aafd30e5e","Type":"ContainerStarted","Data":"6d3e399e47fa4424d83afe025dcc39c5171fd3a6b8c3f0fa854115cb1ecf2ab8"} Dec 12 01:00:39 crc kubenswrapper[4606]: I1212 01:00:39.157534 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-v76xq" event={"ID":"b40df3ae-e397-4c46-ae83-781aafd30e5e","Type":"ContainerStarted","Data":"c59c581e4fd2d9d80c142f94d0386254490142f6b7cfeefc29e3851dd9946750"} Dec 12 01:00:39 crc kubenswrapper[4606]: I1212 01:00:39.196333 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-v76xq" podStartSLOduration=1.989124684 podStartE2EDuration="2.196292721s" podCreationTimestamp="2025-12-12 01:00:37 +0000 UTC" firstStartedPulling="2025-12-12 01:00:37.960360566 +0000 UTC m=+2228.505713442" lastFinishedPulling="2025-12-12 01:00:38.167528613 +0000 UTC m=+2228.712881479" observedRunningTime="2025-12-12 01:00:39.181076706 +0000 UTC m=+2229.726429612" watchObservedRunningTime="2025-12-12 01:00:39.196292721 +0000 UTC m=+2229.741645597" Dec 12 01:00:46 crc kubenswrapper[4606]: I1212 01:00:46.220908 4606 generic.go:334] "Generic (PLEG): container finished" podID="b40df3ae-e397-4c46-ae83-781aafd30e5e" containerID="c59c581e4fd2d9d80c142f94d0386254490142f6b7cfeefc29e3851dd9946750" exitCode=0 Dec 12 01:00:46 crc kubenswrapper[4606]: I1212 01:00:46.221005 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-v76xq" event={"ID":"b40df3ae-e397-4c46-ae83-781aafd30e5e","Type":"ContainerDied","Data":"c59c581e4fd2d9d80c142f94d0386254490142f6b7cfeefc29e3851dd9946750"} Dec 12 01:00:46 crc kubenswrapper[4606]: I1212 01:00:46.699608 4606 scope.go:117] "RemoveContainer" containerID="5c37b89455dd84575aa571020478d60921333939ed948577bddb71be18b5f553" Dec 12 01:00:46 crc kubenswrapper[4606]: E1212 01:00:46.700105 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 01:00:47 crc kubenswrapper[4606]: I1212 01:00:47.700551 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-v76xq" Dec 12 01:00:47 crc kubenswrapper[4606]: I1212 01:00:47.807869 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v6vwk\" (UniqueName: \"kubernetes.io/projected/b40df3ae-e397-4c46-ae83-781aafd30e5e-kube-api-access-v6vwk\") pod \"b40df3ae-e397-4c46-ae83-781aafd30e5e\" (UID: \"b40df3ae-e397-4c46-ae83-781aafd30e5e\") " Dec 12 01:00:47 crc kubenswrapper[4606]: I1212 01:00:47.807954 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/b40df3ae-e397-4c46-ae83-781aafd30e5e-inventory-0\") pod \"b40df3ae-e397-4c46-ae83-781aafd30e5e\" (UID: \"b40df3ae-e397-4c46-ae83-781aafd30e5e\") " Dec 12 01:00:47 crc kubenswrapper[4606]: I1212 01:00:47.808095 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b40df3ae-e397-4c46-ae83-781aafd30e5e-ssh-key-openstack-edpm-ipam\") pod \"b40df3ae-e397-4c46-ae83-781aafd30e5e\" (UID: \"b40df3ae-e397-4c46-ae83-781aafd30e5e\") " Dec 12 01:00:47 crc kubenswrapper[4606]: I1212 01:00:47.820482 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b40df3ae-e397-4c46-ae83-781aafd30e5e-kube-api-access-v6vwk" (OuterVolumeSpecName: "kube-api-access-v6vwk") pod "b40df3ae-e397-4c46-ae83-781aafd30e5e" (UID: "b40df3ae-e397-4c46-ae83-781aafd30e5e"). InnerVolumeSpecName "kube-api-access-v6vwk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 01:00:47 crc kubenswrapper[4606]: I1212 01:00:47.837667 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b40df3ae-e397-4c46-ae83-781aafd30e5e-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "b40df3ae-e397-4c46-ae83-781aafd30e5e" (UID: "b40df3ae-e397-4c46-ae83-781aafd30e5e"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 01:00:47 crc kubenswrapper[4606]: I1212 01:00:47.838315 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b40df3ae-e397-4c46-ae83-781aafd30e5e-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "b40df3ae-e397-4c46-ae83-781aafd30e5e" (UID: "b40df3ae-e397-4c46-ae83-781aafd30e5e"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 01:00:47 crc kubenswrapper[4606]: I1212 01:00:47.910899 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v6vwk\" (UniqueName: \"kubernetes.io/projected/b40df3ae-e397-4c46-ae83-781aafd30e5e-kube-api-access-v6vwk\") on node \"crc\" DevicePath \"\"" Dec 12 01:00:47 crc kubenswrapper[4606]: I1212 01:00:47.910947 4606 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/b40df3ae-e397-4c46-ae83-781aafd30e5e-inventory-0\") on node \"crc\" DevicePath \"\"" Dec 12 01:00:47 crc kubenswrapper[4606]: I1212 01:00:47.910964 4606 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b40df3ae-e397-4c46-ae83-781aafd30e5e-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Dec 12 01:00:48 crc kubenswrapper[4606]: I1212 01:00:48.249108 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-v76xq" event={"ID":"b40df3ae-e397-4c46-ae83-781aafd30e5e","Type":"ContainerDied","Data":"6d3e399e47fa4424d83afe025dcc39c5171fd3a6b8c3f0fa854115cb1ecf2ab8"} Dec 12 01:00:48 crc kubenswrapper[4606]: I1212 01:00:48.249145 4606 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6d3e399e47fa4424d83afe025dcc39c5171fd3a6b8c3f0fa854115cb1ecf2ab8" Dec 12 01:00:48 crc kubenswrapper[4606]: I1212 01:00:48.249351 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-v76xq" Dec 12 01:00:48 crc kubenswrapper[4606]: I1212 01:00:48.496759 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-fzq9w"] Dec 12 01:00:48 crc kubenswrapper[4606]: E1212 01:00:48.497327 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b40df3ae-e397-4c46-ae83-781aafd30e5e" containerName="ssh-known-hosts-edpm-deployment" Dec 12 01:00:48 crc kubenswrapper[4606]: I1212 01:00:48.497352 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="b40df3ae-e397-4c46-ae83-781aafd30e5e" containerName="ssh-known-hosts-edpm-deployment" Dec 12 01:00:48 crc kubenswrapper[4606]: I1212 01:00:48.497617 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="b40df3ae-e397-4c46-ae83-781aafd30e5e" containerName="ssh-known-hosts-edpm-deployment" Dec 12 01:00:48 crc kubenswrapper[4606]: I1212 01:00:48.498455 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fzq9w" Dec 12 01:00:48 crc kubenswrapper[4606]: I1212 01:00:48.505424 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 12 01:00:48 crc kubenswrapper[4606]: I1212 01:00:48.505684 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 12 01:00:48 crc kubenswrapper[4606]: I1212 01:00:48.505785 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-fzq9w"] Dec 12 01:00:48 crc kubenswrapper[4606]: I1212 01:00:48.506044 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-w59bl" Dec 12 01:00:48 crc kubenswrapper[4606]: I1212 01:00:48.506229 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 12 01:00:48 crc kubenswrapper[4606]: I1212 01:00:48.524668 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dbfee972-c3b5-489c-adb3-c7e6720b67d0-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-fzq9w\" (UID: \"dbfee972-c3b5-489c-adb3-c7e6720b67d0\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fzq9w" Dec 12 01:00:48 crc kubenswrapper[4606]: I1212 01:00:48.524771 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sv2mz\" (UniqueName: \"kubernetes.io/projected/dbfee972-c3b5-489c-adb3-c7e6720b67d0-kube-api-access-sv2mz\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-fzq9w\" (UID: \"dbfee972-c3b5-489c-adb3-c7e6720b67d0\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fzq9w" Dec 12 01:00:48 crc kubenswrapper[4606]: I1212 01:00:48.524827 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dbfee972-c3b5-489c-adb3-c7e6720b67d0-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-fzq9w\" (UID: \"dbfee972-c3b5-489c-adb3-c7e6720b67d0\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fzq9w" Dec 12 01:00:48 crc kubenswrapper[4606]: I1212 01:00:48.625966 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dbfee972-c3b5-489c-adb3-c7e6720b67d0-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-fzq9w\" (UID: \"dbfee972-c3b5-489c-adb3-c7e6720b67d0\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fzq9w" Dec 12 01:00:48 crc kubenswrapper[4606]: I1212 01:00:48.626410 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sv2mz\" (UniqueName: \"kubernetes.io/projected/dbfee972-c3b5-489c-adb3-c7e6720b67d0-kube-api-access-sv2mz\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-fzq9w\" (UID: \"dbfee972-c3b5-489c-adb3-c7e6720b67d0\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fzq9w" Dec 12 01:00:48 crc kubenswrapper[4606]: I1212 01:00:48.626546 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dbfee972-c3b5-489c-adb3-c7e6720b67d0-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-fzq9w\" (UID: \"dbfee972-c3b5-489c-adb3-c7e6720b67d0\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fzq9w" Dec 12 01:00:48 crc kubenswrapper[4606]: I1212 01:00:48.631208 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dbfee972-c3b5-489c-adb3-c7e6720b67d0-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-fzq9w\" (UID: \"dbfee972-c3b5-489c-adb3-c7e6720b67d0\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fzq9w" Dec 12 01:00:48 crc kubenswrapper[4606]: I1212 01:00:48.631474 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dbfee972-c3b5-489c-adb3-c7e6720b67d0-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-fzq9w\" (UID: \"dbfee972-c3b5-489c-adb3-c7e6720b67d0\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fzq9w" Dec 12 01:00:48 crc kubenswrapper[4606]: I1212 01:00:48.644031 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sv2mz\" (UniqueName: \"kubernetes.io/projected/dbfee972-c3b5-489c-adb3-c7e6720b67d0-kube-api-access-sv2mz\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-fzq9w\" (UID: \"dbfee972-c3b5-489c-adb3-c7e6720b67d0\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fzq9w" Dec 12 01:00:48 crc kubenswrapper[4606]: I1212 01:00:48.837873 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fzq9w" Dec 12 01:00:49 crc kubenswrapper[4606]: I1212 01:00:49.438918 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-fzq9w"] Dec 12 01:00:50 crc kubenswrapper[4606]: I1212 01:00:50.268959 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fzq9w" event={"ID":"dbfee972-c3b5-489c-adb3-c7e6720b67d0","Type":"ContainerStarted","Data":"1b1aac30dbf046f5aac86ddd9ab1587eabba6ed07fc0c41fd61122eedd6058d4"} Dec 12 01:00:50 crc kubenswrapper[4606]: I1212 01:00:50.269334 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fzq9w" event={"ID":"dbfee972-c3b5-489c-adb3-c7e6720b67d0","Type":"ContainerStarted","Data":"eebb7af5834af22d2c390453363e7e2e1d54bae6b45e0739967c0166b7ad684d"} Dec 12 01:00:50 crc kubenswrapper[4606]: I1212 01:00:50.288729 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fzq9w" podStartSLOduration=2.039252575 podStartE2EDuration="2.288708636s" podCreationTimestamp="2025-12-12 01:00:48 +0000 UTC" firstStartedPulling="2025-12-12 01:00:49.455482526 +0000 UTC m=+2240.000835392" lastFinishedPulling="2025-12-12 01:00:49.704938587 +0000 UTC m=+2240.250291453" observedRunningTime="2025-12-12 01:00:50.282249124 +0000 UTC m=+2240.827602000" watchObservedRunningTime="2025-12-12 01:00:50.288708636 +0000 UTC m=+2240.834061502" Dec 12 01:00:59 crc kubenswrapper[4606]: I1212 01:00:59.343412 4606 generic.go:334] "Generic (PLEG): container finished" podID="dbfee972-c3b5-489c-adb3-c7e6720b67d0" containerID="1b1aac30dbf046f5aac86ddd9ab1587eabba6ed07fc0c41fd61122eedd6058d4" exitCode=0 Dec 12 01:00:59 crc kubenswrapper[4606]: I1212 01:00:59.343473 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fzq9w" event={"ID":"dbfee972-c3b5-489c-adb3-c7e6720b67d0","Type":"ContainerDied","Data":"1b1aac30dbf046f5aac86ddd9ab1587eabba6ed07fc0c41fd61122eedd6058d4"} Dec 12 01:01:00 crc kubenswrapper[4606]: I1212 01:01:00.128381 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29425021-wxzqx"] Dec 12 01:01:00 crc kubenswrapper[4606]: I1212 01:01:00.132121 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29425021-wxzqx" Dec 12 01:01:00 crc kubenswrapper[4606]: I1212 01:01:00.141844 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29425021-wxzqx"] Dec 12 01:01:00 crc kubenswrapper[4606]: I1212 01:01:00.284509 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc2e18bf-398b-4f87-90c3-a14838b991d6-config-data\") pod \"keystone-cron-29425021-wxzqx\" (UID: \"dc2e18bf-398b-4f87-90c3-a14838b991d6\") " pod="openstack/keystone-cron-29425021-wxzqx" Dec 12 01:01:00 crc kubenswrapper[4606]: I1212 01:01:00.284576 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrdnl\" (UniqueName: \"kubernetes.io/projected/dc2e18bf-398b-4f87-90c3-a14838b991d6-kube-api-access-mrdnl\") pod \"keystone-cron-29425021-wxzqx\" (UID: \"dc2e18bf-398b-4f87-90c3-a14838b991d6\") " pod="openstack/keystone-cron-29425021-wxzqx" Dec 12 01:01:00 crc kubenswrapper[4606]: I1212 01:01:00.284614 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/dc2e18bf-398b-4f87-90c3-a14838b991d6-fernet-keys\") pod \"keystone-cron-29425021-wxzqx\" (UID: \"dc2e18bf-398b-4f87-90c3-a14838b991d6\") " pod="openstack/keystone-cron-29425021-wxzqx" Dec 12 01:01:00 crc kubenswrapper[4606]: I1212 01:01:00.284713 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc2e18bf-398b-4f87-90c3-a14838b991d6-combined-ca-bundle\") pod \"keystone-cron-29425021-wxzqx\" (UID: \"dc2e18bf-398b-4f87-90c3-a14838b991d6\") " pod="openstack/keystone-cron-29425021-wxzqx" Dec 12 01:01:00 crc kubenswrapper[4606]: I1212 01:01:00.386667 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc2e18bf-398b-4f87-90c3-a14838b991d6-config-data\") pod \"keystone-cron-29425021-wxzqx\" (UID: \"dc2e18bf-398b-4f87-90c3-a14838b991d6\") " pod="openstack/keystone-cron-29425021-wxzqx" Dec 12 01:01:00 crc kubenswrapper[4606]: I1212 01:01:00.386744 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrdnl\" (UniqueName: \"kubernetes.io/projected/dc2e18bf-398b-4f87-90c3-a14838b991d6-kube-api-access-mrdnl\") pod \"keystone-cron-29425021-wxzqx\" (UID: \"dc2e18bf-398b-4f87-90c3-a14838b991d6\") " pod="openstack/keystone-cron-29425021-wxzqx" Dec 12 01:01:00 crc kubenswrapper[4606]: I1212 01:01:00.386789 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/dc2e18bf-398b-4f87-90c3-a14838b991d6-fernet-keys\") pod \"keystone-cron-29425021-wxzqx\" (UID: \"dc2e18bf-398b-4f87-90c3-a14838b991d6\") " pod="openstack/keystone-cron-29425021-wxzqx" Dec 12 01:01:00 crc kubenswrapper[4606]: I1212 01:01:00.386882 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc2e18bf-398b-4f87-90c3-a14838b991d6-combined-ca-bundle\") pod \"keystone-cron-29425021-wxzqx\" (UID: \"dc2e18bf-398b-4f87-90c3-a14838b991d6\") " pod="openstack/keystone-cron-29425021-wxzqx" Dec 12 01:01:00 crc kubenswrapper[4606]: I1212 01:01:00.393514 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc2e18bf-398b-4f87-90c3-a14838b991d6-combined-ca-bundle\") pod \"keystone-cron-29425021-wxzqx\" (UID: \"dc2e18bf-398b-4f87-90c3-a14838b991d6\") " pod="openstack/keystone-cron-29425021-wxzqx" Dec 12 01:01:00 crc kubenswrapper[4606]: I1212 01:01:00.394695 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc2e18bf-398b-4f87-90c3-a14838b991d6-config-data\") pod \"keystone-cron-29425021-wxzqx\" (UID: \"dc2e18bf-398b-4f87-90c3-a14838b991d6\") " pod="openstack/keystone-cron-29425021-wxzqx" Dec 12 01:01:00 crc kubenswrapper[4606]: I1212 01:01:00.398138 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/dc2e18bf-398b-4f87-90c3-a14838b991d6-fernet-keys\") pod \"keystone-cron-29425021-wxzqx\" (UID: \"dc2e18bf-398b-4f87-90c3-a14838b991d6\") " pod="openstack/keystone-cron-29425021-wxzqx" Dec 12 01:01:00 crc kubenswrapper[4606]: I1212 01:01:00.424087 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrdnl\" (UniqueName: \"kubernetes.io/projected/dc2e18bf-398b-4f87-90c3-a14838b991d6-kube-api-access-mrdnl\") pod \"keystone-cron-29425021-wxzqx\" (UID: \"dc2e18bf-398b-4f87-90c3-a14838b991d6\") " pod="openstack/keystone-cron-29425021-wxzqx" Dec 12 01:01:00 crc kubenswrapper[4606]: I1212 01:01:00.450800 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29425021-wxzqx" Dec 12 01:01:00 crc kubenswrapper[4606]: I1212 01:01:00.700286 4606 scope.go:117] "RemoveContainer" containerID="5c37b89455dd84575aa571020478d60921333939ed948577bddb71be18b5f553" Dec 12 01:01:00 crc kubenswrapper[4606]: E1212 01:01:00.701015 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 01:01:00 crc kubenswrapper[4606]: I1212 01:01:00.791674 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fzq9w" Dec 12 01:01:00 crc kubenswrapper[4606]: I1212 01:01:00.907331 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dbfee972-c3b5-489c-adb3-c7e6720b67d0-ssh-key\") pod \"dbfee972-c3b5-489c-adb3-c7e6720b67d0\" (UID: \"dbfee972-c3b5-489c-adb3-c7e6720b67d0\") " Dec 12 01:01:00 crc kubenswrapper[4606]: I1212 01:01:00.907627 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dbfee972-c3b5-489c-adb3-c7e6720b67d0-inventory\") pod \"dbfee972-c3b5-489c-adb3-c7e6720b67d0\" (UID: \"dbfee972-c3b5-489c-adb3-c7e6720b67d0\") " Dec 12 01:01:00 crc kubenswrapper[4606]: I1212 01:01:00.907661 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sv2mz\" (UniqueName: \"kubernetes.io/projected/dbfee972-c3b5-489c-adb3-c7e6720b67d0-kube-api-access-sv2mz\") pod \"dbfee972-c3b5-489c-adb3-c7e6720b67d0\" (UID: \"dbfee972-c3b5-489c-adb3-c7e6720b67d0\") " Dec 12 01:01:00 crc kubenswrapper[4606]: I1212 01:01:00.912477 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dbfee972-c3b5-489c-adb3-c7e6720b67d0-kube-api-access-sv2mz" (OuterVolumeSpecName: "kube-api-access-sv2mz") pod "dbfee972-c3b5-489c-adb3-c7e6720b67d0" (UID: "dbfee972-c3b5-489c-adb3-c7e6720b67d0"). InnerVolumeSpecName "kube-api-access-sv2mz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 01:01:00 crc kubenswrapper[4606]: I1212 01:01:00.940306 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbfee972-c3b5-489c-adb3-c7e6720b67d0-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "dbfee972-c3b5-489c-adb3-c7e6720b67d0" (UID: "dbfee972-c3b5-489c-adb3-c7e6720b67d0"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 01:01:00 crc kubenswrapper[4606]: I1212 01:01:00.946635 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbfee972-c3b5-489c-adb3-c7e6720b67d0-inventory" (OuterVolumeSpecName: "inventory") pod "dbfee972-c3b5-489c-adb3-c7e6720b67d0" (UID: "dbfee972-c3b5-489c-adb3-c7e6720b67d0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 01:01:00 crc kubenswrapper[4606]: I1212 01:01:00.960420 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29425021-wxzqx"] Dec 12 01:01:01 crc kubenswrapper[4606]: I1212 01:01:01.009839 4606 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dbfee972-c3b5-489c-adb3-c7e6720b67d0-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 12 01:01:01 crc kubenswrapper[4606]: I1212 01:01:01.009872 4606 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dbfee972-c3b5-489c-adb3-c7e6720b67d0-inventory\") on node \"crc\" DevicePath \"\"" Dec 12 01:01:01 crc kubenswrapper[4606]: I1212 01:01:01.009883 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sv2mz\" (UniqueName: \"kubernetes.io/projected/dbfee972-c3b5-489c-adb3-c7e6720b67d0-kube-api-access-sv2mz\") on node \"crc\" DevicePath \"\"" Dec 12 01:01:01 crc kubenswrapper[4606]: I1212 01:01:01.359749 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fzq9w" event={"ID":"dbfee972-c3b5-489c-adb3-c7e6720b67d0","Type":"ContainerDied","Data":"eebb7af5834af22d2c390453363e7e2e1d54bae6b45e0739967c0166b7ad684d"} Dec 12 01:01:01 crc kubenswrapper[4606]: I1212 01:01:01.359794 4606 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eebb7af5834af22d2c390453363e7e2e1d54bae6b45e0739967c0166b7ad684d" Dec 12 01:01:01 crc kubenswrapper[4606]: I1212 01:01:01.359851 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fzq9w" Dec 12 01:01:01 crc kubenswrapper[4606]: I1212 01:01:01.367555 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29425021-wxzqx" event={"ID":"dc2e18bf-398b-4f87-90c3-a14838b991d6","Type":"ContainerStarted","Data":"f7797c6ec7dff46d42f1f137438097c725bdf4805bf53cc858d487a42b4f9b3e"} Dec 12 01:01:01 crc kubenswrapper[4606]: I1212 01:01:01.367597 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29425021-wxzqx" event={"ID":"dc2e18bf-398b-4f87-90c3-a14838b991d6","Type":"ContainerStarted","Data":"152460977ddde1eb4ada37cf49e5c20356a0fea7a446939b35521debc5e898ff"} Dec 12 01:01:01 crc kubenswrapper[4606]: I1212 01:01:01.410516 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29425021-wxzqx" podStartSLOduration=1.410488911 podStartE2EDuration="1.410488911s" podCreationTimestamp="2025-12-12 01:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 01:01:01.403043763 +0000 UTC m=+2251.948396629" watchObservedRunningTime="2025-12-12 01:01:01.410488911 +0000 UTC m=+2251.955841777" Dec 12 01:01:01 crc kubenswrapper[4606]: I1212 01:01:01.454065 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-qnzdn"] Dec 12 01:01:01 crc kubenswrapper[4606]: E1212 01:01:01.454775 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbfee972-c3b5-489c-adb3-c7e6720b67d0" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 12 01:01:01 crc kubenswrapper[4606]: I1212 01:01:01.454797 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbfee972-c3b5-489c-adb3-c7e6720b67d0" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 12 01:01:01 crc kubenswrapper[4606]: I1212 01:01:01.461091 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="dbfee972-c3b5-489c-adb3-c7e6720b67d0" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 12 01:01:01 crc kubenswrapper[4606]: I1212 01:01:01.461802 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-qnzdn" Dec 12 01:01:01 crc kubenswrapper[4606]: I1212 01:01:01.464201 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-w59bl" Dec 12 01:01:01 crc kubenswrapper[4606]: I1212 01:01:01.464442 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 12 01:01:01 crc kubenswrapper[4606]: I1212 01:01:01.464642 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 12 01:01:01 crc kubenswrapper[4606]: I1212 01:01:01.466504 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 12 01:01:01 crc kubenswrapper[4606]: I1212 01:01:01.476243 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-qnzdn"] Dec 12 01:01:01 crc kubenswrapper[4606]: I1212 01:01:01.621152 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/979cac18-8b58-417f-907c-d0aa1cc7646a-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-qnzdn\" (UID: \"979cac18-8b58-417f-907c-d0aa1cc7646a\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-qnzdn" Dec 12 01:01:01 crc kubenswrapper[4606]: I1212 01:01:01.621239 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/979cac18-8b58-417f-907c-d0aa1cc7646a-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-qnzdn\" (UID: \"979cac18-8b58-417f-907c-d0aa1cc7646a\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-qnzdn" Dec 12 01:01:01 crc kubenswrapper[4606]: I1212 01:01:01.621478 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2b95\" (UniqueName: \"kubernetes.io/projected/979cac18-8b58-417f-907c-d0aa1cc7646a-kube-api-access-t2b95\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-qnzdn\" (UID: \"979cac18-8b58-417f-907c-d0aa1cc7646a\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-qnzdn" Dec 12 01:01:01 crc kubenswrapper[4606]: I1212 01:01:01.722543 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/979cac18-8b58-417f-907c-d0aa1cc7646a-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-qnzdn\" (UID: \"979cac18-8b58-417f-907c-d0aa1cc7646a\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-qnzdn" Dec 12 01:01:01 crc kubenswrapper[4606]: I1212 01:01:01.722829 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/979cac18-8b58-417f-907c-d0aa1cc7646a-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-qnzdn\" (UID: \"979cac18-8b58-417f-907c-d0aa1cc7646a\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-qnzdn" Dec 12 01:01:01 crc kubenswrapper[4606]: I1212 01:01:01.723009 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2b95\" (UniqueName: \"kubernetes.io/projected/979cac18-8b58-417f-907c-d0aa1cc7646a-kube-api-access-t2b95\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-qnzdn\" (UID: \"979cac18-8b58-417f-907c-d0aa1cc7646a\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-qnzdn" Dec 12 01:01:01 crc kubenswrapper[4606]: I1212 01:01:01.727761 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/979cac18-8b58-417f-907c-d0aa1cc7646a-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-qnzdn\" (UID: \"979cac18-8b58-417f-907c-d0aa1cc7646a\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-qnzdn" Dec 12 01:01:01 crc kubenswrapper[4606]: I1212 01:01:01.738767 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/979cac18-8b58-417f-907c-d0aa1cc7646a-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-qnzdn\" (UID: \"979cac18-8b58-417f-907c-d0aa1cc7646a\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-qnzdn" Dec 12 01:01:01 crc kubenswrapper[4606]: I1212 01:01:01.744384 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2b95\" (UniqueName: \"kubernetes.io/projected/979cac18-8b58-417f-907c-d0aa1cc7646a-kube-api-access-t2b95\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-qnzdn\" (UID: \"979cac18-8b58-417f-907c-d0aa1cc7646a\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-qnzdn" Dec 12 01:01:01 crc kubenswrapper[4606]: I1212 01:01:01.793441 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-qnzdn" Dec 12 01:01:02 crc kubenswrapper[4606]: I1212 01:01:02.396323 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-qnzdn"] Dec 12 01:01:02 crc kubenswrapper[4606]: W1212 01:01:02.407242 4606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod979cac18_8b58_417f_907c_d0aa1cc7646a.slice/crio-65d3801515deb9f467d60df5b34397bf02dcd0f80dfd5e1a4681a2333435797c WatchSource:0}: Error finding container 65d3801515deb9f467d60df5b34397bf02dcd0f80dfd5e1a4681a2333435797c: Status 404 returned error can't find the container with id 65d3801515deb9f467d60df5b34397bf02dcd0f80dfd5e1a4681a2333435797c Dec 12 01:01:03 crc kubenswrapper[4606]: I1212 01:01:03.386371 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-qnzdn" event={"ID":"979cac18-8b58-417f-907c-d0aa1cc7646a","Type":"ContainerStarted","Data":"80d89dc107a5d6bbbcf368414e68c75d0f7453cab5778c12fecf6e01770b6fea"} Dec 12 01:01:03 crc kubenswrapper[4606]: I1212 01:01:03.386770 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-qnzdn" event={"ID":"979cac18-8b58-417f-907c-d0aa1cc7646a","Type":"ContainerStarted","Data":"65d3801515deb9f467d60df5b34397bf02dcd0f80dfd5e1a4681a2333435797c"} Dec 12 01:01:03 crc kubenswrapper[4606]: I1212 01:01:03.430541 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-qnzdn" podStartSLOduration=2.263987044 podStartE2EDuration="2.430521561s" podCreationTimestamp="2025-12-12 01:01:01 +0000 UTC" firstStartedPulling="2025-12-12 01:01:02.410744441 +0000 UTC m=+2252.956097307" lastFinishedPulling="2025-12-12 01:01:02.577278958 +0000 UTC m=+2253.122631824" observedRunningTime="2025-12-12 01:01:03.425670282 +0000 UTC m=+2253.971023148" watchObservedRunningTime="2025-12-12 01:01:03.430521561 +0000 UTC m=+2253.975874427" Dec 12 01:01:03 crc kubenswrapper[4606]: I1212 01:01:03.452297 4606 scope.go:117] "RemoveContainer" containerID="f3acce94c709bfd6e56ca940f2b563c56c62902741e670a2d5dd229d699dbb46" Dec 12 01:01:04 crc kubenswrapper[4606]: I1212 01:01:04.399865 4606 generic.go:334] "Generic (PLEG): container finished" podID="dc2e18bf-398b-4f87-90c3-a14838b991d6" containerID="f7797c6ec7dff46d42f1f137438097c725bdf4805bf53cc858d487a42b4f9b3e" exitCode=0 Dec 12 01:01:04 crc kubenswrapper[4606]: I1212 01:01:04.401010 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29425021-wxzqx" event={"ID":"dc2e18bf-398b-4f87-90c3-a14838b991d6","Type":"ContainerDied","Data":"f7797c6ec7dff46d42f1f137438097c725bdf4805bf53cc858d487a42b4f9b3e"} Dec 12 01:01:05 crc kubenswrapper[4606]: I1212 01:01:05.785125 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29425021-wxzqx" Dec 12 01:01:05 crc kubenswrapper[4606]: I1212 01:01:05.931244 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/dc2e18bf-398b-4f87-90c3-a14838b991d6-fernet-keys\") pod \"dc2e18bf-398b-4f87-90c3-a14838b991d6\" (UID: \"dc2e18bf-398b-4f87-90c3-a14838b991d6\") " Dec 12 01:01:05 crc kubenswrapper[4606]: I1212 01:01:05.931321 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mrdnl\" (UniqueName: \"kubernetes.io/projected/dc2e18bf-398b-4f87-90c3-a14838b991d6-kube-api-access-mrdnl\") pod \"dc2e18bf-398b-4f87-90c3-a14838b991d6\" (UID: \"dc2e18bf-398b-4f87-90c3-a14838b991d6\") " Dec 12 01:01:05 crc kubenswrapper[4606]: I1212 01:01:05.931349 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc2e18bf-398b-4f87-90c3-a14838b991d6-config-data\") pod \"dc2e18bf-398b-4f87-90c3-a14838b991d6\" (UID: \"dc2e18bf-398b-4f87-90c3-a14838b991d6\") " Dec 12 01:01:05 crc kubenswrapper[4606]: I1212 01:01:05.931391 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc2e18bf-398b-4f87-90c3-a14838b991d6-combined-ca-bundle\") pod \"dc2e18bf-398b-4f87-90c3-a14838b991d6\" (UID: \"dc2e18bf-398b-4f87-90c3-a14838b991d6\") " Dec 12 01:01:05 crc kubenswrapper[4606]: I1212 01:01:05.942623 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc2e18bf-398b-4f87-90c3-a14838b991d6-kube-api-access-mrdnl" (OuterVolumeSpecName: "kube-api-access-mrdnl") pod "dc2e18bf-398b-4f87-90c3-a14838b991d6" (UID: "dc2e18bf-398b-4f87-90c3-a14838b991d6"). InnerVolumeSpecName "kube-api-access-mrdnl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 01:01:05 crc kubenswrapper[4606]: I1212 01:01:05.951157 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc2e18bf-398b-4f87-90c3-a14838b991d6-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "dc2e18bf-398b-4f87-90c3-a14838b991d6" (UID: "dc2e18bf-398b-4f87-90c3-a14838b991d6"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 01:01:05 crc kubenswrapper[4606]: I1212 01:01:05.976427 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc2e18bf-398b-4f87-90c3-a14838b991d6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dc2e18bf-398b-4f87-90c3-a14838b991d6" (UID: "dc2e18bf-398b-4f87-90c3-a14838b991d6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 01:01:06 crc kubenswrapper[4606]: I1212 01:01:06.001307 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc2e18bf-398b-4f87-90c3-a14838b991d6-config-data" (OuterVolumeSpecName: "config-data") pod "dc2e18bf-398b-4f87-90c3-a14838b991d6" (UID: "dc2e18bf-398b-4f87-90c3-a14838b991d6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 01:01:06 crc kubenswrapper[4606]: I1212 01:01:06.033820 4606 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/dc2e18bf-398b-4f87-90c3-a14838b991d6-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 12 01:01:06 crc kubenswrapper[4606]: I1212 01:01:06.034498 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mrdnl\" (UniqueName: \"kubernetes.io/projected/dc2e18bf-398b-4f87-90c3-a14838b991d6-kube-api-access-mrdnl\") on node \"crc\" DevicePath \"\"" Dec 12 01:01:06 crc kubenswrapper[4606]: I1212 01:01:06.034521 4606 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc2e18bf-398b-4f87-90c3-a14838b991d6-config-data\") on node \"crc\" DevicePath \"\"" Dec 12 01:01:06 crc kubenswrapper[4606]: I1212 01:01:06.034531 4606 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc2e18bf-398b-4f87-90c3-a14838b991d6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 12 01:01:06 crc kubenswrapper[4606]: I1212 01:01:06.419054 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29425021-wxzqx" event={"ID":"dc2e18bf-398b-4f87-90c3-a14838b991d6","Type":"ContainerDied","Data":"152460977ddde1eb4ada37cf49e5c20356a0fea7a446939b35521debc5e898ff"} Dec 12 01:01:06 crc kubenswrapper[4606]: I1212 01:01:06.419105 4606 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="152460977ddde1eb4ada37cf49e5c20356a0fea7a446939b35521debc5e898ff" Dec 12 01:01:06 crc kubenswrapper[4606]: I1212 01:01:06.419213 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29425021-wxzqx" Dec 12 01:01:11 crc kubenswrapper[4606]: I1212 01:01:11.699525 4606 scope.go:117] "RemoveContainer" containerID="5c37b89455dd84575aa571020478d60921333939ed948577bddb71be18b5f553" Dec 12 01:01:11 crc kubenswrapper[4606]: E1212 01:01:11.700475 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 01:01:13 crc kubenswrapper[4606]: I1212 01:01:13.482787 4606 generic.go:334] "Generic (PLEG): container finished" podID="979cac18-8b58-417f-907c-d0aa1cc7646a" containerID="80d89dc107a5d6bbbcf368414e68c75d0f7453cab5778c12fecf6e01770b6fea" exitCode=0 Dec 12 01:01:13 crc kubenswrapper[4606]: I1212 01:01:13.482829 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-qnzdn" event={"ID":"979cac18-8b58-417f-907c-d0aa1cc7646a","Type":"ContainerDied","Data":"80d89dc107a5d6bbbcf368414e68c75d0f7453cab5778c12fecf6e01770b6fea"} Dec 12 01:01:14 crc kubenswrapper[4606]: I1212 01:01:14.886543 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-qnzdn" Dec 12 01:01:15 crc kubenswrapper[4606]: I1212 01:01:15.011393 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/979cac18-8b58-417f-907c-d0aa1cc7646a-inventory\") pod \"979cac18-8b58-417f-907c-d0aa1cc7646a\" (UID: \"979cac18-8b58-417f-907c-d0aa1cc7646a\") " Dec 12 01:01:15 crc kubenswrapper[4606]: I1212 01:01:15.011744 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/979cac18-8b58-417f-907c-d0aa1cc7646a-ssh-key\") pod \"979cac18-8b58-417f-907c-d0aa1cc7646a\" (UID: \"979cac18-8b58-417f-907c-d0aa1cc7646a\") " Dec 12 01:01:15 crc kubenswrapper[4606]: I1212 01:01:15.011905 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t2b95\" (UniqueName: \"kubernetes.io/projected/979cac18-8b58-417f-907c-d0aa1cc7646a-kube-api-access-t2b95\") pod \"979cac18-8b58-417f-907c-d0aa1cc7646a\" (UID: \"979cac18-8b58-417f-907c-d0aa1cc7646a\") " Dec 12 01:01:15 crc kubenswrapper[4606]: I1212 01:01:15.020451 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/979cac18-8b58-417f-907c-d0aa1cc7646a-kube-api-access-t2b95" (OuterVolumeSpecName: "kube-api-access-t2b95") pod "979cac18-8b58-417f-907c-d0aa1cc7646a" (UID: "979cac18-8b58-417f-907c-d0aa1cc7646a"). InnerVolumeSpecName "kube-api-access-t2b95". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 01:01:15 crc kubenswrapper[4606]: I1212 01:01:15.043570 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/979cac18-8b58-417f-907c-d0aa1cc7646a-inventory" (OuterVolumeSpecName: "inventory") pod "979cac18-8b58-417f-907c-d0aa1cc7646a" (UID: "979cac18-8b58-417f-907c-d0aa1cc7646a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 01:01:15 crc kubenswrapper[4606]: I1212 01:01:15.044713 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/979cac18-8b58-417f-907c-d0aa1cc7646a-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "979cac18-8b58-417f-907c-d0aa1cc7646a" (UID: "979cac18-8b58-417f-907c-d0aa1cc7646a"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 01:01:15 crc kubenswrapper[4606]: I1212 01:01:15.114360 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t2b95\" (UniqueName: \"kubernetes.io/projected/979cac18-8b58-417f-907c-d0aa1cc7646a-kube-api-access-t2b95\") on node \"crc\" DevicePath \"\"" Dec 12 01:01:15 crc kubenswrapper[4606]: I1212 01:01:15.114417 4606 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/979cac18-8b58-417f-907c-d0aa1cc7646a-inventory\") on node \"crc\" DevicePath \"\"" Dec 12 01:01:15 crc kubenswrapper[4606]: I1212 01:01:15.114426 4606 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/979cac18-8b58-417f-907c-d0aa1cc7646a-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 12 01:01:15 crc kubenswrapper[4606]: I1212 01:01:15.510474 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-qnzdn" event={"ID":"979cac18-8b58-417f-907c-d0aa1cc7646a","Type":"ContainerDied","Data":"65d3801515deb9f467d60df5b34397bf02dcd0f80dfd5e1a4681a2333435797c"} Dec 12 01:01:15 crc kubenswrapper[4606]: I1212 01:01:15.510703 4606 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="65d3801515deb9f467d60df5b34397bf02dcd0f80dfd5e1a4681a2333435797c" Dec 12 01:01:15 crc kubenswrapper[4606]: I1212 01:01:15.510538 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-qnzdn" Dec 12 01:01:15 crc kubenswrapper[4606]: I1212 01:01:15.625983 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9kplh"] Dec 12 01:01:15 crc kubenswrapper[4606]: E1212 01:01:15.626447 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="979cac18-8b58-417f-907c-d0aa1cc7646a" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 12 01:01:15 crc kubenswrapper[4606]: I1212 01:01:15.626469 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="979cac18-8b58-417f-907c-d0aa1cc7646a" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 12 01:01:15 crc kubenswrapper[4606]: E1212 01:01:15.626508 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc2e18bf-398b-4f87-90c3-a14838b991d6" containerName="keystone-cron" Dec 12 01:01:15 crc kubenswrapper[4606]: I1212 01:01:15.626517 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc2e18bf-398b-4f87-90c3-a14838b991d6" containerName="keystone-cron" Dec 12 01:01:15 crc kubenswrapper[4606]: I1212 01:01:15.626747 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc2e18bf-398b-4f87-90c3-a14838b991d6" containerName="keystone-cron" Dec 12 01:01:15 crc kubenswrapper[4606]: I1212 01:01:15.626800 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="979cac18-8b58-417f-907c-d0aa1cc7646a" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 12 01:01:15 crc kubenswrapper[4606]: I1212 01:01:15.627893 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9kplh" Dec 12 01:01:15 crc kubenswrapper[4606]: I1212 01:01:15.631586 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Dec 12 01:01:15 crc kubenswrapper[4606]: I1212 01:01:15.632008 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-w59bl" Dec 12 01:01:15 crc kubenswrapper[4606]: I1212 01:01:15.632285 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Dec 12 01:01:15 crc kubenswrapper[4606]: I1212 01:01:15.632477 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Dec 12 01:01:15 crc kubenswrapper[4606]: I1212 01:01:15.632624 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Dec 12 01:01:15 crc kubenswrapper[4606]: I1212 01:01:15.632821 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 12 01:01:15 crc kubenswrapper[4606]: I1212 01:01:15.632918 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 12 01:01:15 crc kubenswrapper[4606]: I1212 01:01:15.647448 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9kplh"] Dec 12 01:01:15 crc kubenswrapper[4606]: I1212 01:01:15.650224 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 12 01:01:15 crc kubenswrapper[4606]: I1212 01:01:15.737737 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/134d45c2-0084-4779-9125-b36e673a5cf8-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9kplh\" (UID: \"134d45c2-0084-4779-9125-b36e673a5cf8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9kplh" Dec 12 01:01:15 crc kubenswrapper[4606]: I1212 01:01:15.737796 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/134d45c2-0084-4779-9125-b36e673a5cf8-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9kplh\" (UID: \"134d45c2-0084-4779-9125-b36e673a5cf8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9kplh" Dec 12 01:01:15 crc kubenswrapper[4606]: I1212 01:01:15.737859 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/134d45c2-0084-4779-9125-b36e673a5cf8-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9kplh\" (UID: \"134d45c2-0084-4779-9125-b36e673a5cf8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9kplh" Dec 12 01:01:15 crc kubenswrapper[4606]: I1212 01:01:15.737947 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jb9qk\" (UniqueName: \"kubernetes.io/projected/134d45c2-0084-4779-9125-b36e673a5cf8-kube-api-access-jb9qk\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9kplh\" (UID: \"134d45c2-0084-4779-9125-b36e673a5cf8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9kplh" Dec 12 01:01:15 crc kubenswrapper[4606]: I1212 01:01:15.738024 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/134d45c2-0084-4779-9125-b36e673a5cf8-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9kplh\" (UID: \"134d45c2-0084-4779-9125-b36e673a5cf8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9kplh" Dec 12 01:01:15 crc kubenswrapper[4606]: I1212 01:01:15.738083 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/134d45c2-0084-4779-9125-b36e673a5cf8-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9kplh\" (UID: \"134d45c2-0084-4779-9125-b36e673a5cf8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9kplh" Dec 12 01:01:15 crc kubenswrapper[4606]: I1212 01:01:15.738099 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/134d45c2-0084-4779-9125-b36e673a5cf8-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9kplh\" (UID: \"134d45c2-0084-4779-9125-b36e673a5cf8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9kplh" Dec 12 01:01:15 crc kubenswrapper[4606]: I1212 01:01:15.738120 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/134d45c2-0084-4779-9125-b36e673a5cf8-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9kplh\" (UID: \"134d45c2-0084-4779-9125-b36e673a5cf8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9kplh" Dec 12 01:01:15 crc kubenswrapper[4606]: I1212 01:01:15.738471 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/134d45c2-0084-4779-9125-b36e673a5cf8-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9kplh\" (UID: \"134d45c2-0084-4779-9125-b36e673a5cf8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9kplh" Dec 12 01:01:15 crc kubenswrapper[4606]: I1212 01:01:15.738496 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/134d45c2-0084-4779-9125-b36e673a5cf8-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9kplh\" (UID: \"134d45c2-0084-4779-9125-b36e673a5cf8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9kplh" Dec 12 01:01:15 crc kubenswrapper[4606]: I1212 01:01:15.738571 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/134d45c2-0084-4779-9125-b36e673a5cf8-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9kplh\" (UID: \"134d45c2-0084-4779-9125-b36e673a5cf8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9kplh" Dec 12 01:01:15 crc kubenswrapper[4606]: I1212 01:01:15.738621 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/134d45c2-0084-4779-9125-b36e673a5cf8-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9kplh\" (UID: \"134d45c2-0084-4779-9125-b36e673a5cf8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9kplh" Dec 12 01:01:15 crc kubenswrapper[4606]: I1212 01:01:15.738737 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/134d45c2-0084-4779-9125-b36e673a5cf8-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9kplh\" (UID: \"134d45c2-0084-4779-9125-b36e673a5cf8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9kplh" Dec 12 01:01:15 crc kubenswrapper[4606]: I1212 01:01:15.738757 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/134d45c2-0084-4779-9125-b36e673a5cf8-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9kplh\" (UID: \"134d45c2-0084-4779-9125-b36e673a5cf8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9kplh" Dec 12 01:01:15 crc kubenswrapper[4606]: I1212 01:01:15.840834 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jb9qk\" (UniqueName: \"kubernetes.io/projected/134d45c2-0084-4779-9125-b36e673a5cf8-kube-api-access-jb9qk\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9kplh\" (UID: \"134d45c2-0084-4779-9125-b36e673a5cf8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9kplh" Dec 12 01:01:15 crc kubenswrapper[4606]: I1212 01:01:15.840900 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/134d45c2-0084-4779-9125-b36e673a5cf8-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9kplh\" (UID: \"134d45c2-0084-4779-9125-b36e673a5cf8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9kplh" Dec 12 01:01:15 crc kubenswrapper[4606]: I1212 01:01:15.840955 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/134d45c2-0084-4779-9125-b36e673a5cf8-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9kplh\" (UID: \"134d45c2-0084-4779-9125-b36e673a5cf8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9kplh" Dec 12 01:01:15 crc kubenswrapper[4606]: I1212 01:01:15.840973 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/134d45c2-0084-4779-9125-b36e673a5cf8-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9kplh\" (UID: \"134d45c2-0084-4779-9125-b36e673a5cf8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9kplh" Dec 12 01:01:15 crc kubenswrapper[4606]: I1212 01:01:15.840995 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/134d45c2-0084-4779-9125-b36e673a5cf8-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9kplh\" (UID: \"134d45c2-0084-4779-9125-b36e673a5cf8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9kplh" Dec 12 01:01:15 crc kubenswrapper[4606]: I1212 01:01:15.841026 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/134d45c2-0084-4779-9125-b36e673a5cf8-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9kplh\" (UID: \"134d45c2-0084-4779-9125-b36e673a5cf8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9kplh" Dec 12 01:01:15 crc kubenswrapper[4606]: I1212 01:01:15.841044 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/134d45c2-0084-4779-9125-b36e673a5cf8-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9kplh\" (UID: \"134d45c2-0084-4779-9125-b36e673a5cf8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9kplh" Dec 12 01:01:15 crc kubenswrapper[4606]: I1212 01:01:15.841080 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/134d45c2-0084-4779-9125-b36e673a5cf8-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9kplh\" (UID: \"134d45c2-0084-4779-9125-b36e673a5cf8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9kplh" Dec 12 01:01:15 crc kubenswrapper[4606]: I1212 01:01:15.841100 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/134d45c2-0084-4779-9125-b36e673a5cf8-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9kplh\" (UID: \"134d45c2-0084-4779-9125-b36e673a5cf8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9kplh" Dec 12 01:01:15 crc kubenswrapper[4606]: I1212 01:01:15.841118 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/134d45c2-0084-4779-9125-b36e673a5cf8-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9kplh\" (UID: \"134d45c2-0084-4779-9125-b36e673a5cf8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9kplh" Dec 12 01:01:15 crc kubenswrapper[4606]: I1212 01:01:15.841135 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/134d45c2-0084-4779-9125-b36e673a5cf8-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9kplh\" (UID: \"134d45c2-0084-4779-9125-b36e673a5cf8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9kplh" Dec 12 01:01:15 crc kubenswrapper[4606]: I1212 01:01:15.841160 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/134d45c2-0084-4779-9125-b36e673a5cf8-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9kplh\" (UID: \"134d45c2-0084-4779-9125-b36e673a5cf8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9kplh" Dec 12 01:01:15 crc kubenswrapper[4606]: I1212 01:01:15.841194 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/134d45c2-0084-4779-9125-b36e673a5cf8-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9kplh\" (UID: \"134d45c2-0084-4779-9125-b36e673a5cf8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9kplh" Dec 12 01:01:15 crc kubenswrapper[4606]: I1212 01:01:15.841235 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/134d45c2-0084-4779-9125-b36e673a5cf8-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9kplh\" (UID: \"134d45c2-0084-4779-9125-b36e673a5cf8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9kplh" Dec 12 01:01:15 crc kubenswrapper[4606]: I1212 01:01:15.845907 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/134d45c2-0084-4779-9125-b36e673a5cf8-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9kplh\" (UID: \"134d45c2-0084-4779-9125-b36e673a5cf8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9kplh" Dec 12 01:01:15 crc kubenswrapper[4606]: I1212 01:01:15.848006 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/134d45c2-0084-4779-9125-b36e673a5cf8-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9kplh\" (UID: \"134d45c2-0084-4779-9125-b36e673a5cf8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9kplh" Dec 12 01:01:15 crc kubenswrapper[4606]: I1212 01:01:15.848659 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/134d45c2-0084-4779-9125-b36e673a5cf8-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9kplh\" (UID: \"134d45c2-0084-4779-9125-b36e673a5cf8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9kplh" Dec 12 01:01:15 crc kubenswrapper[4606]: I1212 01:01:15.849534 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/134d45c2-0084-4779-9125-b36e673a5cf8-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9kplh\" (UID: \"134d45c2-0084-4779-9125-b36e673a5cf8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9kplh" Dec 12 01:01:15 crc kubenswrapper[4606]: I1212 01:01:15.850974 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/134d45c2-0084-4779-9125-b36e673a5cf8-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9kplh\" (UID: \"134d45c2-0084-4779-9125-b36e673a5cf8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9kplh" Dec 12 01:01:15 crc kubenswrapper[4606]: I1212 01:01:15.851203 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/134d45c2-0084-4779-9125-b36e673a5cf8-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9kplh\" (UID: \"134d45c2-0084-4779-9125-b36e673a5cf8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9kplh" Dec 12 01:01:15 crc kubenswrapper[4606]: I1212 01:01:15.852113 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/134d45c2-0084-4779-9125-b36e673a5cf8-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9kplh\" (UID: \"134d45c2-0084-4779-9125-b36e673a5cf8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9kplh" Dec 12 01:01:15 crc kubenswrapper[4606]: I1212 01:01:15.852653 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/134d45c2-0084-4779-9125-b36e673a5cf8-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9kplh\" (UID: \"134d45c2-0084-4779-9125-b36e673a5cf8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9kplh" Dec 12 01:01:15 crc kubenswrapper[4606]: I1212 01:01:15.852775 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/134d45c2-0084-4779-9125-b36e673a5cf8-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9kplh\" (UID: \"134d45c2-0084-4779-9125-b36e673a5cf8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9kplh" Dec 12 01:01:15 crc kubenswrapper[4606]: I1212 01:01:15.854632 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/134d45c2-0084-4779-9125-b36e673a5cf8-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9kplh\" (UID: \"134d45c2-0084-4779-9125-b36e673a5cf8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9kplh" Dec 12 01:01:15 crc kubenswrapper[4606]: I1212 01:01:15.854978 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/134d45c2-0084-4779-9125-b36e673a5cf8-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9kplh\" (UID: \"134d45c2-0084-4779-9125-b36e673a5cf8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9kplh" Dec 12 01:01:15 crc kubenswrapper[4606]: I1212 01:01:15.855252 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/134d45c2-0084-4779-9125-b36e673a5cf8-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9kplh\" (UID: \"134d45c2-0084-4779-9125-b36e673a5cf8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9kplh" Dec 12 01:01:15 crc kubenswrapper[4606]: I1212 01:01:15.855947 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/134d45c2-0084-4779-9125-b36e673a5cf8-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9kplh\" (UID: \"134d45c2-0084-4779-9125-b36e673a5cf8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9kplh" Dec 12 01:01:15 crc kubenswrapper[4606]: I1212 01:01:15.864606 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jb9qk\" (UniqueName: \"kubernetes.io/projected/134d45c2-0084-4779-9125-b36e673a5cf8-kube-api-access-jb9qk\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9kplh\" (UID: \"134d45c2-0084-4779-9125-b36e673a5cf8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9kplh" Dec 12 01:01:15 crc kubenswrapper[4606]: I1212 01:01:15.946974 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9kplh" Dec 12 01:01:16 crc kubenswrapper[4606]: I1212 01:01:16.479812 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9kplh"] Dec 12 01:01:16 crc kubenswrapper[4606]: I1212 01:01:16.521096 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9kplh" event={"ID":"134d45c2-0084-4779-9125-b36e673a5cf8","Type":"ContainerStarted","Data":"79de0baeea8c590a3f093080ff04f6066ec8d2ecd711f7f5b33815b2b06c42b3"} Dec 12 01:01:17 crc kubenswrapper[4606]: I1212 01:01:17.536041 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9kplh" event={"ID":"134d45c2-0084-4779-9125-b36e673a5cf8","Type":"ContainerStarted","Data":"5d3e41272f3bb7f3bda3bcfa185943ef54e4526616cb077010d58829c9f5bf46"} Dec 12 01:01:17 crc kubenswrapper[4606]: I1212 01:01:17.579162 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9kplh" podStartSLOduration=2.443106664 podStartE2EDuration="2.57914124s" podCreationTimestamp="2025-12-12 01:01:15 +0000 UTC" firstStartedPulling="2025-12-12 01:01:16.487228203 +0000 UTC m=+2267.032581069" lastFinishedPulling="2025-12-12 01:01:16.623262769 +0000 UTC m=+2267.168615645" observedRunningTime="2025-12-12 01:01:17.569243627 +0000 UTC m=+2268.114596503" watchObservedRunningTime="2025-12-12 01:01:17.57914124 +0000 UTC m=+2268.124494106" Dec 12 01:01:22 crc kubenswrapper[4606]: I1212 01:01:22.716872 4606 scope.go:117] "RemoveContainer" containerID="5c37b89455dd84575aa571020478d60921333939ed948577bddb71be18b5f553" Dec 12 01:01:22 crc kubenswrapper[4606]: E1212 01:01:22.717660 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 01:01:34 crc kubenswrapper[4606]: I1212 01:01:34.700273 4606 scope.go:117] "RemoveContainer" containerID="5c37b89455dd84575aa571020478d60921333939ed948577bddb71be18b5f553" Dec 12 01:01:34 crc kubenswrapper[4606]: E1212 01:01:34.701157 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 01:01:46 crc kubenswrapper[4606]: I1212 01:01:46.700359 4606 scope.go:117] "RemoveContainer" containerID="5c37b89455dd84575aa571020478d60921333939ed948577bddb71be18b5f553" Dec 12 01:01:46 crc kubenswrapper[4606]: E1212 01:01:46.701058 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 01:01:59 crc kubenswrapper[4606]: I1212 01:01:59.707377 4606 scope.go:117] "RemoveContainer" containerID="5c37b89455dd84575aa571020478d60921333939ed948577bddb71be18b5f553" Dec 12 01:01:59 crc kubenswrapper[4606]: E1212 01:01:59.708235 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 01:02:00 crc kubenswrapper[4606]: I1212 01:02:00.945757 4606 generic.go:334] "Generic (PLEG): container finished" podID="134d45c2-0084-4779-9125-b36e673a5cf8" containerID="5d3e41272f3bb7f3bda3bcfa185943ef54e4526616cb077010d58829c9f5bf46" exitCode=0 Dec 12 01:02:00 crc kubenswrapper[4606]: I1212 01:02:00.945793 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9kplh" event={"ID":"134d45c2-0084-4779-9125-b36e673a5cf8","Type":"ContainerDied","Data":"5d3e41272f3bb7f3bda3bcfa185943ef54e4526616cb077010d58829c9f5bf46"} Dec 12 01:02:02 crc kubenswrapper[4606]: I1212 01:02:02.388031 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9kplh" Dec 12 01:02:02 crc kubenswrapper[4606]: I1212 01:02:02.524617 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/134d45c2-0084-4779-9125-b36e673a5cf8-openstack-edpm-ipam-ovn-default-certs-0\") pod \"134d45c2-0084-4779-9125-b36e673a5cf8\" (UID: \"134d45c2-0084-4779-9125-b36e673a5cf8\") " Dec 12 01:02:02 crc kubenswrapper[4606]: I1212 01:02:02.524919 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/134d45c2-0084-4779-9125-b36e673a5cf8-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"134d45c2-0084-4779-9125-b36e673a5cf8\" (UID: \"134d45c2-0084-4779-9125-b36e673a5cf8\") " Dec 12 01:02:02 crc kubenswrapper[4606]: I1212 01:02:02.525014 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/134d45c2-0084-4779-9125-b36e673a5cf8-ssh-key\") pod \"134d45c2-0084-4779-9125-b36e673a5cf8\" (UID: \"134d45c2-0084-4779-9125-b36e673a5cf8\") " Dec 12 01:02:02 crc kubenswrapper[4606]: I1212 01:02:02.525084 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/134d45c2-0084-4779-9125-b36e673a5cf8-ovn-combined-ca-bundle\") pod \"134d45c2-0084-4779-9125-b36e673a5cf8\" (UID: \"134d45c2-0084-4779-9125-b36e673a5cf8\") " Dec 12 01:02:02 crc kubenswrapper[4606]: I1212 01:02:02.525193 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/134d45c2-0084-4779-9125-b36e673a5cf8-neutron-metadata-combined-ca-bundle\") pod \"134d45c2-0084-4779-9125-b36e673a5cf8\" (UID: \"134d45c2-0084-4779-9125-b36e673a5cf8\") " Dec 12 01:02:02 crc kubenswrapper[4606]: I1212 01:02:02.525305 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/134d45c2-0084-4779-9125-b36e673a5cf8-bootstrap-combined-ca-bundle\") pod \"134d45c2-0084-4779-9125-b36e673a5cf8\" (UID: \"134d45c2-0084-4779-9125-b36e673a5cf8\") " Dec 12 01:02:02 crc kubenswrapper[4606]: I1212 01:02:02.525389 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/134d45c2-0084-4779-9125-b36e673a5cf8-libvirt-combined-ca-bundle\") pod \"134d45c2-0084-4779-9125-b36e673a5cf8\" (UID: \"134d45c2-0084-4779-9125-b36e673a5cf8\") " Dec 12 01:02:02 crc kubenswrapper[4606]: I1212 01:02:02.525461 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/134d45c2-0084-4779-9125-b36e673a5cf8-repo-setup-combined-ca-bundle\") pod \"134d45c2-0084-4779-9125-b36e673a5cf8\" (UID: \"134d45c2-0084-4779-9125-b36e673a5cf8\") " Dec 12 01:02:02 crc kubenswrapper[4606]: I1212 01:02:02.525567 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jb9qk\" (UniqueName: \"kubernetes.io/projected/134d45c2-0084-4779-9125-b36e673a5cf8-kube-api-access-jb9qk\") pod \"134d45c2-0084-4779-9125-b36e673a5cf8\" (UID: \"134d45c2-0084-4779-9125-b36e673a5cf8\") " Dec 12 01:02:02 crc kubenswrapper[4606]: I1212 01:02:02.525668 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/134d45c2-0084-4779-9125-b36e673a5cf8-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"134d45c2-0084-4779-9125-b36e673a5cf8\" (UID: \"134d45c2-0084-4779-9125-b36e673a5cf8\") " Dec 12 01:02:02 crc kubenswrapper[4606]: I1212 01:02:02.525765 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/134d45c2-0084-4779-9125-b36e673a5cf8-inventory\") pod \"134d45c2-0084-4779-9125-b36e673a5cf8\" (UID: \"134d45c2-0084-4779-9125-b36e673a5cf8\") " Dec 12 01:02:02 crc kubenswrapper[4606]: I1212 01:02:02.525843 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/134d45c2-0084-4779-9125-b36e673a5cf8-nova-combined-ca-bundle\") pod \"134d45c2-0084-4779-9125-b36e673a5cf8\" (UID: \"134d45c2-0084-4779-9125-b36e673a5cf8\") " Dec 12 01:02:02 crc kubenswrapper[4606]: I1212 01:02:02.525922 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/134d45c2-0084-4779-9125-b36e673a5cf8-telemetry-combined-ca-bundle\") pod \"134d45c2-0084-4779-9125-b36e673a5cf8\" (UID: \"134d45c2-0084-4779-9125-b36e673a5cf8\") " Dec 12 01:02:02 crc kubenswrapper[4606]: I1212 01:02:02.526005 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/134d45c2-0084-4779-9125-b36e673a5cf8-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"134d45c2-0084-4779-9125-b36e673a5cf8\" (UID: \"134d45c2-0084-4779-9125-b36e673a5cf8\") " Dec 12 01:02:02 crc kubenswrapper[4606]: I1212 01:02:02.546885 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/134d45c2-0084-4779-9125-b36e673a5cf8-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "134d45c2-0084-4779-9125-b36e673a5cf8" (UID: "134d45c2-0084-4779-9125-b36e673a5cf8"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 01:02:02 crc kubenswrapper[4606]: I1212 01:02:02.547553 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/134d45c2-0084-4779-9125-b36e673a5cf8-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "134d45c2-0084-4779-9125-b36e673a5cf8" (UID: "134d45c2-0084-4779-9125-b36e673a5cf8"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 01:02:02 crc kubenswrapper[4606]: I1212 01:02:02.547723 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/134d45c2-0084-4779-9125-b36e673a5cf8-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "134d45c2-0084-4779-9125-b36e673a5cf8" (UID: "134d45c2-0084-4779-9125-b36e673a5cf8"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 01:02:02 crc kubenswrapper[4606]: I1212 01:02:02.550133 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/134d45c2-0084-4779-9125-b36e673a5cf8-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "134d45c2-0084-4779-9125-b36e673a5cf8" (UID: "134d45c2-0084-4779-9125-b36e673a5cf8"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 01:02:02 crc kubenswrapper[4606]: I1212 01:02:02.562348 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/134d45c2-0084-4779-9125-b36e673a5cf8-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "134d45c2-0084-4779-9125-b36e673a5cf8" (UID: "134d45c2-0084-4779-9125-b36e673a5cf8"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 01:02:02 crc kubenswrapper[4606]: I1212 01:02:02.562669 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/134d45c2-0084-4779-9125-b36e673a5cf8-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "134d45c2-0084-4779-9125-b36e673a5cf8" (UID: "134d45c2-0084-4779-9125-b36e673a5cf8"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 01:02:02 crc kubenswrapper[4606]: I1212 01:02:02.569455 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/134d45c2-0084-4779-9125-b36e673a5cf8-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "134d45c2-0084-4779-9125-b36e673a5cf8" (UID: "134d45c2-0084-4779-9125-b36e673a5cf8"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 01:02:02 crc kubenswrapper[4606]: I1212 01:02:02.570443 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/134d45c2-0084-4779-9125-b36e673a5cf8-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "134d45c2-0084-4779-9125-b36e673a5cf8" (UID: "134d45c2-0084-4779-9125-b36e673a5cf8"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 01:02:02 crc kubenswrapper[4606]: I1212 01:02:02.570632 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/134d45c2-0084-4779-9125-b36e673a5cf8-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "134d45c2-0084-4779-9125-b36e673a5cf8" (UID: "134d45c2-0084-4779-9125-b36e673a5cf8"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 01:02:02 crc kubenswrapper[4606]: I1212 01:02:02.570710 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/134d45c2-0084-4779-9125-b36e673a5cf8-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "134d45c2-0084-4779-9125-b36e673a5cf8" (UID: "134d45c2-0084-4779-9125-b36e673a5cf8"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 01:02:02 crc kubenswrapper[4606]: I1212 01:02:02.580307 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/134d45c2-0084-4779-9125-b36e673a5cf8-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "134d45c2-0084-4779-9125-b36e673a5cf8" (UID: "134d45c2-0084-4779-9125-b36e673a5cf8"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 01:02:02 crc kubenswrapper[4606]: I1212 01:02:02.581364 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/134d45c2-0084-4779-9125-b36e673a5cf8-kube-api-access-jb9qk" (OuterVolumeSpecName: "kube-api-access-jb9qk") pod "134d45c2-0084-4779-9125-b36e673a5cf8" (UID: "134d45c2-0084-4779-9125-b36e673a5cf8"). InnerVolumeSpecName "kube-api-access-jb9qk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 01:02:02 crc kubenswrapper[4606]: I1212 01:02:02.622630 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/134d45c2-0084-4779-9125-b36e673a5cf8-inventory" (OuterVolumeSpecName: "inventory") pod "134d45c2-0084-4779-9125-b36e673a5cf8" (UID: "134d45c2-0084-4779-9125-b36e673a5cf8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 01:02:02 crc kubenswrapper[4606]: I1212 01:02:02.630624 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jb9qk\" (UniqueName: \"kubernetes.io/projected/134d45c2-0084-4779-9125-b36e673a5cf8-kube-api-access-jb9qk\") on node \"crc\" DevicePath \"\"" Dec 12 01:02:02 crc kubenswrapper[4606]: I1212 01:02:02.630745 4606 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/134d45c2-0084-4779-9125-b36e673a5cf8-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 12 01:02:02 crc kubenswrapper[4606]: I1212 01:02:02.630804 4606 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/134d45c2-0084-4779-9125-b36e673a5cf8-inventory\") on node \"crc\" DevicePath \"\"" Dec 12 01:02:02 crc kubenswrapper[4606]: I1212 01:02:02.630866 4606 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/134d45c2-0084-4779-9125-b36e673a5cf8-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 12 01:02:02 crc kubenswrapper[4606]: I1212 01:02:02.630933 4606 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/134d45c2-0084-4779-9125-b36e673a5cf8-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 12 01:02:02 crc kubenswrapper[4606]: I1212 01:02:02.630993 4606 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/134d45c2-0084-4779-9125-b36e673a5cf8-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 12 01:02:02 crc kubenswrapper[4606]: I1212 01:02:02.631057 4606 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/134d45c2-0084-4779-9125-b36e673a5cf8-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 12 01:02:02 crc kubenswrapper[4606]: I1212 01:02:02.631141 4606 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/134d45c2-0084-4779-9125-b36e673a5cf8-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 12 01:02:02 crc kubenswrapper[4606]: I1212 01:02:02.631229 4606 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/134d45c2-0084-4779-9125-b36e673a5cf8-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 12 01:02:02 crc kubenswrapper[4606]: I1212 01:02:02.631293 4606 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/134d45c2-0084-4779-9125-b36e673a5cf8-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 12 01:02:02 crc kubenswrapper[4606]: I1212 01:02:02.631350 4606 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/134d45c2-0084-4779-9125-b36e673a5cf8-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 12 01:02:02 crc kubenswrapper[4606]: I1212 01:02:02.631403 4606 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/134d45c2-0084-4779-9125-b36e673a5cf8-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 12 01:02:02 crc kubenswrapper[4606]: I1212 01:02:02.631457 4606 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/134d45c2-0084-4779-9125-b36e673a5cf8-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 12 01:02:02 crc kubenswrapper[4606]: I1212 01:02:02.633671 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/134d45c2-0084-4779-9125-b36e673a5cf8-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "134d45c2-0084-4779-9125-b36e673a5cf8" (UID: "134d45c2-0084-4779-9125-b36e673a5cf8"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 01:02:02 crc kubenswrapper[4606]: I1212 01:02:02.733581 4606 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/134d45c2-0084-4779-9125-b36e673a5cf8-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 12 01:02:02 crc kubenswrapper[4606]: I1212 01:02:02.966516 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9kplh" event={"ID":"134d45c2-0084-4779-9125-b36e673a5cf8","Type":"ContainerDied","Data":"79de0baeea8c590a3f093080ff04f6066ec8d2ecd711f7f5b33815b2b06c42b3"} Dec 12 01:02:02 crc kubenswrapper[4606]: I1212 01:02:02.966569 4606 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="79de0baeea8c590a3f093080ff04f6066ec8d2ecd711f7f5b33815b2b06c42b3" Dec 12 01:02:02 crc kubenswrapper[4606]: I1212 01:02:02.966857 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9kplh" Dec 12 01:02:03 crc kubenswrapper[4606]: I1212 01:02:03.114303 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-zggvz"] Dec 12 01:02:03 crc kubenswrapper[4606]: E1212 01:02:03.114788 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="134d45c2-0084-4779-9125-b36e673a5cf8" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 12 01:02:03 crc kubenswrapper[4606]: I1212 01:02:03.114819 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="134d45c2-0084-4779-9125-b36e673a5cf8" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 12 01:02:03 crc kubenswrapper[4606]: I1212 01:02:03.115067 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="134d45c2-0084-4779-9125-b36e673a5cf8" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 12 01:02:03 crc kubenswrapper[4606]: I1212 01:02:03.116914 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zggvz" Dec 12 01:02:03 crc kubenswrapper[4606]: I1212 01:02:03.118560 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-w59bl" Dec 12 01:02:03 crc kubenswrapper[4606]: I1212 01:02:03.119621 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 12 01:02:03 crc kubenswrapper[4606]: I1212 01:02:03.119639 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 12 01:02:03 crc kubenswrapper[4606]: I1212 01:02:03.121465 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 12 01:02:03 crc kubenswrapper[4606]: I1212 01:02:03.121644 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Dec 12 01:02:03 crc kubenswrapper[4606]: I1212 01:02:03.143604 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-zggvz"] Dec 12 01:02:03 crc kubenswrapper[4606]: I1212 01:02:03.257887 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdkm9\" (UniqueName: \"kubernetes.io/projected/8b9161f7-ad7e-44ec-8fc1-cdc18e2c9d56-kube-api-access-jdkm9\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-zggvz\" (UID: \"8b9161f7-ad7e-44ec-8fc1-cdc18e2c9d56\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zggvz" Dec 12 01:02:03 crc kubenswrapper[4606]: I1212 01:02:03.258349 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b9161f7-ad7e-44ec-8fc1-cdc18e2c9d56-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-zggvz\" (UID: \"8b9161f7-ad7e-44ec-8fc1-cdc18e2c9d56\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zggvz" Dec 12 01:02:03 crc kubenswrapper[4606]: I1212 01:02:03.258397 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/8b9161f7-ad7e-44ec-8fc1-cdc18e2c9d56-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-zggvz\" (UID: \"8b9161f7-ad7e-44ec-8fc1-cdc18e2c9d56\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zggvz" Dec 12 01:02:03 crc kubenswrapper[4606]: I1212 01:02:03.258440 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8b9161f7-ad7e-44ec-8fc1-cdc18e2c9d56-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-zggvz\" (UID: \"8b9161f7-ad7e-44ec-8fc1-cdc18e2c9d56\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zggvz" Dec 12 01:02:03 crc kubenswrapper[4606]: I1212 01:02:03.258473 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8b9161f7-ad7e-44ec-8fc1-cdc18e2c9d56-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-zggvz\" (UID: \"8b9161f7-ad7e-44ec-8fc1-cdc18e2c9d56\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zggvz" Dec 12 01:02:03 crc kubenswrapper[4606]: I1212 01:02:03.360021 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8b9161f7-ad7e-44ec-8fc1-cdc18e2c9d56-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-zggvz\" (UID: \"8b9161f7-ad7e-44ec-8fc1-cdc18e2c9d56\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zggvz" Dec 12 01:02:03 crc kubenswrapper[4606]: I1212 01:02:03.360077 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jdkm9\" (UniqueName: \"kubernetes.io/projected/8b9161f7-ad7e-44ec-8fc1-cdc18e2c9d56-kube-api-access-jdkm9\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-zggvz\" (UID: \"8b9161f7-ad7e-44ec-8fc1-cdc18e2c9d56\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zggvz" Dec 12 01:02:03 crc kubenswrapper[4606]: I1212 01:02:03.360204 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b9161f7-ad7e-44ec-8fc1-cdc18e2c9d56-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-zggvz\" (UID: \"8b9161f7-ad7e-44ec-8fc1-cdc18e2c9d56\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zggvz" Dec 12 01:02:03 crc kubenswrapper[4606]: I1212 01:02:03.360273 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/8b9161f7-ad7e-44ec-8fc1-cdc18e2c9d56-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-zggvz\" (UID: \"8b9161f7-ad7e-44ec-8fc1-cdc18e2c9d56\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zggvz" Dec 12 01:02:03 crc kubenswrapper[4606]: I1212 01:02:03.360308 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8b9161f7-ad7e-44ec-8fc1-cdc18e2c9d56-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-zggvz\" (UID: \"8b9161f7-ad7e-44ec-8fc1-cdc18e2c9d56\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zggvz" Dec 12 01:02:03 crc kubenswrapper[4606]: I1212 01:02:03.361112 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/8b9161f7-ad7e-44ec-8fc1-cdc18e2c9d56-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-zggvz\" (UID: \"8b9161f7-ad7e-44ec-8fc1-cdc18e2c9d56\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zggvz" Dec 12 01:02:03 crc kubenswrapper[4606]: I1212 01:02:03.368068 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8b9161f7-ad7e-44ec-8fc1-cdc18e2c9d56-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-zggvz\" (UID: \"8b9161f7-ad7e-44ec-8fc1-cdc18e2c9d56\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zggvz" Dec 12 01:02:03 crc kubenswrapper[4606]: I1212 01:02:03.368287 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b9161f7-ad7e-44ec-8fc1-cdc18e2c9d56-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-zggvz\" (UID: \"8b9161f7-ad7e-44ec-8fc1-cdc18e2c9d56\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zggvz" Dec 12 01:02:03 crc kubenswrapper[4606]: I1212 01:02:03.370390 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8b9161f7-ad7e-44ec-8fc1-cdc18e2c9d56-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-zggvz\" (UID: \"8b9161f7-ad7e-44ec-8fc1-cdc18e2c9d56\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zggvz" Dec 12 01:02:03 crc kubenswrapper[4606]: I1212 01:02:03.383077 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdkm9\" (UniqueName: \"kubernetes.io/projected/8b9161f7-ad7e-44ec-8fc1-cdc18e2c9d56-kube-api-access-jdkm9\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-zggvz\" (UID: \"8b9161f7-ad7e-44ec-8fc1-cdc18e2c9d56\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zggvz" Dec 12 01:02:03 crc kubenswrapper[4606]: I1212 01:02:03.460541 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zggvz" Dec 12 01:02:04 crc kubenswrapper[4606]: I1212 01:02:04.042630 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-zggvz"] Dec 12 01:02:04 crc kubenswrapper[4606]: W1212 01:02:04.042986 4606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8b9161f7_ad7e_44ec_8fc1_cdc18e2c9d56.slice/crio-9a45c64c8ba053ac7bc52ea63a0d8be440f9ee252c1073b24ef943604d5a9804 WatchSource:0}: Error finding container 9a45c64c8ba053ac7bc52ea63a0d8be440f9ee252c1073b24ef943604d5a9804: Status 404 returned error can't find the container with id 9a45c64c8ba053ac7bc52ea63a0d8be440f9ee252c1073b24ef943604d5a9804 Dec 12 01:02:04 crc kubenswrapper[4606]: I1212 01:02:04.986795 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zggvz" event={"ID":"8b9161f7-ad7e-44ec-8fc1-cdc18e2c9d56","Type":"ContainerStarted","Data":"b104fd17029c46d32e6560c8caf7d7d762d46ee13a817c3b77ce7e9c1c248bac"} Dec 12 01:02:04 crc kubenswrapper[4606]: I1212 01:02:04.987062 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zggvz" event={"ID":"8b9161f7-ad7e-44ec-8fc1-cdc18e2c9d56","Type":"ContainerStarted","Data":"9a45c64c8ba053ac7bc52ea63a0d8be440f9ee252c1073b24ef943604d5a9804"} Dec 12 01:02:05 crc kubenswrapper[4606]: I1212 01:02:05.026922 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zggvz" podStartSLOduration=1.884256994 podStartE2EDuration="2.026872716s" podCreationTimestamp="2025-12-12 01:02:03 +0000 UTC" firstStartedPulling="2025-12-12 01:02:04.046817377 +0000 UTC m=+2314.592170253" lastFinishedPulling="2025-12-12 01:02:04.189433099 +0000 UTC m=+2314.734785975" observedRunningTime="2025-12-12 01:02:05.005652782 +0000 UTC m=+2315.551005668" watchObservedRunningTime="2025-12-12 01:02:05.026872716 +0000 UTC m=+2315.572225622" Dec 12 01:02:14 crc kubenswrapper[4606]: I1212 01:02:14.700071 4606 scope.go:117] "RemoveContainer" containerID="5c37b89455dd84575aa571020478d60921333939ed948577bddb71be18b5f553" Dec 12 01:02:14 crc kubenswrapper[4606]: E1212 01:02:14.701095 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 01:02:25 crc kubenswrapper[4606]: I1212 01:02:25.699903 4606 scope.go:117] "RemoveContainer" containerID="5c37b89455dd84575aa571020478d60921333939ed948577bddb71be18b5f553" Dec 12 01:02:25 crc kubenswrapper[4606]: E1212 01:02:25.700666 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 01:02:39 crc kubenswrapper[4606]: I1212 01:02:39.708900 4606 scope.go:117] "RemoveContainer" containerID="5c37b89455dd84575aa571020478d60921333939ed948577bddb71be18b5f553" Dec 12 01:02:39 crc kubenswrapper[4606]: E1212 01:02:39.712003 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 01:02:53 crc kubenswrapper[4606]: I1212 01:02:53.700972 4606 scope.go:117] "RemoveContainer" containerID="5c37b89455dd84575aa571020478d60921333939ed948577bddb71be18b5f553" Dec 12 01:02:53 crc kubenswrapper[4606]: E1212 01:02:53.701830 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 01:03:05 crc kubenswrapper[4606]: I1212 01:03:05.699664 4606 scope.go:117] "RemoveContainer" containerID="5c37b89455dd84575aa571020478d60921333939ed948577bddb71be18b5f553" Dec 12 01:03:05 crc kubenswrapper[4606]: E1212 01:03:05.700628 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 01:03:16 crc kubenswrapper[4606]: I1212 01:03:16.686997 4606 generic.go:334] "Generic (PLEG): container finished" podID="8b9161f7-ad7e-44ec-8fc1-cdc18e2c9d56" containerID="b104fd17029c46d32e6560c8caf7d7d762d46ee13a817c3b77ce7e9c1c248bac" exitCode=0 Dec 12 01:03:16 crc kubenswrapper[4606]: I1212 01:03:16.687445 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zggvz" event={"ID":"8b9161f7-ad7e-44ec-8fc1-cdc18e2c9d56","Type":"ContainerDied","Data":"b104fd17029c46d32e6560c8caf7d7d762d46ee13a817c3b77ce7e9c1c248bac"} Dec 12 01:03:18 crc kubenswrapper[4606]: I1212 01:03:18.109878 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zggvz" Dec 12 01:03:18 crc kubenswrapper[4606]: I1212 01:03:18.250765 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8b9161f7-ad7e-44ec-8fc1-cdc18e2c9d56-ssh-key\") pod \"8b9161f7-ad7e-44ec-8fc1-cdc18e2c9d56\" (UID: \"8b9161f7-ad7e-44ec-8fc1-cdc18e2c9d56\") " Dec 12 01:03:18 crc kubenswrapper[4606]: I1212 01:03:18.250909 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/8b9161f7-ad7e-44ec-8fc1-cdc18e2c9d56-ovncontroller-config-0\") pod \"8b9161f7-ad7e-44ec-8fc1-cdc18e2c9d56\" (UID: \"8b9161f7-ad7e-44ec-8fc1-cdc18e2c9d56\") " Dec 12 01:03:18 crc kubenswrapper[4606]: I1212 01:03:18.251049 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jdkm9\" (UniqueName: \"kubernetes.io/projected/8b9161f7-ad7e-44ec-8fc1-cdc18e2c9d56-kube-api-access-jdkm9\") pod \"8b9161f7-ad7e-44ec-8fc1-cdc18e2c9d56\" (UID: \"8b9161f7-ad7e-44ec-8fc1-cdc18e2c9d56\") " Dec 12 01:03:18 crc kubenswrapper[4606]: I1212 01:03:18.251142 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b9161f7-ad7e-44ec-8fc1-cdc18e2c9d56-ovn-combined-ca-bundle\") pod \"8b9161f7-ad7e-44ec-8fc1-cdc18e2c9d56\" (UID: \"8b9161f7-ad7e-44ec-8fc1-cdc18e2c9d56\") " Dec 12 01:03:18 crc kubenswrapper[4606]: I1212 01:03:18.251246 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8b9161f7-ad7e-44ec-8fc1-cdc18e2c9d56-inventory\") pod \"8b9161f7-ad7e-44ec-8fc1-cdc18e2c9d56\" (UID: \"8b9161f7-ad7e-44ec-8fc1-cdc18e2c9d56\") " Dec 12 01:03:18 crc kubenswrapper[4606]: I1212 01:03:18.255945 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b9161f7-ad7e-44ec-8fc1-cdc18e2c9d56-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "8b9161f7-ad7e-44ec-8fc1-cdc18e2c9d56" (UID: "8b9161f7-ad7e-44ec-8fc1-cdc18e2c9d56"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 01:03:18 crc kubenswrapper[4606]: I1212 01:03:18.260799 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b9161f7-ad7e-44ec-8fc1-cdc18e2c9d56-kube-api-access-jdkm9" (OuterVolumeSpecName: "kube-api-access-jdkm9") pod "8b9161f7-ad7e-44ec-8fc1-cdc18e2c9d56" (UID: "8b9161f7-ad7e-44ec-8fc1-cdc18e2c9d56"). InnerVolumeSpecName "kube-api-access-jdkm9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 01:03:18 crc kubenswrapper[4606]: I1212 01:03:18.279745 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b9161f7-ad7e-44ec-8fc1-cdc18e2c9d56-inventory" (OuterVolumeSpecName: "inventory") pod "8b9161f7-ad7e-44ec-8fc1-cdc18e2c9d56" (UID: "8b9161f7-ad7e-44ec-8fc1-cdc18e2c9d56"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 01:03:18 crc kubenswrapper[4606]: I1212 01:03:18.281591 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b9161f7-ad7e-44ec-8fc1-cdc18e2c9d56-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "8b9161f7-ad7e-44ec-8fc1-cdc18e2c9d56" (UID: "8b9161f7-ad7e-44ec-8fc1-cdc18e2c9d56"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 01:03:18 crc kubenswrapper[4606]: I1212 01:03:18.301010 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b9161f7-ad7e-44ec-8fc1-cdc18e2c9d56-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "8b9161f7-ad7e-44ec-8fc1-cdc18e2c9d56" (UID: "8b9161f7-ad7e-44ec-8fc1-cdc18e2c9d56"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 01:03:18 crc kubenswrapper[4606]: I1212 01:03:18.354259 4606 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b9161f7-ad7e-44ec-8fc1-cdc18e2c9d56-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 12 01:03:18 crc kubenswrapper[4606]: I1212 01:03:18.354302 4606 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8b9161f7-ad7e-44ec-8fc1-cdc18e2c9d56-inventory\") on node \"crc\" DevicePath \"\"" Dec 12 01:03:18 crc kubenswrapper[4606]: I1212 01:03:18.354314 4606 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8b9161f7-ad7e-44ec-8fc1-cdc18e2c9d56-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 12 01:03:18 crc kubenswrapper[4606]: I1212 01:03:18.354328 4606 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/8b9161f7-ad7e-44ec-8fc1-cdc18e2c9d56-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Dec 12 01:03:18 crc kubenswrapper[4606]: I1212 01:03:18.354340 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jdkm9\" (UniqueName: \"kubernetes.io/projected/8b9161f7-ad7e-44ec-8fc1-cdc18e2c9d56-kube-api-access-jdkm9\") on node \"crc\" DevicePath \"\"" Dec 12 01:03:18 crc kubenswrapper[4606]: I1212 01:03:18.703904 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zggvz" event={"ID":"8b9161f7-ad7e-44ec-8fc1-cdc18e2c9d56","Type":"ContainerDied","Data":"9a45c64c8ba053ac7bc52ea63a0d8be440f9ee252c1073b24ef943604d5a9804"} Dec 12 01:03:18 crc kubenswrapper[4606]: I1212 01:03:18.704222 4606 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9a45c64c8ba053ac7bc52ea63a0d8be440f9ee252c1073b24ef943604d5a9804" Dec 12 01:03:18 crc kubenswrapper[4606]: I1212 01:03:18.703934 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zggvz" Dec 12 01:03:18 crc kubenswrapper[4606]: I1212 01:03:18.921049 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jm59d"] Dec 12 01:03:18 crc kubenswrapper[4606]: E1212 01:03:18.921455 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b9161f7-ad7e-44ec-8fc1-cdc18e2c9d56" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 12 01:03:18 crc kubenswrapper[4606]: I1212 01:03:18.921467 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b9161f7-ad7e-44ec-8fc1-cdc18e2c9d56" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 12 01:03:18 crc kubenswrapper[4606]: I1212 01:03:18.921685 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b9161f7-ad7e-44ec-8fc1-cdc18e2c9d56" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 12 01:03:18 crc kubenswrapper[4606]: I1212 01:03:18.922330 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jm59d" Dec 12 01:03:18 crc kubenswrapper[4606]: I1212 01:03:18.925303 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 12 01:03:18 crc kubenswrapper[4606]: I1212 01:03:18.925446 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 12 01:03:18 crc kubenswrapper[4606]: I1212 01:03:18.925545 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Dec 12 01:03:18 crc kubenswrapper[4606]: I1212 01:03:18.925760 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Dec 12 01:03:18 crc kubenswrapper[4606]: I1212 01:03:18.925926 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-w59bl" Dec 12 01:03:18 crc kubenswrapper[4606]: I1212 01:03:18.926280 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 12 01:03:18 crc kubenswrapper[4606]: I1212 01:03:18.938312 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jm59d"] Dec 12 01:03:19 crc kubenswrapper[4606]: I1212 01:03:19.069088 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9dafb619-e866-4f79-8e75-28c88fe1dfe7-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jm59d\" (UID: \"9dafb619-e866-4f79-8e75-28c88fe1dfe7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jm59d" Dec 12 01:03:19 crc kubenswrapper[4606]: I1212 01:03:19.069135 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9dafb619-e866-4f79-8e75-28c88fe1dfe7-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jm59d\" (UID: \"9dafb619-e866-4f79-8e75-28c88fe1dfe7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jm59d" Dec 12 01:03:19 crc kubenswrapper[4606]: I1212 01:03:19.069218 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dafb619-e866-4f79-8e75-28c88fe1dfe7-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jm59d\" (UID: \"9dafb619-e866-4f79-8e75-28c88fe1dfe7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jm59d" Dec 12 01:03:19 crc kubenswrapper[4606]: I1212 01:03:19.069240 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9dafb619-e866-4f79-8e75-28c88fe1dfe7-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jm59d\" (UID: \"9dafb619-e866-4f79-8e75-28c88fe1dfe7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jm59d" Dec 12 01:03:19 crc kubenswrapper[4606]: I1212 01:03:19.069256 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9dafb619-e866-4f79-8e75-28c88fe1dfe7-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jm59d\" (UID: \"9dafb619-e866-4f79-8e75-28c88fe1dfe7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jm59d" Dec 12 01:03:19 crc kubenswrapper[4606]: I1212 01:03:19.069283 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kcxpr\" (UniqueName: \"kubernetes.io/projected/9dafb619-e866-4f79-8e75-28c88fe1dfe7-kube-api-access-kcxpr\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jm59d\" (UID: \"9dafb619-e866-4f79-8e75-28c88fe1dfe7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jm59d" Dec 12 01:03:19 crc kubenswrapper[4606]: I1212 01:03:19.170616 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dafb619-e866-4f79-8e75-28c88fe1dfe7-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jm59d\" (UID: \"9dafb619-e866-4f79-8e75-28c88fe1dfe7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jm59d" Dec 12 01:03:19 crc kubenswrapper[4606]: I1212 01:03:19.170671 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9dafb619-e866-4f79-8e75-28c88fe1dfe7-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jm59d\" (UID: \"9dafb619-e866-4f79-8e75-28c88fe1dfe7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jm59d" Dec 12 01:03:19 crc kubenswrapper[4606]: I1212 01:03:19.170701 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9dafb619-e866-4f79-8e75-28c88fe1dfe7-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jm59d\" (UID: \"9dafb619-e866-4f79-8e75-28c88fe1dfe7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jm59d" Dec 12 01:03:19 crc kubenswrapper[4606]: I1212 01:03:19.170740 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kcxpr\" (UniqueName: \"kubernetes.io/projected/9dafb619-e866-4f79-8e75-28c88fe1dfe7-kube-api-access-kcxpr\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jm59d\" (UID: \"9dafb619-e866-4f79-8e75-28c88fe1dfe7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jm59d" Dec 12 01:03:19 crc kubenswrapper[4606]: I1212 01:03:19.170889 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9dafb619-e866-4f79-8e75-28c88fe1dfe7-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jm59d\" (UID: \"9dafb619-e866-4f79-8e75-28c88fe1dfe7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jm59d" Dec 12 01:03:19 crc kubenswrapper[4606]: I1212 01:03:19.170914 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9dafb619-e866-4f79-8e75-28c88fe1dfe7-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jm59d\" (UID: \"9dafb619-e866-4f79-8e75-28c88fe1dfe7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jm59d" Dec 12 01:03:19 crc kubenswrapper[4606]: I1212 01:03:19.177026 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9dafb619-e866-4f79-8e75-28c88fe1dfe7-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jm59d\" (UID: \"9dafb619-e866-4f79-8e75-28c88fe1dfe7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jm59d" Dec 12 01:03:19 crc kubenswrapper[4606]: I1212 01:03:19.177055 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9dafb619-e866-4f79-8e75-28c88fe1dfe7-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jm59d\" (UID: \"9dafb619-e866-4f79-8e75-28c88fe1dfe7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jm59d" Dec 12 01:03:19 crc kubenswrapper[4606]: I1212 01:03:19.183559 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9dafb619-e866-4f79-8e75-28c88fe1dfe7-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jm59d\" (UID: \"9dafb619-e866-4f79-8e75-28c88fe1dfe7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jm59d" Dec 12 01:03:19 crc kubenswrapper[4606]: I1212 01:03:19.184383 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9dafb619-e866-4f79-8e75-28c88fe1dfe7-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jm59d\" (UID: \"9dafb619-e866-4f79-8e75-28c88fe1dfe7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jm59d" Dec 12 01:03:19 crc kubenswrapper[4606]: I1212 01:03:19.194282 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dafb619-e866-4f79-8e75-28c88fe1dfe7-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jm59d\" (UID: \"9dafb619-e866-4f79-8e75-28c88fe1dfe7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jm59d" Dec 12 01:03:19 crc kubenswrapper[4606]: I1212 01:03:19.196245 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kcxpr\" (UniqueName: \"kubernetes.io/projected/9dafb619-e866-4f79-8e75-28c88fe1dfe7-kube-api-access-kcxpr\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jm59d\" (UID: \"9dafb619-e866-4f79-8e75-28c88fe1dfe7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jm59d" Dec 12 01:03:19 crc kubenswrapper[4606]: I1212 01:03:19.254904 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jm59d" Dec 12 01:03:19 crc kubenswrapper[4606]: I1212 01:03:19.828271 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jm59d"] Dec 12 01:03:20 crc kubenswrapper[4606]: I1212 01:03:20.700016 4606 scope.go:117] "RemoveContainer" containerID="5c37b89455dd84575aa571020478d60921333939ed948577bddb71be18b5f553" Dec 12 01:03:20 crc kubenswrapper[4606]: E1212 01:03:20.700462 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 01:03:20 crc kubenswrapper[4606]: I1212 01:03:20.722673 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jm59d" event={"ID":"9dafb619-e866-4f79-8e75-28c88fe1dfe7","Type":"ContainerStarted","Data":"330bc122ebbde9973a511008a71ceb774546bc7962841e8d1e9c03136ddec3ac"} Dec 12 01:03:20 crc kubenswrapper[4606]: I1212 01:03:20.722712 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jm59d" event={"ID":"9dafb619-e866-4f79-8e75-28c88fe1dfe7","Type":"ContainerStarted","Data":"dd45a8b01f5b15a21980b7227ea5af9d825f6bcefe865eb82b824b2a4379b581"} Dec 12 01:03:20 crc kubenswrapper[4606]: I1212 01:03:20.746083 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jm59d" podStartSLOduration=2.558509069 podStartE2EDuration="2.746050564s" podCreationTimestamp="2025-12-12 01:03:18 +0000 UTC" firstStartedPulling="2025-12-12 01:03:19.825677303 +0000 UTC m=+2390.371030169" lastFinishedPulling="2025-12-12 01:03:20.013218778 +0000 UTC m=+2390.558571664" observedRunningTime="2025-12-12 01:03:20.742694875 +0000 UTC m=+2391.288047751" watchObservedRunningTime="2025-12-12 01:03:20.746050564 +0000 UTC m=+2391.291403430" Dec 12 01:03:32 crc kubenswrapper[4606]: I1212 01:03:32.700941 4606 scope.go:117] "RemoveContainer" containerID="5c37b89455dd84575aa571020478d60921333939ed948577bddb71be18b5f553" Dec 12 01:03:32 crc kubenswrapper[4606]: E1212 01:03:32.702566 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 01:03:47 crc kubenswrapper[4606]: I1212 01:03:47.700138 4606 scope.go:117] "RemoveContainer" containerID="5c37b89455dd84575aa571020478d60921333939ed948577bddb71be18b5f553" Dec 12 01:03:47 crc kubenswrapper[4606]: E1212 01:03:47.701417 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 01:03:59 crc kubenswrapper[4606]: I1212 01:03:59.709018 4606 scope.go:117] "RemoveContainer" containerID="5c37b89455dd84575aa571020478d60921333939ed948577bddb71be18b5f553" Dec 12 01:03:59 crc kubenswrapper[4606]: E1212 01:03:59.709818 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 01:04:14 crc kubenswrapper[4606]: I1212 01:04:14.226628 4606 generic.go:334] "Generic (PLEG): container finished" podID="9dafb619-e866-4f79-8e75-28c88fe1dfe7" containerID="330bc122ebbde9973a511008a71ceb774546bc7962841e8d1e9c03136ddec3ac" exitCode=0 Dec 12 01:04:14 crc kubenswrapper[4606]: I1212 01:04:14.226709 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jm59d" event={"ID":"9dafb619-e866-4f79-8e75-28c88fe1dfe7","Type":"ContainerDied","Data":"330bc122ebbde9973a511008a71ceb774546bc7962841e8d1e9c03136ddec3ac"} Dec 12 01:04:14 crc kubenswrapper[4606]: I1212 01:04:14.700282 4606 scope.go:117] "RemoveContainer" containerID="5c37b89455dd84575aa571020478d60921333939ed948577bddb71be18b5f553" Dec 12 01:04:14 crc kubenswrapper[4606]: E1212 01:04:14.700825 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 01:04:15 crc kubenswrapper[4606]: I1212 01:04:15.714290 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jm59d" Dec 12 01:04:15 crc kubenswrapper[4606]: I1212 01:04:15.785924 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9dafb619-e866-4f79-8e75-28c88fe1dfe7-inventory\") pod \"9dafb619-e866-4f79-8e75-28c88fe1dfe7\" (UID: \"9dafb619-e866-4f79-8e75-28c88fe1dfe7\") " Dec 12 01:04:15 crc kubenswrapper[4606]: I1212 01:04:15.785981 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9dafb619-e866-4f79-8e75-28c88fe1dfe7-nova-metadata-neutron-config-0\") pod \"9dafb619-e866-4f79-8e75-28c88fe1dfe7\" (UID: \"9dafb619-e866-4f79-8e75-28c88fe1dfe7\") " Dec 12 01:04:15 crc kubenswrapper[4606]: I1212 01:04:15.786027 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kcxpr\" (UniqueName: \"kubernetes.io/projected/9dafb619-e866-4f79-8e75-28c88fe1dfe7-kube-api-access-kcxpr\") pod \"9dafb619-e866-4f79-8e75-28c88fe1dfe7\" (UID: \"9dafb619-e866-4f79-8e75-28c88fe1dfe7\") " Dec 12 01:04:15 crc kubenswrapper[4606]: I1212 01:04:15.786086 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9dafb619-e866-4f79-8e75-28c88fe1dfe7-neutron-ovn-metadata-agent-neutron-config-0\") pod \"9dafb619-e866-4f79-8e75-28c88fe1dfe7\" (UID: \"9dafb619-e866-4f79-8e75-28c88fe1dfe7\") " Dec 12 01:04:15 crc kubenswrapper[4606]: I1212 01:04:15.786483 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dafb619-e866-4f79-8e75-28c88fe1dfe7-neutron-metadata-combined-ca-bundle\") pod \"9dafb619-e866-4f79-8e75-28c88fe1dfe7\" (UID: \"9dafb619-e866-4f79-8e75-28c88fe1dfe7\") " Dec 12 01:04:15 crc kubenswrapper[4606]: I1212 01:04:15.786551 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9dafb619-e866-4f79-8e75-28c88fe1dfe7-ssh-key\") pod \"9dafb619-e866-4f79-8e75-28c88fe1dfe7\" (UID: \"9dafb619-e866-4f79-8e75-28c88fe1dfe7\") " Dec 12 01:04:15 crc kubenswrapper[4606]: I1212 01:04:15.792782 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9dafb619-e866-4f79-8e75-28c88fe1dfe7-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "9dafb619-e866-4f79-8e75-28c88fe1dfe7" (UID: "9dafb619-e866-4f79-8e75-28c88fe1dfe7"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 01:04:15 crc kubenswrapper[4606]: I1212 01:04:15.792978 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9dafb619-e866-4f79-8e75-28c88fe1dfe7-kube-api-access-kcxpr" (OuterVolumeSpecName: "kube-api-access-kcxpr") pod "9dafb619-e866-4f79-8e75-28c88fe1dfe7" (UID: "9dafb619-e866-4f79-8e75-28c88fe1dfe7"). InnerVolumeSpecName "kube-api-access-kcxpr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 01:04:15 crc kubenswrapper[4606]: I1212 01:04:15.817524 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9dafb619-e866-4f79-8e75-28c88fe1dfe7-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "9dafb619-e866-4f79-8e75-28c88fe1dfe7" (UID: "9dafb619-e866-4f79-8e75-28c88fe1dfe7"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 01:04:15 crc kubenswrapper[4606]: I1212 01:04:15.818124 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9dafb619-e866-4f79-8e75-28c88fe1dfe7-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "9dafb619-e866-4f79-8e75-28c88fe1dfe7" (UID: "9dafb619-e866-4f79-8e75-28c88fe1dfe7"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 01:04:15 crc kubenswrapper[4606]: I1212 01:04:15.822978 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9dafb619-e866-4f79-8e75-28c88fe1dfe7-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "9dafb619-e866-4f79-8e75-28c88fe1dfe7" (UID: "9dafb619-e866-4f79-8e75-28c88fe1dfe7"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 01:04:15 crc kubenswrapper[4606]: I1212 01:04:15.831762 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9dafb619-e866-4f79-8e75-28c88fe1dfe7-inventory" (OuterVolumeSpecName: "inventory") pod "9dafb619-e866-4f79-8e75-28c88fe1dfe7" (UID: "9dafb619-e866-4f79-8e75-28c88fe1dfe7"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 01:04:15 crc kubenswrapper[4606]: I1212 01:04:15.888986 4606 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9dafb619-e866-4f79-8e75-28c88fe1dfe7-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 12 01:04:15 crc kubenswrapper[4606]: I1212 01:04:15.889024 4606 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dafb619-e866-4f79-8e75-28c88fe1dfe7-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 12 01:04:15 crc kubenswrapper[4606]: I1212 01:04:15.889040 4606 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9dafb619-e866-4f79-8e75-28c88fe1dfe7-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 12 01:04:15 crc kubenswrapper[4606]: I1212 01:04:15.889054 4606 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9dafb619-e866-4f79-8e75-28c88fe1dfe7-inventory\") on node \"crc\" DevicePath \"\"" Dec 12 01:04:15 crc kubenswrapper[4606]: I1212 01:04:15.889070 4606 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9dafb619-e866-4f79-8e75-28c88fe1dfe7-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 12 01:04:15 crc kubenswrapper[4606]: I1212 01:04:15.889087 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kcxpr\" (UniqueName: \"kubernetes.io/projected/9dafb619-e866-4f79-8e75-28c88fe1dfe7-kube-api-access-kcxpr\") on node \"crc\" DevicePath \"\"" Dec 12 01:04:16 crc kubenswrapper[4606]: I1212 01:04:16.286110 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jm59d" event={"ID":"9dafb619-e866-4f79-8e75-28c88fe1dfe7","Type":"ContainerDied","Data":"dd45a8b01f5b15a21980b7227ea5af9d825f6bcefe865eb82b824b2a4379b581"} Dec 12 01:04:16 crc kubenswrapper[4606]: I1212 01:04:16.286386 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jm59d" Dec 12 01:04:16 crc kubenswrapper[4606]: I1212 01:04:16.286399 4606 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dd45a8b01f5b15a21980b7227ea5af9d825f6bcefe865eb82b824b2a4379b581" Dec 12 01:04:16 crc kubenswrapper[4606]: I1212 01:04:16.402534 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cdq4s"] Dec 12 01:04:16 crc kubenswrapper[4606]: E1212 01:04:16.402993 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9dafb619-e866-4f79-8e75-28c88fe1dfe7" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 12 01:04:16 crc kubenswrapper[4606]: I1212 01:04:16.403014 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="9dafb619-e866-4f79-8e75-28c88fe1dfe7" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 12 01:04:16 crc kubenswrapper[4606]: I1212 01:04:16.403268 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="9dafb619-e866-4f79-8e75-28c88fe1dfe7" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 12 01:04:16 crc kubenswrapper[4606]: I1212 01:04:16.403996 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cdq4s" Dec 12 01:04:16 crc kubenswrapper[4606]: I1212 01:04:16.408255 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 12 01:04:16 crc kubenswrapper[4606]: I1212 01:04:16.408343 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 12 01:04:16 crc kubenswrapper[4606]: I1212 01:04:16.408510 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Dec 12 01:04:16 crc kubenswrapper[4606]: I1212 01:04:16.408577 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-w59bl" Dec 12 01:04:16 crc kubenswrapper[4606]: I1212 01:04:16.408651 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 12 01:04:16 crc kubenswrapper[4606]: I1212 01:04:16.421911 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cdq4s"] Dec 12 01:04:16 crc kubenswrapper[4606]: I1212 01:04:16.503715 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/7698f12d-5dde-46ae-929e-472dfebb1a90-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-cdq4s\" (UID: \"7698f12d-5dde-46ae-929e-472dfebb1a90\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cdq4s" Dec 12 01:04:16 crc kubenswrapper[4606]: I1212 01:04:16.503821 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7698f12d-5dde-46ae-929e-472dfebb1a90-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-cdq4s\" (UID: \"7698f12d-5dde-46ae-929e-472dfebb1a90\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cdq4s" Dec 12 01:04:16 crc kubenswrapper[4606]: I1212 01:04:16.503853 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j955n\" (UniqueName: \"kubernetes.io/projected/7698f12d-5dde-46ae-929e-472dfebb1a90-kube-api-access-j955n\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-cdq4s\" (UID: \"7698f12d-5dde-46ae-929e-472dfebb1a90\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cdq4s" Dec 12 01:04:16 crc kubenswrapper[4606]: I1212 01:04:16.503938 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7698f12d-5dde-46ae-929e-472dfebb1a90-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-cdq4s\" (UID: \"7698f12d-5dde-46ae-929e-472dfebb1a90\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cdq4s" Dec 12 01:04:16 crc kubenswrapper[4606]: I1212 01:04:16.504009 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7698f12d-5dde-46ae-929e-472dfebb1a90-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-cdq4s\" (UID: \"7698f12d-5dde-46ae-929e-472dfebb1a90\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cdq4s" Dec 12 01:04:16 crc kubenswrapper[4606]: I1212 01:04:16.605793 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7698f12d-5dde-46ae-929e-472dfebb1a90-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-cdq4s\" (UID: \"7698f12d-5dde-46ae-929e-472dfebb1a90\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cdq4s" Dec 12 01:04:16 crc kubenswrapper[4606]: I1212 01:04:16.605922 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/7698f12d-5dde-46ae-929e-472dfebb1a90-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-cdq4s\" (UID: \"7698f12d-5dde-46ae-929e-472dfebb1a90\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cdq4s" Dec 12 01:04:16 crc kubenswrapper[4606]: I1212 01:04:16.605966 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7698f12d-5dde-46ae-929e-472dfebb1a90-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-cdq4s\" (UID: \"7698f12d-5dde-46ae-929e-472dfebb1a90\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cdq4s" Dec 12 01:04:16 crc kubenswrapper[4606]: I1212 01:04:16.605981 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j955n\" (UniqueName: \"kubernetes.io/projected/7698f12d-5dde-46ae-929e-472dfebb1a90-kube-api-access-j955n\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-cdq4s\" (UID: \"7698f12d-5dde-46ae-929e-472dfebb1a90\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cdq4s" Dec 12 01:04:16 crc kubenswrapper[4606]: I1212 01:04:16.606027 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7698f12d-5dde-46ae-929e-472dfebb1a90-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-cdq4s\" (UID: \"7698f12d-5dde-46ae-929e-472dfebb1a90\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cdq4s" Dec 12 01:04:16 crc kubenswrapper[4606]: I1212 01:04:16.609968 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7698f12d-5dde-46ae-929e-472dfebb1a90-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-cdq4s\" (UID: \"7698f12d-5dde-46ae-929e-472dfebb1a90\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cdq4s" Dec 12 01:04:16 crc kubenswrapper[4606]: I1212 01:04:16.610991 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/7698f12d-5dde-46ae-929e-472dfebb1a90-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-cdq4s\" (UID: \"7698f12d-5dde-46ae-929e-472dfebb1a90\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cdq4s" Dec 12 01:04:16 crc kubenswrapper[4606]: I1212 01:04:16.612021 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7698f12d-5dde-46ae-929e-472dfebb1a90-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-cdq4s\" (UID: \"7698f12d-5dde-46ae-929e-472dfebb1a90\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cdq4s" Dec 12 01:04:16 crc kubenswrapper[4606]: I1212 01:04:16.613619 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7698f12d-5dde-46ae-929e-472dfebb1a90-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-cdq4s\" (UID: \"7698f12d-5dde-46ae-929e-472dfebb1a90\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cdq4s" Dec 12 01:04:16 crc kubenswrapper[4606]: I1212 01:04:16.623734 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j955n\" (UniqueName: \"kubernetes.io/projected/7698f12d-5dde-46ae-929e-472dfebb1a90-kube-api-access-j955n\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-cdq4s\" (UID: \"7698f12d-5dde-46ae-929e-472dfebb1a90\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cdq4s" Dec 12 01:04:16 crc kubenswrapper[4606]: I1212 01:04:16.731585 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cdq4s" Dec 12 01:04:17 crc kubenswrapper[4606]: I1212 01:04:17.303113 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cdq4s"] Dec 12 01:04:18 crc kubenswrapper[4606]: I1212 01:04:18.321894 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cdq4s" event={"ID":"7698f12d-5dde-46ae-929e-472dfebb1a90","Type":"ContainerStarted","Data":"de804e2aa80bc4c69279b98c605a23ae400d59412b9bf47ea8634a39b08a038a"} Dec 12 01:04:18 crc kubenswrapper[4606]: I1212 01:04:18.322258 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cdq4s" event={"ID":"7698f12d-5dde-46ae-929e-472dfebb1a90","Type":"ContainerStarted","Data":"583b4b887f5d53759c0f574855fb2b98749d29690831c9ad4319c757ca27ba5e"} Dec 12 01:04:18 crc kubenswrapper[4606]: I1212 01:04:18.349376 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cdq4s" podStartSLOduration=2.160656913 podStartE2EDuration="2.34935658s" podCreationTimestamp="2025-12-12 01:04:16 +0000 UTC" firstStartedPulling="2025-12-12 01:04:17.306302986 +0000 UTC m=+2447.851655862" lastFinishedPulling="2025-12-12 01:04:17.495002653 +0000 UTC m=+2448.040355529" observedRunningTime="2025-12-12 01:04:18.338341327 +0000 UTC m=+2448.883694213" watchObservedRunningTime="2025-12-12 01:04:18.34935658 +0000 UTC m=+2448.894709456" Dec 12 01:04:29 crc kubenswrapper[4606]: I1212 01:04:29.714259 4606 scope.go:117] "RemoveContainer" containerID="5c37b89455dd84575aa571020478d60921333939ed948577bddb71be18b5f553" Dec 12 01:04:29 crc kubenswrapper[4606]: E1212 01:04:29.715161 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 01:04:43 crc kubenswrapper[4606]: I1212 01:04:43.699936 4606 scope.go:117] "RemoveContainer" containerID="5c37b89455dd84575aa571020478d60921333939ed948577bddb71be18b5f553" Dec 12 01:04:44 crc kubenswrapper[4606]: I1212 01:04:44.538585 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" event={"ID":"a543e227-be89-40cb-941d-b4707cc28921","Type":"ContainerStarted","Data":"db929c2a4f0c0ac394d9f5ec86cf2aad7286615bc2456ec2a3e2009c28374446"} Dec 12 01:05:44 crc kubenswrapper[4606]: I1212 01:05:44.042863 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-rxlgw"] Dec 12 01:05:44 crc kubenswrapper[4606]: I1212 01:05:44.047234 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rxlgw" Dec 12 01:05:44 crc kubenswrapper[4606]: I1212 01:05:44.065321 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rxlgw"] Dec 12 01:05:44 crc kubenswrapper[4606]: I1212 01:05:44.106095 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5703a692-bffc-433a-a2e8-bf02ea02f310-utilities\") pod \"community-operators-rxlgw\" (UID: \"5703a692-bffc-433a-a2e8-bf02ea02f310\") " pod="openshift-marketplace/community-operators-rxlgw" Dec 12 01:05:44 crc kubenswrapper[4606]: I1212 01:05:44.106165 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2h5j6\" (UniqueName: \"kubernetes.io/projected/5703a692-bffc-433a-a2e8-bf02ea02f310-kube-api-access-2h5j6\") pod \"community-operators-rxlgw\" (UID: \"5703a692-bffc-433a-a2e8-bf02ea02f310\") " pod="openshift-marketplace/community-operators-rxlgw" Dec 12 01:05:44 crc kubenswrapper[4606]: I1212 01:05:44.106382 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5703a692-bffc-433a-a2e8-bf02ea02f310-catalog-content\") pod \"community-operators-rxlgw\" (UID: \"5703a692-bffc-433a-a2e8-bf02ea02f310\") " pod="openshift-marketplace/community-operators-rxlgw" Dec 12 01:05:44 crc kubenswrapper[4606]: I1212 01:05:44.207856 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5703a692-bffc-433a-a2e8-bf02ea02f310-catalog-content\") pod \"community-operators-rxlgw\" (UID: \"5703a692-bffc-433a-a2e8-bf02ea02f310\") " pod="openshift-marketplace/community-operators-rxlgw" Dec 12 01:05:44 crc kubenswrapper[4606]: I1212 01:05:44.207997 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5703a692-bffc-433a-a2e8-bf02ea02f310-utilities\") pod \"community-operators-rxlgw\" (UID: \"5703a692-bffc-433a-a2e8-bf02ea02f310\") " pod="openshift-marketplace/community-operators-rxlgw" Dec 12 01:05:44 crc kubenswrapper[4606]: I1212 01:05:44.208025 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2h5j6\" (UniqueName: \"kubernetes.io/projected/5703a692-bffc-433a-a2e8-bf02ea02f310-kube-api-access-2h5j6\") pod \"community-operators-rxlgw\" (UID: \"5703a692-bffc-433a-a2e8-bf02ea02f310\") " pod="openshift-marketplace/community-operators-rxlgw" Dec 12 01:05:44 crc kubenswrapper[4606]: I1212 01:05:44.208397 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5703a692-bffc-433a-a2e8-bf02ea02f310-catalog-content\") pod \"community-operators-rxlgw\" (UID: \"5703a692-bffc-433a-a2e8-bf02ea02f310\") " pod="openshift-marketplace/community-operators-rxlgw" Dec 12 01:05:44 crc kubenswrapper[4606]: I1212 01:05:44.208610 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5703a692-bffc-433a-a2e8-bf02ea02f310-utilities\") pod \"community-operators-rxlgw\" (UID: \"5703a692-bffc-433a-a2e8-bf02ea02f310\") " pod="openshift-marketplace/community-operators-rxlgw" Dec 12 01:05:44 crc kubenswrapper[4606]: I1212 01:05:44.232722 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2h5j6\" (UniqueName: \"kubernetes.io/projected/5703a692-bffc-433a-a2e8-bf02ea02f310-kube-api-access-2h5j6\") pod \"community-operators-rxlgw\" (UID: \"5703a692-bffc-433a-a2e8-bf02ea02f310\") " pod="openshift-marketplace/community-operators-rxlgw" Dec 12 01:05:44 crc kubenswrapper[4606]: I1212 01:05:44.379691 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rxlgw" Dec 12 01:05:44 crc kubenswrapper[4606]: I1212 01:05:44.711249 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rxlgw"] Dec 12 01:05:45 crc kubenswrapper[4606]: I1212 01:05:45.246333 4606 generic.go:334] "Generic (PLEG): container finished" podID="5703a692-bffc-433a-a2e8-bf02ea02f310" containerID="6b09c82c1f9f7d0cfc2ae8582699fbc127c0961363ca0514705e063ea344a609" exitCode=0 Dec 12 01:05:45 crc kubenswrapper[4606]: I1212 01:05:45.246374 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rxlgw" event={"ID":"5703a692-bffc-433a-a2e8-bf02ea02f310","Type":"ContainerDied","Data":"6b09c82c1f9f7d0cfc2ae8582699fbc127c0961363ca0514705e063ea344a609"} Dec 12 01:05:45 crc kubenswrapper[4606]: I1212 01:05:45.246399 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rxlgw" event={"ID":"5703a692-bffc-433a-a2e8-bf02ea02f310","Type":"ContainerStarted","Data":"7c7623fb9e1cf1c486ea248fd0b4dbeb4f794b6b8f6d7377a3eff338be2654c2"} Dec 12 01:05:45 crc kubenswrapper[4606]: I1212 01:05:45.249019 4606 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 12 01:05:47 crc kubenswrapper[4606]: I1212 01:05:47.278684 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rxlgw" event={"ID":"5703a692-bffc-433a-a2e8-bf02ea02f310","Type":"ContainerStarted","Data":"4d77a679751dc237cb9c46bc1d692643c997defc1e5e0ba5b41d932d8b422c9a"} Dec 12 01:05:48 crc kubenswrapper[4606]: I1212 01:05:48.290340 4606 generic.go:334] "Generic (PLEG): container finished" podID="5703a692-bffc-433a-a2e8-bf02ea02f310" containerID="4d77a679751dc237cb9c46bc1d692643c997defc1e5e0ba5b41d932d8b422c9a" exitCode=0 Dec 12 01:05:48 crc kubenswrapper[4606]: I1212 01:05:48.290654 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rxlgw" event={"ID":"5703a692-bffc-433a-a2e8-bf02ea02f310","Type":"ContainerDied","Data":"4d77a679751dc237cb9c46bc1d692643c997defc1e5e0ba5b41d932d8b422c9a"} Dec 12 01:05:49 crc kubenswrapper[4606]: I1212 01:05:49.302306 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rxlgw" event={"ID":"5703a692-bffc-433a-a2e8-bf02ea02f310","Type":"ContainerStarted","Data":"65ee984c8aa72e0b22a66ade027b0c535d7491fc38c8563fca8b1202a7db672c"} Dec 12 01:05:49 crc kubenswrapper[4606]: I1212 01:05:49.329802 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-rxlgw" podStartSLOduration=1.759610044 podStartE2EDuration="5.329752087s" podCreationTimestamp="2025-12-12 01:05:44 +0000 UTC" firstStartedPulling="2025-12-12 01:05:45.248828686 +0000 UTC m=+2535.794181552" lastFinishedPulling="2025-12-12 01:05:48.818970709 +0000 UTC m=+2539.364323595" observedRunningTime="2025-12-12 01:05:49.319810904 +0000 UTC m=+2539.865163790" watchObservedRunningTime="2025-12-12 01:05:49.329752087 +0000 UTC m=+2539.875104953" Dec 12 01:05:54 crc kubenswrapper[4606]: I1212 01:05:54.380369 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-rxlgw" Dec 12 01:05:54 crc kubenswrapper[4606]: I1212 01:05:54.380894 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-rxlgw" Dec 12 01:05:54 crc kubenswrapper[4606]: I1212 01:05:54.445566 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-rxlgw" Dec 12 01:05:55 crc kubenswrapper[4606]: I1212 01:05:55.414383 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-rxlgw" Dec 12 01:05:55 crc kubenswrapper[4606]: I1212 01:05:55.466050 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rxlgw"] Dec 12 01:05:57 crc kubenswrapper[4606]: I1212 01:05:57.375882 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-rxlgw" podUID="5703a692-bffc-433a-a2e8-bf02ea02f310" containerName="registry-server" containerID="cri-o://65ee984c8aa72e0b22a66ade027b0c535d7491fc38c8563fca8b1202a7db672c" gracePeriod=2 Dec 12 01:05:58 crc kubenswrapper[4606]: I1212 01:05:58.338761 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rxlgw" Dec 12 01:05:58 crc kubenswrapper[4606]: I1212 01:05:58.387666 4606 generic.go:334] "Generic (PLEG): container finished" podID="5703a692-bffc-433a-a2e8-bf02ea02f310" containerID="65ee984c8aa72e0b22a66ade027b0c535d7491fc38c8563fca8b1202a7db672c" exitCode=0 Dec 12 01:05:58 crc kubenswrapper[4606]: I1212 01:05:58.387712 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rxlgw" event={"ID":"5703a692-bffc-433a-a2e8-bf02ea02f310","Type":"ContainerDied","Data":"65ee984c8aa72e0b22a66ade027b0c535d7491fc38c8563fca8b1202a7db672c"} Dec 12 01:05:58 crc kubenswrapper[4606]: I1212 01:05:58.387738 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rxlgw" event={"ID":"5703a692-bffc-433a-a2e8-bf02ea02f310","Type":"ContainerDied","Data":"7c7623fb9e1cf1c486ea248fd0b4dbeb4f794b6b8f6d7377a3eff338be2654c2"} Dec 12 01:05:58 crc kubenswrapper[4606]: I1212 01:05:58.387758 4606 scope.go:117] "RemoveContainer" containerID="65ee984c8aa72e0b22a66ade027b0c535d7491fc38c8563fca8b1202a7db672c" Dec 12 01:05:58 crc kubenswrapper[4606]: I1212 01:05:58.387888 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rxlgw" Dec 12 01:05:58 crc kubenswrapper[4606]: I1212 01:05:58.416547 4606 scope.go:117] "RemoveContainer" containerID="4d77a679751dc237cb9c46bc1d692643c997defc1e5e0ba5b41d932d8b422c9a" Dec 12 01:05:58 crc kubenswrapper[4606]: I1212 01:05:58.435058 4606 scope.go:117] "RemoveContainer" containerID="6b09c82c1f9f7d0cfc2ae8582699fbc127c0961363ca0514705e063ea344a609" Dec 12 01:05:58 crc kubenswrapper[4606]: I1212 01:05:58.485654 4606 scope.go:117] "RemoveContainer" containerID="65ee984c8aa72e0b22a66ade027b0c535d7491fc38c8563fca8b1202a7db672c" Dec 12 01:05:58 crc kubenswrapper[4606]: E1212 01:05:58.486293 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65ee984c8aa72e0b22a66ade027b0c535d7491fc38c8563fca8b1202a7db672c\": container with ID starting with 65ee984c8aa72e0b22a66ade027b0c535d7491fc38c8563fca8b1202a7db672c not found: ID does not exist" containerID="65ee984c8aa72e0b22a66ade027b0c535d7491fc38c8563fca8b1202a7db672c" Dec 12 01:05:58 crc kubenswrapper[4606]: I1212 01:05:58.486438 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65ee984c8aa72e0b22a66ade027b0c535d7491fc38c8563fca8b1202a7db672c"} err="failed to get container status \"65ee984c8aa72e0b22a66ade027b0c535d7491fc38c8563fca8b1202a7db672c\": rpc error: code = NotFound desc = could not find container \"65ee984c8aa72e0b22a66ade027b0c535d7491fc38c8563fca8b1202a7db672c\": container with ID starting with 65ee984c8aa72e0b22a66ade027b0c535d7491fc38c8563fca8b1202a7db672c not found: ID does not exist" Dec 12 01:05:58 crc kubenswrapper[4606]: I1212 01:05:58.486466 4606 scope.go:117] "RemoveContainer" containerID="4d77a679751dc237cb9c46bc1d692643c997defc1e5e0ba5b41d932d8b422c9a" Dec 12 01:05:58 crc kubenswrapper[4606]: E1212 01:05:58.486885 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d77a679751dc237cb9c46bc1d692643c997defc1e5e0ba5b41d932d8b422c9a\": container with ID starting with 4d77a679751dc237cb9c46bc1d692643c997defc1e5e0ba5b41d932d8b422c9a not found: ID does not exist" containerID="4d77a679751dc237cb9c46bc1d692643c997defc1e5e0ba5b41d932d8b422c9a" Dec 12 01:05:58 crc kubenswrapper[4606]: I1212 01:05:58.486913 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d77a679751dc237cb9c46bc1d692643c997defc1e5e0ba5b41d932d8b422c9a"} err="failed to get container status \"4d77a679751dc237cb9c46bc1d692643c997defc1e5e0ba5b41d932d8b422c9a\": rpc error: code = NotFound desc = could not find container \"4d77a679751dc237cb9c46bc1d692643c997defc1e5e0ba5b41d932d8b422c9a\": container with ID starting with 4d77a679751dc237cb9c46bc1d692643c997defc1e5e0ba5b41d932d8b422c9a not found: ID does not exist" Dec 12 01:05:58 crc kubenswrapper[4606]: I1212 01:05:58.486927 4606 scope.go:117] "RemoveContainer" containerID="6b09c82c1f9f7d0cfc2ae8582699fbc127c0961363ca0514705e063ea344a609" Dec 12 01:05:58 crc kubenswrapper[4606]: E1212 01:05:58.487413 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b09c82c1f9f7d0cfc2ae8582699fbc127c0961363ca0514705e063ea344a609\": container with ID starting with 6b09c82c1f9f7d0cfc2ae8582699fbc127c0961363ca0514705e063ea344a609 not found: ID does not exist" containerID="6b09c82c1f9f7d0cfc2ae8582699fbc127c0961363ca0514705e063ea344a609" Dec 12 01:05:58 crc kubenswrapper[4606]: I1212 01:05:58.487433 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b09c82c1f9f7d0cfc2ae8582699fbc127c0961363ca0514705e063ea344a609"} err="failed to get container status \"6b09c82c1f9f7d0cfc2ae8582699fbc127c0961363ca0514705e063ea344a609\": rpc error: code = NotFound desc = could not find container \"6b09c82c1f9f7d0cfc2ae8582699fbc127c0961363ca0514705e063ea344a609\": container with ID starting with 6b09c82c1f9f7d0cfc2ae8582699fbc127c0961363ca0514705e063ea344a609 not found: ID does not exist" Dec 12 01:05:58 crc kubenswrapper[4606]: I1212 01:05:58.515975 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5703a692-bffc-433a-a2e8-bf02ea02f310-utilities\") pod \"5703a692-bffc-433a-a2e8-bf02ea02f310\" (UID: \"5703a692-bffc-433a-a2e8-bf02ea02f310\") " Dec 12 01:05:58 crc kubenswrapper[4606]: I1212 01:05:58.516281 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5703a692-bffc-433a-a2e8-bf02ea02f310-catalog-content\") pod \"5703a692-bffc-433a-a2e8-bf02ea02f310\" (UID: \"5703a692-bffc-433a-a2e8-bf02ea02f310\") " Dec 12 01:05:58 crc kubenswrapper[4606]: I1212 01:05:58.516393 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2h5j6\" (UniqueName: \"kubernetes.io/projected/5703a692-bffc-433a-a2e8-bf02ea02f310-kube-api-access-2h5j6\") pod \"5703a692-bffc-433a-a2e8-bf02ea02f310\" (UID: \"5703a692-bffc-433a-a2e8-bf02ea02f310\") " Dec 12 01:05:58 crc kubenswrapper[4606]: I1212 01:05:58.517844 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5703a692-bffc-433a-a2e8-bf02ea02f310-utilities" (OuterVolumeSpecName: "utilities") pod "5703a692-bffc-433a-a2e8-bf02ea02f310" (UID: "5703a692-bffc-433a-a2e8-bf02ea02f310"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 01:05:58 crc kubenswrapper[4606]: I1212 01:05:58.526738 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5703a692-bffc-433a-a2e8-bf02ea02f310-kube-api-access-2h5j6" (OuterVolumeSpecName: "kube-api-access-2h5j6") pod "5703a692-bffc-433a-a2e8-bf02ea02f310" (UID: "5703a692-bffc-433a-a2e8-bf02ea02f310"). InnerVolumeSpecName "kube-api-access-2h5j6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 01:05:58 crc kubenswrapper[4606]: I1212 01:05:58.567592 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5703a692-bffc-433a-a2e8-bf02ea02f310-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5703a692-bffc-433a-a2e8-bf02ea02f310" (UID: "5703a692-bffc-433a-a2e8-bf02ea02f310"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 01:05:58 crc kubenswrapper[4606]: I1212 01:05:58.618410 4606 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5703a692-bffc-433a-a2e8-bf02ea02f310-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 12 01:05:58 crc kubenswrapper[4606]: I1212 01:05:58.618443 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2h5j6\" (UniqueName: \"kubernetes.io/projected/5703a692-bffc-433a-a2e8-bf02ea02f310-kube-api-access-2h5j6\") on node \"crc\" DevicePath \"\"" Dec 12 01:05:58 crc kubenswrapper[4606]: I1212 01:05:58.618457 4606 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5703a692-bffc-433a-a2e8-bf02ea02f310-utilities\") on node \"crc\" DevicePath \"\"" Dec 12 01:05:58 crc kubenswrapper[4606]: I1212 01:05:58.727058 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rxlgw"] Dec 12 01:05:58 crc kubenswrapper[4606]: I1212 01:05:58.735798 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-rxlgw"] Dec 12 01:05:59 crc kubenswrapper[4606]: I1212 01:05:59.720019 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5703a692-bffc-433a-a2e8-bf02ea02f310" path="/var/lib/kubelet/pods/5703a692-bffc-433a-a2e8-bf02ea02f310/volumes" Dec 12 01:07:02 crc kubenswrapper[4606]: I1212 01:07:02.010306 4606 patch_prober.go:28] interesting pod/machine-config-daemon-cqmz5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 12 01:07:02 crc kubenswrapper[4606]: I1212 01:07:02.010888 4606 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 12 01:07:09 crc kubenswrapper[4606]: I1212 01:07:09.929200 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-fp5q4"] Dec 12 01:07:09 crc kubenswrapper[4606]: E1212 01:07:09.930443 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5703a692-bffc-433a-a2e8-bf02ea02f310" containerName="registry-server" Dec 12 01:07:09 crc kubenswrapper[4606]: I1212 01:07:09.930466 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="5703a692-bffc-433a-a2e8-bf02ea02f310" containerName="registry-server" Dec 12 01:07:09 crc kubenswrapper[4606]: E1212 01:07:09.930497 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5703a692-bffc-433a-a2e8-bf02ea02f310" containerName="extract-utilities" Dec 12 01:07:09 crc kubenswrapper[4606]: I1212 01:07:09.930510 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="5703a692-bffc-433a-a2e8-bf02ea02f310" containerName="extract-utilities" Dec 12 01:07:09 crc kubenswrapper[4606]: E1212 01:07:09.930538 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5703a692-bffc-433a-a2e8-bf02ea02f310" containerName="extract-content" Dec 12 01:07:09 crc kubenswrapper[4606]: I1212 01:07:09.930550 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="5703a692-bffc-433a-a2e8-bf02ea02f310" containerName="extract-content" Dec 12 01:07:09 crc kubenswrapper[4606]: I1212 01:07:09.953571 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="5703a692-bffc-433a-a2e8-bf02ea02f310" containerName="registry-server" Dec 12 01:07:09 crc kubenswrapper[4606]: I1212 01:07:09.974026 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fp5q4" Dec 12 01:07:09 crc kubenswrapper[4606]: I1212 01:07:09.981523 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fp5q4"] Dec 12 01:07:10 crc kubenswrapper[4606]: I1212 01:07:10.141623 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8290e814-06ee-41a9-a13a-d3c6c94c87b3-utilities\") pod \"certified-operators-fp5q4\" (UID: \"8290e814-06ee-41a9-a13a-d3c6c94c87b3\") " pod="openshift-marketplace/certified-operators-fp5q4" Dec 12 01:07:10 crc kubenswrapper[4606]: I1212 01:07:10.141961 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8290e814-06ee-41a9-a13a-d3c6c94c87b3-catalog-content\") pod \"certified-operators-fp5q4\" (UID: \"8290e814-06ee-41a9-a13a-d3c6c94c87b3\") " pod="openshift-marketplace/certified-operators-fp5q4" Dec 12 01:07:10 crc kubenswrapper[4606]: I1212 01:07:10.142040 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snbfn\" (UniqueName: \"kubernetes.io/projected/8290e814-06ee-41a9-a13a-d3c6c94c87b3-kube-api-access-snbfn\") pod \"certified-operators-fp5q4\" (UID: \"8290e814-06ee-41a9-a13a-d3c6c94c87b3\") " pod="openshift-marketplace/certified-operators-fp5q4" Dec 12 01:07:10 crc kubenswrapper[4606]: I1212 01:07:10.243574 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8290e814-06ee-41a9-a13a-d3c6c94c87b3-catalog-content\") pod \"certified-operators-fp5q4\" (UID: \"8290e814-06ee-41a9-a13a-d3c6c94c87b3\") " pod="openshift-marketplace/certified-operators-fp5q4" Dec 12 01:07:10 crc kubenswrapper[4606]: I1212 01:07:10.243620 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-snbfn\" (UniqueName: \"kubernetes.io/projected/8290e814-06ee-41a9-a13a-d3c6c94c87b3-kube-api-access-snbfn\") pod \"certified-operators-fp5q4\" (UID: \"8290e814-06ee-41a9-a13a-d3c6c94c87b3\") " pod="openshift-marketplace/certified-operators-fp5q4" Dec 12 01:07:10 crc kubenswrapper[4606]: I1212 01:07:10.243708 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8290e814-06ee-41a9-a13a-d3c6c94c87b3-utilities\") pod \"certified-operators-fp5q4\" (UID: \"8290e814-06ee-41a9-a13a-d3c6c94c87b3\") " pod="openshift-marketplace/certified-operators-fp5q4" Dec 12 01:07:10 crc kubenswrapper[4606]: I1212 01:07:10.244100 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8290e814-06ee-41a9-a13a-d3c6c94c87b3-utilities\") pod \"certified-operators-fp5q4\" (UID: \"8290e814-06ee-41a9-a13a-d3c6c94c87b3\") " pod="openshift-marketplace/certified-operators-fp5q4" Dec 12 01:07:10 crc kubenswrapper[4606]: I1212 01:07:10.244222 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8290e814-06ee-41a9-a13a-d3c6c94c87b3-catalog-content\") pod \"certified-operators-fp5q4\" (UID: \"8290e814-06ee-41a9-a13a-d3c6c94c87b3\") " pod="openshift-marketplace/certified-operators-fp5q4" Dec 12 01:07:10 crc kubenswrapper[4606]: I1212 01:07:10.261643 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-snbfn\" (UniqueName: \"kubernetes.io/projected/8290e814-06ee-41a9-a13a-d3c6c94c87b3-kube-api-access-snbfn\") pod \"certified-operators-fp5q4\" (UID: \"8290e814-06ee-41a9-a13a-d3c6c94c87b3\") " pod="openshift-marketplace/certified-operators-fp5q4" Dec 12 01:07:10 crc kubenswrapper[4606]: I1212 01:07:10.299957 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fp5q4" Dec 12 01:07:10 crc kubenswrapper[4606]: I1212 01:07:10.900350 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fp5q4"] Dec 12 01:07:11 crc kubenswrapper[4606]: I1212 01:07:11.109740 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fp5q4" event={"ID":"8290e814-06ee-41a9-a13a-d3c6c94c87b3","Type":"ContainerStarted","Data":"0bc877d046e0ac3813fb8d01144d4a035b24cac4a68d6e691dcc3d83b5bc33df"} Dec 12 01:07:11 crc kubenswrapper[4606]: I1212 01:07:11.110051 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fp5q4" event={"ID":"8290e814-06ee-41a9-a13a-d3c6c94c87b3","Type":"ContainerStarted","Data":"4aacf5f85c137d094f84f59d7d9311a3049cd068acff79ee333a8e6461d79d53"} Dec 12 01:07:12 crc kubenswrapper[4606]: I1212 01:07:12.120067 4606 generic.go:334] "Generic (PLEG): container finished" podID="8290e814-06ee-41a9-a13a-d3c6c94c87b3" containerID="0bc877d046e0ac3813fb8d01144d4a035b24cac4a68d6e691dcc3d83b5bc33df" exitCode=0 Dec 12 01:07:12 crc kubenswrapper[4606]: I1212 01:07:12.120112 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fp5q4" event={"ID":"8290e814-06ee-41a9-a13a-d3c6c94c87b3","Type":"ContainerDied","Data":"0bc877d046e0ac3813fb8d01144d4a035b24cac4a68d6e691dcc3d83b5bc33df"} Dec 12 01:07:18 crc kubenswrapper[4606]: I1212 01:07:18.181021 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fp5q4" event={"ID":"8290e814-06ee-41a9-a13a-d3c6c94c87b3","Type":"ContainerStarted","Data":"0faac0fcff9efeef4c382f15282dc5d42c37baac29ab7689bdd7672d0778aa56"} Dec 12 01:07:19 crc kubenswrapper[4606]: I1212 01:07:19.197847 4606 generic.go:334] "Generic (PLEG): container finished" podID="8290e814-06ee-41a9-a13a-d3c6c94c87b3" containerID="0faac0fcff9efeef4c382f15282dc5d42c37baac29ab7689bdd7672d0778aa56" exitCode=0 Dec 12 01:07:19 crc kubenswrapper[4606]: I1212 01:07:19.197885 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fp5q4" event={"ID":"8290e814-06ee-41a9-a13a-d3c6c94c87b3","Type":"ContainerDied","Data":"0faac0fcff9efeef4c382f15282dc5d42c37baac29ab7689bdd7672d0778aa56"} Dec 12 01:07:21 crc kubenswrapper[4606]: I1212 01:07:21.238604 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fp5q4" event={"ID":"8290e814-06ee-41a9-a13a-d3c6c94c87b3","Type":"ContainerStarted","Data":"e4bd879027622a2cbb2d93d7e3bd787fd3ab6fffed5c3a93d720bc15449f8efd"} Dec 12 01:07:21 crc kubenswrapper[4606]: I1212 01:07:21.262722 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-fp5q4" podStartSLOduration=4.089501083 podStartE2EDuration="12.262692046s" podCreationTimestamp="2025-12-12 01:07:09 +0000 UTC" firstStartedPulling="2025-12-12 01:07:12.122937986 +0000 UTC m=+2622.668290852" lastFinishedPulling="2025-12-12 01:07:20.296128949 +0000 UTC m=+2630.841481815" observedRunningTime="2025-12-12 01:07:21.257502938 +0000 UTC m=+2631.802855814" watchObservedRunningTime="2025-12-12 01:07:21.262692046 +0000 UTC m=+2631.808044912" Dec 12 01:07:30 crc kubenswrapper[4606]: I1212 01:07:30.301163 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-fp5q4" Dec 12 01:07:30 crc kubenswrapper[4606]: I1212 01:07:30.301905 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-fp5q4" Dec 12 01:07:30 crc kubenswrapper[4606]: I1212 01:07:30.364612 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-fp5q4" Dec 12 01:07:30 crc kubenswrapper[4606]: I1212 01:07:30.422156 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-fp5q4" Dec 12 01:07:30 crc kubenswrapper[4606]: I1212 01:07:30.505786 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fp5q4"] Dec 12 01:07:30 crc kubenswrapper[4606]: I1212 01:07:30.607692 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-c6vsb"] Dec 12 01:07:30 crc kubenswrapper[4606]: I1212 01:07:30.608225 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-c6vsb" podUID="c717d9dc-08d4-4863-8788-0f151d9f6c21" containerName="registry-server" containerID="cri-o://fa748d9f7908e02d3b6a0458928dd8cabff04f84f7e707a44c2414506e3a0cf3" gracePeriod=2 Dec 12 01:07:31 crc kubenswrapper[4606]: I1212 01:07:31.133469 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c6vsb" Dec 12 01:07:31 crc kubenswrapper[4606]: I1212 01:07:31.199164 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c717d9dc-08d4-4863-8788-0f151d9f6c21-utilities\") pod \"c717d9dc-08d4-4863-8788-0f151d9f6c21\" (UID: \"c717d9dc-08d4-4863-8788-0f151d9f6c21\") " Dec 12 01:07:31 crc kubenswrapper[4606]: I1212 01:07:31.199435 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c717d9dc-08d4-4863-8788-0f151d9f6c21-catalog-content\") pod \"c717d9dc-08d4-4863-8788-0f151d9f6c21\" (UID: \"c717d9dc-08d4-4863-8788-0f151d9f6c21\") " Dec 12 01:07:31 crc kubenswrapper[4606]: I1212 01:07:31.199593 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gmgk2\" (UniqueName: \"kubernetes.io/projected/c717d9dc-08d4-4863-8788-0f151d9f6c21-kube-api-access-gmgk2\") pod \"c717d9dc-08d4-4863-8788-0f151d9f6c21\" (UID: \"c717d9dc-08d4-4863-8788-0f151d9f6c21\") " Dec 12 01:07:31 crc kubenswrapper[4606]: I1212 01:07:31.204692 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c717d9dc-08d4-4863-8788-0f151d9f6c21-utilities" (OuterVolumeSpecName: "utilities") pod "c717d9dc-08d4-4863-8788-0f151d9f6c21" (UID: "c717d9dc-08d4-4863-8788-0f151d9f6c21"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 01:07:31 crc kubenswrapper[4606]: I1212 01:07:31.220095 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c717d9dc-08d4-4863-8788-0f151d9f6c21-kube-api-access-gmgk2" (OuterVolumeSpecName: "kube-api-access-gmgk2") pod "c717d9dc-08d4-4863-8788-0f151d9f6c21" (UID: "c717d9dc-08d4-4863-8788-0f151d9f6c21"). InnerVolumeSpecName "kube-api-access-gmgk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 01:07:31 crc kubenswrapper[4606]: I1212 01:07:31.290108 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c717d9dc-08d4-4863-8788-0f151d9f6c21-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c717d9dc-08d4-4863-8788-0f151d9f6c21" (UID: "c717d9dc-08d4-4863-8788-0f151d9f6c21"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 01:07:31 crc kubenswrapper[4606]: I1212 01:07:31.301342 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gmgk2\" (UniqueName: \"kubernetes.io/projected/c717d9dc-08d4-4863-8788-0f151d9f6c21-kube-api-access-gmgk2\") on node \"crc\" DevicePath \"\"" Dec 12 01:07:31 crc kubenswrapper[4606]: I1212 01:07:31.301529 4606 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c717d9dc-08d4-4863-8788-0f151d9f6c21-utilities\") on node \"crc\" DevicePath \"\"" Dec 12 01:07:31 crc kubenswrapper[4606]: I1212 01:07:31.301629 4606 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c717d9dc-08d4-4863-8788-0f151d9f6c21-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 12 01:07:31 crc kubenswrapper[4606]: I1212 01:07:31.324149 4606 generic.go:334] "Generic (PLEG): container finished" podID="c717d9dc-08d4-4863-8788-0f151d9f6c21" containerID="fa748d9f7908e02d3b6a0458928dd8cabff04f84f7e707a44c2414506e3a0cf3" exitCode=0 Dec 12 01:07:31 crc kubenswrapper[4606]: I1212 01:07:31.324211 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c6vsb" event={"ID":"c717d9dc-08d4-4863-8788-0f151d9f6c21","Type":"ContainerDied","Data":"fa748d9f7908e02d3b6a0458928dd8cabff04f84f7e707a44c2414506e3a0cf3"} Dec 12 01:07:31 crc kubenswrapper[4606]: I1212 01:07:31.324256 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c6vsb" event={"ID":"c717d9dc-08d4-4863-8788-0f151d9f6c21","Type":"ContainerDied","Data":"852f71ff7586852577f844b4c5db496b86c5420dfa5c50a7213dfd6ad338cd60"} Dec 12 01:07:31 crc kubenswrapper[4606]: I1212 01:07:31.324299 4606 scope.go:117] "RemoveContainer" containerID="fa748d9f7908e02d3b6a0458928dd8cabff04f84f7e707a44c2414506e3a0cf3" Dec 12 01:07:31 crc kubenswrapper[4606]: I1212 01:07:31.325364 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c6vsb" Dec 12 01:07:31 crc kubenswrapper[4606]: I1212 01:07:31.349289 4606 scope.go:117] "RemoveContainer" containerID="c34e4f04b61f7aa7fb53338beda1a43a80ba95b691060095ffbdf6a5ab83dcc3" Dec 12 01:07:31 crc kubenswrapper[4606]: I1212 01:07:31.378233 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-c6vsb"] Dec 12 01:07:31 crc kubenswrapper[4606]: I1212 01:07:31.380814 4606 scope.go:117] "RemoveContainer" containerID="4945ef5077d166c13dafecde7c8166708e9d9ba74ab55d4375dcc4154d23bbfb" Dec 12 01:07:31 crc kubenswrapper[4606]: I1212 01:07:31.383329 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-c6vsb"] Dec 12 01:07:31 crc kubenswrapper[4606]: I1212 01:07:31.416083 4606 scope.go:117] "RemoveContainer" containerID="fa748d9f7908e02d3b6a0458928dd8cabff04f84f7e707a44c2414506e3a0cf3" Dec 12 01:07:31 crc kubenswrapper[4606]: E1212 01:07:31.416585 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa748d9f7908e02d3b6a0458928dd8cabff04f84f7e707a44c2414506e3a0cf3\": container with ID starting with fa748d9f7908e02d3b6a0458928dd8cabff04f84f7e707a44c2414506e3a0cf3 not found: ID does not exist" containerID="fa748d9f7908e02d3b6a0458928dd8cabff04f84f7e707a44c2414506e3a0cf3" Dec 12 01:07:31 crc kubenswrapper[4606]: I1212 01:07:31.416627 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa748d9f7908e02d3b6a0458928dd8cabff04f84f7e707a44c2414506e3a0cf3"} err="failed to get container status \"fa748d9f7908e02d3b6a0458928dd8cabff04f84f7e707a44c2414506e3a0cf3\": rpc error: code = NotFound desc = could not find container \"fa748d9f7908e02d3b6a0458928dd8cabff04f84f7e707a44c2414506e3a0cf3\": container with ID starting with fa748d9f7908e02d3b6a0458928dd8cabff04f84f7e707a44c2414506e3a0cf3 not found: ID does not exist" Dec 12 01:07:31 crc kubenswrapper[4606]: I1212 01:07:31.416656 4606 scope.go:117] "RemoveContainer" containerID="c34e4f04b61f7aa7fb53338beda1a43a80ba95b691060095ffbdf6a5ab83dcc3" Dec 12 01:07:31 crc kubenswrapper[4606]: E1212 01:07:31.416974 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c34e4f04b61f7aa7fb53338beda1a43a80ba95b691060095ffbdf6a5ab83dcc3\": container with ID starting with c34e4f04b61f7aa7fb53338beda1a43a80ba95b691060095ffbdf6a5ab83dcc3 not found: ID does not exist" containerID="c34e4f04b61f7aa7fb53338beda1a43a80ba95b691060095ffbdf6a5ab83dcc3" Dec 12 01:07:31 crc kubenswrapper[4606]: I1212 01:07:31.417002 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c34e4f04b61f7aa7fb53338beda1a43a80ba95b691060095ffbdf6a5ab83dcc3"} err="failed to get container status \"c34e4f04b61f7aa7fb53338beda1a43a80ba95b691060095ffbdf6a5ab83dcc3\": rpc error: code = NotFound desc = could not find container \"c34e4f04b61f7aa7fb53338beda1a43a80ba95b691060095ffbdf6a5ab83dcc3\": container with ID starting with c34e4f04b61f7aa7fb53338beda1a43a80ba95b691060095ffbdf6a5ab83dcc3 not found: ID does not exist" Dec 12 01:07:31 crc kubenswrapper[4606]: I1212 01:07:31.417026 4606 scope.go:117] "RemoveContainer" containerID="4945ef5077d166c13dafecde7c8166708e9d9ba74ab55d4375dcc4154d23bbfb" Dec 12 01:07:31 crc kubenswrapper[4606]: E1212 01:07:31.418627 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4945ef5077d166c13dafecde7c8166708e9d9ba74ab55d4375dcc4154d23bbfb\": container with ID starting with 4945ef5077d166c13dafecde7c8166708e9d9ba74ab55d4375dcc4154d23bbfb not found: ID does not exist" containerID="4945ef5077d166c13dafecde7c8166708e9d9ba74ab55d4375dcc4154d23bbfb" Dec 12 01:07:31 crc kubenswrapper[4606]: I1212 01:07:31.418691 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4945ef5077d166c13dafecde7c8166708e9d9ba74ab55d4375dcc4154d23bbfb"} err="failed to get container status \"4945ef5077d166c13dafecde7c8166708e9d9ba74ab55d4375dcc4154d23bbfb\": rpc error: code = NotFound desc = could not find container \"4945ef5077d166c13dafecde7c8166708e9d9ba74ab55d4375dcc4154d23bbfb\": container with ID starting with 4945ef5077d166c13dafecde7c8166708e9d9ba74ab55d4375dcc4154d23bbfb not found: ID does not exist" Dec 12 01:07:31 crc kubenswrapper[4606]: I1212 01:07:31.709364 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c717d9dc-08d4-4863-8788-0f151d9f6c21" path="/var/lib/kubelet/pods/c717d9dc-08d4-4863-8788-0f151d9f6c21/volumes" Dec 12 01:07:32 crc kubenswrapper[4606]: I1212 01:07:32.010805 4606 patch_prober.go:28] interesting pod/machine-config-daemon-cqmz5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 12 01:07:32 crc kubenswrapper[4606]: I1212 01:07:32.010875 4606 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 12 01:08:02 crc kubenswrapper[4606]: I1212 01:08:02.010449 4606 patch_prober.go:28] interesting pod/machine-config-daemon-cqmz5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 12 01:08:02 crc kubenswrapper[4606]: I1212 01:08:02.010909 4606 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 12 01:08:02 crc kubenswrapper[4606]: I1212 01:08:02.010966 4606 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" Dec 12 01:08:02 crc kubenswrapper[4606]: I1212 01:08:02.011898 4606 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"db929c2a4f0c0ac394d9f5ec86cf2aad7286615bc2456ec2a3e2009c28374446"} pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 12 01:08:02 crc kubenswrapper[4606]: I1212 01:08:02.011965 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" containerName="machine-config-daemon" containerID="cri-o://db929c2a4f0c0ac394d9f5ec86cf2aad7286615bc2456ec2a3e2009c28374446" gracePeriod=600 Dec 12 01:08:02 crc kubenswrapper[4606]: I1212 01:08:02.586765 4606 generic.go:334] "Generic (PLEG): container finished" podID="a543e227-be89-40cb-941d-b4707cc28921" containerID="db929c2a4f0c0ac394d9f5ec86cf2aad7286615bc2456ec2a3e2009c28374446" exitCode=0 Dec 12 01:08:02 crc kubenswrapper[4606]: I1212 01:08:02.586983 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" event={"ID":"a543e227-be89-40cb-941d-b4707cc28921","Type":"ContainerDied","Data":"db929c2a4f0c0ac394d9f5ec86cf2aad7286615bc2456ec2a3e2009c28374446"} Dec 12 01:08:02 crc kubenswrapper[4606]: I1212 01:08:02.587006 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" event={"ID":"a543e227-be89-40cb-941d-b4707cc28921","Type":"ContainerStarted","Data":"d143f1fd2961e240236733dcda2a6e280af9c700504d512a4569cefca1d9830f"} Dec 12 01:08:02 crc kubenswrapper[4606]: I1212 01:08:02.587020 4606 scope.go:117] "RemoveContainer" containerID="5c37b89455dd84575aa571020478d60921333939ed948577bddb71be18b5f553" Dec 12 01:09:07 crc kubenswrapper[4606]: I1212 01:09:07.189843 4606 generic.go:334] "Generic (PLEG): container finished" podID="7698f12d-5dde-46ae-929e-472dfebb1a90" containerID="de804e2aa80bc4c69279b98c605a23ae400d59412b9bf47ea8634a39b08a038a" exitCode=0 Dec 12 01:09:07 crc kubenswrapper[4606]: I1212 01:09:07.190441 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cdq4s" event={"ID":"7698f12d-5dde-46ae-929e-472dfebb1a90","Type":"ContainerDied","Data":"de804e2aa80bc4c69279b98c605a23ae400d59412b9bf47ea8634a39b08a038a"} Dec 12 01:09:08 crc kubenswrapper[4606]: I1212 01:09:08.674087 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cdq4s" Dec 12 01:09:08 crc kubenswrapper[4606]: I1212 01:09:08.785363 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j955n\" (UniqueName: \"kubernetes.io/projected/7698f12d-5dde-46ae-929e-472dfebb1a90-kube-api-access-j955n\") pod \"7698f12d-5dde-46ae-929e-472dfebb1a90\" (UID: \"7698f12d-5dde-46ae-929e-472dfebb1a90\") " Dec 12 01:09:08 crc kubenswrapper[4606]: I1212 01:09:08.785420 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7698f12d-5dde-46ae-929e-472dfebb1a90-inventory\") pod \"7698f12d-5dde-46ae-929e-472dfebb1a90\" (UID: \"7698f12d-5dde-46ae-929e-472dfebb1a90\") " Dec 12 01:09:08 crc kubenswrapper[4606]: I1212 01:09:08.785446 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/7698f12d-5dde-46ae-929e-472dfebb1a90-libvirt-secret-0\") pod \"7698f12d-5dde-46ae-929e-472dfebb1a90\" (UID: \"7698f12d-5dde-46ae-929e-472dfebb1a90\") " Dec 12 01:09:08 crc kubenswrapper[4606]: I1212 01:09:08.785501 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7698f12d-5dde-46ae-929e-472dfebb1a90-libvirt-combined-ca-bundle\") pod \"7698f12d-5dde-46ae-929e-472dfebb1a90\" (UID: \"7698f12d-5dde-46ae-929e-472dfebb1a90\") " Dec 12 01:09:08 crc kubenswrapper[4606]: I1212 01:09:08.785671 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7698f12d-5dde-46ae-929e-472dfebb1a90-ssh-key\") pod \"7698f12d-5dde-46ae-929e-472dfebb1a90\" (UID: \"7698f12d-5dde-46ae-929e-472dfebb1a90\") " Dec 12 01:09:08 crc kubenswrapper[4606]: I1212 01:09:08.795072 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7698f12d-5dde-46ae-929e-472dfebb1a90-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "7698f12d-5dde-46ae-929e-472dfebb1a90" (UID: "7698f12d-5dde-46ae-929e-472dfebb1a90"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 01:09:08 crc kubenswrapper[4606]: I1212 01:09:08.795311 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7698f12d-5dde-46ae-929e-472dfebb1a90-kube-api-access-j955n" (OuterVolumeSpecName: "kube-api-access-j955n") pod "7698f12d-5dde-46ae-929e-472dfebb1a90" (UID: "7698f12d-5dde-46ae-929e-472dfebb1a90"). InnerVolumeSpecName "kube-api-access-j955n". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 01:09:08 crc kubenswrapper[4606]: I1212 01:09:08.815151 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7698f12d-5dde-46ae-929e-472dfebb1a90-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "7698f12d-5dde-46ae-929e-472dfebb1a90" (UID: "7698f12d-5dde-46ae-929e-472dfebb1a90"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 01:09:08 crc kubenswrapper[4606]: I1212 01:09:08.819032 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7698f12d-5dde-46ae-929e-472dfebb1a90-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "7698f12d-5dde-46ae-929e-472dfebb1a90" (UID: "7698f12d-5dde-46ae-929e-472dfebb1a90"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 01:09:08 crc kubenswrapper[4606]: I1212 01:09:08.821470 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7698f12d-5dde-46ae-929e-472dfebb1a90-inventory" (OuterVolumeSpecName: "inventory") pod "7698f12d-5dde-46ae-929e-472dfebb1a90" (UID: "7698f12d-5dde-46ae-929e-472dfebb1a90"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 01:09:08 crc kubenswrapper[4606]: I1212 01:09:08.888104 4606 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7698f12d-5dde-46ae-929e-472dfebb1a90-inventory\") on node \"crc\" DevicePath \"\"" Dec 12 01:09:08 crc kubenswrapper[4606]: I1212 01:09:08.888134 4606 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/7698f12d-5dde-46ae-929e-472dfebb1a90-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Dec 12 01:09:08 crc kubenswrapper[4606]: I1212 01:09:08.888147 4606 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7698f12d-5dde-46ae-929e-472dfebb1a90-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 12 01:09:08 crc kubenswrapper[4606]: I1212 01:09:08.888158 4606 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7698f12d-5dde-46ae-929e-472dfebb1a90-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 12 01:09:08 crc kubenswrapper[4606]: I1212 01:09:08.888172 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j955n\" (UniqueName: \"kubernetes.io/projected/7698f12d-5dde-46ae-929e-472dfebb1a90-kube-api-access-j955n\") on node \"crc\" DevicePath \"\"" Dec 12 01:09:09 crc kubenswrapper[4606]: I1212 01:09:09.211004 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cdq4s" event={"ID":"7698f12d-5dde-46ae-929e-472dfebb1a90","Type":"ContainerDied","Data":"583b4b887f5d53759c0f574855fb2b98749d29690831c9ad4319c757ca27ba5e"} Dec 12 01:09:09 crc kubenswrapper[4606]: I1212 01:09:09.211036 4606 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="583b4b887f5d53759c0f574855fb2b98749d29690831c9ad4319c757ca27ba5e" Dec 12 01:09:09 crc kubenswrapper[4606]: I1212 01:09:09.211072 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cdq4s" Dec 12 01:09:09 crc kubenswrapper[4606]: I1212 01:09:09.301548 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-4ls7g"] Dec 12 01:09:09 crc kubenswrapper[4606]: E1212 01:09:09.301889 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c717d9dc-08d4-4863-8788-0f151d9f6c21" containerName="extract-content" Dec 12 01:09:09 crc kubenswrapper[4606]: I1212 01:09:09.301906 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="c717d9dc-08d4-4863-8788-0f151d9f6c21" containerName="extract-content" Dec 12 01:09:09 crc kubenswrapper[4606]: E1212 01:09:09.301926 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c717d9dc-08d4-4863-8788-0f151d9f6c21" containerName="extract-utilities" Dec 12 01:09:09 crc kubenswrapper[4606]: I1212 01:09:09.301933 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="c717d9dc-08d4-4863-8788-0f151d9f6c21" containerName="extract-utilities" Dec 12 01:09:09 crc kubenswrapper[4606]: E1212 01:09:09.301958 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7698f12d-5dde-46ae-929e-472dfebb1a90" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 12 01:09:09 crc kubenswrapper[4606]: I1212 01:09:09.301965 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="7698f12d-5dde-46ae-929e-472dfebb1a90" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 12 01:09:09 crc kubenswrapper[4606]: E1212 01:09:09.301985 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c717d9dc-08d4-4863-8788-0f151d9f6c21" containerName="registry-server" Dec 12 01:09:09 crc kubenswrapper[4606]: I1212 01:09:09.301991 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="c717d9dc-08d4-4863-8788-0f151d9f6c21" containerName="registry-server" Dec 12 01:09:09 crc kubenswrapper[4606]: I1212 01:09:09.302156 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="7698f12d-5dde-46ae-929e-472dfebb1a90" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 12 01:09:09 crc kubenswrapper[4606]: I1212 01:09:09.302194 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="c717d9dc-08d4-4863-8788-0f151d9f6c21" containerName="registry-server" Dec 12 01:09:09 crc kubenswrapper[4606]: I1212 01:09:09.302794 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4ls7g" Dec 12 01:09:09 crc kubenswrapper[4606]: I1212 01:09:09.306698 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Dec 12 01:09:09 crc kubenswrapper[4606]: I1212 01:09:09.306824 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Dec 12 01:09:09 crc kubenswrapper[4606]: I1212 01:09:09.306929 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-w59bl" Dec 12 01:09:09 crc kubenswrapper[4606]: I1212 01:09:09.307002 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 12 01:09:09 crc kubenswrapper[4606]: I1212 01:09:09.307140 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 12 01:09:09 crc kubenswrapper[4606]: I1212 01:09:09.311989 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 12 01:09:09 crc kubenswrapper[4606]: I1212 01:09:09.312190 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Dec 12 01:09:09 crc kubenswrapper[4606]: I1212 01:09:09.322482 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-4ls7g"] Dec 12 01:09:09 crc kubenswrapper[4606]: I1212 01:09:09.398932 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f70ab77b-2e78-421a-8563-8d8d0e049800-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-4ls7g\" (UID: \"f70ab77b-2e78-421a-8563-8d8d0e049800\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4ls7g" Dec 12 01:09:09 crc kubenswrapper[4606]: I1212 01:09:09.399164 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f70ab77b-2e78-421a-8563-8d8d0e049800-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-4ls7g\" (UID: \"f70ab77b-2e78-421a-8563-8d8d0e049800\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4ls7g" Dec 12 01:09:09 crc kubenswrapper[4606]: I1212 01:09:09.399345 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/f70ab77b-2e78-421a-8563-8d8d0e049800-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-4ls7g\" (UID: \"f70ab77b-2e78-421a-8563-8d8d0e049800\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4ls7g" Dec 12 01:09:09 crc kubenswrapper[4606]: I1212 01:09:09.399419 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/f70ab77b-2e78-421a-8563-8d8d0e049800-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-4ls7g\" (UID: \"f70ab77b-2e78-421a-8563-8d8d0e049800\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4ls7g" Dec 12 01:09:09 crc kubenswrapper[4606]: I1212 01:09:09.399498 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/f70ab77b-2e78-421a-8563-8d8d0e049800-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-4ls7g\" (UID: \"f70ab77b-2e78-421a-8563-8d8d0e049800\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4ls7g" Dec 12 01:09:09 crc kubenswrapper[4606]: I1212 01:09:09.399586 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/f70ab77b-2e78-421a-8563-8d8d0e049800-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-4ls7g\" (UID: \"f70ab77b-2e78-421a-8563-8d8d0e049800\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4ls7g" Dec 12 01:09:09 crc kubenswrapper[4606]: I1212 01:09:09.399718 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/f70ab77b-2e78-421a-8563-8d8d0e049800-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-4ls7g\" (UID: \"f70ab77b-2e78-421a-8563-8d8d0e049800\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4ls7g" Dec 12 01:09:09 crc kubenswrapper[4606]: I1212 01:09:09.399840 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64ggk\" (UniqueName: \"kubernetes.io/projected/f70ab77b-2e78-421a-8563-8d8d0e049800-kube-api-access-64ggk\") pod \"nova-edpm-deployment-openstack-edpm-ipam-4ls7g\" (UID: \"f70ab77b-2e78-421a-8563-8d8d0e049800\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4ls7g" Dec 12 01:09:09 crc kubenswrapper[4606]: I1212 01:09:09.399875 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f70ab77b-2e78-421a-8563-8d8d0e049800-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-4ls7g\" (UID: \"f70ab77b-2e78-421a-8563-8d8d0e049800\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4ls7g" Dec 12 01:09:09 crc kubenswrapper[4606]: I1212 01:09:09.501222 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f70ab77b-2e78-421a-8563-8d8d0e049800-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-4ls7g\" (UID: \"f70ab77b-2e78-421a-8563-8d8d0e049800\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4ls7g" Dec 12 01:09:09 crc kubenswrapper[4606]: I1212 01:09:09.501295 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f70ab77b-2e78-421a-8563-8d8d0e049800-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-4ls7g\" (UID: \"f70ab77b-2e78-421a-8563-8d8d0e049800\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4ls7g" Dec 12 01:09:09 crc kubenswrapper[4606]: I1212 01:09:09.501329 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/f70ab77b-2e78-421a-8563-8d8d0e049800-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-4ls7g\" (UID: \"f70ab77b-2e78-421a-8563-8d8d0e049800\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4ls7g" Dec 12 01:09:09 crc kubenswrapper[4606]: I1212 01:09:09.501377 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/f70ab77b-2e78-421a-8563-8d8d0e049800-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-4ls7g\" (UID: \"f70ab77b-2e78-421a-8563-8d8d0e049800\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4ls7g" Dec 12 01:09:09 crc kubenswrapper[4606]: I1212 01:09:09.501402 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/f70ab77b-2e78-421a-8563-8d8d0e049800-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-4ls7g\" (UID: \"f70ab77b-2e78-421a-8563-8d8d0e049800\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4ls7g" Dec 12 01:09:09 crc kubenswrapper[4606]: I1212 01:09:09.501425 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/f70ab77b-2e78-421a-8563-8d8d0e049800-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-4ls7g\" (UID: \"f70ab77b-2e78-421a-8563-8d8d0e049800\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4ls7g" Dec 12 01:09:09 crc kubenswrapper[4606]: I1212 01:09:09.501495 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/f70ab77b-2e78-421a-8563-8d8d0e049800-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-4ls7g\" (UID: \"f70ab77b-2e78-421a-8563-8d8d0e049800\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4ls7g" Dec 12 01:09:09 crc kubenswrapper[4606]: I1212 01:09:09.501563 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-64ggk\" (UniqueName: \"kubernetes.io/projected/f70ab77b-2e78-421a-8563-8d8d0e049800-kube-api-access-64ggk\") pod \"nova-edpm-deployment-openstack-edpm-ipam-4ls7g\" (UID: \"f70ab77b-2e78-421a-8563-8d8d0e049800\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4ls7g" Dec 12 01:09:09 crc kubenswrapper[4606]: I1212 01:09:09.501592 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f70ab77b-2e78-421a-8563-8d8d0e049800-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-4ls7g\" (UID: \"f70ab77b-2e78-421a-8563-8d8d0e049800\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4ls7g" Dec 12 01:09:09 crc kubenswrapper[4606]: I1212 01:09:09.502557 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/f70ab77b-2e78-421a-8563-8d8d0e049800-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-4ls7g\" (UID: \"f70ab77b-2e78-421a-8563-8d8d0e049800\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4ls7g" Dec 12 01:09:09 crc kubenswrapper[4606]: I1212 01:09:09.506137 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f70ab77b-2e78-421a-8563-8d8d0e049800-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-4ls7g\" (UID: \"f70ab77b-2e78-421a-8563-8d8d0e049800\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4ls7g" Dec 12 01:09:09 crc kubenswrapper[4606]: I1212 01:09:09.506231 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/f70ab77b-2e78-421a-8563-8d8d0e049800-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-4ls7g\" (UID: \"f70ab77b-2e78-421a-8563-8d8d0e049800\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4ls7g" Dec 12 01:09:09 crc kubenswrapper[4606]: I1212 01:09:09.506712 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/f70ab77b-2e78-421a-8563-8d8d0e049800-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-4ls7g\" (UID: \"f70ab77b-2e78-421a-8563-8d8d0e049800\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4ls7g" Dec 12 01:09:09 crc kubenswrapper[4606]: I1212 01:09:09.508266 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/f70ab77b-2e78-421a-8563-8d8d0e049800-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-4ls7g\" (UID: \"f70ab77b-2e78-421a-8563-8d8d0e049800\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4ls7g" Dec 12 01:09:09 crc kubenswrapper[4606]: I1212 01:09:09.510163 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/f70ab77b-2e78-421a-8563-8d8d0e049800-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-4ls7g\" (UID: \"f70ab77b-2e78-421a-8563-8d8d0e049800\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4ls7g" Dec 12 01:09:09 crc kubenswrapper[4606]: I1212 01:09:09.514829 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f70ab77b-2e78-421a-8563-8d8d0e049800-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-4ls7g\" (UID: \"f70ab77b-2e78-421a-8563-8d8d0e049800\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4ls7g" Dec 12 01:09:09 crc kubenswrapper[4606]: I1212 01:09:09.518459 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f70ab77b-2e78-421a-8563-8d8d0e049800-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-4ls7g\" (UID: \"f70ab77b-2e78-421a-8563-8d8d0e049800\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4ls7g" Dec 12 01:09:09 crc kubenswrapper[4606]: I1212 01:09:09.519120 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-64ggk\" (UniqueName: \"kubernetes.io/projected/f70ab77b-2e78-421a-8563-8d8d0e049800-kube-api-access-64ggk\") pod \"nova-edpm-deployment-openstack-edpm-ipam-4ls7g\" (UID: \"f70ab77b-2e78-421a-8563-8d8d0e049800\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4ls7g" Dec 12 01:09:09 crc kubenswrapper[4606]: I1212 01:09:09.621975 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4ls7g" Dec 12 01:09:10 crc kubenswrapper[4606]: I1212 01:09:10.183746 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-4ls7g"] Dec 12 01:09:10 crc kubenswrapper[4606]: I1212 01:09:10.221872 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4ls7g" event={"ID":"f70ab77b-2e78-421a-8563-8d8d0e049800","Type":"ContainerStarted","Data":"374aa9f52e40b920034b0e8f55f0eb873ed480c5ac1308c976d0e9d4b0ef53b9"} Dec 12 01:09:11 crc kubenswrapper[4606]: I1212 01:09:11.232832 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4ls7g" event={"ID":"f70ab77b-2e78-421a-8563-8d8d0e049800","Type":"ContainerStarted","Data":"14bfa7db0a6c2a2d031f4660cb0b55ded04a670b76cc33258bc70b1829c68b0a"} Dec 12 01:09:11 crc kubenswrapper[4606]: I1212 01:09:11.253135 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4ls7g" podStartSLOduration=2.035326832 podStartE2EDuration="2.252933587s" podCreationTimestamp="2025-12-12 01:09:09 +0000 UTC" firstStartedPulling="2025-12-12 01:09:10.185878505 +0000 UTC m=+2740.731231381" lastFinishedPulling="2025-12-12 01:09:10.40348527 +0000 UTC m=+2740.948838136" observedRunningTime="2025-12-12 01:09:11.252263599 +0000 UTC m=+2741.797616475" watchObservedRunningTime="2025-12-12 01:09:11.252933587 +0000 UTC m=+2741.798286463" Dec 12 01:09:23 crc kubenswrapper[4606]: I1212 01:09:23.348550 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-cprpj"] Dec 12 01:09:23 crc kubenswrapper[4606]: I1212 01:09:23.351341 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cprpj" Dec 12 01:09:23 crc kubenswrapper[4606]: I1212 01:09:23.361909 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cprpj"] Dec 12 01:09:23 crc kubenswrapper[4606]: I1212 01:09:23.383080 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s27dm\" (UniqueName: \"kubernetes.io/projected/25fd83b5-3248-4d4d-90b4-6480700ad8ea-kube-api-access-s27dm\") pod \"redhat-marketplace-cprpj\" (UID: \"25fd83b5-3248-4d4d-90b4-6480700ad8ea\") " pod="openshift-marketplace/redhat-marketplace-cprpj" Dec 12 01:09:23 crc kubenswrapper[4606]: I1212 01:09:23.383365 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25fd83b5-3248-4d4d-90b4-6480700ad8ea-utilities\") pod \"redhat-marketplace-cprpj\" (UID: \"25fd83b5-3248-4d4d-90b4-6480700ad8ea\") " pod="openshift-marketplace/redhat-marketplace-cprpj" Dec 12 01:09:23 crc kubenswrapper[4606]: I1212 01:09:23.383489 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25fd83b5-3248-4d4d-90b4-6480700ad8ea-catalog-content\") pod \"redhat-marketplace-cprpj\" (UID: \"25fd83b5-3248-4d4d-90b4-6480700ad8ea\") " pod="openshift-marketplace/redhat-marketplace-cprpj" Dec 12 01:09:23 crc kubenswrapper[4606]: I1212 01:09:23.484994 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25fd83b5-3248-4d4d-90b4-6480700ad8ea-catalog-content\") pod \"redhat-marketplace-cprpj\" (UID: \"25fd83b5-3248-4d4d-90b4-6480700ad8ea\") " pod="openshift-marketplace/redhat-marketplace-cprpj" Dec 12 01:09:23 crc kubenswrapper[4606]: I1212 01:09:23.485133 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s27dm\" (UniqueName: \"kubernetes.io/projected/25fd83b5-3248-4d4d-90b4-6480700ad8ea-kube-api-access-s27dm\") pod \"redhat-marketplace-cprpj\" (UID: \"25fd83b5-3248-4d4d-90b4-6480700ad8ea\") " pod="openshift-marketplace/redhat-marketplace-cprpj" Dec 12 01:09:23 crc kubenswrapper[4606]: I1212 01:09:23.485184 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25fd83b5-3248-4d4d-90b4-6480700ad8ea-utilities\") pod \"redhat-marketplace-cprpj\" (UID: \"25fd83b5-3248-4d4d-90b4-6480700ad8ea\") " pod="openshift-marketplace/redhat-marketplace-cprpj" Dec 12 01:09:23 crc kubenswrapper[4606]: I1212 01:09:23.485648 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25fd83b5-3248-4d4d-90b4-6480700ad8ea-utilities\") pod \"redhat-marketplace-cprpj\" (UID: \"25fd83b5-3248-4d4d-90b4-6480700ad8ea\") " pod="openshift-marketplace/redhat-marketplace-cprpj" Dec 12 01:09:23 crc kubenswrapper[4606]: I1212 01:09:23.485857 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25fd83b5-3248-4d4d-90b4-6480700ad8ea-catalog-content\") pod \"redhat-marketplace-cprpj\" (UID: \"25fd83b5-3248-4d4d-90b4-6480700ad8ea\") " pod="openshift-marketplace/redhat-marketplace-cprpj" Dec 12 01:09:23 crc kubenswrapper[4606]: I1212 01:09:23.517320 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s27dm\" (UniqueName: \"kubernetes.io/projected/25fd83b5-3248-4d4d-90b4-6480700ad8ea-kube-api-access-s27dm\") pod \"redhat-marketplace-cprpj\" (UID: \"25fd83b5-3248-4d4d-90b4-6480700ad8ea\") " pod="openshift-marketplace/redhat-marketplace-cprpj" Dec 12 01:09:23 crc kubenswrapper[4606]: I1212 01:09:23.674915 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cprpj" Dec 12 01:09:24 crc kubenswrapper[4606]: I1212 01:09:24.210954 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cprpj"] Dec 12 01:09:24 crc kubenswrapper[4606]: I1212 01:09:24.364163 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cprpj" event={"ID":"25fd83b5-3248-4d4d-90b4-6480700ad8ea","Type":"ContainerStarted","Data":"cfd15eef78b657e5a0de4dab587b371628f831254d7c6c16778d6d92c6d67fe3"} Dec 12 01:09:25 crc kubenswrapper[4606]: I1212 01:09:25.374372 4606 generic.go:334] "Generic (PLEG): container finished" podID="25fd83b5-3248-4d4d-90b4-6480700ad8ea" containerID="777d9d2e05f2693c8267c5c117a70eda18f03c883c4eeeebcdab8751680cbb4c" exitCode=0 Dec 12 01:09:25 crc kubenswrapper[4606]: I1212 01:09:25.374656 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cprpj" event={"ID":"25fd83b5-3248-4d4d-90b4-6480700ad8ea","Type":"ContainerDied","Data":"777d9d2e05f2693c8267c5c117a70eda18f03c883c4eeeebcdab8751680cbb4c"} Dec 12 01:09:26 crc kubenswrapper[4606]: I1212 01:09:26.390933 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cprpj" event={"ID":"25fd83b5-3248-4d4d-90b4-6480700ad8ea","Type":"ContainerStarted","Data":"e016bd1403723ce3cd062ae35ba1be5b35b717bd471b8ba92c41492f7b1b975c"} Dec 12 01:09:26 crc kubenswrapper[4606]: E1212 01:09:26.882719 4606 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod25fd83b5_3248_4d4d_90b4_6480700ad8ea.slice/crio-e016bd1403723ce3cd062ae35ba1be5b35b717bd471b8ba92c41492f7b1b975c.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod25fd83b5_3248_4d4d_90b4_6480700ad8ea.slice/crio-conmon-e016bd1403723ce3cd062ae35ba1be5b35b717bd471b8ba92c41492f7b1b975c.scope\": RecentStats: unable to find data in memory cache]" Dec 12 01:09:27 crc kubenswrapper[4606]: I1212 01:09:27.402575 4606 generic.go:334] "Generic (PLEG): container finished" podID="25fd83b5-3248-4d4d-90b4-6480700ad8ea" containerID="e016bd1403723ce3cd062ae35ba1be5b35b717bd471b8ba92c41492f7b1b975c" exitCode=0 Dec 12 01:09:27 crc kubenswrapper[4606]: I1212 01:09:27.402666 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cprpj" event={"ID":"25fd83b5-3248-4d4d-90b4-6480700ad8ea","Type":"ContainerDied","Data":"e016bd1403723ce3cd062ae35ba1be5b35b717bd471b8ba92c41492f7b1b975c"} Dec 12 01:09:28 crc kubenswrapper[4606]: I1212 01:09:28.415202 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cprpj" event={"ID":"25fd83b5-3248-4d4d-90b4-6480700ad8ea","Type":"ContainerStarted","Data":"937bada58b39fa23e581f9ea1caa19ebf0ff1f0beebad3eed57e8a9d10186adc"} Dec 12 01:09:28 crc kubenswrapper[4606]: I1212 01:09:28.445351 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-cprpj" podStartSLOduration=2.750200135 podStartE2EDuration="5.445332464s" podCreationTimestamp="2025-12-12 01:09:23 +0000 UTC" firstStartedPulling="2025-12-12 01:09:25.377538313 +0000 UTC m=+2755.922891179" lastFinishedPulling="2025-12-12 01:09:28.072670642 +0000 UTC m=+2758.618023508" observedRunningTime="2025-12-12 01:09:28.437386453 +0000 UTC m=+2758.982739359" watchObservedRunningTime="2025-12-12 01:09:28.445332464 +0000 UTC m=+2758.990685330" Dec 12 01:09:33 crc kubenswrapper[4606]: I1212 01:09:33.676360 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-cprpj" Dec 12 01:09:33 crc kubenswrapper[4606]: I1212 01:09:33.677802 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-cprpj" Dec 12 01:09:33 crc kubenswrapper[4606]: I1212 01:09:33.750542 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-cprpj" Dec 12 01:09:34 crc kubenswrapper[4606]: I1212 01:09:34.521355 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-cprpj" Dec 12 01:09:34 crc kubenswrapper[4606]: I1212 01:09:34.573562 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cprpj"] Dec 12 01:09:36 crc kubenswrapper[4606]: I1212 01:09:36.491696 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-cprpj" podUID="25fd83b5-3248-4d4d-90b4-6480700ad8ea" containerName="registry-server" containerID="cri-o://937bada58b39fa23e581f9ea1caa19ebf0ff1f0beebad3eed57e8a9d10186adc" gracePeriod=2 Dec 12 01:09:36 crc kubenswrapper[4606]: I1212 01:09:36.996531 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cprpj" Dec 12 01:09:37 crc kubenswrapper[4606]: I1212 01:09:37.181779 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s27dm\" (UniqueName: \"kubernetes.io/projected/25fd83b5-3248-4d4d-90b4-6480700ad8ea-kube-api-access-s27dm\") pod \"25fd83b5-3248-4d4d-90b4-6480700ad8ea\" (UID: \"25fd83b5-3248-4d4d-90b4-6480700ad8ea\") " Dec 12 01:09:37 crc kubenswrapper[4606]: I1212 01:09:37.181870 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25fd83b5-3248-4d4d-90b4-6480700ad8ea-utilities\") pod \"25fd83b5-3248-4d4d-90b4-6480700ad8ea\" (UID: \"25fd83b5-3248-4d4d-90b4-6480700ad8ea\") " Dec 12 01:09:37 crc kubenswrapper[4606]: I1212 01:09:37.182043 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25fd83b5-3248-4d4d-90b4-6480700ad8ea-catalog-content\") pod \"25fd83b5-3248-4d4d-90b4-6480700ad8ea\" (UID: \"25fd83b5-3248-4d4d-90b4-6480700ad8ea\") " Dec 12 01:09:37 crc kubenswrapper[4606]: I1212 01:09:37.183330 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25fd83b5-3248-4d4d-90b4-6480700ad8ea-utilities" (OuterVolumeSpecName: "utilities") pod "25fd83b5-3248-4d4d-90b4-6480700ad8ea" (UID: "25fd83b5-3248-4d4d-90b4-6480700ad8ea"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 01:09:37 crc kubenswrapper[4606]: I1212 01:09:37.189001 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25fd83b5-3248-4d4d-90b4-6480700ad8ea-kube-api-access-s27dm" (OuterVolumeSpecName: "kube-api-access-s27dm") pod "25fd83b5-3248-4d4d-90b4-6480700ad8ea" (UID: "25fd83b5-3248-4d4d-90b4-6480700ad8ea"). InnerVolumeSpecName "kube-api-access-s27dm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 01:09:37 crc kubenswrapper[4606]: I1212 01:09:37.207029 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25fd83b5-3248-4d4d-90b4-6480700ad8ea-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "25fd83b5-3248-4d4d-90b4-6480700ad8ea" (UID: "25fd83b5-3248-4d4d-90b4-6480700ad8ea"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 01:09:37 crc kubenswrapper[4606]: I1212 01:09:37.285010 4606 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25fd83b5-3248-4d4d-90b4-6480700ad8ea-utilities\") on node \"crc\" DevicePath \"\"" Dec 12 01:09:37 crc kubenswrapper[4606]: I1212 01:09:37.285396 4606 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25fd83b5-3248-4d4d-90b4-6480700ad8ea-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 12 01:09:37 crc kubenswrapper[4606]: I1212 01:09:37.285415 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s27dm\" (UniqueName: \"kubernetes.io/projected/25fd83b5-3248-4d4d-90b4-6480700ad8ea-kube-api-access-s27dm\") on node \"crc\" DevicePath \"\"" Dec 12 01:09:37 crc kubenswrapper[4606]: I1212 01:09:37.504003 4606 generic.go:334] "Generic (PLEG): container finished" podID="25fd83b5-3248-4d4d-90b4-6480700ad8ea" containerID="937bada58b39fa23e581f9ea1caa19ebf0ff1f0beebad3eed57e8a9d10186adc" exitCode=0 Dec 12 01:09:37 crc kubenswrapper[4606]: I1212 01:09:37.504075 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cprpj" event={"ID":"25fd83b5-3248-4d4d-90b4-6480700ad8ea","Type":"ContainerDied","Data":"937bada58b39fa23e581f9ea1caa19ebf0ff1f0beebad3eed57e8a9d10186adc"} Dec 12 01:09:37 crc kubenswrapper[4606]: I1212 01:09:37.504134 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cprpj" event={"ID":"25fd83b5-3248-4d4d-90b4-6480700ad8ea","Type":"ContainerDied","Data":"cfd15eef78b657e5a0de4dab587b371628f831254d7c6c16778d6d92c6d67fe3"} Dec 12 01:09:37 crc kubenswrapper[4606]: I1212 01:09:37.504152 4606 scope.go:117] "RemoveContainer" containerID="937bada58b39fa23e581f9ea1caa19ebf0ff1f0beebad3eed57e8a9d10186adc" Dec 12 01:09:37 crc kubenswrapper[4606]: I1212 01:09:37.506289 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cprpj" Dec 12 01:09:37 crc kubenswrapper[4606]: I1212 01:09:37.542450 4606 scope.go:117] "RemoveContainer" containerID="e016bd1403723ce3cd062ae35ba1be5b35b717bd471b8ba92c41492f7b1b975c" Dec 12 01:09:37 crc kubenswrapper[4606]: I1212 01:09:37.575496 4606 scope.go:117] "RemoveContainer" containerID="777d9d2e05f2693c8267c5c117a70eda18f03c883c4eeeebcdab8751680cbb4c" Dec 12 01:09:37 crc kubenswrapper[4606]: I1212 01:09:37.575614 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cprpj"] Dec 12 01:09:37 crc kubenswrapper[4606]: I1212 01:09:37.591761 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-cprpj"] Dec 12 01:09:37 crc kubenswrapper[4606]: I1212 01:09:37.625374 4606 scope.go:117] "RemoveContainer" containerID="937bada58b39fa23e581f9ea1caa19ebf0ff1f0beebad3eed57e8a9d10186adc" Dec 12 01:09:37 crc kubenswrapper[4606]: E1212 01:09:37.625801 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"937bada58b39fa23e581f9ea1caa19ebf0ff1f0beebad3eed57e8a9d10186adc\": container with ID starting with 937bada58b39fa23e581f9ea1caa19ebf0ff1f0beebad3eed57e8a9d10186adc not found: ID does not exist" containerID="937bada58b39fa23e581f9ea1caa19ebf0ff1f0beebad3eed57e8a9d10186adc" Dec 12 01:09:37 crc kubenswrapper[4606]: I1212 01:09:37.625830 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"937bada58b39fa23e581f9ea1caa19ebf0ff1f0beebad3eed57e8a9d10186adc"} err="failed to get container status \"937bada58b39fa23e581f9ea1caa19ebf0ff1f0beebad3eed57e8a9d10186adc\": rpc error: code = NotFound desc = could not find container \"937bada58b39fa23e581f9ea1caa19ebf0ff1f0beebad3eed57e8a9d10186adc\": container with ID starting with 937bada58b39fa23e581f9ea1caa19ebf0ff1f0beebad3eed57e8a9d10186adc not found: ID does not exist" Dec 12 01:09:37 crc kubenswrapper[4606]: I1212 01:09:37.625852 4606 scope.go:117] "RemoveContainer" containerID="e016bd1403723ce3cd062ae35ba1be5b35b717bd471b8ba92c41492f7b1b975c" Dec 12 01:09:37 crc kubenswrapper[4606]: E1212 01:09:37.626243 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e016bd1403723ce3cd062ae35ba1be5b35b717bd471b8ba92c41492f7b1b975c\": container with ID starting with e016bd1403723ce3cd062ae35ba1be5b35b717bd471b8ba92c41492f7b1b975c not found: ID does not exist" containerID="e016bd1403723ce3cd062ae35ba1be5b35b717bd471b8ba92c41492f7b1b975c" Dec 12 01:09:37 crc kubenswrapper[4606]: I1212 01:09:37.626269 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e016bd1403723ce3cd062ae35ba1be5b35b717bd471b8ba92c41492f7b1b975c"} err="failed to get container status \"e016bd1403723ce3cd062ae35ba1be5b35b717bd471b8ba92c41492f7b1b975c\": rpc error: code = NotFound desc = could not find container \"e016bd1403723ce3cd062ae35ba1be5b35b717bd471b8ba92c41492f7b1b975c\": container with ID starting with e016bd1403723ce3cd062ae35ba1be5b35b717bd471b8ba92c41492f7b1b975c not found: ID does not exist" Dec 12 01:09:37 crc kubenswrapper[4606]: I1212 01:09:37.626282 4606 scope.go:117] "RemoveContainer" containerID="777d9d2e05f2693c8267c5c117a70eda18f03c883c4eeeebcdab8751680cbb4c" Dec 12 01:09:37 crc kubenswrapper[4606]: E1212 01:09:37.626525 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"777d9d2e05f2693c8267c5c117a70eda18f03c883c4eeeebcdab8751680cbb4c\": container with ID starting with 777d9d2e05f2693c8267c5c117a70eda18f03c883c4eeeebcdab8751680cbb4c not found: ID does not exist" containerID="777d9d2e05f2693c8267c5c117a70eda18f03c883c4eeeebcdab8751680cbb4c" Dec 12 01:09:37 crc kubenswrapper[4606]: I1212 01:09:37.626550 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"777d9d2e05f2693c8267c5c117a70eda18f03c883c4eeeebcdab8751680cbb4c"} err="failed to get container status \"777d9d2e05f2693c8267c5c117a70eda18f03c883c4eeeebcdab8751680cbb4c\": rpc error: code = NotFound desc = could not find container \"777d9d2e05f2693c8267c5c117a70eda18f03c883c4eeeebcdab8751680cbb4c\": container with ID starting with 777d9d2e05f2693c8267c5c117a70eda18f03c883c4eeeebcdab8751680cbb4c not found: ID does not exist" Dec 12 01:09:37 crc kubenswrapper[4606]: I1212 01:09:37.709424 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25fd83b5-3248-4d4d-90b4-6480700ad8ea" path="/var/lib/kubelet/pods/25fd83b5-3248-4d4d-90b4-6480700ad8ea/volumes" Dec 12 01:10:02 crc kubenswrapper[4606]: I1212 01:10:02.010346 4606 patch_prober.go:28] interesting pod/machine-config-daemon-cqmz5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 12 01:10:02 crc kubenswrapper[4606]: I1212 01:10:02.010915 4606 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 12 01:10:32 crc kubenswrapper[4606]: I1212 01:10:32.010136 4606 patch_prober.go:28] interesting pod/machine-config-daemon-cqmz5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 12 01:10:32 crc kubenswrapper[4606]: I1212 01:10:32.010794 4606 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 12 01:11:02 crc kubenswrapper[4606]: I1212 01:11:02.010095 4606 patch_prober.go:28] interesting pod/machine-config-daemon-cqmz5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 12 01:11:02 crc kubenswrapper[4606]: I1212 01:11:02.010746 4606 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 12 01:11:02 crc kubenswrapper[4606]: I1212 01:11:02.010802 4606 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" Dec 12 01:11:02 crc kubenswrapper[4606]: I1212 01:11:02.011634 4606 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d143f1fd2961e240236733dcda2a6e280af9c700504d512a4569cefca1d9830f"} pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 12 01:11:02 crc kubenswrapper[4606]: I1212 01:11:02.011694 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" containerName="machine-config-daemon" containerID="cri-o://d143f1fd2961e240236733dcda2a6e280af9c700504d512a4569cefca1d9830f" gracePeriod=600 Dec 12 01:11:02 crc kubenswrapper[4606]: E1212 01:11:02.142850 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 01:11:02 crc kubenswrapper[4606]: I1212 01:11:02.309014 4606 generic.go:334] "Generic (PLEG): container finished" podID="a543e227-be89-40cb-941d-b4707cc28921" containerID="d143f1fd2961e240236733dcda2a6e280af9c700504d512a4569cefca1d9830f" exitCode=0 Dec 12 01:11:02 crc kubenswrapper[4606]: I1212 01:11:02.309050 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" event={"ID":"a543e227-be89-40cb-941d-b4707cc28921","Type":"ContainerDied","Data":"d143f1fd2961e240236733dcda2a6e280af9c700504d512a4569cefca1d9830f"} Dec 12 01:11:02 crc kubenswrapper[4606]: I1212 01:11:02.309105 4606 scope.go:117] "RemoveContainer" containerID="db929c2a4f0c0ac394d9f5ec86cf2aad7286615bc2456ec2a3e2009c28374446" Dec 12 01:11:02 crc kubenswrapper[4606]: I1212 01:11:02.309554 4606 scope.go:117] "RemoveContainer" containerID="d143f1fd2961e240236733dcda2a6e280af9c700504d512a4569cefca1d9830f" Dec 12 01:11:02 crc kubenswrapper[4606]: E1212 01:11:02.309832 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 01:11:17 crc kubenswrapper[4606]: I1212 01:11:17.700362 4606 scope.go:117] "RemoveContainer" containerID="d143f1fd2961e240236733dcda2a6e280af9c700504d512a4569cefca1d9830f" Dec 12 01:11:17 crc kubenswrapper[4606]: E1212 01:11:17.701504 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 01:11:29 crc kubenswrapper[4606]: I1212 01:11:29.707289 4606 scope.go:117] "RemoveContainer" containerID="d143f1fd2961e240236733dcda2a6e280af9c700504d512a4569cefca1d9830f" Dec 12 01:11:29 crc kubenswrapper[4606]: E1212 01:11:29.708360 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 01:11:43 crc kubenswrapper[4606]: I1212 01:11:43.701344 4606 scope.go:117] "RemoveContainer" containerID="d143f1fd2961e240236733dcda2a6e280af9c700504d512a4569cefca1d9830f" Dec 12 01:11:43 crc kubenswrapper[4606]: E1212 01:11:43.702295 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 01:11:50 crc kubenswrapper[4606]: I1212 01:11:50.540337 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-pc77x"] Dec 12 01:11:50 crc kubenswrapper[4606]: E1212 01:11:50.541458 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25fd83b5-3248-4d4d-90b4-6480700ad8ea" containerName="extract-content" Dec 12 01:11:50 crc kubenswrapper[4606]: I1212 01:11:50.541483 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="25fd83b5-3248-4d4d-90b4-6480700ad8ea" containerName="extract-content" Dec 12 01:11:50 crc kubenswrapper[4606]: E1212 01:11:50.541540 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25fd83b5-3248-4d4d-90b4-6480700ad8ea" containerName="extract-utilities" Dec 12 01:11:50 crc kubenswrapper[4606]: I1212 01:11:50.541552 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="25fd83b5-3248-4d4d-90b4-6480700ad8ea" containerName="extract-utilities" Dec 12 01:11:50 crc kubenswrapper[4606]: E1212 01:11:50.541568 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25fd83b5-3248-4d4d-90b4-6480700ad8ea" containerName="registry-server" Dec 12 01:11:50 crc kubenswrapper[4606]: I1212 01:11:50.541577 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="25fd83b5-3248-4d4d-90b4-6480700ad8ea" containerName="registry-server" Dec 12 01:11:50 crc kubenswrapper[4606]: I1212 01:11:50.541795 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="25fd83b5-3248-4d4d-90b4-6480700ad8ea" containerName="registry-server" Dec 12 01:11:50 crc kubenswrapper[4606]: I1212 01:11:50.543877 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pc77x" Dec 12 01:11:50 crc kubenswrapper[4606]: I1212 01:11:50.594410 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pc77x"] Dec 12 01:11:50 crc kubenswrapper[4606]: I1212 01:11:50.604914 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dl96v\" (UniqueName: \"kubernetes.io/projected/288bb9b8-2ffc-4584-81d6-863512639e3e-kube-api-access-dl96v\") pod \"redhat-operators-pc77x\" (UID: \"288bb9b8-2ffc-4584-81d6-863512639e3e\") " pod="openshift-marketplace/redhat-operators-pc77x" Dec 12 01:11:50 crc kubenswrapper[4606]: I1212 01:11:50.604969 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/288bb9b8-2ffc-4584-81d6-863512639e3e-utilities\") pod \"redhat-operators-pc77x\" (UID: \"288bb9b8-2ffc-4584-81d6-863512639e3e\") " pod="openshift-marketplace/redhat-operators-pc77x" Dec 12 01:11:50 crc kubenswrapper[4606]: I1212 01:11:50.605074 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/288bb9b8-2ffc-4584-81d6-863512639e3e-catalog-content\") pod \"redhat-operators-pc77x\" (UID: \"288bb9b8-2ffc-4584-81d6-863512639e3e\") " pod="openshift-marketplace/redhat-operators-pc77x" Dec 12 01:11:50 crc kubenswrapper[4606]: I1212 01:11:50.707677 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/288bb9b8-2ffc-4584-81d6-863512639e3e-catalog-content\") pod \"redhat-operators-pc77x\" (UID: \"288bb9b8-2ffc-4584-81d6-863512639e3e\") " pod="openshift-marketplace/redhat-operators-pc77x" Dec 12 01:11:50 crc kubenswrapper[4606]: I1212 01:11:50.708373 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/288bb9b8-2ffc-4584-81d6-863512639e3e-catalog-content\") pod \"redhat-operators-pc77x\" (UID: \"288bb9b8-2ffc-4584-81d6-863512639e3e\") " pod="openshift-marketplace/redhat-operators-pc77x" Dec 12 01:11:50 crc kubenswrapper[4606]: I1212 01:11:50.709044 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dl96v\" (UniqueName: \"kubernetes.io/projected/288bb9b8-2ffc-4584-81d6-863512639e3e-kube-api-access-dl96v\") pod \"redhat-operators-pc77x\" (UID: \"288bb9b8-2ffc-4584-81d6-863512639e3e\") " pod="openshift-marketplace/redhat-operators-pc77x" Dec 12 01:11:50 crc kubenswrapper[4606]: I1212 01:11:50.709126 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/288bb9b8-2ffc-4584-81d6-863512639e3e-utilities\") pod \"redhat-operators-pc77x\" (UID: \"288bb9b8-2ffc-4584-81d6-863512639e3e\") " pod="openshift-marketplace/redhat-operators-pc77x" Dec 12 01:11:50 crc kubenswrapper[4606]: I1212 01:11:50.709903 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/288bb9b8-2ffc-4584-81d6-863512639e3e-utilities\") pod \"redhat-operators-pc77x\" (UID: \"288bb9b8-2ffc-4584-81d6-863512639e3e\") " pod="openshift-marketplace/redhat-operators-pc77x" Dec 12 01:11:50 crc kubenswrapper[4606]: I1212 01:11:50.748292 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dl96v\" (UniqueName: \"kubernetes.io/projected/288bb9b8-2ffc-4584-81d6-863512639e3e-kube-api-access-dl96v\") pod \"redhat-operators-pc77x\" (UID: \"288bb9b8-2ffc-4584-81d6-863512639e3e\") " pod="openshift-marketplace/redhat-operators-pc77x" Dec 12 01:11:50 crc kubenswrapper[4606]: I1212 01:11:50.898886 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pc77x" Dec 12 01:11:51 crc kubenswrapper[4606]: I1212 01:11:51.401202 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pc77x"] Dec 12 01:11:51 crc kubenswrapper[4606]: I1212 01:11:51.799904 4606 generic.go:334] "Generic (PLEG): container finished" podID="288bb9b8-2ffc-4584-81d6-863512639e3e" containerID="0c8a8b6c32461080887dae3eb52053a878706dc3db5c26da5f3a26f0a0f7dd4a" exitCode=0 Dec 12 01:11:51 crc kubenswrapper[4606]: I1212 01:11:51.799949 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pc77x" event={"ID":"288bb9b8-2ffc-4584-81d6-863512639e3e","Type":"ContainerDied","Data":"0c8a8b6c32461080887dae3eb52053a878706dc3db5c26da5f3a26f0a0f7dd4a"} Dec 12 01:11:51 crc kubenswrapper[4606]: I1212 01:11:51.800141 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pc77x" event={"ID":"288bb9b8-2ffc-4584-81d6-863512639e3e","Type":"ContainerStarted","Data":"527bc57a2c6d8156a02b762b3eebd3a5e4095e728ac8b57e8cd4d31fbd283731"} Dec 12 01:11:51 crc kubenswrapper[4606]: I1212 01:11:51.802136 4606 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 12 01:11:52 crc kubenswrapper[4606]: I1212 01:11:52.811064 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pc77x" event={"ID":"288bb9b8-2ffc-4584-81d6-863512639e3e","Type":"ContainerStarted","Data":"b1bbf3469be7716bed6c0de118ba9ed9876ebbfa319dae41f461aac2467a6090"} Dec 12 01:11:55 crc kubenswrapper[4606]: I1212 01:11:55.837359 4606 generic.go:334] "Generic (PLEG): container finished" podID="288bb9b8-2ffc-4584-81d6-863512639e3e" containerID="b1bbf3469be7716bed6c0de118ba9ed9876ebbfa319dae41f461aac2467a6090" exitCode=0 Dec 12 01:11:55 crc kubenswrapper[4606]: I1212 01:11:55.837440 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pc77x" event={"ID":"288bb9b8-2ffc-4584-81d6-863512639e3e","Type":"ContainerDied","Data":"b1bbf3469be7716bed6c0de118ba9ed9876ebbfa319dae41f461aac2467a6090"} Dec 12 01:11:56 crc kubenswrapper[4606]: I1212 01:11:56.848088 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pc77x" event={"ID":"288bb9b8-2ffc-4584-81d6-863512639e3e","Type":"ContainerStarted","Data":"017384e4f9ce6969c67cb2d932a923e70d575bb56c6c5f4a9a55c756a3c00be4"} Dec 12 01:11:56 crc kubenswrapper[4606]: I1212 01:11:56.878446 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-pc77x" podStartSLOduration=2.3913888070000002 podStartE2EDuration="6.878409369s" podCreationTimestamp="2025-12-12 01:11:50 +0000 UTC" firstStartedPulling="2025-12-12 01:11:51.80190824 +0000 UTC m=+2902.347261096" lastFinishedPulling="2025-12-12 01:11:56.288928792 +0000 UTC m=+2906.834281658" observedRunningTime="2025-12-12 01:11:56.872667316 +0000 UTC m=+2907.418020182" watchObservedRunningTime="2025-12-12 01:11:56.878409369 +0000 UTC m=+2907.423762225" Dec 12 01:11:57 crc kubenswrapper[4606]: I1212 01:11:57.699839 4606 scope.go:117] "RemoveContainer" containerID="d143f1fd2961e240236733dcda2a6e280af9c700504d512a4569cefca1d9830f" Dec 12 01:11:57 crc kubenswrapper[4606]: E1212 01:11:57.700165 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 01:12:00 crc kubenswrapper[4606]: I1212 01:12:00.899142 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-pc77x" Dec 12 01:12:00 crc kubenswrapper[4606]: I1212 01:12:00.899648 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-pc77x" Dec 12 01:12:01 crc kubenswrapper[4606]: I1212 01:12:01.889494 4606 generic.go:334] "Generic (PLEG): container finished" podID="f70ab77b-2e78-421a-8563-8d8d0e049800" containerID="14bfa7db0a6c2a2d031f4660cb0b55ded04a670b76cc33258bc70b1829c68b0a" exitCode=0 Dec 12 01:12:01 crc kubenswrapper[4606]: I1212 01:12:01.889593 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4ls7g" event={"ID":"f70ab77b-2e78-421a-8563-8d8d0e049800","Type":"ContainerDied","Data":"14bfa7db0a6c2a2d031f4660cb0b55ded04a670b76cc33258bc70b1829c68b0a"} Dec 12 01:12:01 crc kubenswrapper[4606]: I1212 01:12:01.980762 4606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-pc77x" podUID="288bb9b8-2ffc-4584-81d6-863512639e3e" containerName="registry-server" probeResult="failure" output=< Dec 12 01:12:01 crc kubenswrapper[4606]: timeout: failed to connect service ":50051" within 1s Dec 12 01:12:01 crc kubenswrapper[4606]: > Dec 12 01:12:03 crc kubenswrapper[4606]: I1212 01:12:03.356231 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4ls7g" Dec 12 01:12:03 crc kubenswrapper[4606]: I1212 01:12:03.458927 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f70ab77b-2e78-421a-8563-8d8d0e049800-inventory\") pod \"f70ab77b-2e78-421a-8563-8d8d0e049800\" (UID: \"f70ab77b-2e78-421a-8563-8d8d0e049800\") " Dec 12 01:12:03 crc kubenswrapper[4606]: I1212 01:12:03.459257 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/f70ab77b-2e78-421a-8563-8d8d0e049800-nova-migration-ssh-key-0\") pod \"f70ab77b-2e78-421a-8563-8d8d0e049800\" (UID: \"f70ab77b-2e78-421a-8563-8d8d0e049800\") " Dec 12 01:12:03 crc kubenswrapper[4606]: I1212 01:12:03.459309 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f70ab77b-2e78-421a-8563-8d8d0e049800-ssh-key\") pod \"f70ab77b-2e78-421a-8563-8d8d0e049800\" (UID: \"f70ab77b-2e78-421a-8563-8d8d0e049800\") " Dec 12 01:12:03 crc kubenswrapper[4606]: I1212 01:12:03.459344 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/f70ab77b-2e78-421a-8563-8d8d0e049800-nova-cell1-compute-config-0\") pod \"f70ab77b-2e78-421a-8563-8d8d0e049800\" (UID: \"f70ab77b-2e78-421a-8563-8d8d0e049800\") " Dec 12 01:12:03 crc kubenswrapper[4606]: I1212 01:12:03.459364 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/f70ab77b-2e78-421a-8563-8d8d0e049800-nova-cell1-compute-config-1\") pod \"f70ab77b-2e78-421a-8563-8d8d0e049800\" (UID: \"f70ab77b-2e78-421a-8563-8d8d0e049800\") " Dec 12 01:12:03 crc kubenswrapper[4606]: I1212 01:12:03.459518 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-64ggk\" (UniqueName: \"kubernetes.io/projected/f70ab77b-2e78-421a-8563-8d8d0e049800-kube-api-access-64ggk\") pod \"f70ab77b-2e78-421a-8563-8d8d0e049800\" (UID: \"f70ab77b-2e78-421a-8563-8d8d0e049800\") " Dec 12 01:12:03 crc kubenswrapper[4606]: I1212 01:12:03.459572 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/f70ab77b-2e78-421a-8563-8d8d0e049800-nova-extra-config-0\") pod \"f70ab77b-2e78-421a-8563-8d8d0e049800\" (UID: \"f70ab77b-2e78-421a-8563-8d8d0e049800\") " Dec 12 01:12:03 crc kubenswrapper[4606]: I1212 01:12:03.459590 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/f70ab77b-2e78-421a-8563-8d8d0e049800-nova-migration-ssh-key-1\") pod \"f70ab77b-2e78-421a-8563-8d8d0e049800\" (UID: \"f70ab77b-2e78-421a-8563-8d8d0e049800\") " Dec 12 01:12:03 crc kubenswrapper[4606]: I1212 01:12:03.459609 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f70ab77b-2e78-421a-8563-8d8d0e049800-nova-combined-ca-bundle\") pod \"f70ab77b-2e78-421a-8563-8d8d0e049800\" (UID: \"f70ab77b-2e78-421a-8563-8d8d0e049800\") " Dec 12 01:12:03 crc kubenswrapper[4606]: I1212 01:12:03.467375 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f70ab77b-2e78-421a-8563-8d8d0e049800-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "f70ab77b-2e78-421a-8563-8d8d0e049800" (UID: "f70ab77b-2e78-421a-8563-8d8d0e049800"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 01:12:03 crc kubenswrapper[4606]: I1212 01:12:03.514830 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f70ab77b-2e78-421a-8563-8d8d0e049800-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "f70ab77b-2e78-421a-8563-8d8d0e049800" (UID: "f70ab77b-2e78-421a-8563-8d8d0e049800"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 01:12:03 crc kubenswrapper[4606]: I1212 01:12:03.519631 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f70ab77b-2e78-421a-8563-8d8d0e049800-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "f70ab77b-2e78-421a-8563-8d8d0e049800" (UID: "f70ab77b-2e78-421a-8563-8d8d0e049800"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 01:12:03 crc kubenswrapper[4606]: I1212 01:12:03.520124 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f70ab77b-2e78-421a-8563-8d8d0e049800-kube-api-access-64ggk" (OuterVolumeSpecName: "kube-api-access-64ggk") pod "f70ab77b-2e78-421a-8563-8d8d0e049800" (UID: "f70ab77b-2e78-421a-8563-8d8d0e049800"). InnerVolumeSpecName "kube-api-access-64ggk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 01:12:03 crc kubenswrapper[4606]: I1212 01:12:03.520944 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f70ab77b-2e78-421a-8563-8d8d0e049800-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "f70ab77b-2e78-421a-8563-8d8d0e049800" (UID: "f70ab77b-2e78-421a-8563-8d8d0e049800"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 01:12:03 crc kubenswrapper[4606]: I1212 01:12:03.538842 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f70ab77b-2e78-421a-8563-8d8d0e049800-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "f70ab77b-2e78-421a-8563-8d8d0e049800" (UID: "f70ab77b-2e78-421a-8563-8d8d0e049800"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 01:12:03 crc kubenswrapper[4606]: I1212 01:12:03.539132 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f70ab77b-2e78-421a-8563-8d8d0e049800-inventory" (OuterVolumeSpecName: "inventory") pod "f70ab77b-2e78-421a-8563-8d8d0e049800" (UID: "f70ab77b-2e78-421a-8563-8d8d0e049800"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 01:12:03 crc kubenswrapper[4606]: I1212 01:12:03.554080 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f70ab77b-2e78-421a-8563-8d8d0e049800-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "f70ab77b-2e78-421a-8563-8d8d0e049800" (UID: "f70ab77b-2e78-421a-8563-8d8d0e049800"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 01:12:03 crc kubenswrapper[4606]: I1212 01:12:03.555054 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f70ab77b-2e78-421a-8563-8d8d0e049800-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "f70ab77b-2e78-421a-8563-8d8d0e049800" (UID: "f70ab77b-2e78-421a-8563-8d8d0e049800"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 01:12:03 crc kubenswrapper[4606]: I1212 01:12:03.561549 4606 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/f70ab77b-2e78-421a-8563-8d8d0e049800-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Dec 12 01:12:03 crc kubenswrapper[4606]: I1212 01:12:03.561581 4606 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/f70ab77b-2e78-421a-8563-8d8d0e049800-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Dec 12 01:12:03 crc kubenswrapper[4606]: I1212 01:12:03.561593 4606 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f70ab77b-2e78-421a-8563-8d8d0e049800-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 12 01:12:03 crc kubenswrapper[4606]: I1212 01:12:03.561602 4606 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f70ab77b-2e78-421a-8563-8d8d0e049800-inventory\") on node \"crc\" DevicePath \"\"" Dec 12 01:12:03 crc kubenswrapper[4606]: I1212 01:12:03.561611 4606 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/f70ab77b-2e78-421a-8563-8d8d0e049800-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Dec 12 01:12:03 crc kubenswrapper[4606]: I1212 01:12:03.561618 4606 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f70ab77b-2e78-421a-8563-8d8d0e049800-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 12 01:12:03 crc kubenswrapper[4606]: I1212 01:12:03.561628 4606 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/f70ab77b-2e78-421a-8563-8d8d0e049800-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Dec 12 01:12:03 crc kubenswrapper[4606]: I1212 01:12:03.561640 4606 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/f70ab77b-2e78-421a-8563-8d8d0e049800-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Dec 12 01:12:03 crc kubenswrapper[4606]: I1212 01:12:03.561648 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-64ggk\" (UniqueName: \"kubernetes.io/projected/f70ab77b-2e78-421a-8563-8d8d0e049800-kube-api-access-64ggk\") on node \"crc\" DevicePath \"\"" Dec 12 01:12:03 crc kubenswrapper[4606]: I1212 01:12:03.909857 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4ls7g" event={"ID":"f70ab77b-2e78-421a-8563-8d8d0e049800","Type":"ContainerDied","Data":"374aa9f52e40b920034b0e8f55f0eb873ed480c5ac1308c976d0e9d4b0ef53b9"} Dec 12 01:12:03 crc kubenswrapper[4606]: I1212 01:12:03.909912 4606 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="374aa9f52e40b920034b0e8f55f0eb873ed480c5ac1308c976d0e9d4b0ef53b9" Dec 12 01:12:03 crc kubenswrapper[4606]: I1212 01:12:03.909997 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4ls7g" Dec 12 01:12:04 crc kubenswrapper[4606]: I1212 01:12:04.023852 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2bdq4"] Dec 12 01:12:04 crc kubenswrapper[4606]: E1212 01:12:04.035303 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f70ab77b-2e78-421a-8563-8d8d0e049800" containerName="nova-edpm-deployment-openstack-edpm-ipam" Dec 12 01:12:04 crc kubenswrapper[4606]: I1212 01:12:04.035337 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="f70ab77b-2e78-421a-8563-8d8d0e049800" containerName="nova-edpm-deployment-openstack-edpm-ipam" Dec 12 01:12:04 crc kubenswrapper[4606]: I1212 01:12:04.047490 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="f70ab77b-2e78-421a-8563-8d8d0e049800" containerName="nova-edpm-deployment-openstack-edpm-ipam" Dec 12 01:12:04 crc kubenswrapper[4606]: I1212 01:12:04.048081 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2bdq4"] Dec 12 01:12:04 crc kubenswrapper[4606]: I1212 01:12:04.048242 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2bdq4" Dec 12 01:12:04 crc kubenswrapper[4606]: I1212 01:12:04.052785 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 12 01:12:04 crc kubenswrapper[4606]: I1212 01:12:04.053058 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 12 01:12:04 crc kubenswrapper[4606]: I1212 01:12:04.053202 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Dec 12 01:12:04 crc kubenswrapper[4606]: I1212 01:12:04.053223 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 12 01:12:04 crc kubenswrapper[4606]: I1212 01:12:04.053253 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-w59bl" Dec 12 01:12:04 crc kubenswrapper[4606]: I1212 01:12:04.173033 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf6da039-4ee5-40c3-90b0-606cd302ee04-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2bdq4\" (UID: \"bf6da039-4ee5-40c3-90b0-606cd302ee04\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2bdq4" Dec 12 01:12:04 crc kubenswrapper[4606]: I1212 01:12:04.173088 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bf6da039-4ee5-40c3-90b0-606cd302ee04-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2bdq4\" (UID: \"bf6da039-4ee5-40c3-90b0-606cd302ee04\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2bdq4" Dec 12 01:12:04 crc kubenswrapper[4606]: I1212 01:12:04.173127 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/bf6da039-4ee5-40c3-90b0-606cd302ee04-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2bdq4\" (UID: \"bf6da039-4ee5-40c3-90b0-606cd302ee04\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2bdq4" Dec 12 01:12:04 crc kubenswrapper[4606]: I1212 01:12:04.173150 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bm6tq\" (UniqueName: \"kubernetes.io/projected/bf6da039-4ee5-40c3-90b0-606cd302ee04-kube-api-access-bm6tq\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2bdq4\" (UID: \"bf6da039-4ee5-40c3-90b0-606cd302ee04\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2bdq4" Dec 12 01:12:04 crc kubenswrapper[4606]: I1212 01:12:04.173384 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/bf6da039-4ee5-40c3-90b0-606cd302ee04-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2bdq4\" (UID: \"bf6da039-4ee5-40c3-90b0-606cd302ee04\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2bdq4" Dec 12 01:12:04 crc kubenswrapper[4606]: I1212 01:12:04.173435 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/bf6da039-4ee5-40c3-90b0-606cd302ee04-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2bdq4\" (UID: \"bf6da039-4ee5-40c3-90b0-606cd302ee04\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2bdq4" Dec 12 01:12:04 crc kubenswrapper[4606]: I1212 01:12:04.173493 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bf6da039-4ee5-40c3-90b0-606cd302ee04-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2bdq4\" (UID: \"bf6da039-4ee5-40c3-90b0-606cd302ee04\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2bdq4" Dec 12 01:12:04 crc kubenswrapper[4606]: I1212 01:12:04.274949 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bf6da039-4ee5-40c3-90b0-606cd302ee04-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2bdq4\" (UID: \"bf6da039-4ee5-40c3-90b0-606cd302ee04\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2bdq4" Dec 12 01:12:04 crc kubenswrapper[4606]: I1212 01:12:04.275071 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/bf6da039-4ee5-40c3-90b0-606cd302ee04-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2bdq4\" (UID: \"bf6da039-4ee5-40c3-90b0-606cd302ee04\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2bdq4" Dec 12 01:12:04 crc kubenswrapper[4606]: I1212 01:12:04.275095 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bm6tq\" (UniqueName: \"kubernetes.io/projected/bf6da039-4ee5-40c3-90b0-606cd302ee04-kube-api-access-bm6tq\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2bdq4\" (UID: \"bf6da039-4ee5-40c3-90b0-606cd302ee04\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2bdq4" Dec 12 01:12:04 crc kubenswrapper[4606]: I1212 01:12:04.275162 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/bf6da039-4ee5-40c3-90b0-606cd302ee04-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2bdq4\" (UID: \"bf6da039-4ee5-40c3-90b0-606cd302ee04\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2bdq4" Dec 12 01:12:04 crc kubenswrapper[4606]: I1212 01:12:04.275941 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/bf6da039-4ee5-40c3-90b0-606cd302ee04-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2bdq4\" (UID: \"bf6da039-4ee5-40c3-90b0-606cd302ee04\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2bdq4" Dec 12 01:12:04 crc kubenswrapper[4606]: I1212 01:12:04.276413 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bf6da039-4ee5-40c3-90b0-606cd302ee04-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2bdq4\" (UID: \"bf6da039-4ee5-40c3-90b0-606cd302ee04\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2bdq4" Dec 12 01:12:04 crc kubenswrapper[4606]: I1212 01:12:04.276573 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf6da039-4ee5-40c3-90b0-606cd302ee04-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2bdq4\" (UID: \"bf6da039-4ee5-40c3-90b0-606cd302ee04\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2bdq4" Dec 12 01:12:04 crc kubenswrapper[4606]: I1212 01:12:04.279635 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/bf6da039-4ee5-40c3-90b0-606cd302ee04-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2bdq4\" (UID: \"bf6da039-4ee5-40c3-90b0-606cd302ee04\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2bdq4" Dec 12 01:12:04 crc kubenswrapper[4606]: I1212 01:12:04.283981 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bf6da039-4ee5-40c3-90b0-606cd302ee04-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2bdq4\" (UID: \"bf6da039-4ee5-40c3-90b0-606cd302ee04\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2bdq4" Dec 12 01:12:04 crc kubenswrapper[4606]: I1212 01:12:04.284403 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/bf6da039-4ee5-40c3-90b0-606cd302ee04-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2bdq4\" (UID: \"bf6da039-4ee5-40c3-90b0-606cd302ee04\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2bdq4" Dec 12 01:12:04 crc kubenswrapper[4606]: I1212 01:12:04.284756 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf6da039-4ee5-40c3-90b0-606cd302ee04-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2bdq4\" (UID: \"bf6da039-4ee5-40c3-90b0-606cd302ee04\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2bdq4" Dec 12 01:12:04 crc kubenswrapper[4606]: I1212 01:12:04.284929 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/bf6da039-4ee5-40c3-90b0-606cd302ee04-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2bdq4\" (UID: \"bf6da039-4ee5-40c3-90b0-606cd302ee04\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2bdq4" Dec 12 01:12:04 crc kubenswrapper[4606]: I1212 01:12:04.289991 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bf6da039-4ee5-40c3-90b0-606cd302ee04-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2bdq4\" (UID: \"bf6da039-4ee5-40c3-90b0-606cd302ee04\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2bdq4" Dec 12 01:12:04 crc kubenswrapper[4606]: I1212 01:12:04.297901 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bm6tq\" (UniqueName: \"kubernetes.io/projected/bf6da039-4ee5-40c3-90b0-606cd302ee04-kube-api-access-bm6tq\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2bdq4\" (UID: \"bf6da039-4ee5-40c3-90b0-606cd302ee04\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2bdq4" Dec 12 01:12:04 crc kubenswrapper[4606]: I1212 01:12:04.367294 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2bdq4" Dec 12 01:12:04 crc kubenswrapper[4606]: I1212 01:12:04.961451 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2bdq4"] Dec 12 01:12:05 crc kubenswrapper[4606]: I1212 01:12:05.935942 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2bdq4" event={"ID":"bf6da039-4ee5-40c3-90b0-606cd302ee04","Type":"ContainerStarted","Data":"5430465020ba6b06927e421561e8c6ac66fddfb5e2f9fb8435f2b86e8f6ac354"} Dec 12 01:12:05 crc kubenswrapper[4606]: I1212 01:12:05.936571 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2bdq4" event={"ID":"bf6da039-4ee5-40c3-90b0-606cd302ee04","Type":"ContainerStarted","Data":"e44ec147587348411a716553ff3b2926304deb82811d97a8d899d5a713fa0726"} Dec 12 01:12:05 crc kubenswrapper[4606]: I1212 01:12:05.958693 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2bdq4" podStartSLOduration=1.7786004709999998 podStartE2EDuration="1.958669311s" podCreationTimestamp="2025-12-12 01:12:04 +0000 UTC" firstStartedPulling="2025-12-12 01:12:04.968379185 +0000 UTC m=+2915.513732051" lastFinishedPulling="2025-12-12 01:12:05.148448025 +0000 UTC m=+2915.693800891" observedRunningTime="2025-12-12 01:12:05.95408646 +0000 UTC m=+2916.499439336" watchObservedRunningTime="2025-12-12 01:12:05.958669311 +0000 UTC m=+2916.504022177" Dec 12 01:12:10 crc kubenswrapper[4606]: I1212 01:12:10.699603 4606 scope.go:117] "RemoveContainer" containerID="d143f1fd2961e240236733dcda2a6e280af9c700504d512a4569cefca1d9830f" Dec 12 01:12:10 crc kubenswrapper[4606]: E1212 01:12:10.700152 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 01:12:10 crc kubenswrapper[4606]: I1212 01:12:10.944231 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-pc77x" Dec 12 01:12:11 crc kubenswrapper[4606]: I1212 01:12:11.000126 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-pc77x" Dec 12 01:12:11 crc kubenswrapper[4606]: I1212 01:12:11.190482 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pc77x"] Dec 12 01:12:11 crc kubenswrapper[4606]: I1212 01:12:11.987959 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-pc77x" podUID="288bb9b8-2ffc-4584-81d6-863512639e3e" containerName="registry-server" containerID="cri-o://017384e4f9ce6969c67cb2d932a923e70d575bb56c6c5f4a9a55c756a3c00be4" gracePeriod=2 Dec 12 01:12:12 crc kubenswrapper[4606]: I1212 01:12:12.459397 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pc77x" Dec 12 01:12:12 crc kubenswrapper[4606]: I1212 01:12:12.633822 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/288bb9b8-2ffc-4584-81d6-863512639e3e-catalog-content\") pod \"288bb9b8-2ffc-4584-81d6-863512639e3e\" (UID: \"288bb9b8-2ffc-4584-81d6-863512639e3e\") " Dec 12 01:12:12 crc kubenswrapper[4606]: I1212 01:12:12.634038 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dl96v\" (UniqueName: \"kubernetes.io/projected/288bb9b8-2ffc-4584-81d6-863512639e3e-kube-api-access-dl96v\") pod \"288bb9b8-2ffc-4584-81d6-863512639e3e\" (UID: \"288bb9b8-2ffc-4584-81d6-863512639e3e\") " Dec 12 01:12:12 crc kubenswrapper[4606]: I1212 01:12:12.634062 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/288bb9b8-2ffc-4584-81d6-863512639e3e-utilities\") pod \"288bb9b8-2ffc-4584-81d6-863512639e3e\" (UID: \"288bb9b8-2ffc-4584-81d6-863512639e3e\") " Dec 12 01:12:12 crc kubenswrapper[4606]: I1212 01:12:12.635202 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/288bb9b8-2ffc-4584-81d6-863512639e3e-utilities" (OuterVolumeSpecName: "utilities") pod "288bb9b8-2ffc-4584-81d6-863512639e3e" (UID: "288bb9b8-2ffc-4584-81d6-863512639e3e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 01:12:12 crc kubenswrapper[4606]: I1212 01:12:12.642517 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/288bb9b8-2ffc-4584-81d6-863512639e3e-kube-api-access-dl96v" (OuterVolumeSpecName: "kube-api-access-dl96v") pod "288bb9b8-2ffc-4584-81d6-863512639e3e" (UID: "288bb9b8-2ffc-4584-81d6-863512639e3e"). InnerVolumeSpecName "kube-api-access-dl96v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 01:12:12 crc kubenswrapper[4606]: I1212 01:12:12.736332 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dl96v\" (UniqueName: \"kubernetes.io/projected/288bb9b8-2ffc-4584-81d6-863512639e3e-kube-api-access-dl96v\") on node \"crc\" DevicePath \"\"" Dec 12 01:12:12 crc kubenswrapper[4606]: I1212 01:12:12.736369 4606 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/288bb9b8-2ffc-4584-81d6-863512639e3e-utilities\") on node \"crc\" DevicePath \"\"" Dec 12 01:12:12 crc kubenswrapper[4606]: I1212 01:12:12.757076 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/288bb9b8-2ffc-4584-81d6-863512639e3e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "288bb9b8-2ffc-4584-81d6-863512639e3e" (UID: "288bb9b8-2ffc-4584-81d6-863512639e3e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 01:12:12 crc kubenswrapper[4606]: I1212 01:12:12.838822 4606 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/288bb9b8-2ffc-4584-81d6-863512639e3e-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 12 01:12:13 crc kubenswrapper[4606]: I1212 01:12:13.003422 4606 generic.go:334] "Generic (PLEG): container finished" podID="288bb9b8-2ffc-4584-81d6-863512639e3e" containerID="017384e4f9ce6969c67cb2d932a923e70d575bb56c6c5f4a9a55c756a3c00be4" exitCode=0 Dec 12 01:12:13 crc kubenswrapper[4606]: I1212 01:12:13.003469 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pc77x" event={"ID":"288bb9b8-2ffc-4584-81d6-863512639e3e","Type":"ContainerDied","Data":"017384e4f9ce6969c67cb2d932a923e70d575bb56c6c5f4a9a55c756a3c00be4"} Dec 12 01:12:13 crc kubenswrapper[4606]: I1212 01:12:13.003498 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pc77x" event={"ID":"288bb9b8-2ffc-4584-81d6-863512639e3e","Type":"ContainerDied","Data":"527bc57a2c6d8156a02b762b3eebd3a5e4095e728ac8b57e8cd4d31fbd283731"} Dec 12 01:12:13 crc kubenswrapper[4606]: I1212 01:12:13.003536 4606 scope.go:117] "RemoveContainer" containerID="017384e4f9ce6969c67cb2d932a923e70d575bb56c6c5f4a9a55c756a3c00be4" Dec 12 01:12:13 crc kubenswrapper[4606]: I1212 01:12:13.003684 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pc77x" Dec 12 01:12:13 crc kubenswrapper[4606]: I1212 01:12:13.026975 4606 scope.go:117] "RemoveContainer" containerID="b1bbf3469be7716bed6c0de118ba9ed9876ebbfa319dae41f461aac2467a6090" Dec 12 01:12:13 crc kubenswrapper[4606]: I1212 01:12:13.050081 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pc77x"] Dec 12 01:12:13 crc kubenswrapper[4606]: I1212 01:12:13.058431 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-pc77x"] Dec 12 01:12:13 crc kubenswrapper[4606]: I1212 01:12:13.115019 4606 scope.go:117] "RemoveContainer" containerID="0c8a8b6c32461080887dae3eb52053a878706dc3db5c26da5f3a26f0a0f7dd4a" Dec 12 01:12:13 crc kubenswrapper[4606]: I1212 01:12:13.144151 4606 scope.go:117] "RemoveContainer" containerID="017384e4f9ce6969c67cb2d932a923e70d575bb56c6c5f4a9a55c756a3c00be4" Dec 12 01:12:13 crc kubenswrapper[4606]: E1212 01:12:13.144991 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"017384e4f9ce6969c67cb2d932a923e70d575bb56c6c5f4a9a55c756a3c00be4\": container with ID starting with 017384e4f9ce6969c67cb2d932a923e70d575bb56c6c5f4a9a55c756a3c00be4 not found: ID does not exist" containerID="017384e4f9ce6969c67cb2d932a923e70d575bb56c6c5f4a9a55c756a3c00be4" Dec 12 01:12:13 crc kubenswrapper[4606]: I1212 01:12:13.145024 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"017384e4f9ce6969c67cb2d932a923e70d575bb56c6c5f4a9a55c756a3c00be4"} err="failed to get container status \"017384e4f9ce6969c67cb2d932a923e70d575bb56c6c5f4a9a55c756a3c00be4\": rpc error: code = NotFound desc = could not find container \"017384e4f9ce6969c67cb2d932a923e70d575bb56c6c5f4a9a55c756a3c00be4\": container with ID starting with 017384e4f9ce6969c67cb2d932a923e70d575bb56c6c5f4a9a55c756a3c00be4 not found: ID does not exist" Dec 12 01:12:13 crc kubenswrapper[4606]: I1212 01:12:13.145047 4606 scope.go:117] "RemoveContainer" containerID="b1bbf3469be7716bed6c0de118ba9ed9876ebbfa319dae41f461aac2467a6090" Dec 12 01:12:13 crc kubenswrapper[4606]: E1212 01:12:13.145338 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1bbf3469be7716bed6c0de118ba9ed9876ebbfa319dae41f461aac2467a6090\": container with ID starting with b1bbf3469be7716bed6c0de118ba9ed9876ebbfa319dae41f461aac2467a6090 not found: ID does not exist" containerID="b1bbf3469be7716bed6c0de118ba9ed9876ebbfa319dae41f461aac2467a6090" Dec 12 01:12:13 crc kubenswrapper[4606]: I1212 01:12:13.145365 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1bbf3469be7716bed6c0de118ba9ed9876ebbfa319dae41f461aac2467a6090"} err="failed to get container status \"b1bbf3469be7716bed6c0de118ba9ed9876ebbfa319dae41f461aac2467a6090\": rpc error: code = NotFound desc = could not find container \"b1bbf3469be7716bed6c0de118ba9ed9876ebbfa319dae41f461aac2467a6090\": container with ID starting with b1bbf3469be7716bed6c0de118ba9ed9876ebbfa319dae41f461aac2467a6090 not found: ID does not exist" Dec 12 01:12:13 crc kubenswrapper[4606]: I1212 01:12:13.145378 4606 scope.go:117] "RemoveContainer" containerID="0c8a8b6c32461080887dae3eb52053a878706dc3db5c26da5f3a26f0a0f7dd4a" Dec 12 01:12:13 crc kubenswrapper[4606]: E1212 01:12:13.145573 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c8a8b6c32461080887dae3eb52053a878706dc3db5c26da5f3a26f0a0f7dd4a\": container with ID starting with 0c8a8b6c32461080887dae3eb52053a878706dc3db5c26da5f3a26f0a0f7dd4a not found: ID does not exist" containerID="0c8a8b6c32461080887dae3eb52053a878706dc3db5c26da5f3a26f0a0f7dd4a" Dec 12 01:12:13 crc kubenswrapper[4606]: I1212 01:12:13.145665 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c8a8b6c32461080887dae3eb52053a878706dc3db5c26da5f3a26f0a0f7dd4a"} err="failed to get container status \"0c8a8b6c32461080887dae3eb52053a878706dc3db5c26da5f3a26f0a0f7dd4a\": rpc error: code = NotFound desc = could not find container \"0c8a8b6c32461080887dae3eb52053a878706dc3db5c26da5f3a26f0a0f7dd4a\": container with ID starting with 0c8a8b6c32461080887dae3eb52053a878706dc3db5c26da5f3a26f0a0f7dd4a not found: ID does not exist" Dec 12 01:12:13 crc kubenswrapper[4606]: I1212 01:12:13.718154 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="288bb9b8-2ffc-4584-81d6-863512639e3e" path="/var/lib/kubelet/pods/288bb9b8-2ffc-4584-81d6-863512639e3e/volumes" Dec 12 01:12:24 crc kubenswrapper[4606]: I1212 01:12:24.700347 4606 scope.go:117] "RemoveContainer" containerID="d143f1fd2961e240236733dcda2a6e280af9c700504d512a4569cefca1d9830f" Dec 12 01:12:24 crc kubenswrapper[4606]: E1212 01:12:24.701912 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 01:12:37 crc kubenswrapper[4606]: I1212 01:12:37.700105 4606 scope.go:117] "RemoveContainer" containerID="d143f1fd2961e240236733dcda2a6e280af9c700504d512a4569cefca1d9830f" Dec 12 01:12:37 crc kubenswrapper[4606]: E1212 01:12:37.700955 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 01:12:48 crc kubenswrapper[4606]: I1212 01:12:48.699848 4606 scope.go:117] "RemoveContainer" containerID="d143f1fd2961e240236733dcda2a6e280af9c700504d512a4569cefca1d9830f" Dec 12 01:12:48 crc kubenswrapper[4606]: E1212 01:12:48.701061 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 01:13:02 crc kubenswrapper[4606]: I1212 01:13:02.700120 4606 scope.go:117] "RemoveContainer" containerID="d143f1fd2961e240236733dcda2a6e280af9c700504d512a4569cefca1d9830f" Dec 12 01:13:02 crc kubenswrapper[4606]: E1212 01:13:02.700962 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 01:13:16 crc kubenswrapper[4606]: I1212 01:13:16.700681 4606 scope.go:117] "RemoveContainer" containerID="d143f1fd2961e240236733dcda2a6e280af9c700504d512a4569cefca1d9830f" Dec 12 01:13:16 crc kubenswrapper[4606]: E1212 01:13:16.701685 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 01:13:27 crc kubenswrapper[4606]: I1212 01:13:27.700305 4606 scope.go:117] "RemoveContainer" containerID="d143f1fd2961e240236733dcda2a6e280af9c700504d512a4569cefca1d9830f" Dec 12 01:13:27 crc kubenswrapper[4606]: E1212 01:13:27.702262 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 01:13:39 crc kubenswrapper[4606]: I1212 01:13:39.713350 4606 scope.go:117] "RemoveContainer" containerID="d143f1fd2961e240236733dcda2a6e280af9c700504d512a4569cefca1d9830f" Dec 12 01:13:39 crc kubenswrapper[4606]: E1212 01:13:39.714677 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 01:13:52 crc kubenswrapper[4606]: I1212 01:13:52.700476 4606 scope.go:117] "RemoveContainer" containerID="d143f1fd2961e240236733dcda2a6e280af9c700504d512a4569cefca1d9830f" Dec 12 01:13:52 crc kubenswrapper[4606]: E1212 01:13:52.704159 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 01:14:04 crc kubenswrapper[4606]: I1212 01:14:04.700840 4606 scope.go:117] "RemoveContainer" containerID="d143f1fd2961e240236733dcda2a6e280af9c700504d512a4569cefca1d9830f" Dec 12 01:14:04 crc kubenswrapper[4606]: E1212 01:14:04.701877 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 01:14:19 crc kubenswrapper[4606]: I1212 01:14:19.700395 4606 scope.go:117] "RemoveContainer" containerID="d143f1fd2961e240236733dcda2a6e280af9c700504d512a4569cefca1d9830f" Dec 12 01:14:19 crc kubenswrapper[4606]: E1212 01:14:19.702580 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 01:14:30 crc kubenswrapper[4606]: I1212 01:14:30.700344 4606 scope.go:117] "RemoveContainer" containerID="d143f1fd2961e240236733dcda2a6e280af9c700504d512a4569cefca1d9830f" Dec 12 01:14:30 crc kubenswrapper[4606]: E1212 01:14:30.701048 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 01:14:44 crc kubenswrapper[4606]: I1212 01:14:44.699639 4606 scope.go:117] "RemoveContainer" containerID="d143f1fd2961e240236733dcda2a6e280af9c700504d512a4569cefca1d9830f" Dec 12 01:14:44 crc kubenswrapper[4606]: E1212 01:14:44.700456 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 01:14:57 crc kubenswrapper[4606]: I1212 01:14:57.700407 4606 scope.go:117] "RemoveContainer" containerID="d143f1fd2961e240236733dcda2a6e280af9c700504d512a4569cefca1d9830f" Dec 12 01:14:57 crc kubenswrapper[4606]: E1212 01:14:57.701025 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 01:15:00 crc kubenswrapper[4606]: I1212 01:15:00.158108 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29425035-dcr97"] Dec 12 01:15:00 crc kubenswrapper[4606]: E1212 01:15:00.158517 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="288bb9b8-2ffc-4584-81d6-863512639e3e" containerName="extract-content" Dec 12 01:15:00 crc kubenswrapper[4606]: I1212 01:15:00.158529 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="288bb9b8-2ffc-4584-81d6-863512639e3e" containerName="extract-content" Dec 12 01:15:00 crc kubenswrapper[4606]: E1212 01:15:00.158555 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="288bb9b8-2ffc-4584-81d6-863512639e3e" containerName="extract-utilities" Dec 12 01:15:00 crc kubenswrapper[4606]: I1212 01:15:00.158562 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="288bb9b8-2ffc-4584-81d6-863512639e3e" containerName="extract-utilities" Dec 12 01:15:00 crc kubenswrapper[4606]: E1212 01:15:00.158589 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="288bb9b8-2ffc-4584-81d6-863512639e3e" containerName="registry-server" Dec 12 01:15:00 crc kubenswrapper[4606]: I1212 01:15:00.158595 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="288bb9b8-2ffc-4584-81d6-863512639e3e" containerName="registry-server" Dec 12 01:15:00 crc kubenswrapper[4606]: I1212 01:15:00.158776 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="288bb9b8-2ffc-4584-81d6-863512639e3e" containerName="registry-server" Dec 12 01:15:00 crc kubenswrapper[4606]: I1212 01:15:00.159369 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29425035-dcr97" Dec 12 01:15:00 crc kubenswrapper[4606]: I1212 01:15:00.161622 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 12 01:15:00 crc kubenswrapper[4606]: I1212 01:15:00.162018 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 12 01:15:00 crc kubenswrapper[4606]: I1212 01:15:00.179267 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29425035-dcr97"] Dec 12 01:15:00 crc kubenswrapper[4606]: I1212 01:15:00.195997 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ec14f523-33c1-43e9-a143-c423cf7ec754-config-volume\") pod \"collect-profiles-29425035-dcr97\" (UID: \"ec14f523-33c1-43e9-a143-c423cf7ec754\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29425035-dcr97" Dec 12 01:15:00 crc kubenswrapper[4606]: I1212 01:15:00.196096 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ec14f523-33c1-43e9-a143-c423cf7ec754-secret-volume\") pod \"collect-profiles-29425035-dcr97\" (UID: \"ec14f523-33c1-43e9-a143-c423cf7ec754\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29425035-dcr97" Dec 12 01:15:00 crc kubenswrapper[4606]: I1212 01:15:00.196268 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bj2b\" (UniqueName: \"kubernetes.io/projected/ec14f523-33c1-43e9-a143-c423cf7ec754-kube-api-access-9bj2b\") pod \"collect-profiles-29425035-dcr97\" (UID: \"ec14f523-33c1-43e9-a143-c423cf7ec754\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29425035-dcr97" Dec 12 01:15:00 crc kubenswrapper[4606]: I1212 01:15:00.297434 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ec14f523-33c1-43e9-a143-c423cf7ec754-config-volume\") pod \"collect-profiles-29425035-dcr97\" (UID: \"ec14f523-33c1-43e9-a143-c423cf7ec754\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29425035-dcr97" Dec 12 01:15:00 crc kubenswrapper[4606]: I1212 01:15:00.297538 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ec14f523-33c1-43e9-a143-c423cf7ec754-secret-volume\") pod \"collect-profiles-29425035-dcr97\" (UID: \"ec14f523-33c1-43e9-a143-c423cf7ec754\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29425035-dcr97" Dec 12 01:15:00 crc kubenswrapper[4606]: I1212 01:15:00.297680 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9bj2b\" (UniqueName: \"kubernetes.io/projected/ec14f523-33c1-43e9-a143-c423cf7ec754-kube-api-access-9bj2b\") pod \"collect-profiles-29425035-dcr97\" (UID: \"ec14f523-33c1-43e9-a143-c423cf7ec754\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29425035-dcr97" Dec 12 01:15:00 crc kubenswrapper[4606]: I1212 01:15:00.298656 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ec14f523-33c1-43e9-a143-c423cf7ec754-config-volume\") pod \"collect-profiles-29425035-dcr97\" (UID: \"ec14f523-33c1-43e9-a143-c423cf7ec754\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29425035-dcr97" Dec 12 01:15:00 crc kubenswrapper[4606]: I1212 01:15:00.304155 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ec14f523-33c1-43e9-a143-c423cf7ec754-secret-volume\") pod \"collect-profiles-29425035-dcr97\" (UID: \"ec14f523-33c1-43e9-a143-c423cf7ec754\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29425035-dcr97" Dec 12 01:15:00 crc kubenswrapper[4606]: I1212 01:15:00.354870 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bj2b\" (UniqueName: \"kubernetes.io/projected/ec14f523-33c1-43e9-a143-c423cf7ec754-kube-api-access-9bj2b\") pod \"collect-profiles-29425035-dcr97\" (UID: \"ec14f523-33c1-43e9-a143-c423cf7ec754\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29425035-dcr97" Dec 12 01:15:00 crc kubenswrapper[4606]: I1212 01:15:00.500323 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29425035-dcr97" Dec 12 01:15:00 crc kubenswrapper[4606]: I1212 01:15:00.980396 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29425035-dcr97"] Dec 12 01:15:01 crc kubenswrapper[4606]: I1212 01:15:01.619212 4606 generic.go:334] "Generic (PLEG): container finished" podID="ec14f523-33c1-43e9-a143-c423cf7ec754" containerID="da991ef27524182ee9ccd4effff3577e3692d1d6724bb42c5102b5194eeba703" exitCode=0 Dec 12 01:15:01 crc kubenswrapper[4606]: I1212 01:15:01.619304 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29425035-dcr97" event={"ID":"ec14f523-33c1-43e9-a143-c423cf7ec754","Type":"ContainerDied","Data":"da991ef27524182ee9ccd4effff3577e3692d1d6724bb42c5102b5194eeba703"} Dec 12 01:15:01 crc kubenswrapper[4606]: I1212 01:15:01.619566 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29425035-dcr97" event={"ID":"ec14f523-33c1-43e9-a143-c423cf7ec754","Type":"ContainerStarted","Data":"f5bc171867433dd79740e684e7bffbe605df0042ea9dd0df8957f22569ca169b"} Dec 12 01:15:02 crc kubenswrapper[4606]: I1212 01:15:02.954098 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29425035-dcr97" Dec 12 01:15:03 crc kubenswrapper[4606]: I1212 01:15:03.060684 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ec14f523-33c1-43e9-a143-c423cf7ec754-config-volume\") pod \"ec14f523-33c1-43e9-a143-c423cf7ec754\" (UID: \"ec14f523-33c1-43e9-a143-c423cf7ec754\") " Dec 12 01:15:03 crc kubenswrapper[4606]: I1212 01:15:03.060776 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ec14f523-33c1-43e9-a143-c423cf7ec754-secret-volume\") pod \"ec14f523-33c1-43e9-a143-c423cf7ec754\" (UID: \"ec14f523-33c1-43e9-a143-c423cf7ec754\") " Dec 12 01:15:03 crc kubenswrapper[4606]: I1212 01:15:03.060863 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9bj2b\" (UniqueName: \"kubernetes.io/projected/ec14f523-33c1-43e9-a143-c423cf7ec754-kube-api-access-9bj2b\") pod \"ec14f523-33c1-43e9-a143-c423cf7ec754\" (UID: \"ec14f523-33c1-43e9-a143-c423cf7ec754\") " Dec 12 01:15:03 crc kubenswrapper[4606]: I1212 01:15:03.061885 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec14f523-33c1-43e9-a143-c423cf7ec754-config-volume" (OuterVolumeSpecName: "config-volume") pod "ec14f523-33c1-43e9-a143-c423cf7ec754" (UID: "ec14f523-33c1-43e9-a143-c423cf7ec754"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 01:15:03 crc kubenswrapper[4606]: I1212 01:15:03.062024 4606 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ec14f523-33c1-43e9-a143-c423cf7ec754-config-volume\") on node \"crc\" DevicePath \"\"" Dec 12 01:15:03 crc kubenswrapper[4606]: I1212 01:15:03.070457 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec14f523-33c1-43e9-a143-c423cf7ec754-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "ec14f523-33c1-43e9-a143-c423cf7ec754" (UID: "ec14f523-33c1-43e9-a143-c423cf7ec754"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 01:15:03 crc kubenswrapper[4606]: I1212 01:15:03.070671 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec14f523-33c1-43e9-a143-c423cf7ec754-kube-api-access-9bj2b" (OuterVolumeSpecName: "kube-api-access-9bj2b") pod "ec14f523-33c1-43e9-a143-c423cf7ec754" (UID: "ec14f523-33c1-43e9-a143-c423cf7ec754"). InnerVolumeSpecName "kube-api-access-9bj2b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 01:15:03 crc kubenswrapper[4606]: I1212 01:15:03.164589 4606 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ec14f523-33c1-43e9-a143-c423cf7ec754-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 12 01:15:03 crc kubenswrapper[4606]: I1212 01:15:03.164627 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9bj2b\" (UniqueName: \"kubernetes.io/projected/ec14f523-33c1-43e9-a143-c423cf7ec754-kube-api-access-9bj2b\") on node \"crc\" DevicePath \"\"" Dec 12 01:15:03 crc kubenswrapper[4606]: I1212 01:15:03.639523 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29425035-dcr97" event={"ID":"ec14f523-33c1-43e9-a143-c423cf7ec754","Type":"ContainerDied","Data":"f5bc171867433dd79740e684e7bffbe605df0042ea9dd0df8957f22569ca169b"} Dec 12 01:15:03 crc kubenswrapper[4606]: I1212 01:15:03.639570 4606 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f5bc171867433dd79740e684e7bffbe605df0042ea9dd0df8957f22569ca169b" Dec 12 01:15:03 crc kubenswrapper[4606]: I1212 01:15:03.639629 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29425035-dcr97" Dec 12 01:15:04 crc kubenswrapper[4606]: I1212 01:15:04.033681 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29424990-7td8m"] Dec 12 01:15:04 crc kubenswrapper[4606]: I1212 01:15:04.046828 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29424990-7td8m"] Dec 12 01:15:05 crc kubenswrapper[4606]: I1212 01:15:05.711592 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ee5a4a4-3f2c-4e71-be6e-4b95896ebcf2" path="/var/lib/kubelet/pods/0ee5a4a4-3f2c-4e71-be6e-4b95896ebcf2/volumes" Dec 12 01:15:09 crc kubenswrapper[4606]: I1212 01:15:09.707941 4606 scope.go:117] "RemoveContainer" containerID="d143f1fd2961e240236733dcda2a6e280af9c700504d512a4569cefca1d9830f" Dec 12 01:15:09 crc kubenswrapper[4606]: E1212 01:15:09.708733 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 01:15:21 crc kubenswrapper[4606]: I1212 01:15:21.700976 4606 scope.go:117] "RemoveContainer" containerID="d143f1fd2961e240236733dcda2a6e280af9c700504d512a4569cefca1d9830f" Dec 12 01:15:21 crc kubenswrapper[4606]: E1212 01:15:21.701783 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 01:15:28 crc kubenswrapper[4606]: I1212 01:15:28.868734 4606 generic.go:334] "Generic (PLEG): container finished" podID="bf6da039-4ee5-40c3-90b0-606cd302ee04" containerID="5430465020ba6b06927e421561e8c6ac66fddfb5e2f9fb8435f2b86e8f6ac354" exitCode=0 Dec 12 01:15:28 crc kubenswrapper[4606]: I1212 01:15:28.868874 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2bdq4" event={"ID":"bf6da039-4ee5-40c3-90b0-606cd302ee04","Type":"ContainerDied","Data":"5430465020ba6b06927e421561e8c6ac66fddfb5e2f9fb8435f2b86e8f6ac354"} Dec 12 01:15:30 crc kubenswrapper[4606]: I1212 01:15:30.353284 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2bdq4" Dec 12 01:15:30 crc kubenswrapper[4606]: I1212 01:15:30.426155 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bf6da039-4ee5-40c3-90b0-606cd302ee04-ssh-key\") pod \"bf6da039-4ee5-40c3-90b0-606cd302ee04\" (UID: \"bf6da039-4ee5-40c3-90b0-606cd302ee04\") " Dec 12 01:15:30 crc kubenswrapper[4606]: I1212 01:15:30.426275 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bm6tq\" (UniqueName: \"kubernetes.io/projected/bf6da039-4ee5-40c3-90b0-606cd302ee04-kube-api-access-bm6tq\") pod \"bf6da039-4ee5-40c3-90b0-606cd302ee04\" (UID: \"bf6da039-4ee5-40c3-90b0-606cd302ee04\") " Dec 12 01:15:30 crc kubenswrapper[4606]: I1212 01:15:30.426344 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf6da039-4ee5-40c3-90b0-606cd302ee04-telemetry-combined-ca-bundle\") pod \"bf6da039-4ee5-40c3-90b0-606cd302ee04\" (UID: \"bf6da039-4ee5-40c3-90b0-606cd302ee04\") " Dec 12 01:15:30 crc kubenswrapper[4606]: I1212 01:15:30.426406 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/bf6da039-4ee5-40c3-90b0-606cd302ee04-ceilometer-compute-config-data-1\") pod \"bf6da039-4ee5-40c3-90b0-606cd302ee04\" (UID: \"bf6da039-4ee5-40c3-90b0-606cd302ee04\") " Dec 12 01:15:30 crc kubenswrapper[4606]: I1212 01:15:30.426442 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/bf6da039-4ee5-40c3-90b0-606cd302ee04-ceilometer-compute-config-data-2\") pod \"bf6da039-4ee5-40c3-90b0-606cd302ee04\" (UID: \"bf6da039-4ee5-40c3-90b0-606cd302ee04\") " Dec 12 01:15:30 crc kubenswrapper[4606]: I1212 01:15:30.426465 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/bf6da039-4ee5-40c3-90b0-606cd302ee04-ceilometer-compute-config-data-0\") pod \"bf6da039-4ee5-40c3-90b0-606cd302ee04\" (UID: \"bf6da039-4ee5-40c3-90b0-606cd302ee04\") " Dec 12 01:15:30 crc kubenswrapper[4606]: I1212 01:15:30.426487 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bf6da039-4ee5-40c3-90b0-606cd302ee04-inventory\") pod \"bf6da039-4ee5-40c3-90b0-606cd302ee04\" (UID: \"bf6da039-4ee5-40c3-90b0-606cd302ee04\") " Dec 12 01:15:30 crc kubenswrapper[4606]: I1212 01:15:30.433760 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf6da039-4ee5-40c3-90b0-606cd302ee04-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "bf6da039-4ee5-40c3-90b0-606cd302ee04" (UID: "bf6da039-4ee5-40c3-90b0-606cd302ee04"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 01:15:30 crc kubenswrapper[4606]: I1212 01:15:30.438636 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf6da039-4ee5-40c3-90b0-606cd302ee04-kube-api-access-bm6tq" (OuterVolumeSpecName: "kube-api-access-bm6tq") pod "bf6da039-4ee5-40c3-90b0-606cd302ee04" (UID: "bf6da039-4ee5-40c3-90b0-606cd302ee04"). InnerVolumeSpecName "kube-api-access-bm6tq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 01:15:30 crc kubenswrapper[4606]: I1212 01:15:30.466179 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf6da039-4ee5-40c3-90b0-606cd302ee04-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "bf6da039-4ee5-40c3-90b0-606cd302ee04" (UID: "bf6da039-4ee5-40c3-90b0-606cd302ee04"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 01:15:30 crc kubenswrapper[4606]: I1212 01:15:30.468033 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf6da039-4ee5-40c3-90b0-606cd302ee04-inventory" (OuterVolumeSpecName: "inventory") pod "bf6da039-4ee5-40c3-90b0-606cd302ee04" (UID: "bf6da039-4ee5-40c3-90b0-606cd302ee04"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 01:15:30 crc kubenswrapper[4606]: I1212 01:15:30.471699 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf6da039-4ee5-40c3-90b0-606cd302ee04-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "bf6da039-4ee5-40c3-90b0-606cd302ee04" (UID: "bf6da039-4ee5-40c3-90b0-606cd302ee04"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 01:15:30 crc kubenswrapper[4606]: I1212 01:15:30.489358 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf6da039-4ee5-40c3-90b0-606cd302ee04-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "bf6da039-4ee5-40c3-90b0-606cd302ee04" (UID: "bf6da039-4ee5-40c3-90b0-606cd302ee04"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 01:15:30 crc kubenswrapper[4606]: I1212 01:15:30.489892 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf6da039-4ee5-40c3-90b0-606cd302ee04-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "bf6da039-4ee5-40c3-90b0-606cd302ee04" (UID: "bf6da039-4ee5-40c3-90b0-606cd302ee04"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 01:15:30 crc kubenswrapper[4606]: I1212 01:15:30.528734 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bm6tq\" (UniqueName: \"kubernetes.io/projected/bf6da039-4ee5-40c3-90b0-606cd302ee04-kube-api-access-bm6tq\") on node \"crc\" DevicePath \"\"" Dec 12 01:15:30 crc kubenswrapper[4606]: I1212 01:15:30.528947 4606 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf6da039-4ee5-40c3-90b0-606cd302ee04-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 12 01:15:30 crc kubenswrapper[4606]: I1212 01:15:30.529030 4606 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/bf6da039-4ee5-40c3-90b0-606cd302ee04-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Dec 12 01:15:30 crc kubenswrapper[4606]: I1212 01:15:30.529123 4606 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/bf6da039-4ee5-40c3-90b0-606cd302ee04-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Dec 12 01:15:30 crc kubenswrapper[4606]: I1212 01:15:30.529256 4606 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/bf6da039-4ee5-40c3-90b0-606cd302ee04-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Dec 12 01:15:30 crc kubenswrapper[4606]: I1212 01:15:30.529351 4606 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bf6da039-4ee5-40c3-90b0-606cd302ee04-inventory\") on node \"crc\" DevicePath \"\"" Dec 12 01:15:30 crc kubenswrapper[4606]: I1212 01:15:30.529449 4606 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bf6da039-4ee5-40c3-90b0-606cd302ee04-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 12 01:15:30 crc kubenswrapper[4606]: I1212 01:15:30.886856 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2bdq4" event={"ID":"bf6da039-4ee5-40c3-90b0-606cd302ee04","Type":"ContainerDied","Data":"e44ec147587348411a716553ff3b2926304deb82811d97a8d899d5a713fa0726"} Dec 12 01:15:30 crc kubenswrapper[4606]: I1212 01:15:30.887616 4606 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e44ec147587348411a716553ff3b2926304deb82811d97a8d899d5a713fa0726" Dec 12 01:15:30 crc kubenswrapper[4606]: I1212 01:15:30.887089 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2bdq4" Dec 12 01:15:36 crc kubenswrapper[4606]: I1212 01:15:36.700599 4606 scope.go:117] "RemoveContainer" containerID="d143f1fd2961e240236733dcda2a6e280af9c700504d512a4569cefca1d9830f" Dec 12 01:15:36 crc kubenswrapper[4606]: E1212 01:15:36.701528 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 01:15:49 crc kubenswrapper[4606]: I1212 01:15:49.725084 4606 scope.go:117] "RemoveContainer" containerID="d143f1fd2961e240236733dcda2a6e280af9c700504d512a4569cefca1d9830f" Dec 12 01:15:49 crc kubenswrapper[4606]: E1212 01:15:49.727346 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 01:16:01 crc kubenswrapper[4606]: I1212 01:16:01.699722 4606 scope.go:117] "RemoveContainer" containerID="d143f1fd2961e240236733dcda2a6e280af9c700504d512a4569cefca1d9830f" Dec 12 01:16:01 crc kubenswrapper[4606]: E1212 01:16:01.700422 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 01:16:03 crc kubenswrapper[4606]: I1212 01:16:03.858531 4606 scope.go:117] "RemoveContainer" containerID="0758f1c087a8f7581b046dc559ae82dfe863767335aa9c8506f861dbc5a85de2" Dec 12 01:16:16 crc kubenswrapper[4606]: I1212 01:16:16.699880 4606 scope.go:117] "RemoveContainer" containerID="d143f1fd2961e240236733dcda2a6e280af9c700504d512a4569cefca1d9830f" Dec 12 01:16:17 crc kubenswrapper[4606]: I1212 01:16:17.335271 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" event={"ID":"a543e227-be89-40cb-941d-b4707cc28921","Type":"ContainerStarted","Data":"f229baf38a7f21fe7b01bef3e91a8c649ea56881ce0bff1f8c08ca952dad853d"} Dec 12 01:16:29 crc kubenswrapper[4606]: I1212 01:16:29.652029 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Dec 12 01:16:29 crc kubenswrapper[4606]: E1212 01:16:29.654522 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec14f523-33c1-43e9-a143-c423cf7ec754" containerName="collect-profiles" Dec 12 01:16:29 crc kubenswrapper[4606]: I1212 01:16:29.654680 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec14f523-33c1-43e9-a143-c423cf7ec754" containerName="collect-profiles" Dec 12 01:16:29 crc kubenswrapper[4606]: E1212 01:16:29.654852 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf6da039-4ee5-40c3-90b0-606cd302ee04" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Dec 12 01:16:29 crc kubenswrapper[4606]: I1212 01:16:29.654955 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf6da039-4ee5-40c3-90b0-606cd302ee04" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Dec 12 01:16:29 crc kubenswrapper[4606]: I1212 01:16:29.655342 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf6da039-4ee5-40c3-90b0-606cd302ee04" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Dec 12 01:16:29 crc kubenswrapper[4606]: I1212 01:16:29.655492 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec14f523-33c1-43e9-a143-c423cf7ec754" containerName="collect-profiles" Dec 12 01:16:29 crc kubenswrapper[4606]: I1212 01:16:29.657071 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 12 01:16:29 crc kubenswrapper[4606]: I1212 01:16:29.660158 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-xrmfx" Dec 12 01:16:29 crc kubenswrapper[4606]: I1212 01:16:29.660609 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Dec 12 01:16:29 crc kubenswrapper[4606]: I1212 01:16:29.662920 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Dec 12 01:16:29 crc kubenswrapper[4606]: I1212 01:16:29.663377 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Dec 12 01:16:29 crc kubenswrapper[4606]: I1212 01:16:29.673690 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Dec 12 01:16:29 crc kubenswrapper[4606]: I1212 01:16:29.723904 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/4b099e40-725d-42e2-84fc-6ed969a20e5f-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"4b099e40-725d-42e2-84fc-6ed969a20e5f\") " pod="openstack/tempest-tests-tempest" Dec 12 01:16:29 crc kubenswrapper[4606]: I1212 01:16:29.723993 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cf5ch\" (UniqueName: \"kubernetes.io/projected/4b099e40-725d-42e2-84fc-6ed969a20e5f-kube-api-access-cf5ch\") pod \"tempest-tests-tempest\" (UID: \"4b099e40-725d-42e2-84fc-6ed969a20e5f\") " pod="openstack/tempest-tests-tempest" Dec 12 01:16:29 crc kubenswrapper[4606]: I1212 01:16:29.724084 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4b099e40-725d-42e2-84fc-6ed969a20e5f-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"4b099e40-725d-42e2-84fc-6ed969a20e5f\") " pod="openstack/tempest-tests-tempest" Dec 12 01:16:29 crc kubenswrapper[4606]: I1212 01:16:29.724154 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"tempest-tests-tempest\" (UID: \"4b099e40-725d-42e2-84fc-6ed969a20e5f\") " pod="openstack/tempest-tests-tempest" Dec 12 01:16:29 crc kubenswrapper[4606]: I1212 01:16:29.724292 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/4b099e40-725d-42e2-84fc-6ed969a20e5f-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"4b099e40-725d-42e2-84fc-6ed969a20e5f\") " pod="openstack/tempest-tests-tempest" Dec 12 01:16:29 crc kubenswrapper[4606]: I1212 01:16:29.725937 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/4b099e40-725d-42e2-84fc-6ed969a20e5f-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"4b099e40-725d-42e2-84fc-6ed969a20e5f\") " pod="openstack/tempest-tests-tempest" Dec 12 01:16:29 crc kubenswrapper[4606]: I1212 01:16:29.725995 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/4b099e40-725d-42e2-84fc-6ed969a20e5f-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"4b099e40-725d-42e2-84fc-6ed969a20e5f\") " pod="openstack/tempest-tests-tempest" Dec 12 01:16:29 crc kubenswrapper[4606]: I1212 01:16:29.726234 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/4b099e40-725d-42e2-84fc-6ed969a20e5f-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"4b099e40-725d-42e2-84fc-6ed969a20e5f\") " pod="openstack/tempest-tests-tempest" Dec 12 01:16:29 crc kubenswrapper[4606]: I1212 01:16:29.726295 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4b099e40-725d-42e2-84fc-6ed969a20e5f-config-data\") pod \"tempest-tests-tempest\" (UID: \"4b099e40-725d-42e2-84fc-6ed969a20e5f\") " pod="openstack/tempest-tests-tempest" Dec 12 01:16:29 crc kubenswrapper[4606]: I1212 01:16:29.828391 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/4b099e40-725d-42e2-84fc-6ed969a20e5f-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"4b099e40-725d-42e2-84fc-6ed969a20e5f\") " pod="openstack/tempest-tests-tempest" Dec 12 01:16:29 crc kubenswrapper[4606]: I1212 01:16:29.828463 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cf5ch\" (UniqueName: \"kubernetes.io/projected/4b099e40-725d-42e2-84fc-6ed969a20e5f-kube-api-access-cf5ch\") pod \"tempest-tests-tempest\" (UID: \"4b099e40-725d-42e2-84fc-6ed969a20e5f\") " pod="openstack/tempest-tests-tempest" Dec 12 01:16:29 crc kubenswrapper[4606]: I1212 01:16:29.828536 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4b099e40-725d-42e2-84fc-6ed969a20e5f-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"4b099e40-725d-42e2-84fc-6ed969a20e5f\") " pod="openstack/tempest-tests-tempest" Dec 12 01:16:29 crc kubenswrapper[4606]: I1212 01:16:29.828585 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"tempest-tests-tempest\" (UID: \"4b099e40-725d-42e2-84fc-6ed969a20e5f\") " pod="openstack/tempest-tests-tempest" Dec 12 01:16:29 crc kubenswrapper[4606]: I1212 01:16:29.828619 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/4b099e40-725d-42e2-84fc-6ed969a20e5f-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"4b099e40-725d-42e2-84fc-6ed969a20e5f\") " pod="openstack/tempest-tests-tempest" Dec 12 01:16:29 crc kubenswrapper[4606]: I1212 01:16:29.828688 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/4b099e40-725d-42e2-84fc-6ed969a20e5f-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"4b099e40-725d-42e2-84fc-6ed969a20e5f\") " pod="openstack/tempest-tests-tempest" Dec 12 01:16:29 crc kubenswrapper[4606]: I1212 01:16:29.828774 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/4b099e40-725d-42e2-84fc-6ed969a20e5f-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"4b099e40-725d-42e2-84fc-6ed969a20e5f\") " pod="openstack/tempest-tests-tempest" Dec 12 01:16:29 crc kubenswrapper[4606]: I1212 01:16:29.828980 4606 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"tempest-tests-tempest\" (UID: \"4b099e40-725d-42e2-84fc-6ed969a20e5f\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/tempest-tests-tempest" Dec 12 01:16:29 crc kubenswrapper[4606]: I1212 01:16:29.829870 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/4b099e40-725d-42e2-84fc-6ed969a20e5f-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"4b099e40-725d-42e2-84fc-6ed969a20e5f\") " pod="openstack/tempest-tests-tempest" Dec 12 01:16:29 crc kubenswrapper[4606]: I1212 01:16:29.830486 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/4b099e40-725d-42e2-84fc-6ed969a20e5f-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"4b099e40-725d-42e2-84fc-6ed969a20e5f\") " pod="openstack/tempest-tests-tempest" Dec 12 01:16:29 crc kubenswrapper[4606]: I1212 01:16:29.828985 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/4b099e40-725d-42e2-84fc-6ed969a20e5f-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"4b099e40-725d-42e2-84fc-6ed969a20e5f\") " pod="openstack/tempest-tests-tempest" Dec 12 01:16:29 crc kubenswrapper[4606]: I1212 01:16:29.832131 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4b099e40-725d-42e2-84fc-6ed969a20e5f-config-data\") pod \"tempest-tests-tempest\" (UID: \"4b099e40-725d-42e2-84fc-6ed969a20e5f\") " pod="openstack/tempest-tests-tempest" Dec 12 01:16:29 crc kubenswrapper[4606]: I1212 01:16:29.832473 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/4b099e40-725d-42e2-84fc-6ed969a20e5f-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"4b099e40-725d-42e2-84fc-6ed969a20e5f\") " pod="openstack/tempest-tests-tempest" Dec 12 01:16:29 crc kubenswrapper[4606]: I1212 01:16:29.833560 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Dec 12 01:16:29 crc kubenswrapper[4606]: I1212 01:16:29.834435 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Dec 12 01:16:29 crc kubenswrapper[4606]: I1212 01:16:29.835859 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/4b099e40-725d-42e2-84fc-6ed969a20e5f-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"4b099e40-725d-42e2-84fc-6ed969a20e5f\") " pod="openstack/tempest-tests-tempest" Dec 12 01:16:29 crc kubenswrapper[4606]: I1212 01:16:29.838402 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/4b099e40-725d-42e2-84fc-6ed969a20e5f-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"4b099e40-725d-42e2-84fc-6ed969a20e5f\") " pod="openstack/tempest-tests-tempest" Dec 12 01:16:29 crc kubenswrapper[4606]: I1212 01:16:29.843533 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4b099e40-725d-42e2-84fc-6ed969a20e5f-config-data\") pod \"tempest-tests-tempest\" (UID: \"4b099e40-725d-42e2-84fc-6ed969a20e5f\") " pod="openstack/tempest-tests-tempest" Dec 12 01:16:29 crc kubenswrapper[4606]: I1212 01:16:29.843957 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4b099e40-725d-42e2-84fc-6ed969a20e5f-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"4b099e40-725d-42e2-84fc-6ed969a20e5f\") " pod="openstack/tempest-tests-tempest" Dec 12 01:16:29 crc kubenswrapper[4606]: I1212 01:16:29.861273 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cf5ch\" (UniqueName: \"kubernetes.io/projected/4b099e40-725d-42e2-84fc-6ed969a20e5f-kube-api-access-cf5ch\") pod \"tempest-tests-tempest\" (UID: \"4b099e40-725d-42e2-84fc-6ed969a20e5f\") " pod="openstack/tempest-tests-tempest" Dec 12 01:16:29 crc kubenswrapper[4606]: I1212 01:16:29.877559 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"tempest-tests-tempest\" (UID: \"4b099e40-725d-42e2-84fc-6ed969a20e5f\") " pod="openstack/tempest-tests-tempest" Dec 12 01:16:29 crc kubenswrapper[4606]: I1212 01:16:29.982565 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-xrmfx" Dec 12 01:16:29 crc kubenswrapper[4606]: I1212 01:16:29.991822 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 12 01:16:30 crc kubenswrapper[4606]: I1212 01:16:30.518586 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Dec 12 01:16:31 crc kubenswrapper[4606]: I1212 01:16:31.452867 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"4b099e40-725d-42e2-84fc-6ed969a20e5f","Type":"ContainerStarted","Data":"fd0ef95278326cc8cd5af7cacb365d6b41c6ce2731b5a2a8ad5191be2ef1a894"} Dec 12 01:16:39 crc kubenswrapper[4606]: I1212 01:16:39.246456 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-fxcrs"] Dec 12 01:16:39 crc kubenswrapper[4606]: I1212 01:16:39.249079 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fxcrs" Dec 12 01:16:39 crc kubenswrapper[4606]: I1212 01:16:39.294232 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fxcrs"] Dec 12 01:16:39 crc kubenswrapper[4606]: I1212 01:16:39.344062 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sggsr\" (UniqueName: \"kubernetes.io/projected/8479db5e-ae4a-474d-9ff7-5dc3c16821ae-kube-api-access-sggsr\") pod \"community-operators-fxcrs\" (UID: \"8479db5e-ae4a-474d-9ff7-5dc3c16821ae\") " pod="openshift-marketplace/community-operators-fxcrs" Dec 12 01:16:39 crc kubenswrapper[4606]: I1212 01:16:39.344161 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8479db5e-ae4a-474d-9ff7-5dc3c16821ae-catalog-content\") pod \"community-operators-fxcrs\" (UID: \"8479db5e-ae4a-474d-9ff7-5dc3c16821ae\") " pod="openshift-marketplace/community-operators-fxcrs" Dec 12 01:16:39 crc kubenswrapper[4606]: I1212 01:16:39.344265 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8479db5e-ae4a-474d-9ff7-5dc3c16821ae-utilities\") pod \"community-operators-fxcrs\" (UID: \"8479db5e-ae4a-474d-9ff7-5dc3c16821ae\") " pod="openshift-marketplace/community-operators-fxcrs" Dec 12 01:16:39 crc kubenswrapper[4606]: I1212 01:16:39.445523 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sggsr\" (UniqueName: \"kubernetes.io/projected/8479db5e-ae4a-474d-9ff7-5dc3c16821ae-kube-api-access-sggsr\") pod \"community-operators-fxcrs\" (UID: \"8479db5e-ae4a-474d-9ff7-5dc3c16821ae\") " pod="openshift-marketplace/community-operators-fxcrs" Dec 12 01:16:39 crc kubenswrapper[4606]: I1212 01:16:39.445598 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8479db5e-ae4a-474d-9ff7-5dc3c16821ae-catalog-content\") pod \"community-operators-fxcrs\" (UID: \"8479db5e-ae4a-474d-9ff7-5dc3c16821ae\") " pod="openshift-marketplace/community-operators-fxcrs" Dec 12 01:16:39 crc kubenswrapper[4606]: I1212 01:16:39.445653 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8479db5e-ae4a-474d-9ff7-5dc3c16821ae-utilities\") pod \"community-operators-fxcrs\" (UID: \"8479db5e-ae4a-474d-9ff7-5dc3c16821ae\") " pod="openshift-marketplace/community-operators-fxcrs" Dec 12 01:16:39 crc kubenswrapper[4606]: I1212 01:16:39.446100 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8479db5e-ae4a-474d-9ff7-5dc3c16821ae-utilities\") pod \"community-operators-fxcrs\" (UID: \"8479db5e-ae4a-474d-9ff7-5dc3c16821ae\") " pod="openshift-marketplace/community-operators-fxcrs" Dec 12 01:16:39 crc kubenswrapper[4606]: I1212 01:16:39.446426 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8479db5e-ae4a-474d-9ff7-5dc3c16821ae-catalog-content\") pod \"community-operators-fxcrs\" (UID: \"8479db5e-ae4a-474d-9ff7-5dc3c16821ae\") " pod="openshift-marketplace/community-operators-fxcrs" Dec 12 01:16:39 crc kubenswrapper[4606]: I1212 01:16:39.470798 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sggsr\" (UniqueName: \"kubernetes.io/projected/8479db5e-ae4a-474d-9ff7-5dc3c16821ae-kube-api-access-sggsr\") pod \"community-operators-fxcrs\" (UID: \"8479db5e-ae4a-474d-9ff7-5dc3c16821ae\") " pod="openshift-marketplace/community-operators-fxcrs" Dec 12 01:16:39 crc kubenswrapper[4606]: I1212 01:16:39.580845 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fxcrs" Dec 12 01:16:40 crc kubenswrapper[4606]: I1212 01:16:40.548287 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fxcrs"] Dec 12 01:16:40 crc kubenswrapper[4606]: I1212 01:16:40.570865 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fxcrs" event={"ID":"8479db5e-ae4a-474d-9ff7-5dc3c16821ae","Type":"ContainerStarted","Data":"8f0e3502d91cb9d35405763cf4e2ec8f0577032c27ff71f4cfdd419c104babe6"} Dec 12 01:16:41 crc kubenswrapper[4606]: I1212 01:16:41.592663 4606 generic.go:334] "Generic (PLEG): container finished" podID="8479db5e-ae4a-474d-9ff7-5dc3c16821ae" containerID="7b788a551d57d95faa6e8524c4a8d63b3ddaec361b4af55c7894c1f703b585f3" exitCode=0 Dec 12 01:16:41 crc kubenswrapper[4606]: I1212 01:16:41.593118 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fxcrs" event={"ID":"8479db5e-ae4a-474d-9ff7-5dc3c16821ae","Type":"ContainerDied","Data":"7b788a551d57d95faa6e8524c4a8d63b3ddaec361b4af55c7894c1f703b585f3"} Dec 12 01:16:43 crc kubenswrapper[4606]: I1212 01:16:43.617749 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fxcrs" event={"ID":"8479db5e-ae4a-474d-9ff7-5dc3c16821ae","Type":"ContainerStarted","Data":"9ba441ece8b550f3949877a4e08337058777cd351cbc8498ebfe8ebe859136e1"} Dec 12 01:16:44 crc kubenswrapper[4606]: I1212 01:16:44.628409 4606 generic.go:334] "Generic (PLEG): container finished" podID="8479db5e-ae4a-474d-9ff7-5dc3c16821ae" containerID="9ba441ece8b550f3949877a4e08337058777cd351cbc8498ebfe8ebe859136e1" exitCode=0 Dec 12 01:16:44 crc kubenswrapper[4606]: I1212 01:16:44.628472 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fxcrs" event={"ID":"8479db5e-ae4a-474d-9ff7-5dc3c16821ae","Type":"ContainerDied","Data":"9ba441ece8b550f3949877a4e08337058777cd351cbc8498ebfe8ebe859136e1"} Dec 12 01:17:05 crc kubenswrapper[4606]: I1212 01:17:05.842889 4606 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 12 01:17:05 crc kubenswrapper[4606]: E1212 01:17:05.929366 4606 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Dec 12 01:17:05 crc kubenswrapper[4606]: E1212 01:17:05.932380 4606 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cf5ch,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(4b099e40-725d-42e2-84fc-6ed969a20e5f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 12 01:17:05 crc kubenswrapper[4606]: E1212 01:17:05.933560 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="4b099e40-725d-42e2-84fc-6ed969a20e5f" Dec 12 01:17:06 crc kubenswrapper[4606]: I1212 01:17:06.836332 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fxcrs" event={"ID":"8479db5e-ae4a-474d-9ff7-5dc3c16821ae","Type":"ContainerStarted","Data":"59063ba7711fb0fdc6c174f0438c89fd5c36ad29a5b1a62313a5a095b98cd3bf"} Dec 12 01:17:06 crc kubenswrapper[4606]: E1212 01:17:06.838236 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="4b099e40-725d-42e2-84fc-6ed969a20e5f" Dec 12 01:17:06 crc kubenswrapper[4606]: I1212 01:17:06.880007 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-fxcrs" podStartSLOduration=2.934662007 podStartE2EDuration="27.879970186s" podCreationTimestamp="2025-12-12 01:16:39 +0000 UTC" firstStartedPulling="2025-12-12 01:16:41.595079573 +0000 UTC m=+3192.140432439" lastFinishedPulling="2025-12-12 01:17:06.540387752 +0000 UTC m=+3217.085740618" observedRunningTime="2025-12-12 01:17:06.874605085 +0000 UTC m=+3217.419957971" watchObservedRunningTime="2025-12-12 01:17:06.879970186 +0000 UTC m=+3217.425323052" Dec 12 01:17:09 crc kubenswrapper[4606]: I1212 01:17:09.581906 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-fxcrs" Dec 12 01:17:09 crc kubenswrapper[4606]: I1212 01:17:09.582282 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-fxcrs" Dec 12 01:17:09 crc kubenswrapper[4606]: I1212 01:17:09.648045 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-fxcrs" Dec 12 01:17:19 crc kubenswrapper[4606]: I1212 01:17:19.645204 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-fxcrs" Dec 12 01:17:19 crc kubenswrapper[4606]: I1212 01:17:19.697411 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fxcrs"] Dec 12 01:17:19 crc kubenswrapper[4606]: I1212 01:17:19.968211 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-fxcrs" podUID="8479db5e-ae4a-474d-9ff7-5dc3c16821ae" containerName="registry-server" containerID="cri-o://59063ba7711fb0fdc6c174f0438c89fd5c36ad29a5b1a62313a5a095b98cd3bf" gracePeriod=2 Dec 12 01:17:20 crc kubenswrapper[4606]: I1212 01:17:20.276807 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Dec 12 01:17:20 crc kubenswrapper[4606]: I1212 01:17:20.529242 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fxcrs" Dec 12 01:17:20 crc kubenswrapper[4606]: I1212 01:17:20.625298 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8479db5e-ae4a-474d-9ff7-5dc3c16821ae-utilities\") pod \"8479db5e-ae4a-474d-9ff7-5dc3c16821ae\" (UID: \"8479db5e-ae4a-474d-9ff7-5dc3c16821ae\") " Dec 12 01:17:20 crc kubenswrapper[4606]: I1212 01:17:20.625366 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sggsr\" (UniqueName: \"kubernetes.io/projected/8479db5e-ae4a-474d-9ff7-5dc3c16821ae-kube-api-access-sggsr\") pod \"8479db5e-ae4a-474d-9ff7-5dc3c16821ae\" (UID: \"8479db5e-ae4a-474d-9ff7-5dc3c16821ae\") " Dec 12 01:17:20 crc kubenswrapper[4606]: I1212 01:17:20.625450 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8479db5e-ae4a-474d-9ff7-5dc3c16821ae-catalog-content\") pod \"8479db5e-ae4a-474d-9ff7-5dc3c16821ae\" (UID: \"8479db5e-ae4a-474d-9ff7-5dc3c16821ae\") " Dec 12 01:17:20 crc kubenswrapper[4606]: I1212 01:17:20.629158 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8479db5e-ae4a-474d-9ff7-5dc3c16821ae-utilities" (OuterVolumeSpecName: "utilities") pod "8479db5e-ae4a-474d-9ff7-5dc3c16821ae" (UID: "8479db5e-ae4a-474d-9ff7-5dc3c16821ae"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 01:17:20 crc kubenswrapper[4606]: I1212 01:17:20.633785 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8479db5e-ae4a-474d-9ff7-5dc3c16821ae-kube-api-access-sggsr" (OuterVolumeSpecName: "kube-api-access-sggsr") pod "8479db5e-ae4a-474d-9ff7-5dc3c16821ae" (UID: "8479db5e-ae4a-474d-9ff7-5dc3c16821ae"). InnerVolumeSpecName "kube-api-access-sggsr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 01:17:20 crc kubenswrapper[4606]: I1212 01:17:20.674417 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8479db5e-ae4a-474d-9ff7-5dc3c16821ae-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8479db5e-ae4a-474d-9ff7-5dc3c16821ae" (UID: "8479db5e-ae4a-474d-9ff7-5dc3c16821ae"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 01:17:20 crc kubenswrapper[4606]: I1212 01:17:20.728247 4606 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8479db5e-ae4a-474d-9ff7-5dc3c16821ae-utilities\") on node \"crc\" DevicePath \"\"" Dec 12 01:17:20 crc kubenswrapper[4606]: I1212 01:17:20.728463 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sggsr\" (UniqueName: \"kubernetes.io/projected/8479db5e-ae4a-474d-9ff7-5dc3c16821ae-kube-api-access-sggsr\") on node \"crc\" DevicePath \"\"" Dec 12 01:17:20 crc kubenswrapper[4606]: I1212 01:17:20.728574 4606 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8479db5e-ae4a-474d-9ff7-5dc3c16821ae-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 12 01:17:20 crc kubenswrapper[4606]: I1212 01:17:20.979651 4606 generic.go:334] "Generic (PLEG): container finished" podID="8479db5e-ae4a-474d-9ff7-5dc3c16821ae" containerID="59063ba7711fb0fdc6c174f0438c89fd5c36ad29a5b1a62313a5a095b98cd3bf" exitCode=0 Dec 12 01:17:20 crc kubenswrapper[4606]: I1212 01:17:20.979703 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fxcrs" event={"ID":"8479db5e-ae4a-474d-9ff7-5dc3c16821ae","Type":"ContainerDied","Data":"59063ba7711fb0fdc6c174f0438c89fd5c36ad29a5b1a62313a5a095b98cd3bf"} Dec 12 01:17:20 crc kubenswrapper[4606]: I1212 01:17:20.979727 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fxcrs" Dec 12 01:17:20 crc kubenswrapper[4606]: I1212 01:17:20.979745 4606 scope.go:117] "RemoveContainer" containerID="59063ba7711fb0fdc6c174f0438c89fd5c36ad29a5b1a62313a5a095b98cd3bf" Dec 12 01:17:20 crc kubenswrapper[4606]: I1212 01:17:20.979734 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fxcrs" event={"ID":"8479db5e-ae4a-474d-9ff7-5dc3c16821ae","Type":"ContainerDied","Data":"8f0e3502d91cb9d35405763cf4e2ec8f0577032c27ff71f4cfdd419c104babe6"} Dec 12 01:17:21 crc kubenswrapper[4606]: I1212 01:17:21.016433 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fxcrs"] Dec 12 01:17:21 crc kubenswrapper[4606]: I1212 01:17:21.027083 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-fxcrs"] Dec 12 01:17:21 crc kubenswrapper[4606]: I1212 01:17:21.027571 4606 scope.go:117] "RemoveContainer" containerID="9ba441ece8b550f3949877a4e08337058777cd351cbc8498ebfe8ebe859136e1" Dec 12 01:17:21 crc kubenswrapper[4606]: I1212 01:17:21.054713 4606 scope.go:117] "RemoveContainer" containerID="7b788a551d57d95faa6e8524c4a8d63b3ddaec361b4af55c7894c1f703b585f3" Dec 12 01:17:21 crc kubenswrapper[4606]: I1212 01:17:21.079870 4606 scope.go:117] "RemoveContainer" containerID="59063ba7711fb0fdc6c174f0438c89fd5c36ad29a5b1a62313a5a095b98cd3bf" Dec 12 01:17:21 crc kubenswrapper[4606]: E1212 01:17:21.080352 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59063ba7711fb0fdc6c174f0438c89fd5c36ad29a5b1a62313a5a095b98cd3bf\": container with ID starting with 59063ba7711fb0fdc6c174f0438c89fd5c36ad29a5b1a62313a5a095b98cd3bf not found: ID does not exist" containerID="59063ba7711fb0fdc6c174f0438c89fd5c36ad29a5b1a62313a5a095b98cd3bf" Dec 12 01:17:21 crc kubenswrapper[4606]: I1212 01:17:21.080396 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59063ba7711fb0fdc6c174f0438c89fd5c36ad29a5b1a62313a5a095b98cd3bf"} err="failed to get container status \"59063ba7711fb0fdc6c174f0438c89fd5c36ad29a5b1a62313a5a095b98cd3bf\": rpc error: code = NotFound desc = could not find container \"59063ba7711fb0fdc6c174f0438c89fd5c36ad29a5b1a62313a5a095b98cd3bf\": container with ID starting with 59063ba7711fb0fdc6c174f0438c89fd5c36ad29a5b1a62313a5a095b98cd3bf not found: ID does not exist" Dec 12 01:17:21 crc kubenswrapper[4606]: I1212 01:17:21.080429 4606 scope.go:117] "RemoveContainer" containerID="9ba441ece8b550f3949877a4e08337058777cd351cbc8498ebfe8ebe859136e1" Dec 12 01:17:21 crc kubenswrapper[4606]: E1212 01:17:21.080848 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ba441ece8b550f3949877a4e08337058777cd351cbc8498ebfe8ebe859136e1\": container with ID starting with 9ba441ece8b550f3949877a4e08337058777cd351cbc8498ebfe8ebe859136e1 not found: ID does not exist" containerID="9ba441ece8b550f3949877a4e08337058777cd351cbc8498ebfe8ebe859136e1" Dec 12 01:17:21 crc kubenswrapper[4606]: I1212 01:17:21.080878 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ba441ece8b550f3949877a4e08337058777cd351cbc8498ebfe8ebe859136e1"} err="failed to get container status \"9ba441ece8b550f3949877a4e08337058777cd351cbc8498ebfe8ebe859136e1\": rpc error: code = NotFound desc = could not find container \"9ba441ece8b550f3949877a4e08337058777cd351cbc8498ebfe8ebe859136e1\": container with ID starting with 9ba441ece8b550f3949877a4e08337058777cd351cbc8498ebfe8ebe859136e1 not found: ID does not exist" Dec 12 01:17:21 crc kubenswrapper[4606]: I1212 01:17:21.080928 4606 scope.go:117] "RemoveContainer" containerID="7b788a551d57d95faa6e8524c4a8d63b3ddaec361b4af55c7894c1f703b585f3" Dec 12 01:17:21 crc kubenswrapper[4606]: E1212 01:17:21.081215 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b788a551d57d95faa6e8524c4a8d63b3ddaec361b4af55c7894c1f703b585f3\": container with ID starting with 7b788a551d57d95faa6e8524c4a8d63b3ddaec361b4af55c7894c1f703b585f3 not found: ID does not exist" containerID="7b788a551d57d95faa6e8524c4a8d63b3ddaec361b4af55c7894c1f703b585f3" Dec 12 01:17:21 crc kubenswrapper[4606]: I1212 01:17:21.081245 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b788a551d57d95faa6e8524c4a8d63b3ddaec361b4af55c7894c1f703b585f3"} err="failed to get container status \"7b788a551d57d95faa6e8524c4a8d63b3ddaec361b4af55c7894c1f703b585f3\": rpc error: code = NotFound desc = could not find container \"7b788a551d57d95faa6e8524c4a8d63b3ddaec361b4af55c7894c1f703b585f3\": container with ID starting with 7b788a551d57d95faa6e8524c4a8d63b3ddaec361b4af55c7894c1f703b585f3 not found: ID does not exist" Dec 12 01:17:21 crc kubenswrapper[4606]: I1212 01:17:21.711003 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8479db5e-ae4a-474d-9ff7-5dc3c16821ae" path="/var/lib/kubelet/pods/8479db5e-ae4a-474d-9ff7-5dc3c16821ae/volumes" Dec 12 01:17:21 crc kubenswrapper[4606]: I1212 01:17:21.991154 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"4b099e40-725d-42e2-84fc-6ed969a20e5f","Type":"ContainerStarted","Data":"c0bd3e1cd3aaff04345df35a82960b7e5528a79f5456f54d7e0bf0565d0f9c28"} Dec 12 01:17:22 crc kubenswrapper[4606]: I1212 01:17:22.015025 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=4.2791147 podStartE2EDuration="54.015005971s" podCreationTimestamp="2025-12-12 01:16:28 +0000 UTC" firstStartedPulling="2025-12-12 01:16:30.535394366 +0000 UTC m=+3181.080747242" lastFinishedPulling="2025-12-12 01:17:20.271285637 +0000 UTC m=+3230.816638513" observedRunningTime="2025-12-12 01:17:22.013653146 +0000 UTC m=+3232.559006012" watchObservedRunningTime="2025-12-12 01:17:22.015005971 +0000 UTC m=+3232.560358837" Dec 12 01:18:32 crc kubenswrapper[4606]: I1212 01:18:32.010899 4606 patch_prober.go:28] interesting pod/machine-config-daemon-cqmz5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 12 01:18:32 crc kubenswrapper[4606]: I1212 01:18:32.011604 4606 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 12 01:19:02 crc kubenswrapper[4606]: I1212 01:19:02.012472 4606 patch_prober.go:28] interesting pod/machine-config-daemon-cqmz5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 12 01:19:02 crc kubenswrapper[4606]: I1212 01:19:02.013031 4606 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 12 01:19:32 crc kubenswrapper[4606]: I1212 01:19:32.010269 4606 patch_prober.go:28] interesting pod/machine-config-daemon-cqmz5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 12 01:19:32 crc kubenswrapper[4606]: I1212 01:19:32.010837 4606 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 12 01:19:32 crc kubenswrapper[4606]: I1212 01:19:32.010912 4606 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" Dec 12 01:19:32 crc kubenswrapper[4606]: I1212 01:19:32.011789 4606 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f229baf38a7f21fe7b01bef3e91a8c649ea56881ce0bff1f8c08ca952dad853d"} pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 12 01:19:32 crc kubenswrapper[4606]: I1212 01:19:32.011864 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" containerName="machine-config-daemon" containerID="cri-o://f229baf38a7f21fe7b01bef3e91a8c649ea56881ce0bff1f8c08ca952dad853d" gracePeriod=600 Dec 12 01:19:32 crc kubenswrapper[4606]: I1212 01:19:32.181307 4606 generic.go:334] "Generic (PLEG): container finished" podID="a543e227-be89-40cb-941d-b4707cc28921" containerID="f229baf38a7f21fe7b01bef3e91a8c649ea56881ce0bff1f8c08ca952dad853d" exitCode=0 Dec 12 01:19:32 crc kubenswrapper[4606]: I1212 01:19:32.181375 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" event={"ID":"a543e227-be89-40cb-941d-b4707cc28921","Type":"ContainerDied","Data":"f229baf38a7f21fe7b01bef3e91a8c649ea56881ce0bff1f8c08ca952dad853d"} Dec 12 01:19:32 crc kubenswrapper[4606]: I1212 01:19:32.181674 4606 scope.go:117] "RemoveContainer" containerID="d143f1fd2961e240236733dcda2a6e280af9c700504d512a4569cefca1d9830f" Dec 12 01:19:33 crc kubenswrapper[4606]: I1212 01:19:33.191774 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" event={"ID":"a543e227-be89-40cb-941d-b4707cc28921","Type":"ContainerStarted","Data":"6cf0fcbed104ebcc1d2da0cc866e500a495d7669fd69152b8565fd158eb51ada"} Dec 12 01:19:51 crc kubenswrapper[4606]: I1212 01:19:51.638389 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-pl92t"] Dec 12 01:19:51 crc kubenswrapper[4606]: E1212 01:19:51.639348 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8479db5e-ae4a-474d-9ff7-5dc3c16821ae" containerName="extract-utilities" Dec 12 01:19:51 crc kubenswrapper[4606]: I1212 01:19:51.639363 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="8479db5e-ae4a-474d-9ff7-5dc3c16821ae" containerName="extract-utilities" Dec 12 01:19:51 crc kubenswrapper[4606]: E1212 01:19:51.639401 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8479db5e-ae4a-474d-9ff7-5dc3c16821ae" containerName="extract-content" Dec 12 01:19:51 crc kubenswrapper[4606]: I1212 01:19:51.639409 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="8479db5e-ae4a-474d-9ff7-5dc3c16821ae" containerName="extract-content" Dec 12 01:19:51 crc kubenswrapper[4606]: E1212 01:19:51.639431 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8479db5e-ae4a-474d-9ff7-5dc3c16821ae" containerName="registry-server" Dec 12 01:19:51 crc kubenswrapper[4606]: I1212 01:19:51.639439 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="8479db5e-ae4a-474d-9ff7-5dc3c16821ae" containerName="registry-server" Dec 12 01:19:51 crc kubenswrapper[4606]: I1212 01:19:51.639654 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="8479db5e-ae4a-474d-9ff7-5dc3c16821ae" containerName="registry-server" Dec 12 01:19:51 crc kubenswrapper[4606]: I1212 01:19:51.644832 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pl92t" Dec 12 01:19:51 crc kubenswrapper[4606]: I1212 01:19:51.667577 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pl92t"] Dec 12 01:19:51 crc kubenswrapper[4606]: I1212 01:19:51.753727 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94601fe7-f652-4b4c-9dde-c09e889c0583-catalog-content\") pod \"redhat-marketplace-pl92t\" (UID: \"94601fe7-f652-4b4c-9dde-c09e889c0583\") " pod="openshift-marketplace/redhat-marketplace-pl92t" Dec 12 01:19:51 crc kubenswrapper[4606]: I1212 01:19:51.753909 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrbh9\" (UniqueName: \"kubernetes.io/projected/94601fe7-f652-4b4c-9dde-c09e889c0583-kube-api-access-lrbh9\") pod \"redhat-marketplace-pl92t\" (UID: \"94601fe7-f652-4b4c-9dde-c09e889c0583\") " pod="openshift-marketplace/redhat-marketplace-pl92t" Dec 12 01:19:51 crc kubenswrapper[4606]: I1212 01:19:51.753957 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94601fe7-f652-4b4c-9dde-c09e889c0583-utilities\") pod \"redhat-marketplace-pl92t\" (UID: \"94601fe7-f652-4b4c-9dde-c09e889c0583\") " pod="openshift-marketplace/redhat-marketplace-pl92t" Dec 12 01:19:51 crc kubenswrapper[4606]: I1212 01:19:51.856285 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrbh9\" (UniqueName: \"kubernetes.io/projected/94601fe7-f652-4b4c-9dde-c09e889c0583-kube-api-access-lrbh9\") pod \"redhat-marketplace-pl92t\" (UID: \"94601fe7-f652-4b4c-9dde-c09e889c0583\") " pod="openshift-marketplace/redhat-marketplace-pl92t" Dec 12 01:19:51 crc kubenswrapper[4606]: I1212 01:19:51.856354 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94601fe7-f652-4b4c-9dde-c09e889c0583-utilities\") pod \"redhat-marketplace-pl92t\" (UID: \"94601fe7-f652-4b4c-9dde-c09e889c0583\") " pod="openshift-marketplace/redhat-marketplace-pl92t" Dec 12 01:19:51 crc kubenswrapper[4606]: I1212 01:19:51.856804 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94601fe7-f652-4b4c-9dde-c09e889c0583-catalog-content\") pod \"redhat-marketplace-pl92t\" (UID: \"94601fe7-f652-4b4c-9dde-c09e889c0583\") " pod="openshift-marketplace/redhat-marketplace-pl92t" Dec 12 01:19:51 crc kubenswrapper[4606]: I1212 01:19:51.856823 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94601fe7-f652-4b4c-9dde-c09e889c0583-utilities\") pod \"redhat-marketplace-pl92t\" (UID: \"94601fe7-f652-4b4c-9dde-c09e889c0583\") " pod="openshift-marketplace/redhat-marketplace-pl92t" Dec 12 01:19:51 crc kubenswrapper[4606]: I1212 01:19:51.857323 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94601fe7-f652-4b4c-9dde-c09e889c0583-catalog-content\") pod \"redhat-marketplace-pl92t\" (UID: \"94601fe7-f652-4b4c-9dde-c09e889c0583\") " pod="openshift-marketplace/redhat-marketplace-pl92t" Dec 12 01:19:51 crc kubenswrapper[4606]: I1212 01:19:51.882113 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrbh9\" (UniqueName: \"kubernetes.io/projected/94601fe7-f652-4b4c-9dde-c09e889c0583-kube-api-access-lrbh9\") pod \"redhat-marketplace-pl92t\" (UID: \"94601fe7-f652-4b4c-9dde-c09e889c0583\") " pod="openshift-marketplace/redhat-marketplace-pl92t" Dec 12 01:19:51 crc kubenswrapper[4606]: I1212 01:19:51.981423 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pl92t" Dec 12 01:19:52 crc kubenswrapper[4606]: W1212 01:19:52.681444 4606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod94601fe7_f652_4b4c_9dde_c09e889c0583.slice/crio-1d94b8c631be8a6dacc397999a05cc38e2e88293a125e3baf88dd6ec18e6e61c WatchSource:0}: Error finding container 1d94b8c631be8a6dacc397999a05cc38e2e88293a125e3baf88dd6ec18e6e61c: Status 404 returned error can't find the container with id 1d94b8c631be8a6dacc397999a05cc38e2e88293a125e3baf88dd6ec18e6e61c Dec 12 01:19:52 crc kubenswrapper[4606]: I1212 01:19:52.688423 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pl92t"] Dec 12 01:19:53 crc kubenswrapper[4606]: I1212 01:19:53.386338 4606 generic.go:334] "Generic (PLEG): container finished" podID="94601fe7-f652-4b4c-9dde-c09e889c0583" containerID="079f20fca0f6ac479f017ef58bd73839b5e6d36a449e229ae94beb9f2ac27162" exitCode=0 Dec 12 01:19:53 crc kubenswrapper[4606]: I1212 01:19:53.386420 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pl92t" event={"ID":"94601fe7-f652-4b4c-9dde-c09e889c0583","Type":"ContainerDied","Data":"079f20fca0f6ac479f017ef58bd73839b5e6d36a449e229ae94beb9f2ac27162"} Dec 12 01:19:53 crc kubenswrapper[4606]: I1212 01:19:53.386698 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pl92t" event={"ID":"94601fe7-f652-4b4c-9dde-c09e889c0583","Type":"ContainerStarted","Data":"1d94b8c631be8a6dacc397999a05cc38e2e88293a125e3baf88dd6ec18e6e61c"} Dec 12 01:19:53 crc kubenswrapper[4606]: I1212 01:19:53.442085 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-b6r8t"] Dec 12 01:19:53 crc kubenswrapper[4606]: I1212 01:19:53.448238 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b6r8t" Dec 12 01:19:53 crc kubenswrapper[4606]: I1212 01:19:53.465003 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-b6r8t"] Dec 12 01:19:53 crc kubenswrapper[4606]: I1212 01:19:53.588312 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c78d463-36af-47ea-8f89-aaf053311ee4-catalog-content\") pod \"certified-operators-b6r8t\" (UID: \"2c78d463-36af-47ea-8f89-aaf053311ee4\") " pod="openshift-marketplace/certified-operators-b6r8t" Dec 12 01:19:53 crc kubenswrapper[4606]: I1212 01:19:53.588391 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c78d463-36af-47ea-8f89-aaf053311ee4-utilities\") pod \"certified-operators-b6r8t\" (UID: \"2c78d463-36af-47ea-8f89-aaf053311ee4\") " pod="openshift-marketplace/certified-operators-b6r8t" Dec 12 01:19:53 crc kubenswrapper[4606]: I1212 01:19:53.588460 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjlng\" (UniqueName: \"kubernetes.io/projected/2c78d463-36af-47ea-8f89-aaf053311ee4-kube-api-access-xjlng\") pod \"certified-operators-b6r8t\" (UID: \"2c78d463-36af-47ea-8f89-aaf053311ee4\") " pod="openshift-marketplace/certified-operators-b6r8t" Dec 12 01:19:53 crc kubenswrapper[4606]: I1212 01:19:53.689686 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c78d463-36af-47ea-8f89-aaf053311ee4-utilities\") pod \"certified-operators-b6r8t\" (UID: \"2c78d463-36af-47ea-8f89-aaf053311ee4\") " pod="openshift-marketplace/certified-operators-b6r8t" Dec 12 01:19:53 crc kubenswrapper[4606]: I1212 01:19:53.689804 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjlng\" (UniqueName: \"kubernetes.io/projected/2c78d463-36af-47ea-8f89-aaf053311ee4-kube-api-access-xjlng\") pod \"certified-operators-b6r8t\" (UID: \"2c78d463-36af-47ea-8f89-aaf053311ee4\") " pod="openshift-marketplace/certified-operators-b6r8t" Dec 12 01:19:53 crc kubenswrapper[4606]: I1212 01:19:53.689879 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c78d463-36af-47ea-8f89-aaf053311ee4-catalog-content\") pod \"certified-operators-b6r8t\" (UID: \"2c78d463-36af-47ea-8f89-aaf053311ee4\") " pod="openshift-marketplace/certified-operators-b6r8t" Dec 12 01:19:53 crc kubenswrapper[4606]: I1212 01:19:53.690317 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c78d463-36af-47ea-8f89-aaf053311ee4-catalog-content\") pod \"certified-operators-b6r8t\" (UID: \"2c78d463-36af-47ea-8f89-aaf053311ee4\") " pod="openshift-marketplace/certified-operators-b6r8t" Dec 12 01:19:53 crc kubenswrapper[4606]: I1212 01:19:53.690517 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c78d463-36af-47ea-8f89-aaf053311ee4-utilities\") pod \"certified-operators-b6r8t\" (UID: \"2c78d463-36af-47ea-8f89-aaf053311ee4\") " pod="openshift-marketplace/certified-operators-b6r8t" Dec 12 01:19:53 crc kubenswrapper[4606]: I1212 01:19:53.726097 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjlng\" (UniqueName: \"kubernetes.io/projected/2c78d463-36af-47ea-8f89-aaf053311ee4-kube-api-access-xjlng\") pod \"certified-operators-b6r8t\" (UID: \"2c78d463-36af-47ea-8f89-aaf053311ee4\") " pod="openshift-marketplace/certified-operators-b6r8t" Dec 12 01:19:53 crc kubenswrapper[4606]: I1212 01:19:53.762976 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b6r8t" Dec 12 01:19:54 crc kubenswrapper[4606]: I1212 01:19:54.292895 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-b6r8t"] Dec 12 01:19:54 crc kubenswrapper[4606]: W1212 01:19:54.301871 4606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2c78d463_36af_47ea_8f89_aaf053311ee4.slice/crio-4ecd3b24dfb2421fc2563739e393b361eed8c084c8653af74dfaf123dfb6fde6 WatchSource:0}: Error finding container 4ecd3b24dfb2421fc2563739e393b361eed8c084c8653af74dfaf123dfb6fde6: Status 404 returned error can't find the container with id 4ecd3b24dfb2421fc2563739e393b361eed8c084c8653af74dfaf123dfb6fde6 Dec 12 01:19:54 crc kubenswrapper[4606]: I1212 01:19:54.397524 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pl92t" event={"ID":"94601fe7-f652-4b4c-9dde-c09e889c0583","Type":"ContainerStarted","Data":"68375f1756df1f0b3980958eda5497ccb2a0caf47c4fbfaa2e3ded9f587b6323"} Dec 12 01:19:54 crc kubenswrapper[4606]: I1212 01:19:54.398587 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b6r8t" event={"ID":"2c78d463-36af-47ea-8f89-aaf053311ee4","Type":"ContainerStarted","Data":"4ecd3b24dfb2421fc2563739e393b361eed8c084c8653af74dfaf123dfb6fde6"} Dec 12 01:19:55 crc kubenswrapper[4606]: I1212 01:19:55.408328 4606 generic.go:334] "Generic (PLEG): container finished" podID="2c78d463-36af-47ea-8f89-aaf053311ee4" containerID="8da1e29edbffccf6e87df6c6f816dc02b576c95899777b94de835600df2990ff" exitCode=0 Dec 12 01:19:55 crc kubenswrapper[4606]: I1212 01:19:55.408591 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b6r8t" event={"ID":"2c78d463-36af-47ea-8f89-aaf053311ee4","Type":"ContainerDied","Data":"8da1e29edbffccf6e87df6c6f816dc02b576c95899777b94de835600df2990ff"} Dec 12 01:19:55 crc kubenswrapper[4606]: I1212 01:19:55.417532 4606 generic.go:334] "Generic (PLEG): container finished" podID="94601fe7-f652-4b4c-9dde-c09e889c0583" containerID="68375f1756df1f0b3980958eda5497ccb2a0caf47c4fbfaa2e3ded9f587b6323" exitCode=0 Dec 12 01:19:55 crc kubenswrapper[4606]: I1212 01:19:55.417580 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pl92t" event={"ID":"94601fe7-f652-4b4c-9dde-c09e889c0583","Type":"ContainerDied","Data":"68375f1756df1f0b3980958eda5497ccb2a0caf47c4fbfaa2e3ded9f587b6323"} Dec 12 01:19:56 crc kubenswrapper[4606]: I1212 01:19:56.426421 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b6r8t" event={"ID":"2c78d463-36af-47ea-8f89-aaf053311ee4","Type":"ContainerStarted","Data":"a71f7ebdbcdb6122e82fa9127e279889190a10c5a36e25d4669662cd5fd074e4"} Dec 12 01:19:56 crc kubenswrapper[4606]: I1212 01:19:56.430142 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pl92t" event={"ID":"94601fe7-f652-4b4c-9dde-c09e889c0583","Type":"ContainerStarted","Data":"03b821d41d6630bc651874d1def546b28fa8ebbc07ec91d6935f5419eacc1192"} Dec 12 01:19:56 crc kubenswrapper[4606]: I1212 01:19:56.474934 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-pl92t" podStartSLOduration=2.999432057 podStartE2EDuration="5.474886411s" podCreationTimestamp="2025-12-12 01:19:51 +0000 UTC" firstStartedPulling="2025-12-12 01:19:53.388691642 +0000 UTC m=+3383.934044528" lastFinishedPulling="2025-12-12 01:19:55.864146016 +0000 UTC m=+3386.409498882" observedRunningTime="2025-12-12 01:19:56.464379216 +0000 UTC m=+3387.009732082" watchObservedRunningTime="2025-12-12 01:19:56.474886411 +0000 UTC m=+3387.020239277" Dec 12 01:19:58 crc kubenswrapper[4606]: I1212 01:19:58.445623 4606 generic.go:334] "Generic (PLEG): container finished" podID="2c78d463-36af-47ea-8f89-aaf053311ee4" containerID="a71f7ebdbcdb6122e82fa9127e279889190a10c5a36e25d4669662cd5fd074e4" exitCode=0 Dec 12 01:19:58 crc kubenswrapper[4606]: I1212 01:19:58.445890 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b6r8t" event={"ID":"2c78d463-36af-47ea-8f89-aaf053311ee4","Type":"ContainerDied","Data":"a71f7ebdbcdb6122e82fa9127e279889190a10c5a36e25d4669662cd5fd074e4"} Dec 12 01:19:59 crc kubenswrapper[4606]: I1212 01:19:59.459922 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b6r8t" event={"ID":"2c78d463-36af-47ea-8f89-aaf053311ee4","Type":"ContainerStarted","Data":"f629e73448cb1178eb6e9b0e3533af2ccb744eaef6dceee00cc902ff81a9d2ff"} Dec 12 01:20:00 crc kubenswrapper[4606]: I1212 01:20:00.497921 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-b6r8t" podStartSLOduration=3.6687613260000003 podStartE2EDuration="7.497873016s" podCreationTimestamp="2025-12-12 01:19:53 +0000 UTC" firstStartedPulling="2025-12-12 01:19:55.410465259 +0000 UTC m=+3385.955818125" lastFinishedPulling="2025-12-12 01:19:59.239576949 +0000 UTC m=+3389.784929815" observedRunningTime="2025-12-12 01:20:00.48774899 +0000 UTC m=+3391.033101856" watchObservedRunningTime="2025-12-12 01:20:00.497873016 +0000 UTC m=+3391.043225882" Dec 12 01:20:01 crc kubenswrapper[4606]: I1212 01:20:01.981799 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-pl92t" Dec 12 01:20:01 crc kubenswrapper[4606]: I1212 01:20:01.982069 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-pl92t" Dec 12 01:20:02 crc kubenswrapper[4606]: I1212 01:20:02.030642 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-pl92t" Dec 12 01:20:02 crc kubenswrapper[4606]: I1212 01:20:02.549848 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-pl92t" Dec 12 01:20:03 crc kubenswrapper[4606]: I1212 01:20:03.764570 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-b6r8t" Dec 12 01:20:03 crc kubenswrapper[4606]: I1212 01:20:03.765560 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-b6r8t" Dec 12 01:20:04 crc kubenswrapper[4606]: I1212 01:20:04.816782 4606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-b6r8t" podUID="2c78d463-36af-47ea-8f89-aaf053311ee4" containerName="registry-server" probeResult="failure" output=< Dec 12 01:20:04 crc kubenswrapper[4606]: timeout: failed to connect service ":50051" within 1s Dec 12 01:20:04 crc kubenswrapper[4606]: > Dec 12 01:20:06 crc kubenswrapper[4606]: I1212 01:20:06.613580 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pl92t"] Dec 12 01:20:06 crc kubenswrapper[4606]: I1212 01:20:06.614092 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-pl92t" podUID="94601fe7-f652-4b4c-9dde-c09e889c0583" containerName="registry-server" containerID="cri-o://03b821d41d6630bc651874d1def546b28fa8ebbc07ec91d6935f5419eacc1192" gracePeriod=2 Dec 12 01:20:07 crc kubenswrapper[4606]: I1212 01:20:07.461906 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pl92t" Dec 12 01:20:07 crc kubenswrapper[4606]: I1212 01:20:07.546713 4606 generic.go:334] "Generic (PLEG): container finished" podID="94601fe7-f652-4b4c-9dde-c09e889c0583" containerID="03b821d41d6630bc651874d1def546b28fa8ebbc07ec91d6935f5419eacc1192" exitCode=0 Dec 12 01:20:07 crc kubenswrapper[4606]: I1212 01:20:07.546761 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pl92t" event={"ID":"94601fe7-f652-4b4c-9dde-c09e889c0583","Type":"ContainerDied","Data":"03b821d41d6630bc651874d1def546b28fa8ebbc07ec91d6935f5419eacc1192"} Dec 12 01:20:07 crc kubenswrapper[4606]: I1212 01:20:07.546787 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pl92t" event={"ID":"94601fe7-f652-4b4c-9dde-c09e889c0583","Type":"ContainerDied","Data":"1d94b8c631be8a6dacc397999a05cc38e2e88293a125e3baf88dd6ec18e6e61c"} Dec 12 01:20:07 crc kubenswrapper[4606]: I1212 01:20:07.546815 4606 scope.go:117] "RemoveContainer" containerID="03b821d41d6630bc651874d1def546b28fa8ebbc07ec91d6935f5419eacc1192" Dec 12 01:20:07 crc kubenswrapper[4606]: I1212 01:20:07.546985 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pl92t" Dec 12 01:20:07 crc kubenswrapper[4606]: I1212 01:20:07.549154 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94601fe7-f652-4b4c-9dde-c09e889c0583-catalog-content\") pod \"94601fe7-f652-4b4c-9dde-c09e889c0583\" (UID: \"94601fe7-f652-4b4c-9dde-c09e889c0583\") " Dec 12 01:20:07 crc kubenswrapper[4606]: I1212 01:20:07.549378 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94601fe7-f652-4b4c-9dde-c09e889c0583-utilities\") pod \"94601fe7-f652-4b4c-9dde-c09e889c0583\" (UID: \"94601fe7-f652-4b4c-9dde-c09e889c0583\") " Dec 12 01:20:07 crc kubenswrapper[4606]: I1212 01:20:07.549593 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lrbh9\" (UniqueName: \"kubernetes.io/projected/94601fe7-f652-4b4c-9dde-c09e889c0583-kube-api-access-lrbh9\") pod \"94601fe7-f652-4b4c-9dde-c09e889c0583\" (UID: \"94601fe7-f652-4b4c-9dde-c09e889c0583\") " Dec 12 01:20:07 crc kubenswrapper[4606]: I1212 01:20:07.571446 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94601fe7-f652-4b4c-9dde-c09e889c0583-kube-api-access-lrbh9" (OuterVolumeSpecName: "kube-api-access-lrbh9") pod "94601fe7-f652-4b4c-9dde-c09e889c0583" (UID: "94601fe7-f652-4b4c-9dde-c09e889c0583"). InnerVolumeSpecName "kube-api-access-lrbh9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 01:20:07 crc kubenswrapper[4606]: I1212 01:20:07.572773 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94601fe7-f652-4b4c-9dde-c09e889c0583-utilities" (OuterVolumeSpecName: "utilities") pod "94601fe7-f652-4b4c-9dde-c09e889c0583" (UID: "94601fe7-f652-4b4c-9dde-c09e889c0583"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 01:20:07 crc kubenswrapper[4606]: I1212 01:20:07.605667 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94601fe7-f652-4b4c-9dde-c09e889c0583-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "94601fe7-f652-4b4c-9dde-c09e889c0583" (UID: "94601fe7-f652-4b4c-9dde-c09e889c0583"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 01:20:07 crc kubenswrapper[4606]: I1212 01:20:07.645490 4606 scope.go:117] "RemoveContainer" containerID="68375f1756df1f0b3980958eda5497ccb2a0caf47c4fbfaa2e3ded9f587b6323" Dec 12 01:20:07 crc kubenswrapper[4606]: I1212 01:20:07.651434 4606 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94601fe7-f652-4b4c-9dde-c09e889c0583-utilities\") on node \"crc\" DevicePath \"\"" Dec 12 01:20:07 crc kubenswrapper[4606]: I1212 01:20:07.651460 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lrbh9\" (UniqueName: \"kubernetes.io/projected/94601fe7-f652-4b4c-9dde-c09e889c0583-kube-api-access-lrbh9\") on node \"crc\" DevicePath \"\"" Dec 12 01:20:07 crc kubenswrapper[4606]: I1212 01:20:07.651469 4606 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94601fe7-f652-4b4c-9dde-c09e889c0583-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 12 01:20:07 crc kubenswrapper[4606]: I1212 01:20:07.672758 4606 scope.go:117] "RemoveContainer" containerID="079f20fca0f6ac479f017ef58bd73839b5e6d36a449e229ae94beb9f2ac27162" Dec 12 01:20:07 crc kubenswrapper[4606]: I1212 01:20:07.714691 4606 scope.go:117] "RemoveContainer" containerID="03b821d41d6630bc651874d1def546b28fa8ebbc07ec91d6935f5419eacc1192" Dec 12 01:20:07 crc kubenswrapper[4606]: E1212 01:20:07.715124 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03b821d41d6630bc651874d1def546b28fa8ebbc07ec91d6935f5419eacc1192\": container with ID starting with 03b821d41d6630bc651874d1def546b28fa8ebbc07ec91d6935f5419eacc1192 not found: ID does not exist" containerID="03b821d41d6630bc651874d1def546b28fa8ebbc07ec91d6935f5419eacc1192" Dec 12 01:20:07 crc kubenswrapper[4606]: I1212 01:20:07.715152 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03b821d41d6630bc651874d1def546b28fa8ebbc07ec91d6935f5419eacc1192"} err="failed to get container status \"03b821d41d6630bc651874d1def546b28fa8ebbc07ec91d6935f5419eacc1192\": rpc error: code = NotFound desc = could not find container \"03b821d41d6630bc651874d1def546b28fa8ebbc07ec91d6935f5419eacc1192\": container with ID starting with 03b821d41d6630bc651874d1def546b28fa8ebbc07ec91d6935f5419eacc1192 not found: ID does not exist" Dec 12 01:20:07 crc kubenswrapper[4606]: I1212 01:20:07.715193 4606 scope.go:117] "RemoveContainer" containerID="68375f1756df1f0b3980958eda5497ccb2a0caf47c4fbfaa2e3ded9f587b6323" Dec 12 01:20:07 crc kubenswrapper[4606]: E1212 01:20:07.715407 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68375f1756df1f0b3980958eda5497ccb2a0caf47c4fbfaa2e3ded9f587b6323\": container with ID starting with 68375f1756df1f0b3980958eda5497ccb2a0caf47c4fbfaa2e3ded9f587b6323 not found: ID does not exist" containerID="68375f1756df1f0b3980958eda5497ccb2a0caf47c4fbfaa2e3ded9f587b6323" Dec 12 01:20:07 crc kubenswrapper[4606]: I1212 01:20:07.715443 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68375f1756df1f0b3980958eda5497ccb2a0caf47c4fbfaa2e3ded9f587b6323"} err="failed to get container status \"68375f1756df1f0b3980958eda5497ccb2a0caf47c4fbfaa2e3ded9f587b6323\": rpc error: code = NotFound desc = could not find container \"68375f1756df1f0b3980958eda5497ccb2a0caf47c4fbfaa2e3ded9f587b6323\": container with ID starting with 68375f1756df1f0b3980958eda5497ccb2a0caf47c4fbfaa2e3ded9f587b6323 not found: ID does not exist" Dec 12 01:20:07 crc kubenswrapper[4606]: I1212 01:20:07.715455 4606 scope.go:117] "RemoveContainer" containerID="079f20fca0f6ac479f017ef58bd73839b5e6d36a449e229ae94beb9f2ac27162" Dec 12 01:20:07 crc kubenswrapper[4606]: E1212 01:20:07.715633 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"079f20fca0f6ac479f017ef58bd73839b5e6d36a449e229ae94beb9f2ac27162\": container with ID starting with 079f20fca0f6ac479f017ef58bd73839b5e6d36a449e229ae94beb9f2ac27162 not found: ID does not exist" containerID="079f20fca0f6ac479f017ef58bd73839b5e6d36a449e229ae94beb9f2ac27162" Dec 12 01:20:07 crc kubenswrapper[4606]: I1212 01:20:07.715652 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"079f20fca0f6ac479f017ef58bd73839b5e6d36a449e229ae94beb9f2ac27162"} err="failed to get container status \"079f20fca0f6ac479f017ef58bd73839b5e6d36a449e229ae94beb9f2ac27162\": rpc error: code = NotFound desc = could not find container \"079f20fca0f6ac479f017ef58bd73839b5e6d36a449e229ae94beb9f2ac27162\": container with ID starting with 079f20fca0f6ac479f017ef58bd73839b5e6d36a449e229ae94beb9f2ac27162 not found: ID does not exist" Dec 12 01:20:07 crc kubenswrapper[4606]: I1212 01:20:07.871616 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pl92t"] Dec 12 01:20:07 crc kubenswrapper[4606]: I1212 01:20:07.880830 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-pl92t"] Dec 12 01:20:09 crc kubenswrapper[4606]: I1212 01:20:09.708990 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94601fe7-f652-4b4c-9dde-c09e889c0583" path="/var/lib/kubelet/pods/94601fe7-f652-4b4c-9dde-c09e889c0583/volumes" Dec 12 01:20:13 crc kubenswrapper[4606]: I1212 01:20:13.817851 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-b6r8t" Dec 12 01:20:13 crc kubenswrapper[4606]: I1212 01:20:13.876289 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-b6r8t" Dec 12 01:20:14 crc kubenswrapper[4606]: I1212 01:20:14.059161 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-b6r8t"] Dec 12 01:20:15 crc kubenswrapper[4606]: I1212 01:20:15.629993 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-b6r8t" podUID="2c78d463-36af-47ea-8f89-aaf053311ee4" containerName="registry-server" containerID="cri-o://f629e73448cb1178eb6e9b0e3533af2ccb744eaef6dceee00cc902ff81a9d2ff" gracePeriod=2 Dec 12 01:20:16 crc kubenswrapper[4606]: I1212 01:20:16.166060 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b6r8t" Dec 12 01:20:16 crc kubenswrapper[4606]: I1212 01:20:16.306429 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c78d463-36af-47ea-8f89-aaf053311ee4-utilities\") pod \"2c78d463-36af-47ea-8f89-aaf053311ee4\" (UID: \"2c78d463-36af-47ea-8f89-aaf053311ee4\") " Dec 12 01:20:16 crc kubenswrapper[4606]: I1212 01:20:16.306795 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xjlng\" (UniqueName: \"kubernetes.io/projected/2c78d463-36af-47ea-8f89-aaf053311ee4-kube-api-access-xjlng\") pod \"2c78d463-36af-47ea-8f89-aaf053311ee4\" (UID: \"2c78d463-36af-47ea-8f89-aaf053311ee4\") " Dec 12 01:20:16 crc kubenswrapper[4606]: I1212 01:20:16.306909 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c78d463-36af-47ea-8f89-aaf053311ee4-catalog-content\") pod \"2c78d463-36af-47ea-8f89-aaf053311ee4\" (UID: \"2c78d463-36af-47ea-8f89-aaf053311ee4\") " Dec 12 01:20:16 crc kubenswrapper[4606]: I1212 01:20:16.307542 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c78d463-36af-47ea-8f89-aaf053311ee4-utilities" (OuterVolumeSpecName: "utilities") pod "2c78d463-36af-47ea-8f89-aaf053311ee4" (UID: "2c78d463-36af-47ea-8f89-aaf053311ee4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 01:20:16 crc kubenswrapper[4606]: I1212 01:20:16.313221 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c78d463-36af-47ea-8f89-aaf053311ee4-kube-api-access-xjlng" (OuterVolumeSpecName: "kube-api-access-xjlng") pod "2c78d463-36af-47ea-8f89-aaf053311ee4" (UID: "2c78d463-36af-47ea-8f89-aaf053311ee4"). InnerVolumeSpecName "kube-api-access-xjlng". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 01:20:16 crc kubenswrapper[4606]: I1212 01:20:16.372225 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c78d463-36af-47ea-8f89-aaf053311ee4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2c78d463-36af-47ea-8f89-aaf053311ee4" (UID: "2c78d463-36af-47ea-8f89-aaf053311ee4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 01:20:16 crc kubenswrapper[4606]: I1212 01:20:16.409406 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xjlng\" (UniqueName: \"kubernetes.io/projected/2c78d463-36af-47ea-8f89-aaf053311ee4-kube-api-access-xjlng\") on node \"crc\" DevicePath \"\"" Dec 12 01:20:16 crc kubenswrapper[4606]: I1212 01:20:16.409433 4606 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c78d463-36af-47ea-8f89-aaf053311ee4-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 12 01:20:16 crc kubenswrapper[4606]: I1212 01:20:16.409442 4606 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c78d463-36af-47ea-8f89-aaf053311ee4-utilities\") on node \"crc\" DevicePath \"\"" Dec 12 01:20:16 crc kubenswrapper[4606]: I1212 01:20:16.642385 4606 generic.go:334] "Generic (PLEG): container finished" podID="2c78d463-36af-47ea-8f89-aaf053311ee4" containerID="f629e73448cb1178eb6e9b0e3533af2ccb744eaef6dceee00cc902ff81a9d2ff" exitCode=0 Dec 12 01:20:16 crc kubenswrapper[4606]: I1212 01:20:16.642434 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b6r8t" event={"ID":"2c78d463-36af-47ea-8f89-aaf053311ee4","Type":"ContainerDied","Data":"f629e73448cb1178eb6e9b0e3533af2ccb744eaef6dceee00cc902ff81a9d2ff"} Dec 12 01:20:16 crc kubenswrapper[4606]: I1212 01:20:16.642458 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b6r8t" Dec 12 01:20:16 crc kubenswrapper[4606]: I1212 01:20:16.642473 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b6r8t" event={"ID":"2c78d463-36af-47ea-8f89-aaf053311ee4","Type":"ContainerDied","Data":"4ecd3b24dfb2421fc2563739e393b361eed8c084c8653af74dfaf123dfb6fde6"} Dec 12 01:20:16 crc kubenswrapper[4606]: I1212 01:20:16.642497 4606 scope.go:117] "RemoveContainer" containerID="f629e73448cb1178eb6e9b0e3533af2ccb744eaef6dceee00cc902ff81a9d2ff" Dec 12 01:20:16 crc kubenswrapper[4606]: I1212 01:20:16.680116 4606 scope.go:117] "RemoveContainer" containerID="a71f7ebdbcdb6122e82fa9127e279889190a10c5a36e25d4669662cd5fd074e4" Dec 12 01:20:16 crc kubenswrapper[4606]: I1212 01:20:16.686936 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-b6r8t"] Dec 12 01:20:16 crc kubenswrapper[4606]: I1212 01:20:16.695473 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-b6r8t"] Dec 12 01:20:16 crc kubenswrapper[4606]: I1212 01:20:16.708615 4606 scope.go:117] "RemoveContainer" containerID="8da1e29edbffccf6e87df6c6f816dc02b576c95899777b94de835600df2990ff" Dec 12 01:20:16 crc kubenswrapper[4606]: I1212 01:20:16.747255 4606 scope.go:117] "RemoveContainer" containerID="f629e73448cb1178eb6e9b0e3533af2ccb744eaef6dceee00cc902ff81a9d2ff" Dec 12 01:20:16 crc kubenswrapper[4606]: E1212 01:20:16.747698 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f629e73448cb1178eb6e9b0e3533af2ccb744eaef6dceee00cc902ff81a9d2ff\": container with ID starting with f629e73448cb1178eb6e9b0e3533af2ccb744eaef6dceee00cc902ff81a9d2ff not found: ID does not exist" containerID="f629e73448cb1178eb6e9b0e3533af2ccb744eaef6dceee00cc902ff81a9d2ff" Dec 12 01:20:16 crc kubenswrapper[4606]: I1212 01:20:16.747748 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f629e73448cb1178eb6e9b0e3533af2ccb744eaef6dceee00cc902ff81a9d2ff"} err="failed to get container status \"f629e73448cb1178eb6e9b0e3533af2ccb744eaef6dceee00cc902ff81a9d2ff\": rpc error: code = NotFound desc = could not find container \"f629e73448cb1178eb6e9b0e3533af2ccb744eaef6dceee00cc902ff81a9d2ff\": container with ID starting with f629e73448cb1178eb6e9b0e3533af2ccb744eaef6dceee00cc902ff81a9d2ff not found: ID does not exist" Dec 12 01:20:16 crc kubenswrapper[4606]: I1212 01:20:16.747783 4606 scope.go:117] "RemoveContainer" containerID="a71f7ebdbcdb6122e82fa9127e279889190a10c5a36e25d4669662cd5fd074e4" Dec 12 01:20:16 crc kubenswrapper[4606]: E1212 01:20:16.748138 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a71f7ebdbcdb6122e82fa9127e279889190a10c5a36e25d4669662cd5fd074e4\": container with ID starting with a71f7ebdbcdb6122e82fa9127e279889190a10c5a36e25d4669662cd5fd074e4 not found: ID does not exist" containerID="a71f7ebdbcdb6122e82fa9127e279889190a10c5a36e25d4669662cd5fd074e4" Dec 12 01:20:16 crc kubenswrapper[4606]: I1212 01:20:16.748160 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a71f7ebdbcdb6122e82fa9127e279889190a10c5a36e25d4669662cd5fd074e4"} err="failed to get container status \"a71f7ebdbcdb6122e82fa9127e279889190a10c5a36e25d4669662cd5fd074e4\": rpc error: code = NotFound desc = could not find container \"a71f7ebdbcdb6122e82fa9127e279889190a10c5a36e25d4669662cd5fd074e4\": container with ID starting with a71f7ebdbcdb6122e82fa9127e279889190a10c5a36e25d4669662cd5fd074e4 not found: ID does not exist" Dec 12 01:20:16 crc kubenswrapper[4606]: I1212 01:20:16.748191 4606 scope.go:117] "RemoveContainer" containerID="8da1e29edbffccf6e87df6c6f816dc02b576c95899777b94de835600df2990ff" Dec 12 01:20:16 crc kubenswrapper[4606]: E1212 01:20:16.748444 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8da1e29edbffccf6e87df6c6f816dc02b576c95899777b94de835600df2990ff\": container with ID starting with 8da1e29edbffccf6e87df6c6f816dc02b576c95899777b94de835600df2990ff not found: ID does not exist" containerID="8da1e29edbffccf6e87df6c6f816dc02b576c95899777b94de835600df2990ff" Dec 12 01:20:16 crc kubenswrapper[4606]: I1212 01:20:16.748484 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8da1e29edbffccf6e87df6c6f816dc02b576c95899777b94de835600df2990ff"} err="failed to get container status \"8da1e29edbffccf6e87df6c6f816dc02b576c95899777b94de835600df2990ff\": rpc error: code = NotFound desc = could not find container \"8da1e29edbffccf6e87df6c6f816dc02b576c95899777b94de835600df2990ff\": container with ID starting with 8da1e29edbffccf6e87df6c6f816dc02b576c95899777b94de835600df2990ff not found: ID does not exist" Dec 12 01:20:17 crc kubenswrapper[4606]: I1212 01:20:17.710104 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c78d463-36af-47ea-8f89-aaf053311ee4" path="/var/lib/kubelet/pods/2c78d463-36af-47ea-8f89-aaf053311ee4/volumes" Dec 12 01:21:32 crc kubenswrapper[4606]: I1212 01:21:32.010558 4606 patch_prober.go:28] interesting pod/machine-config-daemon-cqmz5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 12 01:21:32 crc kubenswrapper[4606]: I1212 01:21:32.011082 4606 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 12 01:22:02 crc kubenswrapper[4606]: I1212 01:22:02.010678 4606 patch_prober.go:28] interesting pod/machine-config-daemon-cqmz5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 12 01:22:02 crc kubenswrapper[4606]: I1212 01:22:02.011101 4606 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 12 01:22:32 crc kubenswrapper[4606]: I1212 01:22:32.010233 4606 patch_prober.go:28] interesting pod/machine-config-daemon-cqmz5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 12 01:22:32 crc kubenswrapper[4606]: I1212 01:22:32.010901 4606 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 12 01:22:32 crc kubenswrapper[4606]: I1212 01:22:32.010974 4606 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" Dec 12 01:22:32 crc kubenswrapper[4606]: I1212 01:22:32.012060 4606 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6cf0fcbed104ebcc1d2da0cc866e500a495d7669fd69152b8565fd158eb51ada"} pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 12 01:22:32 crc kubenswrapper[4606]: I1212 01:22:32.012164 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" containerName="machine-config-daemon" containerID="cri-o://6cf0fcbed104ebcc1d2da0cc866e500a495d7669fd69152b8565fd158eb51ada" gracePeriod=600 Dec 12 01:22:32 crc kubenswrapper[4606]: E1212 01:22:32.138140 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 01:22:32 crc kubenswrapper[4606]: I1212 01:22:32.895567 4606 generic.go:334] "Generic (PLEG): container finished" podID="a543e227-be89-40cb-941d-b4707cc28921" containerID="6cf0fcbed104ebcc1d2da0cc866e500a495d7669fd69152b8565fd158eb51ada" exitCode=0 Dec 12 01:22:32 crc kubenswrapper[4606]: I1212 01:22:32.896096 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" event={"ID":"a543e227-be89-40cb-941d-b4707cc28921","Type":"ContainerDied","Data":"6cf0fcbed104ebcc1d2da0cc866e500a495d7669fd69152b8565fd158eb51ada"} Dec 12 01:22:32 crc kubenswrapper[4606]: I1212 01:22:32.896250 4606 scope.go:117] "RemoveContainer" containerID="f229baf38a7f21fe7b01bef3e91a8c649ea56881ce0bff1f8c08ca952dad853d" Dec 12 01:22:32 crc kubenswrapper[4606]: I1212 01:22:32.897076 4606 scope.go:117] "RemoveContainer" containerID="6cf0fcbed104ebcc1d2da0cc866e500a495d7669fd69152b8565fd158eb51ada" Dec 12 01:22:32 crc kubenswrapper[4606]: E1212 01:22:32.897528 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 01:22:45 crc kubenswrapper[4606]: I1212 01:22:45.699862 4606 scope.go:117] "RemoveContainer" containerID="6cf0fcbed104ebcc1d2da0cc866e500a495d7669fd69152b8565fd158eb51ada" Dec 12 01:22:45 crc kubenswrapper[4606]: E1212 01:22:45.700917 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 01:22:58 crc kubenswrapper[4606]: I1212 01:22:58.700353 4606 scope.go:117] "RemoveContainer" containerID="6cf0fcbed104ebcc1d2da0cc866e500a495d7669fd69152b8565fd158eb51ada" Dec 12 01:22:58 crc kubenswrapper[4606]: E1212 01:22:58.701437 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 01:23:09 crc kubenswrapper[4606]: I1212 01:23:09.707313 4606 scope.go:117] "RemoveContainer" containerID="6cf0fcbed104ebcc1d2da0cc866e500a495d7669fd69152b8565fd158eb51ada" Dec 12 01:23:09 crc kubenswrapper[4606]: E1212 01:23:09.708030 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 01:23:21 crc kubenswrapper[4606]: I1212 01:23:21.699493 4606 scope.go:117] "RemoveContainer" containerID="6cf0fcbed104ebcc1d2da0cc866e500a495d7669fd69152b8565fd158eb51ada" Dec 12 01:23:21 crc kubenswrapper[4606]: E1212 01:23:21.701655 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 01:23:34 crc kubenswrapper[4606]: I1212 01:23:34.701452 4606 scope.go:117] "RemoveContainer" containerID="6cf0fcbed104ebcc1d2da0cc866e500a495d7669fd69152b8565fd158eb51ada" Dec 12 01:23:34 crc kubenswrapper[4606]: E1212 01:23:34.703460 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 01:23:46 crc kubenswrapper[4606]: I1212 01:23:46.700152 4606 scope.go:117] "RemoveContainer" containerID="6cf0fcbed104ebcc1d2da0cc866e500a495d7669fd69152b8565fd158eb51ada" Dec 12 01:23:46 crc kubenswrapper[4606]: E1212 01:23:46.701107 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 01:23:57 crc kubenswrapper[4606]: I1212 01:23:57.700150 4606 scope.go:117] "RemoveContainer" containerID="6cf0fcbed104ebcc1d2da0cc866e500a495d7669fd69152b8565fd158eb51ada" Dec 12 01:23:57 crc kubenswrapper[4606]: E1212 01:23:57.701076 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 01:24:10 crc kubenswrapper[4606]: I1212 01:24:10.700048 4606 scope.go:117] "RemoveContainer" containerID="6cf0fcbed104ebcc1d2da0cc866e500a495d7669fd69152b8565fd158eb51ada" Dec 12 01:24:10 crc kubenswrapper[4606]: E1212 01:24:10.700808 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 01:24:24 crc kubenswrapper[4606]: I1212 01:24:24.699809 4606 scope.go:117] "RemoveContainer" containerID="6cf0fcbed104ebcc1d2da0cc866e500a495d7669fd69152b8565fd158eb51ada" Dec 12 01:24:24 crc kubenswrapper[4606]: E1212 01:24:24.701073 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 01:24:36 crc kubenswrapper[4606]: I1212 01:24:36.700082 4606 scope.go:117] "RemoveContainer" containerID="6cf0fcbed104ebcc1d2da0cc866e500a495d7669fd69152b8565fd158eb51ada" Dec 12 01:24:36 crc kubenswrapper[4606]: E1212 01:24:36.700864 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 01:24:48 crc kubenswrapper[4606]: I1212 01:24:48.700027 4606 scope.go:117] "RemoveContainer" containerID="6cf0fcbed104ebcc1d2da0cc866e500a495d7669fd69152b8565fd158eb51ada" Dec 12 01:24:48 crc kubenswrapper[4606]: E1212 01:24:48.700786 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 01:25:01 crc kubenswrapper[4606]: I1212 01:25:01.700311 4606 scope.go:117] "RemoveContainer" containerID="6cf0fcbed104ebcc1d2da0cc866e500a495d7669fd69152b8565fd158eb51ada" Dec 12 01:25:01 crc kubenswrapper[4606]: E1212 01:25:01.701026 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 01:25:15 crc kubenswrapper[4606]: I1212 01:25:15.704223 4606 scope.go:117] "RemoveContainer" containerID="6cf0fcbed104ebcc1d2da0cc866e500a495d7669fd69152b8565fd158eb51ada" Dec 12 01:25:15 crc kubenswrapper[4606]: E1212 01:25:15.704960 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 01:25:26 crc kubenswrapper[4606]: I1212 01:25:26.700209 4606 scope.go:117] "RemoveContainer" containerID="6cf0fcbed104ebcc1d2da0cc866e500a495d7669fd69152b8565fd158eb51ada" Dec 12 01:25:26 crc kubenswrapper[4606]: E1212 01:25:26.701010 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 01:25:39 crc kubenswrapper[4606]: I1212 01:25:39.711218 4606 scope.go:117] "RemoveContainer" containerID="6cf0fcbed104ebcc1d2da0cc866e500a495d7669fd69152b8565fd158eb51ada" Dec 12 01:25:39 crc kubenswrapper[4606]: E1212 01:25:39.711846 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 01:25:51 crc kubenswrapper[4606]: I1212 01:25:51.699531 4606 scope.go:117] "RemoveContainer" containerID="6cf0fcbed104ebcc1d2da0cc866e500a495d7669fd69152b8565fd158eb51ada" Dec 12 01:25:51 crc kubenswrapper[4606]: E1212 01:25:51.700101 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 01:26:02 crc kubenswrapper[4606]: I1212 01:26:02.877810 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-8whwl"] Dec 12 01:26:02 crc kubenswrapper[4606]: E1212 01:26:02.878915 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c78d463-36af-47ea-8f89-aaf053311ee4" containerName="extract-utilities" Dec 12 01:26:02 crc kubenswrapper[4606]: I1212 01:26:02.878932 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c78d463-36af-47ea-8f89-aaf053311ee4" containerName="extract-utilities" Dec 12 01:26:02 crc kubenswrapper[4606]: E1212 01:26:02.878952 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c78d463-36af-47ea-8f89-aaf053311ee4" containerName="extract-content" Dec 12 01:26:02 crc kubenswrapper[4606]: I1212 01:26:02.878961 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c78d463-36af-47ea-8f89-aaf053311ee4" containerName="extract-content" Dec 12 01:26:02 crc kubenswrapper[4606]: E1212 01:26:02.878975 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94601fe7-f652-4b4c-9dde-c09e889c0583" containerName="extract-utilities" Dec 12 01:26:02 crc kubenswrapper[4606]: I1212 01:26:02.878982 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="94601fe7-f652-4b4c-9dde-c09e889c0583" containerName="extract-utilities" Dec 12 01:26:02 crc kubenswrapper[4606]: E1212 01:26:02.879001 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94601fe7-f652-4b4c-9dde-c09e889c0583" containerName="registry-server" Dec 12 01:26:02 crc kubenswrapper[4606]: I1212 01:26:02.879009 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="94601fe7-f652-4b4c-9dde-c09e889c0583" containerName="registry-server" Dec 12 01:26:02 crc kubenswrapper[4606]: E1212 01:26:02.879061 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94601fe7-f652-4b4c-9dde-c09e889c0583" containerName="extract-content" Dec 12 01:26:02 crc kubenswrapper[4606]: I1212 01:26:02.879070 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="94601fe7-f652-4b4c-9dde-c09e889c0583" containerName="extract-content" Dec 12 01:26:02 crc kubenswrapper[4606]: E1212 01:26:02.879084 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c78d463-36af-47ea-8f89-aaf053311ee4" containerName="registry-server" Dec 12 01:26:02 crc kubenswrapper[4606]: I1212 01:26:02.879094 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c78d463-36af-47ea-8f89-aaf053311ee4" containerName="registry-server" Dec 12 01:26:02 crc kubenswrapper[4606]: I1212 01:26:02.879341 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c78d463-36af-47ea-8f89-aaf053311ee4" containerName="registry-server" Dec 12 01:26:02 crc kubenswrapper[4606]: I1212 01:26:02.879372 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="94601fe7-f652-4b4c-9dde-c09e889c0583" containerName="registry-server" Dec 12 01:26:02 crc kubenswrapper[4606]: I1212 01:26:02.881016 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8whwl" Dec 12 01:26:02 crc kubenswrapper[4606]: I1212 01:26:02.893769 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8whwl"] Dec 12 01:26:03 crc kubenswrapper[4606]: I1212 01:26:03.009527 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da3f6c49-c4b6-4fee-a3f5-1635d73e62f2-utilities\") pod \"redhat-operators-8whwl\" (UID: \"da3f6c49-c4b6-4fee-a3f5-1635d73e62f2\") " pod="openshift-marketplace/redhat-operators-8whwl" Dec 12 01:26:03 crc kubenswrapper[4606]: I1212 01:26:03.009573 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvht5\" (UniqueName: \"kubernetes.io/projected/da3f6c49-c4b6-4fee-a3f5-1635d73e62f2-kube-api-access-nvht5\") pod \"redhat-operators-8whwl\" (UID: \"da3f6c49-c4b6-4fee-a3f5-1635d73e62f2\") " pod="openshift-marketplace/redhat-operators-8whwl" Dec 12 01:26:03 crc kubenswrapper[4606]: I1212 01:26:03.009672 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da3f6c49-c4b6-4fee-a3f5-1635d73e62f2-catalog-content\") pod \"redhat-operators-8whwl\" (UID: \"da3f6c49-c4b6-4fee-a3f5-1635d73e62f2\") " pod="openshift-marketplace/redhat-operators-8whwl" Dec 12 01:26:03 crc kubenswrapper[4606]: I1212 01:26:03.111140 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da3f6c49-c4b6-4fee-a3f5-1635d73e62f2-utilities\") pod \"redhat-operators-8whwl\" (UID: \"da3f6c49-c4b6-4fee-a3f5-1635d73e62f2\") " pod="openshift-marketplace/redhat-operators-8whwl" Dec 12 01:26:03 crc kubenswrapper[4606]: I1212 01:26:03.111223 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nvht5\" (UniqueName: \"kubernetes.io/projected/da3f6c49-c4b6-4fee-a3f5-1635d73e62f2-kube-api-access-nvht5\") pod \"redhat-operators-8whwl\" (UID: \"da3f6c49-c4b6-4fee-a3f5-1635d73e62f2\") " pod="openshift-marketplace/redhat-operators-8whwl" Dec 12 01:26:03 crc kubenswrapper[4606]: I1212 01:26:03.111365 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da3f6c49-c4b6-4fee-a3f5-1635d73e62f2-catalog-content\") pod \"redhat-operators-8whwl\" (UID: \"da3f6c49-c4b6-4fee-a3f5-1635d73e62f2\") " pod="openshift-marketplace/redhat-operators-8whwl" Dec 12 01:26:03 crc kubenswrapper[4606]: I1212 01:26:03.111908 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da3f6c49-c4b6-4fee-a3f5-1635d73e62f2-utilities\") pod \"redhat-operators-8whwl\" (UID: \"da3f6c49-c4b6-4fee-a3f5-1635d73e62f2\") " pod="openshift-marketplace/redhat-operators-8whwl" Dec 12 01:26:03 crc kubenswrapper[4606]: I1212 01:26:03.111989 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da3f6c49-c4b6-4fee-a3f5-1635d73e62f2-catalog-content\") pod \"redhat-operators-8whwl\" (UID: \"da3f6c49-c4b6-4fee-a3f5-1635d73e62f2\") " pod="openshift-marketplace/redhat-operators-8whwl" Dec 12 01:26:03 crc kubenswrapper[4606]: I1212 01:26:03.136277 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvht5\" (UniqueName: \"kubernetes.io/projected/da3f6c49-c4b6-4fee-a3f5-1635d73e62f2-kube-api-access-nvht5\") pod \"redhat-operators-8whwl\" (UID: \"da3f6c49-c4b6-4fee-a3f5-1635d73e62f2\") " pod="openshift-marketplace/redhat-operators-8whwl" Dec 12 01:26:03 crc kubenswrapper[4606]: I1212 01:26:03.205447 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8whwl" Dec 12 01:26:03 crc kubenswrapper[4606]: I1212 01:26:03.701722 4606 scope.go:117] "RemoveContainer" containerID="6cf0fcbed104ebcc1d2da0cc866e500a495d7669fd69152b8565fd158eb51ada" Dec 12 01:26:03 crc kubenswrapper[4606]: E1212 01:26:03.702408 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 01:26:03 crc kubenswrapper[4606]: I1212 01:26:03.722104 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8whwl"] Dec 12 01:26:04 crc kubenswrapper[4606]: I1212 01:26:04.121480 4606 generic.go:334] "Generic (PLEG): container finished" podID="da3f6c49-c4b6-4fee-a3f5-1635d73e62f2" containerID="15894097515bd1f767eae48c41c7b83163bd6e05364aa597cdb237fbd504fe4a" exitCode=0 Dec 12 01:26:04 crc kubenswrapper[4606]: I1212 01:26:04.122447 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8whwl" event={"ID":"da3f6c49-c4b6-4fee-a3f5-1635d73e62f2","Type":"ContainerDied","Data":"15894097515bd1f767eae48c41c7b83163bd6e05364aa597cdb237fbd504fe4a"} Dec 12 01:26:04 crc kubenswrapper[4606]: I1212 01:26:04.122731 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8whwl" event={"ID":"da3f6c49-c4b6-4fee-a3f5-1635d73e62f2","Type":"ContainerStarted","Data":"c2796708ad3dbe8db1e11eaaae7de8b9d4758515b54717f0402e99008db1109a"} Dec 12 01:26:04 crc kubenswrapper[4606]: I1212 01:26:04.124317 4606 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 12 01:26:16 crc kubenswrapper[4606]: I1212 01:26:16.251280 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8whwl" event={"ID":"da3f6c49-c4b6-4fee-a3f5-1635d73e62f2","Type":"ContainerStarted","Data":"7972f5db58b9a3a35d9ac86998259db734b53c593fbe682bf31d76d039753fc8"} Dec 12 01:26:18 crc kubenswrapper[4606]: I1212 01:26:18.272271 4606 generic.go:334] "Generic (PLEG): container finished" podID="da3f6c49-c4b6-4fee-a3f5-1635d73e62f2" containerID="7972f5db58b9a3a35d9ac86998259db734b53c593fbe682bf31d76d039753fc8" exitCode=0 Dec 12 01:26:18 crc kubenswrapper[4606]: I1212 01:26:18.272361 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8whwl" event={"ID":"da3f6c49-c4b6-4fee-a3f5-1635d73e62f2","Type":"ContainerDied","Data":"7972f5db58b9a3a35d9ac86998259db734b53c593fbe682bf31d76d039753fc8"} Dec 12 01:26:18 crc kubenswrapper[4606]: I1212 01:26:18.699669 4606 scope.go:117] "RemoveContainer" containerID="6cf0fcbed104ebcc1d2da0cc866e500a495d7669fd69152b8565fd158eb51ada" Dec 12 01:26:18 crc kubenswrapper[4606]: E1212 01:26:18.700064 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 01:26:19 crc kubenswrapper[4606]: I1212 01:26:19.282759 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8whwl" event={"ID":"da3f6c49-c4b6-4fee-a3f5-1635d73e62f2","Type":"ContainerStarted","Data":"f6ac740a37e3d8f8db4151096839dfb48be93157db9f3b3dd59463fec609dd02"} Dec 12 01:26:19 crc kubenswrapper[4606]: I1212 01:26:19.305807 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-8whwl" podStartSLOduration=2.7038070899999997 podStartE2EDuration="17.305766091s" podCreationTimestamp="2025-12-12 01:26:02 +0000 UTC" firstStartedPulling="2025-12-12 01:26:04.124075623 +0000 UTC m=+3754.669428489" lastFinishedPulling="2025-12-12 01:26:18.726034624 +0000 UTC m=+3769.271387490" observedRunningTime="2025-12-12 01:26:19.300791109 +0000 UTC m=+3769.846143995" watchObservedRunningTime="2025-12-12 01:26:19.305766091 +0000 UTC m=+3769.851118957" Dec 12 01:26:23 crc kubenswrapper[4606]: I1212 01:26:23.205735 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-8whwl" Dec 12 01:26:23 crc kubenswrapper[4606]: I1212 01:26:23.206159 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-8whwl" Dec 12 01:26:24 crc kubenswrapper[4606]: I1212 01:26:24.396331 4606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-8whwl" podUID="da3f6c49-c4b6-4fee-a3f5-1635d73e62f2" containerName="registry-server" probeResult="failure" output=< Dec 12 01:26:24 crc kubenswrapper[4606]: timeout: failed to connect service ":50051" within 1s Dec 12 01:26:24 crc kubenswrapper[4606]: > Dec 12 01:26:33 crc kubenswrapper[4606]: I1212 01:26:33.266120 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-8whwl" Dec 12 01:26:33 crc kubenswrapper[4606]: I1212 01:26:33.314763 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-8whwl" Dec 12 01:26:33 crc kubenswrapper[4606]: I1212 01:26:33.700613 4606 scope.go:117] "RemoveContainer" containerID="6cf0fcbed104ebcc1d2da0cc866e500a495d7669fd69152b8565fd158eb51ada" Dec 12 01:26:33 crc kubenswrapper[4606]: E1212 01:26:33.700896 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 01:26:33 crc kubenswrapper[4606]: I1212 01:26:33.891017 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8whwl"] Dec 12 01:26:34 crc kubenswrapper[4606]: I1212 01:26:34.072805 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-l7nww"] Dec 12 01:26:34 crc kubenswrapper[4606]: I1212 01:26:34.073457 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-l7nww" podUID="9df9dd2c-220d-43e3-a680-21d6ea0622f5" containerName="registry-server" containerID="cri-o://539312d1a8084692cea75168d401825c852b0de6ef1f83e3b9f8e40100f154b6" gracePeriod=2 Dec 12 01:26:34 crc kubenswrapper[4606]: I1212 01:26:34.427921 4606 generic.go:334] "Generic (PLEG): container finished" podID="9df9dd2c-220d-43e3-a680-21d6ea0622f5" containerID="539312d1a8084692cea75168d401825c852b0de6ef1f83e3b9f8e40100f154b6" exitCode=0 Dec 12 01:26:34 crc kubenswrapper[4606]: I1212 01:26:34.428017 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l7nww" event={"ID":"9df9dd2c-220d-43e3-a680-21d6ea0622f5","Type":"ContainerDied","Data":"539312d1a8084692cea75168d401825c852b0de6ef1f83e3b9f8e40100f154b6"} Dec 12 01:26:34 crc kubenswrapper[4606]: I1212 01:26:34.620771 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l7nww" Dec 12 01:26:34 crc kubenswrapper[4606]: I1212 01:26:34.741325 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5dlcq\" (UniqueName: \"kubernetes.io/projected/9df9dd2c-220d-43e3-a680-21d6ea0622f5-kube-api-access-5dlcq\") pod \"9df9dd2c-220d-43e3-a680-21d6ea0622f5\" (UID: \"9df9dd2c-220d-43e3-a680-21d6ea0622f5\") " Dec 12 01:26:34 crc kubenswrapper[4606]: I1212 01:26:34.741369 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9df9dd2c-220d-43e3-a680-21d6ea0622f5-catalog-content\") pod \"9df9dd2c-220d-43e3-a680-21d6ea0622f5\" (UID: \"9df9dd2c-220d-43e3-a680-21d6ea0622f5\") " Dec 12 01:26:34 crc kubenswrapper[4606]: I1212 01:26:34.741392 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9df9dd2c-220d-43e3-a680-21d6ea0622f5-utilities\") pod \"9df9dd2c-220d-43e3-a680-21d6ea0622f5\" (UID: \"9df9dd2c-220d-43e3-a680-21d6ea0622f5\") " Dec 12 01:26:34 crc kubenswrapper[4606]: I1212 01:26:34.741968 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9df9dd2c-220d-43e3-a680-21d6ea0622f5-utilities" (OuterVolumeSpecName: "utilities") pod "9df9dd2c-220d-43e3-a680-21d6ea0622f5" (UID: "9df9dd2c-220d-43e3-a680-21d6ea0622f5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 01:26:34 crc kubenswrapper[4606]: I1212 01:26:34.743591 4606 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9df9dd2c-220d-43e3-a680-21d6ea0622f5-utilities\") on node \"crc\" DevicePath \"\"" Dec 12 01:26:34 crc kubenswrapper[4606]: I1212 01:26:34.749529 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9df9dd2c-220d-43e3-a680-21d6ea0622f5-kube-api-access-5dlcq" (OuterVolumeSpecName: "kube-api-access-5dlcq") pod "9df9dd2c-220d-43e3-a680-21d6ea0622f5" (UID: "9df9dd2c-220d-43e3-a680-21d6ea0622f5"). InnerVolumeSpecName "kube-api-access-5dlcq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 01:26:34 crc kubenswrapper[4606]: I1212 01:26:34.845451 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5dlcq\" (UniqueName: \"kubernetes.io/projected/9df9dd2c-220d-43e3-a680-21d6ea0622f5-kube-api-access-5dlcq\") on node \"crc\" DevicePath \"\"" Dec 12 01:26:34 crc kubenswrapper[4606]: I1212 01:26:34.846593 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9df9dd2c-220d-43e3-a680-21d6ea0622f5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9df9dd2c-220d-43e3-a680-21d6ea0622f5" (UID: "9df9dd2c-220d-43e3-a680-21d6ea0622f5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 01:26:34 crc kubenswrapper[4606]: I1212 01:26:34.947394 4606 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9df9dd2c-220d-43e3-a680-21d6ea0622f5-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 12 01:26:35 crc kubenswrapper[4606]: I1212 01:26:35.438159 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l7nww" event={"ID":"9df9dd2c-220d-43e3-a680-21d6ea0622f5","Type":"ContainerDied","Data":"d123b28c61aa20b82ec30938185d60eca045d9aaf68efa36bab971be9e6fe687"} Dec 12 01:26:35 crc kubenswrapper[4606]: I1212 01:26:35.438247 4606 scope.go:117] "RemoveContainer" containerID="539312d1a8084692cea75168d401825c852b0de6ef1f83e3b9f8e40100f154b6" Dec 12 01:26:35 crc kubenswrapper[4606]: I1212 01:26:35.438261 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l7nww" Dec 12 01:26:35 crc kubenswrapper[4606]: I1212 01:26:35.477391 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-l7nww"] Dec 12 01:26:35 crc kubenswrapper[4606]: I1212 01:26:35.479251 4606 scope.go:117] "RemoveContainer" containerID="8ad4a502008e1a5347672b41510651b81a542219b2c140d9042033868a21b0b6" Dec 12 01:26:35 crc kubenswrapper[4606]: I1212 01:26:35.487080 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-l7nww"] Dec 12 01:26:35 crc kubenswrapper[4606]: I1212 01:26:35.535699 4606 scope.go:117] "RemoveContainer" containerID="2bbd03411a81b51047ee545e4c28b2dc3078fe5f50a1f2f0de6e8a24d1141953" Dec 12 01:26:35 crc kubenswrapper[4606]: I1212 01:26:35.761823 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9df9dd2c-220d-43e3-a680-21d6ea0622f5" path="/var/lib/kubelet/pods/9df9dd2c-220d-43e3-a680-21d6ea0622f5/volumes" Dec 12 01:26:46 crc kubenswrapper[4606]: I1212 01:26:46.699877 4606 scope.go:117] "RemoveContainer" containerID="6cf0fcbed104ebcc1d2da0cc866e500a495d7669fd69152b8565fd158eb51ada" Dec 12 01:26:46 crc kubenswrapper[4606]: E1212 01:26:46.700534 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 01:26:58 crc kubenswrapper[4606]: I1212 01:26:58.699943 4606 scope.go:117] "RemoveContainer" containerID="6cf0fcbed104ebcc1d2da0cc866e500a495d7669fd69152b8565fd158eb51ada" Dec 12 01:26:58 crc kubenswrapper[4606]: E1212 01:26:58.700788 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 01:27:10 crc kubenswrapper[4606]: I1212 01:27:10.699520 4606 scope.go:117] "RemoveContainer" containerID="6cf0fcbed104ebcc1d2da0cc866e500a495d7669fd69152b8565fd158eb51ada" Dec 12 01:27:10 crc kubenswrapper[4606]: E1212 01:27:10.700118 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 01:27:24 crc kubenswrapper[4606]: I1212 01:27:24.699913 4606 scope.go:117] "RemoveContainer" containerID="6cf0fcbed104ebcc1d2da0cc866e500a495d7669fd69152b8565fd158eb51ada" Dec 12 01:27:24 crc kubenswrapper[4606]: E1212 01:27:24.700903 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 01:27:35 crc kubenswrapper[4606]: I1212 01:27:35.701369 4606 scope.go:117] "RemoveContainer" containerID="6cf0fcbed104ebcc1d2da0cc866e500a495d7669fd69152b8565fd158eb51ada" Dec 12 01:27:36 crc kubenswrapper[4606]: I1212 01:27:36.996393 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" event={"ID":"a543e227-be89-40cb-941d-b4707cc28921","Type":"ContainerStarted","Data":"b29850fc0cc29c6aaef17ef54ee835cd2490230b421235d791c42309fba8ead5"} Dec 12 01:28:00 crc kubenswrapper[4606]: I1212 01:28:00.806887 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-jc7v4"] Dec 12 01:28:00 crc kubenswrapper[4606]: E1212 01:28:00.807895 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9df9dd2c-220d-43e3-a680-21d6ea0622f5" containerName="registry-server" Dec 12 01:28:00 crc kubenswrapper[4606]: I1212 01:28:00.807912 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="9df9dd2c-220d-43e3-a680-21d6ea0622f5" containerName="registry-server" Dec 12 01:28:00 crc kubenswrapper[4606]: E1212 01:28:00.807923 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9df9dd2c-220d-43e3-a680-21d6ea0622f5" containerName="extract-content" Dec 12 01:28:00 crc kubenswrapper[4606]: I1212 01:28:00.807930 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="9df9dd2c-220d-43e3-a680-21d6ea0622f5" containerName="extract-content" Dec 12 01:28:00 crc kubenswrapper[4606]: E1212 01:28:00.807951 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9df9dd2c-220d-43e3-a680-21d6ea0622f5" containerName="extract-utilities" Dec 12 01:28:00 crc kubenswrapper[4606]: I1212 01:28:00.807959 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="9df9dd2c-220d-43e3-a680-21d6ea0622f5" containerName="extract-utilities" Dec 12 01:28:00 crc kubenswrapper[4606]: I1212 01:28:00.808256 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="9df9dd2c-220d-43e3-a680-21d6ea0622f5" containerName="registry-server" Dec 12 01:28:00 crc kubenswrapper[4606]: I1212 01:28:00.826115 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jc7v4"] Dec 12 01:28:00 crc kubenswrapper[4606]: I1212 01:28:00.826502 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jc7v4" Dec 12 01:28:00 crc kubenswrapper[4606]: I1212 01:28:00.976301 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lst9k\" (UniqueName: \"kubernetes.io/projected/9cdf692d-b841-4469-92db-26d6af933437-kube-api-access-lst9k\") pod \"community-operators-jc7v4\" (UID: \"9cdf692d-b841-4469-92db-26d6af933437\") " pod="openshift-marketplace/community-operators-jc7v4" Dec 12 01:28:00 crc kubenswrapper[4606]: I1212 01:28:00.976648 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9cdf692d-b841-4469-92db-26d6af933437-utilities\") pod \"community-operators-jc7v4\" (UID: \"9cdf692d-b841-4469-92db-26d6af933437\") " pod="openshift-marketplace/community-operators-jc7v4" Dec 12 01:28:00 crc kubenswrapper[4606]: I1212 01:28:00.976753 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9cdf692d-b841-4469-92db-26d6af933437-catalog-content\") pod \"community-operators-jc7v4\" (UID: \"9cdf692d-b841-4469-92db-26d6af933437\") " pod="openshift-marketplace/community-operators-jc7v4" Dec 12 01:28:01 crc kubenswrapper[4606]: I1212 01:28:01.078648 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9cdf692d-b841-4469-92db-26d6af933437-utilities\") pod \"community-operators-jc7v4\" (UID: \"9cdf692d-b841-4469-92db-26d6af933437\") " pod="openshift-marketplace/community-operators-jc7v4" Dec 12 01:28:01 crc kubenswrapper[4606]: I1212 01:28:01.079380 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9cdf692d-b841-4469-92db-26d6af933437-catalog-content\") pod \"community-operators-jc7v4\" (UID: \"9cdf692d-b841-4469-92db-26d6af933437\") " pod="openshift-marketplace/community-operators-jc7v4" Dec 12 01:28:01 crc kubenswrapper[4606]: I1212 01:28:01.079799 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lst9k\" (UniqueName: \"kubernetes.io/projected/9cdf692d-b841-4469-92db-26d6af933437-kube-api-access-lst9k\") pod \"community-operators-jc7v4\" (UID: \"9cdf692d-b841-4469-92db-26d6af933437\") " pod="openshift-marketplace/community-operators-jc7v4" Dec 12 01:28:01 crc kubenswrapper[4606]: I1212 01:28:01.079677 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9cdf692d-b841-4469-92db-26d6af933437-catalog-content\") pod \"community-operators-jc7v4\" (UID: \"9cdf692d-b841-4469-92db-26d6af933437\") " pod="openshift-marketplace/community-operators-jc7v4" Dec 12 01:28:01 crc kubenswrapper[4606]: I1212 01:28:01.079321 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9cdf692d-b841-4469-92db-26d6af933437-utilities\") pod \"community-operators-jc7v4\" (UID: \"9cdf692d-b841-4469-92db-26d6af933437\") " pod="openshift-marketplace/community-operators-jc7v4" Dec 12 01:28:01 crc kubenswrapper[4606]: I1212 01:28:01.111023 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lst9k\" (UniqueName: \"kubernetes.io/projected/9cdf692d-b841-4469-92db-26d6af933437-kube-api-access-lst9k\") pod \"community-operators-jc7v4\" (UID: \"9cdf692d-b841-4469-92db-26d6af933437\") " pod="openshift-marketplace/community-operators-jc7v4" Dec 12 01:28:01 crc kubenswrapper[4606]: I1212 01:28:01.148101 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jc7v4" Dec 12 01:28:01 crc kubenswrapper[4606]: I1212 01:28:01.732878 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jc7v4"] Dec 12 01:28:02 crc kubenswrapper[4606]: I1212 01:28:02.271399 4606 generic.go:334] "Generic (PLEG): container finished" podID="9cdf692d-b841-4469-92db-26d6af933437" containerID="a124a3a3b2f061ba588f5f7524973e570e2821b5ee30a9d12efd3d1ff376c6b0" exitCode=0 Dec 12 01:28:02 crc kubenswrapper[4606]: I1212 01:28:02.271475 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jc7v4" event={"ID":"9cdf692d-b841-4469-92db-26d6af933437","Type":"ContainerDied","Data":"a124a3a3b2f061ba588f5f7524973e570e2821b5ee30a9d12efd3d1ff376c6b0"} Dec 12 01:28:02 crc kubenswrapper[4606]: I1212 01:28:02.271692 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jc7v4" event={"ID":"9cdf692d-b841-4469-92db-26d6af933437","Type":"ContainerStarted","Data":"bf0afc9617b034f08a18cd34aa9d09ae82605cd47d09203eed29e7b264c4a4f1"} Dec 12 01:28:03 crc kubenswrapper[4606]: I1212 01:28:03.291647 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jc7v4" event={"ID":"9cdf692d-b841-4469-92db-26d6af933437","Type":"ContainerStarted","Data":"e7916b2d2beb1d936794f0ba487e940dca7262f895209b4dfced92296cde7321"} Dec 12 01:28:04 crc kubenswrapper[4606]: I1212 01:28:04.302819 4606 generic.go:334] "Generic (PLEG): container finished" podID="9cdf692d-b841-4469-92db-26d6af933437" containerID="e7916b2d2beb1d936794f0ba487e940dca7262f895209b4dfced92296cde7321" exitCode=0 Dec 12 01:28:04 crc kubenswrapper[4606]: I1212 01:28:04.302947 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jc7v4" event={"ID":"9cdf692d-b841-4469-92db-26d6af933437","Type":"ContainerDied","Data":"e7916b2d2beb1d936794f0ba487e940dca7262f895209b4dfced92296cde7321"} Dec 12 01:28:05 crc kubenswrapper[4606]: I1212 01:28:05.316469 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jc7v4" event={"ID":"9cdf692d-b841-4469-92db-26d6af933437","Type":"ContainerStarted","Data":"cf0a9138fd31e4d14fa5e05b30ecb00351ef0c2af0c7501d25979bb5ecc465d1"} Dec 12 01:28:05 crc kubenswrapper[4606]: I1212 01:28:05.339809 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-jc7v4" podStartSLOduration=2.617277037 podStartE2EDuration="5.339785274s" podCreationTimestamp="2025-12-12 01:28:00 +0000 UTC" firstStartedPulling="2025-12-12 01:28:02.273470545 +0000 UTC m=+3872.818823411" lastFinishedPulling="2025-12-12 01:28:04.995978732 +0000 UTC m=+3875.541331648" observedRunningTime="2025-12-12 01:28:05.338230643 +0000 UTC m=+3875.883583529" watchObservedRunningTime="2025-12-12 01:28:05.339785274 +0000 UTC m=+3875.885138140" Dec 12 01:28:11 crc kubenswrapper[4606]: I1212 01:28:11.148509 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-jc7v4" Dec 12 01:28:11 crc kubenswrapper[4606]: I1212 01:28:11.150272 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-jc7v4" Dec 12 01:28:11 crc kubenswrapper[4606]: I1212 01:28:11.212441 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-jc7v4" Dec 12 01:28:11 crc kubenswrapper[4606]: I1212 01:28:11.420578 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-jc7v4" Dec 12 01:28:11 crc kubenswrapper[4606]: I1212 01:28:11.465686 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jc7v4"] Dec 12 01:28:13 crc kubenswrapper[4606]: I1212 01:28:13.381123 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-jc7v4" podUID="9cdf692d-b841-4469-92db-26d6af933437" containerName="registry-server" containerID="cri-o://cf0a9138fd31e4d14fa5e05b30ecb00351ef0c2af0c7501d25979bb5ecc465d1" gracePeriod=2 Dec 12 01:28:13 crc kubenswrapper[4606]: I1212 01:28:13.937291 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jc7v4" Dec 12 01:28:14 crc kubenswrapper[4606]: I1212 01:28:14.046411 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lst9k\" (UniqueName: \"kubernetes.io/projected/9cdf692d-b841-4469-92db-26d6af933437-kube-api-access-lst9k\") pod \"9cdf692d-b841-4469-92db-26d6af933437\" (UID: \"9cdf692d-b841-4469-92db-26d6af933437\") " Dec 12 01:28:14 crc kubenswrapper[4606]: I1212 01:28:14.046553 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9cdf692d-b841-4469-92db-26d6af933437-catalog-content\") pod \"9cdf692d-b841-4469-92db-26d6af933437\" (UID: \"9cdf692d-b841-4469-92db-26d6af933437\") " Dec 12 01:28:14 crc kubenswrapper[4606]: I1212 01:28:14.046857 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9cdf692d-b841-4469-92db-26d6af933437-utilities\") pod \"9cdf692d-b841-4469-92db-26d6af933437\" (UID: \"9cdf692d-b841-4469-92db-26d6af933437\") " Dec 12 01:28:14 crc kubenswrapper[4606]: I1212 01:28:14.047608 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9cdf692d-b841-4469-92db-26d6af933437-utilities" (OuterVolumeSpecName: "utilities") pod "9cdf692d-b841-4469-92db-26d6af933437" (UID: "9cdf692d-b841-4469-92db-26d6af933437"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 01:28:14 crc kubenswrapper[4606]: I1212 01:28:14.054906 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9cdf692d-b841-4469-92db-26d6af933437-kube-api-access-lst9k" (OuterVolumeSpecName: "kube-api-access-lst9k") pod "9cdf692d-b841-4469-92db-26d6af933437" (UID: "9cdf692d-b841-4469-92db-26d6af933437"). InnerVolumeSpecName "kube-api-access-lst9k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 01:28:14 crc kubenswrapper[4606]: I1212 01:28:14.096510 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9cdf692d-b841-4469-92db-26d6af933437-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9cdf692d-b841-4469-92db-26d6af933437" (UID: "9cdf692d-b841-4469-92db-26d6af933437"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 01:28:14 crc kubenswrapper[4606]: I1212 01:28:14.149125 4606 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9cdf692d-b841-4469-92db-26d6af933437-utilities\") on node \"crc\" DevicePath \"\"" Dec 12 01:28:14 crc kubenswrapper[4606]: I1212 01:28:14.149159 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lst9k\" (UniqueName: \"kubernetes.io/projected/9cdf692d-b841-4469-92db-26d6af933437-kube-api-access-lst9k\") on node \"crc\" DevicePath \"\"" Dec 12 01:28:14 crc kubenswrapper[4606]: I1212 01:28:14.149189 4606 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9cdf692d-b841-4469-92db-26d6af933437-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 12 01:28:14 crc kubenswrapper[4606]: I1212 01:28:14.406867 4606 generic.go:334] "Generic (PLEG): container finished" podID="9cdf692d-b841-4469-92db-26d6af933437" containerID="cf0a9138fd31e4d14fa5e05b30ecb00351ef0c2af0c7501d25979bb5ecc465d1" exitCode=0 Dec 12 01:28:14 crc kubenswrapper[4606]: I1212 01:28:14.406904 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jc7v4" event={"ID":"9cdf692d-b841-4469-92db-26d6af933437","Type":"ContainerDied","Data":"cf0a9138fd31e4d14fa5e05b30ecb00351ef0c2af0c7501d25979bb5ecc465d1"} Dec 12 01:28:14 crc kubenswrapper[4606]: I1212 01:28:14.406931 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jc7v4" event={"ID":"9cdf692d-b841-4469-92db-26d6af933437","Type":"ContainerDied","Data":"bf0afc9617b034f08a18cd34aa9d09ae82605cd47d09203eed29e7b264c4a4f1"} Dec 12 01:28:14 crc kubenswrapper[4606]: I1212 01:28:14.406932 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jc7v4" Dec 12 01:28:14 crc kubenswrapper[4606]: I1212 01:28:14.406947 4606 scope.go:117] "RemoveContainer" containerID="cf0a9138fd31e4d14fa5e05b30ecb00351ef0c2af0c7501d25979bb5ecc465d1" Dec 12 01:28:14 crc kubenswrapper[4606]: I1212 01:28:14.426266 4606 scope.go:117] "RemoveContainer" containerID="e7916b2d2beb1d936794f0ba487e940dca7262f895209b4dfced92296cde7321" Dec 12 01:28:14 crc kubenswrapper[4606]: I1212 01:28:14.456617 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jc7v4"] Dec 12 01:28:14 crc kubenswrapper[4606]: I1212 01:28:14.462514 4606 scope.go:117] "RemoveContainer" containerID="a124a3a3b2f061ba588f5f7524973e570e2821b5ee30a9d12efd3d1ff376c6b0" Dec 12 01:28:14 crc kubenswrapper[4606]: I1212 01:28:14.470900 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-jc7v4"] Dec 12 01:28:14 crc kubenswrapper[4606]: I1212 01:28:14.506892 4606 scope.go:117] "RemoveContainer" containerID="cf0a9138fd31e4d14fa5e05b30ecb00351ef0c2af0c7501d25979bb5ecc465d1" Dec 12 01:28:14 crc kubenswrapper[4606]: E1212 01:28:14.507370 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf0a9138fd31e4d14fa5e05b30ecb00351ef0c2af0c7501d25979bb5ecc465d1\": container with ID starting with cf0a9138fd31e4d14fa5e05b30ecb00351ef0c2af0c7501d25979bb5ecc465d1 not found: ID does not exist" containerID="cf0a9138fd31e4d14fa5e05b30ecb00351ef0c2af0c7501d25979bb5ecc465d1" Dec 12 01:28:14 crc kubenswrapper[4606]: I1212 01:28:14.507411 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf0a9138fd31e4d14fa5e05b30ecb00351ef0c2af0c7501d25979bb5ecc465d1"} err="failed to get container status \"cf0a9138fd31e4d14fa5e05b30ecb00351ef0c2af0c7501d25979bb5ecc465d1\": rpc error: code = NotFound desc = could not find container \"cf0a9138fd31e4d14fa5e05b30ecb00351ef0c2af0c7501d25979bb5ecc465d1\": container with ID starting with cf0a9138fd31e4d14fa5e05b30ecb00351ef0c2af0c7501d25979bb5ecc465d1 not found: ID does not exist" Dec 12 01:28:14 crc kubenswrapper[4606]: I1212 01:28:14.507437 4606 scope.go:117] "RemoveContainer" containerID="e7916b2d2beb1d936794f0ba487e940dca7262f895209b4dfced92296cde7321" Dec 12 01:28:14 crc kubenswrapper[4606]: E1212 01:28:14.507926 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7916b2d2beb1d936794f0ba487e940dca7262f895209b4dfced92296cde7321\": container with ID starting with e7916b2d2beb1d936794f0ba487e940dca7262f895209b4dfced92296cde7321 not found: ID does not exist" containerID="e7916b2d2beb1d936794f0ba487e940dca7262f895209b4dfced92296cde7321" Dec 12 01:28:14 crc kubenswrapper[4606]: I1212 01:28:14.507969 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7916b2d2beb1d936794f0ba487e940dca7262f895209b4dfced92296cde7321"} err="failed to get container status \"e7916b2d2beb1d936794f0ba487e940dca7262f895209b4dfced92296cde7321\": rpc error: code = NotFound desc = could not find container \"e7916b2d2beb1d936794f0ba487e940dca7262f895209b4dfced92296cde7321\": container with ID starting with e7916b2d2beb1d936794f0ba487e940dca7262f895209b4dfced92296cde7321 not found: ID does not exist" Dec 12 01:28:14 crc kubenswrapper[4606]: I1212 01:28:14.507996 4606 scope.go:117] "RemoveContainer" containerID="a124a3a3b2f061ba588f5f7524973e570e2821b5ee30a9d12efd3d1ff376c6b0" Dec 12 01:28:14 crc kubenswrapper[4606]: E1212 01:28:14.508365 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a124a3a3b2f061ba588f5f7524973e570e2821b5ee30a9d12efd3d1ff376c6b0\": container with ID starting with a124a3a3b2f061ba588f5f7524973e570e2821b5ee30a9d12efd3d1ff376c6b0 not found: ID does not exist" containerID="a124a3a3b2f061ba588f5f7524973e570e2821b5ee30a9d12efd3d1ff376c6b0" Dec 12 01:28:14 crc kubenswrapper[4606]: I1212 01:28:14.508385 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a124a3a3b2f061ba588f5f7524973e570e2821b5ee30a9d12efd3d1ff376c6b0"} err="failed to get container status \"a124a3a3b2f061ba588f5f7524973e570e2821b5ee30a9d12efd3d1ff376c6b0\": rpc error: code = NotFound desc = could not find container \"a124a3a3b2f061ba588f5f7524973e570e2821b5ee30a9d12efd3d1ff376c6b0\": container with ID starting with a124a3a3b2f061ba588f5f7524973e570e2821b5ee30a9d12efd3d1ff376c6b0 not found: ID does not exist" Dec 12 01:28:15 crc kubenswrapper[4606]: I1212 01:28:15.710097 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9cdf692d-b841-4469-92db-26d6af933437" path="/var/lib/kubelet/pods/9cdf692d-b841-4469-92db-26d6af933437/volumes" Dec 12 01:30:00 crc kubenswrapper[4606]: I1212 01:30:00.234917 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29425050-zvlx7"] Dec 12 01:30:00 crc kubenswrapper[4606]: E1212 01:30:00.235930 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cdf692d-b841-4469-92db-26d6af933437" containerName="extract-content" Dec 12 01:30:00 crc kubenswrapper[4606]: I1212 01:30:00.235946 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cdf692d-b841-4469-92db-26d6af933437" containerName="extract-content" Dec 12 01:30:00 crc kubenswrapper[4606]: E1212 01:30:00.235981 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cdf692d-b841-4469-92db-26d6af933437" containerName="registry-server" Dec 12 01:30:00 crc kubenswrapper[4606]: I1212 01:30:00.235987 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cdf692d-b841-4469-92db-26d6af933437" containerName="registry-server" Dec 12 01:30:00 crc kubenswrapper[4606]: E1212 01:30:00.236008 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cdf692d-b841-4469-92db-26d6af933437" containerName="extract-utilities" Dec 12 01:30:00 crc kubenswrapper[4606]: I1212 01:30:00.236016 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cdf692d-b841-4469-92db-26d6af933437" containerName="extract-utilities" Dec 12 01:30:00 crc kubenswrapper[4606]: I1212 01:30:00.236237 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="9cdf692d-b841-4469-92db-26d6af933437" containerName="registry-server" Dec 12 01:30:00 crc kubenswrapper[4606]: I1212 01:30:00.236948 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29425050-zvlx7" Dec 12 01:30:00 crc kubenswrapper[4606]: I1212 01:30:00.245213 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 12 01:30:00 crc kubenswrapper[4606]: I1212 01:30:00.246961 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 12 01:30:00 crc kubenswrapper[4606]: I1212 01:30:00.251106 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29425050-zvlx7"] Dec 12 01:30:00 crc kubenswrapper[4606]: I1212 01:30:00.345391 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wpkq\" (UniqueName: \"kubernetes.io/projected/38083161-817d-4a9a-9fe3-5d7140f36819-kube-api-access-5wpkq\") pod \"collect-profiles-29425050-zvlx7\" (UID: \"38083161-817d-4a9a-9fe3-5d7140f36819\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29425050-zvlx7" Dec 12 01:30:00 crc kubenswrapper[4606]: I1212 01:30:00.345879 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/38083161-817d-4a9a-9fe3-5d7140f36819-secret-volume\") pod \"collect-profiles-29425050-zvlx7\" (UID: \"38083161-817d-4a9a-9fe3-5d7140f36819\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29425050-zvlx7" Dec 12 01:30:00 crc kubenswrapper[4606]: I1212 01:30:00.346122 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/38083161-817d-4a9a-9fe3-5d7140f36819-config-volume\") pod \"collect-profiles-29425050-zvlx7\" (UID: \"38083161-817d-4a9a-9fe3-5d7140f36819\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29425050-zvlx7" Dec 12 01:30:00 crc kubenswrapper[4606]: I1212 01:30:00.447863 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/38083161-817d-4a9a-9fe3-5d7140f36819-secret-volume\") pod \"collect-profiles-29425050-zvlx7\" (UID: \"38083161-817d-4a9a-9fe3-5d7140f36819\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29425050-zvlx7" Dec 12 01:30:00 crc kubenswrapper[4606]: I1212 01:30:00.448022 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/38083161-817d-4a9a-9fe3-5d7140f36819-config-volume\") pod \"collect-profiles-29425050-zvlx7\" (UID: \"38083161-817d-4a9a-9fe3-5d7140f36819\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29425050-zvlx7" Dec 12 01:30:00 crc kubenswrapper[4606]: I1212 01:30:00.448131 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5wpkq\" (UniqueName: \"kubernetes.io/projected/38083161-817d-4a9a-9fe3-5d7140f36819-kube-api-access-5wpkq\") pod \"collect-profiles-29425050-zvlx7\" (UID: \"38083161-817d-4a9a-9fe3-5d7140f36819\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29425050-zvlx7" Dec 12 01:30:00 crc kubenswrapper[4606]: I1212 01:30:00.448756 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/38083161-817d-4a9a-9fe3-5d7140f36819-config-volume\") pod \"collect-profiles-29425050-zvlx7\" (UID: \"38083161-817d-4a9a-9fe3-5d7140f36819\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29425050-zvlx7" Dec 12 01:30:00 crc kubenswrapper[4606]: I1212 01:30:00.748496 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/38083161-817d-4a9a-9fe3-5d7140f36819-secret-volume\") pod \"collect-profiles-29425050-zvlx7\" (UID: \"38083161-817d-4a9a-9fe3-5d7140f36819\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29425050-zvlx7" Dec 12 01:30:00 crc kubenswrapper[4606]: I1212 01:30:00.749129 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5wpkq\" (UniqueName: \"kubernetes.io/projected/38083161-817d-4a9a-9fe3-5d7140f36819-kube-api-access-5wpkq\") pod \"collect-profiles-29425050-zvlx7\" (UID: \"38083161-817d-4a9a-9fe3-5d7140f36819\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29425050-zvlx7" Dec 12 01:30:00 crc kubenswrapper[4606]: I1212 01:30:00.860289 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29425050-zvlx7" Dec 12 01:30:01 crc kubenswrapper[4606]: I1212 01:30:01.920909 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29425050-zvlx7"] Dec 12 01:30:02 crc kubenswrapper[4606]: I1212 01:30:02.010313 4606 patch_prober.go:28] interesting pod/machine-config-daemon-cqmz5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 12 01:30:02 crc kubenswrapper[4606]: I1212 01:30:02.010648 4606 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 12 01:30:02 crc kubenswrapper[4606]: I1212 01:30:02.473529 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29425050-zvlx7" event={"ID":"38083161-817d-4a9a-9fe3-5d7140f36819","Type":"ContainerStarted","Data":"016a963a0ef7e983056fcf14ae2462e46706a8cee6d9773702246d5c846f103b"} Dec 12 01:30:02 crc kubenswrapper[4606]: I1212 01:30:02.473596 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29425050-zvlx7" event={"ID":"38083161-817d-4a9a-9fe3-5d7140f36819","Type":"ContainerStarted","Data":"4dd69fe6350cae60788ca1276063daaa497fff988c523c971f04cb88d9f7de2f"} Dec 12 01:30:02 crc kubenswrapper[4606]: I1212 01:30:02.497573 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29425050-zvlx7" podStartSLOduration=2.497539357 podStartE2EDuration="2.497539357s" podCreationTimestamp="2025-12-12 01:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 01:30:02.497079915 +0000 UTC m=+3993.042432791" watchObservedRunningTime="2025-12-12 01:30:02.497539357 +0000 UTC m=+3993.042892223" Dec 12 01:30:03 crc kubenswrapper[4606]: I1212 01:30:03.482562 4606 generic.go:334] "Generic (PLEG): container finished" podID="38083161-817d-4a9a-9fe3-5d7140f36819" containerID="016a963a0ef7e983056fcf14ae2462e46706a8cee6d9773702246d5c846f103b" exitCode=0 Dec 12 01:30:03 crc kubenswrapper[4606]: I1212 01:30:03.482816 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29425050-zvlx7" event={"ID":"38083161-817d-4a9a-9fe3-5d7140f36819","Type":"ContainerDied","Data":"016a963a0ef7e983056fcf14ae2462e46706a8cee6d9773702246d5c846f103b"} Dec 12 01:30:05 crc kubenswrapper[4606]: I1212 01:30:05.133565 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29425050-zvlx7" Dec 12 01:30:05 crc kubenswrapper[4606]: I1212 01:30:05.242717 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/38083161-817d-4a9a-9fe3-5d7140f36819-config-volume\") pod \"38083161-817d-4a9a-9fe3-5d7140f36819\" (UID: \"38083161-817d-4a9a-9fe3-5d7140f36819\") " Dec 12 01:30:05 crc kubenswrapper[4606]: I1212 01:30:05.242813 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5wpkq\" (UniqueName: \"kubernetes.io/projected/38083161-817d-4a9a-9fe3-5d7140f36819-kube-api-access-5wpkq\") pod \"38083161-817d-4a9a-9fe3-5d7140f36819\" (UID: \"38083161-817d-4a9a-9fe3-5d7140f36819\") " Dec 12 01:30:05 crc kubenswrapper[4606]: I1212 01:30:05.242857 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/38083161-817d-4a9a-9fe3-5d7140f36819-secret-volume\") pod \"38083161-817d-4a9a-9fe3-5d7140f36819\" (UID: \"38083161-817d-4a9a-9fe3-5d7140f36819\") " Dec 12 01:30:05 crc kubenswrapper[4606]: I1212 01:30:05.243718 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38083161-817d-4a9a-9fe3-5d7140f36819-config-volume" (OuterVolumeSpecName: "config-volume") pod "38083161-817d-4a9a-9fe3-5d7140f36819" (UID: "38083161-817d-4a9a-9fe3-5d7140f36819"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 01:30:05 crc kubenswrapper[4606]: I1212 01:30:05.248124 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38083161-817d-4a9a-9fe3-5d7140f36819-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "38083161-817d-4a9a-9fe3-5d7140f36819" (UID: "38083161-817d-4a9a-9fe3-5d7140f36819"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 01:30:05 crc kubenswrapper[4606]: I1212 01:30:05.258347 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38083161-817d-4a9a-9fe3-5d7140f36819-kube-api-access-5wpkq" (OuterVolumeSpecName: "kube-api-access-5wpkq") pod "38083161-817d-4a9a-9fe3-5d7140f36819" (UID: "38083161-817d-4a9a-9fe3-5d7140f36819"). InnerVolumeSpecName "kube-api-access-5wpkq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 01:30:05 crc kubenswrapper[4606]: I1212 01:30:05.345042 4606 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/38083161-817d-4a9a-9fe3-5d7140f36819-config-volume\") on node \"crc\" DevicePath \"\"" Dec 12 01:30:05 crc kubenswrapper[4606]: I1212 01:30:05.345296 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5wpkq\" (UniqueName: \"kubernetes.io/projected/38083161-817d-4a9a-9fe3-5d7140f36819-kube-api-access-5wpkq\") on node \"crc\" DevicePath \"\"" Dec 12 01:30:05 crc kubenswrapper[4606]: I1212 01:30:05.345367 4606 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/38083161-817d-4a9a-9fe3-5d7140f36819-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 12 01:30:05 crc kubenswrapper[4606]: I1212 01:30:05.521783 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29425050-zvlx7" event={"ID":"38083161-817d-4a9a-9fe3-5d7140f36819","Type":"ContainerDied","Data":"4dd69fe6350cae60788ca1276063daaa497fff988c523c971f04cb88d9f7de2f"} Dec 12 01:30:05 crc kubenswrapper[4606]: I1212 01:30:05.521826 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29425050-zvlx7" Dec 12 01:30:05 crc kubenswrapper[4606]: I1212 01:30:05.521815 4606 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4dd69fe6350cae60788ca1276063daaa497fff988c523c971f04cb88d9f7de2f" Dec 12 01:30:06 crc kubenswrapper[4606]: I1212 01:30:06.238467 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29425005-jc4rq"] Dec 12 01:30:06 crc kubenswrapper[4606]: I1212 01:30:06.261564 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29425005-jc4rq"] Dec 12 01:30:07 crc kubenswrapper[4606]: I1212 01:30:07.712235 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a80843b7-dee1-423c-b28b-3fbcdf367999" path="/var/lib/kubelet/pods/a80843b7-dee1-423c-b28b-3fbcdf367999/volumes" Dec 12 01:30:32 crc kubenswrapper[4606]: I1212 01:30:32.010514 4606 patch_prober.go:28] interesting pod/machine-config-daemon-cqmz5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 12 01:30:32 crc kubenswrapper[4606]: I1212 01:30:32.010915 4606 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 12 01:30:34 crc kubenswrapper[4606]: I1212 01:30:34.156116 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-j76fv"] Dec 12 01:30:34 crc kubenswrapper[4606]: E1212 01:30:34.156788 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38083161-817d-4a9a-9fe3-5d7140f36819" containerName="collect-profiles" Dec 12 01:30:34 crc kubenswrapper[4606]: I1212 01:30:34.156803 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="38083161-817d-4a9a-9fe3-5d7140f36819" containerName="collect-profiles" Dec 12 01:30:34 crc kubenswrapper[4606]: I1212 01:30:34.157003 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="38083161-817d-4a9a-9fe3-5d7140f36819" containerName="collect-profiles" Dec 12 01:30:34 crc kubenswrapper[4606]: I1212 01:30:34.158537 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j76fv" Dec 12 01:30:34 crc kubenswrapper[4606]: I1212 01:30:34.199702 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-j76fv"] Dec 12 01:30:34 crc kubenswrapper[4606]: I1212 01:30:34.255888 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/947a9c91-0411-4d85-a9a1-ecce288235a9-utilities\") pod \"redhat-marketplace-j76fv\" (UID: \"947a9c91-0411-4d85-a9a1-ecce288235a9\") " pod="openshift-marketplace/redhat-marketplace-j76fv" Dec 12 01:30:34 crc kubenswrapper[4606]: I1212 01:30:34.255997 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzxmm\" (UniqueName: \"kubernetes.io/projected/947a9c91-0411-4d85-a9a1-ecce288235a9-kube-api-access-kzxmm\") pod \"redhat-marketplace-j76fv\" (UID: \"947a9c91-0411-4d85-a9a1-ecce288235a9\") " pod="openshift-marketplace/redhat-marketplace-j76fv" Dec 12 01:30:34 crc kubenswrapper[4606]: I1212 01:30:34.256034 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/947a9c91-0411-4d85-a9a1-ecce288235a9-catalog-content\") pod \"redhat-marketplace-j76fv\" (UID: \"947a9c91-0411-4d85-a9a1-ecce288235a9\") " pod="openshift-marketplace/redhat-marketplace-j76fv" Dec 12 01:30:34 crc kubenswrapper[4606]: I1212 01:30:34.357705 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/947a9c91-0411-4d85-a9a1-ecce288235a9-utilities\") pod \"redhat-marketplace-j76fv\" (UID: \"947a9c91-0411-4d85-a9a1-ecce288235a9\") " pod="openshift-marketplace/redhat-marketplace-j76fv" Dec 12 01:30:34 crc kubenswrapper[4606]: I1212 01:30:34.357923 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kzxmm\" (UniqueName: \"kubernetes.io/projected/947a9c91-0411-4d85-a9a1-ecce288235a9-kube-api-access-kzxmm\") pod \"redhat-marketplace-j76fv\" (UID: \"947a9c91-0411-4d85-a9a1-ecce288235a9\") " pod="openshift-marketplace/redhat-marketplace-j76fv" Dec 12 01:30:34 crc kubenswrapper[4606]: I1212 01:30:34.358005 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/947a9c91-0411-4d85-a9a1-ecce288235a9-catalog-content\") pod \"redhat-marketplace-j76fv\" (UID: \"947a9c91-0411-4d85-a9a1-ecce288235a9\") " pod="openshift-marketplace/redhat-marketplace-j76fv" Dec 12 01:30:34 crc kubenswrapper[4606]: I1212 01:30:34.358211 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/947a9c91-0411-4d85-a9a1-ecce288235a9-utilities\") pod \"redhat-marketplace-j76fv\" (UID: \"947a9c91-0411-4d85-a9a1-ecce288235a9\") " pod="openshift-marketplace/redhat-marketplace-j76fv" Dec 12 01:30:34 crc kubenswrapper[4606]: I1212 01:30:34.358661 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/947a9c91-0411-4d85-a9a1-ecce288235a9-catalog-content\") pod \"redhat-marketplace-j76fv\" (UID: \"947a9c91-0411-4d85-a9a1-ecce288235a9\") " pod="openshift-marketplace/redhat-marketplace-j76fv" Dec 12 01:30:34 crc kubenswrapper[4606]: I1212 01:30:34.383042 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzxmm\" (UniqueName: \"kubernetes.io/projected/947a9c91-0411-4d85-a9a1-ecce288235a9-kube-api-access-kzxmm\") pod \"redhat-marketplace-j76fv\" (UID: \"947a9c91-0411-4d85-a9a1-ecce288235a9\") " pod="openshift-marketplace/redhat-marketplace-j76fv" Dec 12 01:30:34 crc kubenswrapper[4606]: I1212 01:30:34.489329 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j76fv" Dec 12 01:30:34 crc kubenswrapper[4606]: I1212 01:30:34.951242 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-j76fv"] Dec 12 01:30:35 crc kubenswrapper[4606]: I1212 01:30:35.827778 4606 generic.go:334] "Generic (PLEG): container finished" podID="947a9c91-0411-4d85-a9a1-ecce288235a9" containerID="1b99afbe7ef09ea39759f0319725b33aef3edbf1dda1fca6b959ff9d3397a9f4" exitCode=0 Dec 12 01:30:35 crc kubenswrapper[4606]: I1212 01:30:35.828596 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j76fv" event={"ID":"947a9c91-0411-4d85-a9a1-ecce288235a9","Type":"ContainerDied","Data":"1b99afbe7ef09ea39759f0319725b33aef3edbf1dda1fca6b959ff9d3397a9f4"} Dec 12 01:30:35 crc kubenswrapper[4606]: I1212 01:30:35.828635 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j76fv" event={"ID":"947a9c91-0411-4d85-a9a1-ecce288235a9","Type":"ContainerStarted","Data":"ca4d784766a815424b75fa8b2d8f0dd98adfb969cec2abd7c4454ecd0a30dceb"} Dec 12 01:30:36 crc kubenswrapper[4606]: I1212 01:30:36.838630 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j76fv" event={"ID":"947a9c91-0411-4d85-a9a1-ecce288235a9","Type":"ContainerStarted","Data":"df79d94fb12e774bfd13768c685a284fa3520764d34035e5722497cc47a86d61"} Dec 12 01:30:37 crc kubenswrapper[4606]: I1212 01:30:37.854643 4606 generic.go:334] "Generic (PLEG): container finished" podID="947a9c91-0411-4d85-a9a1-ecce288235a9" containerID="df79d94fb12e774bfd13768c685a284fa3520764d34035e5722497cc47a86d61" exitCode=0 Dec 12 01:30:37 crc kubenswrapper[4606]: I1212 01:30:37.854727 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j76fv" event={"ID":"947a9c91-0411-4d85-a9a1-ecce288235a9","Type":"ContainerDied","Data":"df79d94fb12e774bfd13768c685a284fa3520764d34035e5722497cc47a86d61"} Dec 12 01:30:38 crc kubenswrapper[4606]: I1212 01:30:38.867641 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j76fv" event={"ID":"947a9c91-0411-4d85-a9a1-ecce288235a9","Type":"ContainerStarted","Data":"1625610b43d558674ee198642d4e539d0d8d71c4e401a4ca05d912bd83688112"} Dec 12 01:30:44 crc kubenswrapper[4606]: I1212 01:30:44.489862 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-j76fv" Dec 12 01:30:44 crc kubenswrapper[4606]: I1212 01:30:44.490511 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-j76fv" Dec 12 01:30:44 crc kubenswrapper[4606]: I1212 01:30:44.552798 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-j76fv" Dec 12 01:30:44 crc kubenswrapper[4606]: I1212 01:30:44.590641 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-j76fv" podStartSLOduration=8.064262447 podStartE2EDuration="10.590575712s" podCreationTimestamp="2025-12-12 01:30:34 +0000 UTC" firstStartedPulling="2025-12-12 01:30:35.832046117 +0000 UTC m=+4026.377398973" lastFinishedPulling="2025-12-12 01:30:38.358359362 +0000 UTC m=+4028.903712238" observedRunningTime="2025-12-12 01:30:38.891422455 +0000 UTC m=+4029.436775331" watchObservedRunningTime="2025-12-12 01:30:44.590575712 +0000 UTC m=+4035.135928578" Dec 12 01:30:45 crc kubenswrapper[4606]: I1212 01:30:45.002212 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-j76fv" Dec 12 01:30:45 crc kubenswrapper[4606]: I1212 01:30:45.068646 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-j76fv"] Dec 12 01:30:46 crc kubenswrapper[4606]: I1212 01:30:46.950437 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-j76fv" podUID="947a9c91-0411-4d85-a9a1-ecce288235a9" containerName="registry-server" containerID="cri-o://1625610b43d558674ee198642d4e539d0d8d71c4e401a4ca05d912bd83688112" gracePeriod=2 Dec 12 01:30:47 crc kubenswrapper[4606]: I1212 01:30:47.327550 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-zrk9d"] Dec 12 01:30:47 crc kubenswrapper[4606]: I1212 01:30:47.330033 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zrk9d" Dec 12 01:30:47 crc kubenswrapper[4606]: I1212 01:30:47.340685 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zrk9d"] Dec 12 01:30:47 crc kubenswrapper[4606]: I1212 01:30:47.451816 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rsrk2\" (UniqueName: \"kubernetes.io/projected/b67be642-049f-46c4-9f3c-40e454379d98-kube-api-access-rsrk2\") pod \"certified-operators-zrk9d\" (UID: \"b67be642-049f-46c4-9f3c-40e454379d98\") " pod="openshift-marketplace/certified-operators-zrk9d" Dec 12 01:30:47 crc kubenswrapper[4606]: I1212 01:30:47.452144 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b67be642-049f-46c4-9f3c-40e454379d98-utilities\") pod \"certified-operators-zrk9d\" (UID: \"b67be642-049f-46c4-9f3c-40e454379d98\") " pod="openshift-marketplace/certified-operators-zrk9d" Dec 12 01:30:47 crc kubenswrapper[4606]: I1212 01:30:47.452601 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b67be642-049f-46c4-9f3c-40e454379d98-catalog-content\") pod \"certified-operators-zrk9d\" (UID: \"b67be642-049f-46c4-9f3c-40e454379d98\") " pod="openshift-marketplace/certified-operators-zrk9d" Dec 12 01:30:47 crc kubenswrapper[4606]: I1212 01:30:47.555210 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rsrk2\" (UniqueName: \"kubernetes.io/projected/b67be642-049f-46c4-9f3c-40e454379d98-kube-api-access-rsrk2\") pod \"certified-operators-zrk9d\" (UID: \"b67be642-049f-46c4-9f3c-40e454379d98\") " pod="openshift-marketplace/certified-operators-zrk9d" Dec 12 01:30:47 crc kubenswrapper[4606]: I1212 01:30:47.555255 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b67be642-049f-46c4-9f3c-40e454379d98-utilities\") pod \"certified-operators-zrk9d\" (UID: \"b67be642-049f-46c4-9f3c-40e454379d98\") " pod="openshift-marketplace/certified-operators-zrk9d" Dec 12 01:30:47 crc kubenswrapper[4606]: I1212 01:30:47.555374 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b67be642-049f-46c4-9f3c-40e454379d98-catalog-content\") pod \"certified-operators-zrk9d\" (UID: \"b67be642-049f-46c4-9f3c-40e454379d98\") " pod="openshift-marketplace/certified-operators-zrk9d" Dec 12 01:30:47 crc kubenswrapper[4606]: I1212 01:30:47.555900 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b67be642-049f-46c4-9f3c-40e454379d98-catalog-content\") pod \"certified-operators-zrk9d\" (UID: \"b67be642-049f-46c4-9f3c-40e454379d98\") " pod="openshift-marketplace/certified-operators-zrk9d" Dec 12 01:30:47 crc kubenswrapper[4606]: I1212 01:30:47.555955 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b67be642-049f-46c4-9f3c-40e454379d98-utilities\") pod \"certified-operators-zrk9d\" (UID: \"b67be642-049f-46c4-9f3c-40e454379d98\") " pod="openshift-marketplace/certified-operators-zrk9d" Dec 12 01:30:47 crc kubenswrapper[4606]: I1212 01:30:47.585344 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rsrk2\" (UniqueName: \"kubernetes.io/projected/b67be642-049f-46c4-9f3c-40e454379d98-kube-api-access-rsrk2\") pod \"certified-operators-zrk9d\" (UID: \"b67be642-049f-46c4-9f3c-40e454379d98\") " pod="openshift-marketplace/certified-operators-zrk9d" Dec 12 01:30:47 crc kubenswrapper[4606]: I1212 01:30:47.645368 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j76fv" Dec 12 01:30:47 crc kubenswrapper[4606]: I1212 01:30:47.658879 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zrk9d" Dec 12 01:30:47 crc kubenswrapper[4606]: I1212 01:30:47.758786 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kzxmm\" (UniqueName: \"kubernetes.io/projected/947a9c91-0411-4d85-a9a1-ecce288235a9-kube-api-access-kzxmm\") pod \"947a9c91-0411-4d85-a9a1-ecce288235a9\" (UID: \"947a9c91-0411-4d85-a9a1-ecce288235a9\") " Dec 12 01:30:47 crc kubenswrapper[4606]: I1212 01:30:47.758838 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/947a9c91-0411-4d85-a9a1-ecce288235a9-catalog-content\") pod \"947a9c91-0411-4d85-a9a1-ecce288235a9\" (UID: \"947a9c91-0411-4d85-a9a1-ecce288235a9\") " Dec 12 01:30:47 crc kubenswrapper[4606]: I1212 01:30:47.758956 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/947a9c91-0411-4d85-a9a1-ecce288235a9-utilities\") pod \"947a9c91-0411-4d85-a9a1-ecce288235a9\" (UID: \"947a9c91-0411-4d85-a9a1-ecce288235a9\") " Dec 12 01:30:47 crc kubenswrapper[4606]: I1212 01:30:47.759857 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/947a9c91-0411-4d85-a9a1-ecce288235a9-utilities" (OuterVolumeSpecName: "utilities") pod "947a9c91-0411-4d85-a9a1-ecce288235a9" (UID: "947a9c91-0411-4d85-a9a1-ecce288235a9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 01:30:47 crc kubenswrapper[4606]: I1212 01:30:47.762895 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/947a9c91-0411-4d85-a9a1-ecce288235a9-kube-api-access-kzxmm" (OuterVolumeSpecName: "kube-api-access-kzxmm") pod "947a9c91-0411-4d85-a9a1-ecce288235a9" (UID: "947a9c91-0411-4d85-a9a1-ecce288235a9"). InnerVolumeSpecName "kube-api-access-kzxmm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 01:30:47 crc kubenswrapper[4606]: I1212 01:30:47.800381 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/947a9c91-0411-4d85-a9a1-ecce288235a9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "947a9c91-0411-4d85-a9a1-ecce288235a9" (UID: "947a9c91-0411-4d85-a9a1-ecce288235a9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 01:30:47 crc kubenswrapper[4606]: I1212 01:30:47.861684 4606 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/947a9c91-0411-4d85-a9a1-ecce288235a9-utilities\") on node \"crc\" DevicePath \"\"" Dec 12 01:30:47 crc kubenswrapper[4606]: I1212 01:30:47.862455 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kzxmm\" (UniqueName: \"kubernetes.io/projected/947a9c91-0411-4d85-a9a1-ecce288235a9-kube-api-access-kzxmm\") on node \"crc\" DevicePath \"\"" Dec 12 01:30:47 crc kubenswrapper[4606]: I1212 01:30:47.862513 4606 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/947a9c91-0411-4d85-a9a1-ecce288235a9-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 12 01:30:47 crc kubenswrapper[4606]: I1212 01:30:47.966629 4606 generic.go:334] "Generic (PLEG): container finished" podID="947a9c91-0411-4d85-a9a1-ecce288235a9" containerID="1625610b43d558674ee198642d4e539d0d8d71c4e401a4ca05d912bd83688112" exitCode=0 Dec 12 01:30:47 crc kubenswrapper[4606]: I1212 01:30:47.966676 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j76fv" event={"ID":"947a9c91-0411-4d85-a9a1-ecce288235a9","Type":"ContainerDied","Data":"1625610b43d558674ee198642d4e539d0d8d71c4e401a4ca05d912bd83688112"} Dec 12 01:30:47 crc kubenswrapper[4606]: I1212 01:30:47.966706 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j76fv" event={"ID":"947a9c91-0411-4d85-a9a1-ecce288235a9","Type":"ContainerDied","Data":"ca4d784766a815424b75fa8b2d8f0dd98adfb969cec2abd7c4454ecd0a30dceb"} Dec 12 01:30:47 crc kubenswrapper[4606]: I1212 01:30:47.966726 4606 scope.go:117] "RemoveContainer" containerID="1625610b43d558674ee198642d4e539d0d8d71c4e401a4ca05d912bd83688112" Dec 12 01:30:47 crc kubenswrapper[4606]: I1212 01:30:47.966885 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j76fv" Dec 12 01:30:48 crc kubenswrapper[4606]: I1212 01:30:48.014160 4606 scope.go:117] "RemoveContainer" containerID="df79d94fb12e774bfd13768c685a284fa3520764d34035e5722497cc47a86d61" Dec 12 01:30:48 crc kubenswrapper[4606]: I1212 01:30:48.016214 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-j76fv"] Dec 12 01:30:48 crc kubenswrapper[4606]: I1212 01:30:48.024637 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-j76fv"] Dec 12 01:30:48 crc kubenswrapper[4606]: I1212 01:30:48.141081 4606 scope.go:117] "RemoveContainer" containerID="1b99afbe7ef09ea39759f0319725b33aef3edbf1dda1fca6b959ff9d3397a9f4" Dec 12 01:30:48 crc kubenswrapper[4606]: I1212 01:30:48.186741 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zrk9d"] Dec 12 01:30:48 crc kubenswrapper[4606]: I1212 01:30:48.211483 4606 scope.go:117] "RemoveContainer" containerID="1625610b43d558674ee198642d4e539d0d8d71c4e401a4ca05d912bd83688112" Dec 12 01:30:48 crc kubenswrapper[4606]: E1212 01:30:48.213013 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1625610b43d558674ee198642d4e539d0d8d71c4e401a4ca05d912bd83688112\": container with ID starting with 1625610b43d558674ee198642d4e539d0d8d71c4e401a4ca05d912bd83688112 not found: ID does not exist" containerID="1625610b43d558674ee198642d4e539d0d8d71c4e401a4ca05d912bd83688112" Dec 12 01:30:48 crc kubenswrapper[4606]: I1212 01:30:48.213065 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1625610b43d558674ee198642d4e539d0d8d71c4e401a4ca05d912bd83688112"} err="failed to get container status \"1625610b43d558674ee198642d4e539d0d8d71c4e401a4ca05d912bd83688112\": rpc error: code = NotFound desc = could not find container \"1625610b43d558674ee198642d4e539d0d8d71c4e401a4ca05d912bd83688112\": container with ID starting with 1625610b43d558674ee198642d4e539d0d8d71c4e401a4ca05d912bd83688112 not found: ID does not exist" Dec 12 01:30:48 crc kubenswrapper[4606]: I1212 01:30:48.213101 4606 scope.go:117] "RemoveContainer" containerID="df79d94fb12e774bfd13768c685a284fa3520764d34035e5722497cc47a86d61" Dec 12 01:30:48 crc kubenswrapper[4606]: E1212 01:30:48.213447 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df79d94fb12e774bfd13768c685a284fa3520764d34035e5722497cc47a86d61\": container with ID starting with df79d94fb12e774bfd13768c685a284fa3520764d34035e5722497cc47a86d61 not found: ID does not exist" containerID="df79d94fb12e774bfd13768c685a284fa3520764d34035e5722497cc47a86d61" Dec 12 01:30:48 crc kubenswrapper[4606]: I1212 01:30:48.213482 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df79d94fb12e774bfd13768c685a284fa3520764d34035e5722497cc47a86d61"} err="failed to get container status \"df79d94fb12e774bfd13768c685a284fa3520764d34035e5722497cc47a86d61\": rpc error: code = NotFound desc = could not find container \"df79d94fb12e774bfd13768c685a284fa3520764d34035e5722497cc47a86d61\": container with ID starting with df79d94fb12e774bfd13768c685a284fa3520764d34035e5722497cc47a86d61 not found: ID does not exist" Dec 12 01:30:48 crc kubenswrapper[4606]: I1212 01:30:48.213510 4606 scope.go:117] "RemoveContainer" containerID="1b99afbe7ef09ea39759f0319725b33aef3edbf1dda1fca6b959ff9d3397a9f4" Dec 12 01:30:48 crc kubenswrapper[4606]: E1212 01:30:48.215031 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b99afbe7ef09ea39759f0319725b33aef3edbf1dda1fca6b959ff9d3397a9f4\": container with ID starting with 1b99afbe7ef09ea39759f0319725b33aef3edbf1dda1fca6b959ff9d3397a9f4 not found: ID does not exist" containerID="1b99afbe7ef09ea39759f0319725b33aef3edbf1dda1fca6b959ff9d3397a9f4" Dec 12 01:30:48 crc kubenswrapper[4606]: I1212 01:30:48.215080 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b99afbe7ef09ea39759f0319725b33aef3edbf1dda1fca6b959ff9d3397a9f4"} err="failed to get container status \"1b99afbe7ef09ea39759f0319725b33aef3edbf1dda1fca6b959ff9d3397a9f4\": rpc error: code = NotFound desc = could not find container \"1b99afbe7ef09ea39759f0319725b33aef3edbf1dda1fca6b959ff9d3397a9f4\": container with ID starting with 1b99afbe7ef09ea39759f0319725b33aef3edbf1dda1fca6b959ff9d3397a9f4 not found: ID does not exist" Dec 12 01:30:48 crc kubenswrapper[4606]: I1212 01:30:48.999616 4606 generic.go:334] "Generic (PLEG): container finished" podID="b67be642-049f-46c4-9f3c-40e454379d98" containerID="51b31af81ed70f4317ccf7f3a0f51e5b42addc3f2bdf0ee2f54a33ad7cada81e" exitCode=0 Dec 12 01:30:49 crc kubenswrapper[4606]: I1212 01:30:49.000794 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zrk9d" event={"ID":"b67be642-049f-46c4-9f3c-40e454379d98","Type":"ContainerDied","Data":"51b31af81ed70f4317ccf7f3a0f51e5b42addc3f2bdf0ee2f54a33ad7cada81e"} Dec 12 01:30:49 crc kubenswrapper[4606]: I1212 01:30:49.000929 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zrk9d" event={"ID":"b67be642-049f-46c4-9f3c-40e454379d98","Type":"ContainerStarted","Data":"baebd7a37110c7b6773701686e6fdf195db61e4884397b147ecf49d707b83ab2"} Dec 12 01:30:49 crc kubenswrapper[4606]: I1212 01:30:49.709392 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="947a9c91-0411-4d85-a9a1-ecce288235a9" path="/var/lib/kubelet/pods/947a9c91-0411-4d85-a9a1-ecce288235a9/volumes" Dec 12 01:30:51 crc kubenswrapper[4606]: I1212 01:30:51.023302 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zrk9d" event={"ID":"b67be642-049f-46c4-9f3c-40e454379d98","Type":"ContainerStarted","Data":"dce976602399809b2bf4be316e57a1e0d948236554de697ae7937c0e96644268"} Dec 12 01:30:53 crc kubenswrapper[4606]: I1212 01:30:53.042517 4606 generic.go:334] "Generic (PLEG): container finished" podID="b67be642-049f-46c4-9f3c-40e454379d98" containerID="dce976602399809b2bf4be316e57a1e0d948236554de697ae7937c0e96644268" exitCode=0 Dec 12 01:30:53 crc kubenswrapper[4606]: I1212 01:30:53.042585 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zrk9d" event={"ID":"b67be642-049f-46c4-9f3c-40e454379d98","Type":"ContainerDied","Data":"dce976602399809b2bf4be316e57a1e0d948236554de697ae7937c0e96644268"} Dec 12 01:30:54 crc kubenswrapper[4606]: I1212 01:30:54.054096 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zrk9d" event={"ID":"b67be642-049f-46c4-9f3c-40e454379d98","Type":"ContainerStarted","Data":"75171af308dabd7a6910b95e598ff31b0ac27de5d8534d66cadb9e8ce0207e7d"} Dec 12 01:30:54 crc kubenswrapper[4606]: I1212 01:30:54.075788 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-zrk9d" podStartSLOduration=2.390481342 podStartE2EDuration="7.075769893s" podCreationTimestamp="2025-12-12 01:30:47 +0000 UTC" firstStartedPulling="2025-12-12 01:30:49.003527729 +0000 UTC m=+4039.548880595" lastFinishedPulling="2025-12-12 01:30:53.68881627 +0000 UTC m=+4044.234169146" observedRunningTime="2025-12-12 01:30:54.070858793 +0000 UTC m=+4044.616211679" watchObservedRunningTime="2025-12-12 01:30:54.075769893 +0000 UTC m=+4044.621122749" Dec 12 01:30:57 crc kubenswrapper[4606]: I1212 01:30:57.659280 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-zrk9d" Dec 12 01:30:57 crc kubenswrapper[4606]: I1212 01:30:57.659905 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-zrk9d" Dec 12 01:30:58 crc kubenswrapper[4606]: I1212 01:30:58.702612 4606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-zrk9d" podUID="b67be642-049f-46c4-9f3c-40e454379d98" containerName="registry-server" probeResult="failure" output=< Dec 12 01:30:58 crc kubenswrapper[4606]: timeout: failed to connect service ":50051" within 1s Dec 12 01:30:58 crc kubenswrapper[4606]: > Dec 12 01:31:02 crc kubenswrapper[4606]: I1212 01:31:02.010281 4606 patch_prober.go:28] interesting pod/machine-config-daemon-cqmz5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 12 01:31:02 crc kubenswrapper[4606]: I1212 01:31:02.010794 4606 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 12 01:31:02 crc kubenswrapper[4606]: I1212 01:31:02.010840 4606 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" Dec 12 01:31:02 crc kubenswrapper[4606]: I1212 01:31:02.011709 4606 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b29850fc0cc29c6aaef17ef54ee835cd2490230b421235d791c42309fba8ead5"} pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 12 01:31:02 crc kubenswrapper[4606]: I1212 01:31:02.011765 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" containerName="machine-config-daemon" containerID="cri-o://b29850fc0cc29c6aaef17ef54ee835cd2490230b421235d791c42309fba8ead5" gracePeriod=600 Dec 12 01:31:03 crc kubenswrapper[4606]: I1212 01:31:03.137049 4606 generic.go:334] "Generic (PLEG): container finished" podID="a543e227-be89-40cb-941d-b4707cc28921" containerID="b29850fc0cc29c6aaef17ef54ee835cd2490230b421235d791c42309fba8ead5" exitCode=0 Dec 12 01:31:03 crc kubenswrapper[4606]: I1212 01:31:03.137203 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" event={"ID":"a543e227-be89-40cb-941d-b4707cc28921","Type":"ContainerDied","Data":"b29850fc0cc29c6aaef17ef54ee835cd2490230b421235d791c42309fba8ead5"} Dec 12 01:31:03 crc kubenswrapper[4606]: I1212 01:31:03.137710 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" event={"ID":"a543e227-be89-40cb-941d-b4707cc28921","Type":"ContainerStarted","Data":"1632cda7e473e31214f35a52504e8863e32c4bf8c760028b86ec1d6ad9265c6d"} Dec 12 01:31:03 crc kubenswrapper[4606]: I1212 01:31:03.137744 4606 scope.go:117] "RemoveContainer" containerID="6cf0fcbed104ebcc1d2da0cc866e500a495d7669fd69152b8565fd158eb51ada" Dec 12 01:31:04 crc kubenswrapper[4606]: I1212 01:31:04.286617 4606 scope.go:117] "RemoveContainer" containerID="e99e5aad3425459457c138f7c529a53e5bfb99297b683e12d0fdd1457fd6c6e2" Dec 12 01:31:07 crc kubenswrapper[4606]: I1212 01:31:07.714707 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-zrk9d" Dec 12 01:31:07 crc kubenswrapper[4606]: I1212 01:31:07.765556 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-zrk9d" Dec 12 01:31:07 crc kubenswrapper[4606]: I1212 01:31:07.949073 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zrk9d"] Dec 12 01:31:09 crc kubenswrapper[4606]: I1212 01:31:09.194870 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-zrk9d" podUID="b67be642-049f-46c4-9f3c-40e454379d98" containerName="registry-server" containerID="cri-o://75171af308dabd7a6910b95e598ff31b0ac27de5d8534d66cadb9e8ce0207e7d" gracePeriod=2 Dec 12 01:31:09 crc kubenswrapper[4606]: I1212 01:31:09.904552 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zrk9d" Dec 12 01:31:09 crc kubenswrapper[4606]: I1212 01:31:09.954936 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rsrk2\" (UniqueName: \"kubernetes.io/projected/b67be642-049f-46c4-9f3c-40e454379d98-kube-api-access-rsrk2\") pod \"b67be642-049f-46c4-9f3c-40e454379d98\" (UID: \"b67be642-049f-46c4-9f3c-40e454379d98\") " Dec 12 01:31:09 crc kubenswrapper[4606]: I1212 01:31:09.955058 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b67be642-049f-46c4-9f3c-40e454379d98-utilities\") pod \"b67be642-049f-46c4-9f3c-40e454379d98\" (UID: \"b67be642-049f-46c4-9f3c-40e454379d98\") " Dec 12 01:31:09 crc kubenswrapper[4606]: I1212 01:31:09.955249 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b67be642-049f-46c4-9f3c-40e454379d98-catalog-content\") pod \"b67be642-049f-46c4-9f3c-40e454379d98\" (UID: \"b67be642-049f-46c4-9f3c-40e454379d98\") " Dec 12 01:31:09 crc kubenswrapper[4606]: I1212 01:31:09.955873 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b67be642-049f-46c4-9f3c-40e454379d98-utilities" (OuterVolumeSpecName: "utilities") pod "b67be642-049f-46c4-9f3c-40e454379d98" (UID: "b67be642-049f-46c4-9f3c-40e454379d98"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 01:31:09 crc kubenswrapper[4606]: I1212 01:31:09.961528 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b67be642-049f-46c4-9f3c-40e454379d98-kube-api-access-rsrk2" (OuterVolumeSpecName: "kube-api-access-rsrk2") pod "b67be642-049f-46c4-9f3c-40e454379d98" (UID: "b67be642-049f-46c4-9f3c-40e454379d98"). InnerVolumeSpecName "kube-api-access-rsrk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 01:31:10 crc kubenswrapper[4606]: I1212 01:31:10.008626 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b67be642-049f-46c4-9f3c-40e454379d98-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b67be642-049f-46c4-9f3c-40e454379d98" (UID: "b67be642-049f-46c4-9f3c-40e454379d98"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 01:31:10 crc kubenswrapper[4606]: I1212 01:31:10.058035 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rsrk2\" (UniqueName: \"kubernetes.io/projected/b67be642-049f-46c4-9f3c-40e454379d98-kube-api-access-rsrk2\") on node \"crc\" DevicePath \"\"" Dec 12 01:31:10 crc kubenswrapper[4606]: I1212 01:31:10.058090 4606 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b67be642-049f-46c4-9f3c-40e454379d98-utilities\") on node \"crc\" DevicePath \"\"" Dec 12 01:31:10 crc kubenswrapper[4606]: I1212 01:31:10.058115 4606 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b67be642-049f-46c4-9f3c-40e454379d98-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 12 01:31:10 crc kubenswrapper[4606]: I1212 01:31:10.206988 4606 generic.go:334] "Generic (PLEG): container finished" podID="b67be642-049f-46c4-9f3c-40e454379d98" containerID="75171af308dabd7a6910b95e598ff31b0ac27de5d8534d66cadb9e8ce0207e7d" exitCode=0 Dec 12 01:31:10 crc kubenswrapper[4606]: I1212 01:31:10.207031 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zrk9d" event={"ID":"b67be642-049f-46c4-9f3c-40e454379d98","Type":"ContainerDied","Data":"75171af308dabd7a6910b95e598ff31b0ac27de5d8534d66cadb9e8ce0207e7d"} Dec 12 01:31:10 crc kubenswrapper[4606]: I1212 01:31:10.207057 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zrk9d" event={"ID":"b67be642-049f-46c4-9f3c-40e454379d98","Type":"ContainerDied","Data":"baebd7a37110c7b6773701686e6fdf195db61e4884397b147ecf49d707b83ab2"} Dec 12 01:31:10 crc kubenswrapper[4606]: I1212 01:31:10.207068 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zrk9d" Dec 12 01:31:10 crc kubenswrapper[4606]: I1212 01:31:10.207075 4606 scope.go:117] "RemoveContainer" containerID="75171af308dabd7a6910b95e598ff31b0ac27de5d8534d66cadb9e8ce0207e7d" Dec 12 01:31:10 crc kubenswrapper[4606]: I1212 01:31:10.244589 4606 scope.go:117] "RemoveContainer" containerID="dce976602399809b2bf4be316e57a1e0d948236554de697ae7937c0e96644268" Dec 12 01:31:10 crc kubenswrapper[4606]: I1212 01:31:10.246675 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zrk9d"] Dec 12 01:31:10 crc kubenswrapper[4606]: I1212 01:31:10.261387 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-zrk9d"] Dec 12 01:31:10 crc kubenswrapper[4606]: I1212 01:31:10.303416 4606 scope.go:117] "RemoveContainer" containerID="51b31af81ed70f4317ccf7f3a0f51e5b42addc3f2bdf0ee2f54a33ad7cada81e" Dec 12 01:31:10 crc kubenswrapper[4606]: I1212 01:31:10.363833 4606 scope.go:117] "RemoveContainer" containerID="75171af308dabd7a6910b95e598ff31b0ac27de5d8534d66cadb9e8ce0207e7d" Dec 12 01:31:10 crc kubenswrapper[4606]: E1212 01:31:10.366330 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75171af308dabd7a6910b95e598ff31b0ac27de5d8534d66cadb9e8ce0207e7d\": container with ID starting with 75171af308dabd7a6910b95e598ff31b0ac27de5d8534d66cadb9e8ce0207e7d not found: ID does not exist" containerID="75171af308dabd7a6910b95e598ff31b0ac27de5d8534d66cadb9e8ce0207e7d" Dec 12 01:31:10 crc kubenswrapper[4606]: I1212 01:31:10.366513 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75171af308dabd7a6910b95e598ff31b0ac27de5d8534d66cadb9e8ce0207e7d"} err="failed to get container status \"75171af308dabd7a6910b95e598ff31b0ac27de5d8534d66cadb9e8ce0207e7d\": rpc error: code = NotFound desc = could not find container \"75171af308dabd7a6910b95e598ff31b0ac27de5d8534d66cadb9e8ce0207e7d\": container with ID starting with 75171af308dabd7a6910b95e598ff31b0ac27de5d8534d66cadb9e8ce0207e7d not found: ID does not exist" Dec 12 01:31:10 crc kubenswrapper[4606]: I1212 01:31:10.366622 4606 scope.go:117] "RemoveContainer" containerID="dce976602399809b2bf4be316e57a1e0d948236554de697ae7937c0e96644268" Dec 12 01:31:10 crc kubenswrapper[4606]: E1212 01:31:10.368956 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dce976602399809b2bf4be316e57a1e0d948236554de697ae7937c0e96644268\": container with ID starting with dce976602399809b2bf4be316e57a1e0d948236554de697ae7937c0e96644268 not found: ID does not exist" containerID="dce976602399809b2bf4be316e57a1e0d948236554de697ae7937c0e96644268" Dec 12 01:31:10 crc kubenswrapper[4606]: I1212 01:31:10.369053 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dce976602399809b2bf4be316e57a1e0d948236554de697ae7937c0e96644268"} err="failed to get container status \"dce976602399809b2bf4be316e57a1e0d948236554de697ae7937c0e96644268\": rpc error: code = NotFound desc = could not find container \"dce976602399809b2bf4be316e57a1e0d948236554de697ae7937c0e96644268\": container with ID starting with dce976602399809b2bf4be316e57a1e0d948236554de697ae7937c0e96644268 not found: ID does not exist" Dec 12 01:31:10 crc kubenswrapper[4606]: I1212 01:31:10.369118 4606 scope.go:117] "RemoveContainer" containerID="51b31af81ed70f4317ccf7f3a0f51e5b42addc3f2bdf0ee2f54a33ad7cada81e" Dec 12 01:31:10 crc kubenswrapper[4606]: E1212 01:31:10.372468 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51b31af81ed70f4317ccf7f3a0f51e5b42addc3f2bdf0ee2f54a33ad7cada81e\": container with ID starting with 51b31af81ed70f4317ccf7f3a0f51e5b42addc3f2bdf0ee2f54a33ad7cada81e not found: ID does not exist" containerID="51b31af81ed70f4317ccf7f3a0f51e5b42addc3f2bdf0ee2f54a33ad7cada81e" Dec 12 01:31:10 crc kubenswrapper[4606]: I1212 01:31:10.372514 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51b31af81ed70f4317ccf7f3a0f51e5b42addc3f2bdf0ee2f54a33ad7cada81e"} err="failed to get container status \"51b31af81ed70f4317ccf7f3a0f51e5b42addc3f2bdf0ee2f54a33ad7cada81e\": rpc error: code = NotFound desc = could not find container \"51b31af81ed70f4317ccf7f3a0f51e5b42addc3f2bdf0ee2f54a33ad7cada81e\": container with ID starting with 51b31af81ed70f4317ccf7f3a0f51e5b42addc3f2bdf0ee2f54a33ad7cada81e not found: ID does not exist" Dec 12 01:31:11 crc kubenswrapper[4606]: I1212 01:31:11.724569 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b67be642-049f-46c4-9f3c-40e454379d98" path="/var/lib/kubelet/pods/b67be642-049f-46c4-9f3c-40e454379d98/volumes" Dec 12 01:33:02 crc kubenswrapper[4606]: I1212 01:33:02.010687 4606 patch_prober.go:28] interesting pod/machine-config-daemon-cqmz5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 12 01:33:02 crc kubenswrapper[4606]: I1212 01:33:02.011309 4606 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 12 01:33:32 crc kubenswrapper[4606]: I1212 01:33:32.010645 4606 patch_prober.go:28] interesting pod/machine-config-daemon-cqmz5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 12 01:33:32 crc kubenswrapper[4606]: I1212 01:33:32.011326 4606 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 12 01:34:02 crc kubenswrapper[4606]: I1212 01:34:02.017926 4606 patch_prober.go:28] interesting pod/machine-config-daemon-cqmz5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 12 01:34:02 crc kubenswrapper[4606]: I1212 01:34:02.018929 4606 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 12 01:34:02 crc kubenswrapper[4606]: I1212 01:34:02.018978 4606 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" Dec 12 01:34:02 crc kubenswrapper[4606]: I1212 01:34:02.019926 4606 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1632cda7e473e31214f35a52504e8863e32c4bf8c760028b86ec1d6ad9265c6d"} pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 12 01:34:02 crc kubenswrapper[4606]: I1212 01:34:02.020065 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" containerName="machine-config-daemon" containerID="cri-o://1632cda7e473e31214f35a52504e8863e32c4bf8c760028b86ec1d6ad9265c6d" gracePeriod=600 Dec 12 01:34:02 crc kubenswrapper[4606]: E1212 01:34:02.153621 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 01:34:02 crc kubenswrapper[4606]: I1212 01:34:02.635109 4606 generic.go:334] "Generic (PLEG): container finished" podID="a543e227-be89-40cb-941d-b4707cc28921" containerID="1632cda7e473e31214f35a52504e8863e32c4bf8c760028b86ec1d6ad9265c6d" exitCode=0 Dec 12 01:34:02 crc kubenswrapper[4606]: I1212 01:34:02.635152 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" event={"ID":"a543e227-be89-40cb-941d-b4707cc28921","Type":"ContainerDied","Data":"1632cda7e473e31214f35a52504e8863e32c4bf8c760028b86ec1d6ad9265c6d"} Dec 12 01:34:02 crc kubenswrapper[4606]: I1212 01:34:02.635209 4606 scope.go:117] "RemoveContainer" containerID="b29850fc0cc29c6aaef17ef54ee835cd2490230b421235d791c42309fba8ead5" Dec 12 01:34:02 crc kubenswrapper[4606]: I1212 01:34:02.635982 4606 scope.go:117] "RemoveContainer" containerID="1632cda7e473e31214f35a52504e8863e32c4bf8c760028b86ec1d6ad9265c6d" Dec 12 01:34:02 crc kubenswrapper[4606]: E1212 01:34:02.636450 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 01:34:17 crc kubenswrapper[4606]: I1212 01:34:17.699577 4606 scope.go:117] "RemoveContainer" containerID="1632cda7e473e31214f35a52504e8863e32c4bf8c760028b86ec1d6ad9265c6d" Dec 12 01:34:17 crc kubenswrapper[4606]: E1212 01:34:17.701310 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 01:34:32 crc kubenswrapper[4606]: I1212 01:34:32.700614 4606 scope.go:117] "RemoveContainer" containerID="1632cda7e473e31214f35a52504e8863e32c4bf8c760028b86ec1d6ad9265c6d" Dec 12 01:34:32 crc kubenswrapper[4606]: E1212 01:34:32.701975 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 01:34:47 crc kubenswrapper[4606]: I1212 01:34:47.703298 4606 scope.go:117] "RemoveContainer" containerID="1632cda7e473e31214f35a52504e8863e32c4bf8c760028b86ec1d6ad9265c6d" Dec 12 01:34:47 crc kubenswrapper[4606]: E1212 01:34:47.704415 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 01:34:59 crc kubenswrapper[4606]: I1212 01:34:59.699858 4606 scope.go:117] "RemoveContainer" containerID="1632cda7e473e31214f35a52504e8863e32c4bf8c760028b86ec1d6ad9265c6d" Dec 12 01:34:59 crc kubenswrapper[4606]: E1212 01:34:59.700911 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 01:35:11 crc kubenswrapper[4606]: I1212 01:35:11.700015 4606 scope.go:117] "RemoveContainer" containerID="1632cda7e473e31214f35a52504e8863e32c4bf8c760028b86ec1d6ad9265c6d" Dec 12 01:35:11 crc kubenswrapper[4606]: E1212 01:35:11.700863 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 01:35:26 crc kubenswrapper[4606]: I1212 01:35:26.700417 4606 scope.go:117] "RemoveContainer" containerID="1632cda7e473e31214f35a52504e8863e32c4bf8c760028b86ec1d6ad9265c6d" Dec 12 01:35:26 crc kubenswrapper[4606]: E1212 01:35:26.701151 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 01:35:41 crc kubenswrapper[4606]: I1212 01:35:41.701276 4606 scope.go:117] "RemoveContainer" containerID="1632cda7e473e31214f35a52504e8863e32c4bf8c760028b86ec1d6ad9265c6d" Dec 12 01:35:41 crc kubenswrapper[4606]: E1212 01:35:41.702356 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 01:35:53 crc kubenswrapper[4606]: I1212 01:35:53.700878 4606 scope.go:117] "RemoveContainer" containerID="1632cda7e473e31214f35a52504e8863e32c4bf8c760028b86ec1d6ad9265c6d" Dec 12 01:35:53 crc kubenswrapper[4606]: E1212 01:35:53.701771 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 01:36:05 crc kubenswrapper[4606]: I1212 01:36:05.702001 4606 scope.go:117] "RemoveContainer" containerID="1632cda7e473e31214f35a52504e8863e32c4bf8c760028b86ec1d6ad9265c6d" Dec 12 01:36:05 crc kubenswrapper[4606]: E1212 01:36:05.702747 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 01:36:19 crc kubenswrapper[4606]: I1212 01:36:19.716928 4606 scope.go:117] "RemoveContainer" containerID="1632cda7e473e31214f35a52504e8863e32c4bf8c760028b86ec1d6ad9265c6d" Dec 12 01:36:19 crc kubenswrapper[4606]: E1212 01:36:19.718512 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 01:36:30 crc kubenswrapper[4606]: I1212 01:36:30.700279 4606 scope.go:117] "RemoveContainer" containerID="1632cda7e473e31214f35a52504e8863e32c4bf8c760028b86ec1d6ad9265c6d" Dec 12 01:36:30 crc kubenswrapper[4606]: E1212 01:36:30.702341 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 01:36:42 crc kubenswrapper[4606]: I1212 01:36:42.699740 4606 scope.go:117] "RemoveContainer" containerID="1632cda7e473e31214f35a52504e8863e32c4bf8c760028b86ec1d6ad9265c6d" Dec 12 01:36:42 crc kubenswrapper[4606]: E1212 01:36:42.701296 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 01:36:57 crc kubenswrapper[4606]: I1212 01:36:57.699307 4606 scope.go:117] "RemoveContainer" containerID="1632cda7e473e31214f35a52504e8863e32c4bf8c760028b86ec1d6ad9265c6d" Dec 12 01:36:57 crc kubenswrapper[4606]: E1212 01:36:57.699997 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 01:37:00 crc kubenswrapper[4606]: I1212 01:37:00.274857 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-gqsx2"] Dec 12 01:37:00 crc kubenswrapper[4606]: E1212 01:37:00.275558 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b67be642-049f-46c4-9f3c-40e454379d98" containerName="extract-content" Dec 12 01:37:00 crc kubenswrapper[4606]: I1212 01:37:00.275573 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="b67be642-049f-46c4-9f3c-40e454379d98" containerName="extract-content" Dec 12 01:37:00 crc kubenswrapper[4606]: E1212 01:37:00.275590 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="947a9c91-0411-4d85-a9a1-ecce288235a9" containerName="extract-utilities" Dec 12 01:37:00 crc kubenswrapper[4606]: I1212 01:37:00.275598 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="947a9c91-0411-4d85-a9a1-ecce288235a9" containerName="extract-utilities" Dec 12 01:37:00 crc kubenswrapper[4606]: E1212 01:37:00.275614 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b67be642-049f-46c4-9f3c-40e454379d98" containerName="extract-utilities" Dec 12 01:37:00 crc kubenswrapper[4606]: I1212 01:37:00.275623 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="b67be642-049f-46c4-9f3c-40e454379d98" containerName="extract-utilities" Dec 12 01:37:00 crc kubenswrapper[4606]: E1212 01:37:00.275641 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="947a9c91-0411-4d85-a9a1-ecce288235a9" containerName="extract-content" Dec 12 01:37:00 crc kubenswrapper[4606]: I1212 01:37:00.275648 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="947a9c91-0411-4d85-a9a1-ecce288235a9" containerName="extract-content" Dec 12 01:37:00 crc kubenswrapper[4606]: E1212 01:37:00.275660 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="947a9c91-0411-4d85-a9a1-ecce288235a9" containerName="registry-server" Dec 12 01:37:00 crc kubenswrapper[4606]: I1212 01:37:00.275668 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="947a9c91-0411-4d85-a9a1-ecce288235a9" containerName="registry-server" Dec 12 01:37:00 crc kubenswrapper[4606]: E1212 01:37:00.275682 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b67be642-049f-46c4-9f3c-40e454379d98" containerName="registry-server" Dec 12 01:37:00 crc kubenswrapper[4606]: I1212 01:37:00.275691 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="b67be642-049f-46c4-9f3c-40e454379d98" containerName="registry-server" Dec 12 01:37:00 crc kubenswrapper[4606]: I1212 01:37:00.275909 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="b67be642-049f-46c4-9f3c-40e454379d98" containerName="registry-server" Dec 12 01:37:00 crc kubenswrapper[4606]: I1212 01:37:00.275942 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="947a9c91-0411-4d85-a9a1-ecce288235a9" containerName="registry-server" Dec 12 01:37:00 crc kubenswrapper[4606]: I1212 01:37:00.277641 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gqsx2" Dec 12 01:37:00 crc kubenswrapper[4606]: I1212 01:37:00.292028 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gqsx2"] Dec 12 01:37:00 crc kubenswrapper[4606]: I1212 01:37:00.358553 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0cc07a26-a736-4069-b1ba-fd4d6bfe94d4-catalog-content\") pod \"redhat-operators-gqsx2\" (UID: \"0cc07a26-a736-4069-b1ba-fd4d6bfe94d4\") " pod="openshift-marketplace/redhat-operators-gqsx2" Dec 12 01:37:00 crc kubenswrapper[4606]: I1212 01:37:00.358646 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jsbdt\" (UniqueName: \"kubernetes.io/projected/0cc07a26-a736-4069-b1ba-fd4d6bfe94d4-kube-api-access-jsbdt\") pod \"redhat-operators-gqsx2\" (UID: \"0cc07a26-a736-4069-b1ba-fd4d6bfe94d4\") " pod="openshift-marketplace/redhat-operators-gqsx2" Dec 12 01:37:00 crc kubenswrapper[4606]: I1212 01:37:00.358705 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0cc07a26-a736-4069-b1ba-fd4d6bfe94d4-utilities\") pod \"redhat-operators-gqsx2\" (UID: \"0cc07a26-a736-4069-b1ba-fd4d6bfe94d4\") " pod="openshift-marketplace/redhat-operators-gqsx2" Dec 12 01:37:00 crc kubenswrapper[4606]: I1212 01:37:00.460390 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jsbdt\" (UniqueName: \"kubernetes.io/projected/0cc07a26-a736-4069-b1ba-fd4d6bfe94d4-kube-api-access-jsbdt\") pod \"redhat-operators-gqsx2\" (UID: \"0cc07a26-a736-4069-b1ba-fd4d6bfe94d4\") " pod="openshift-marketplace/redhat-operators-gqsx2" Dec 12 01:37:00 crc kubenswrapper[4606]: I1212 01:37:00.460508 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0cc07a26-a736-4069-b1ba-fd4d6bfe94d4-utilities\") pod \"redhat-operators-gqsx2\" (UID: \"0cc07a26-a736-4069-b1ba-fd4d6bfe94d4\") " pod="openshift-marketplace/redhat-operators-gqsx2" Dec 12 01:37:00 crc kubenswrapper[4606]: I1212 01:37:00.460987 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0cc07a26-a736-4069-b1ba-fd4d6bfe94d4-utilities\") pod \"redhat-operators-gqsx2\" (UID: \"0cc07a26-a736-4069-b1ba-fd4d6bfe94d4\") " pod="openshift-marketplace/redhat-operators-gqsx2" Dec 12 01:37:00 crc kubenswrapper[4606]: I1212 01:37:00.461124 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0cc07a26-a736-4069-b1ba-fd4d6bfe94d4-catalog-content\") pod \"redhat-operators-gqsx2\" (UID: \"0cc07a26-a736-4069-b1ba-fd4d6bfe94d4\") " pod="openshift-marketplace/redhat-operators-gqsx2" Dec 12 01:37:00 crc kubenswrapper[4606]: I1212 01:37:00.461426 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0cc07a26-a736-4069-b1ba-fd4d6bfe94d4-catalog-content\") pod \"redhat-operators-gqsx2\" (UID: \"0cc07a26-a736-4069-b1ba-fd4d6bfe94d4\") " pod="openshift-marketplace/redhat-operators-gqsx2" Dec 12 01:37:00 crc kubenswrapper[4606]: I1212 01:37:00.484490 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jsbdt\" (UniqueName: \"kubernetes.io/projected/0cc07a26-a736-4069-b1ba-fd4d6bfe94d4-kube-api-access-jsbdt\") pod \"redhat-operators-gqsx2\" (UID: \"0cc07a26-a736-4069-b1ba-fd4d6bfe94d4\") " pod="openshift-marketplace/redhat-operators-gqsx2" Dec 12 01:37:00 crc kubenswrapper[4606]: I1212 01:37:00.614503 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gqsx2" Dec 12 01:37:01 crc kubenswrapper[4606]: I1212 01:37:01.226764 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gqsx2"] Dec 12 01:37:01 crc kubenswrapper[4606]: I1212 01:37:01.349701 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gqsx2" event={"ID":"0cc07a26-a736-4069-b1ba-fd4d6bfe94d4","Type":"ContainerStarted","Data":"20227b7f8b0e8f76797bad737a1a8930a553cf68be6de20c05e8ede3904ea031"} Dec 12 01:37:02 crc kubenswrapper[4606]: I1212 01:37:02.365683 4606 generic.go:334] "Generic (PLEG): container finished" podID="0cc07a26-a736-4069-b1ba-fd4d6bfe94d4" containerID="8e88ebbab681835f363a52fab4c8467a2ebe6fbf1031e29d3957a78b1a88e8e1" exitCode=0 Dec 12 01:37:02 crc kubenswrapper[4606]: I1212 01:37:02.366021 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gqsx2" event={"ID":"0cc07a26-a736-4069-b1ba-fd4d6bfe94d4","Type":"ContainerDied","Data":"8e88ebbab681835f363a52fab4c8467a2ebe6fbf1031e29d3957a78b1a88e8e1"} Dec 12 01:37:02 crc kubenswrapper[4606]: I1212 01:37:02.369054 4606 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 12 01:37:05 crc kubenswrapper[4606]: I1212 01:37:05.412167 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gqsx2" event={"ID":"0cc07a26-a736-4069-b1ba-fd4d6bfe94d4","Type":"ContainerStarted","Data":"a427cd1011a9ac04b9423b57b58e0587d79e090337ab4b6318a5dc8ec6620568"} Dec 12 01:37:08 crc kubenswrapper[4606]: I1212 01:37:08.699459 4606 scope.go:117] "RemoveContainer" containerID="1632cda7e473e31214f35a52504e8863e32c4bf8c760028b86ec1d6ad9265c6d" Dec 12 01:37:08 crc kubenswrapper[4606]: E1212 01:37:08.700294 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 01:37:10 crc kubenswrapper[4606]: I1212 01:37:10.458519 4606 generic.go:334] "Generic (PLEG): container finished" podID="0cc07a26-a736-4069-b1ba-fd4d6bfe94d4" containerID="a427cd1011a9ac04b9423b57b58e0587d79e090337ab4b6318a5dc8ec6620568" exitCode=0 Dec 12 01:37:10 crc kubenswrapper[4606]: I1212 01:37:10.458738 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gqsx2" event={"ID":"0cc07a26-a736-4069-b1ba-fd4d6bfe94d4","Type":"ContainerDied","Data":"a427cd1011a9ac04b9423b57b58e0587d79e090337ab4b6318a5dc8ec6620568"} Dec 12 01:37:11 crc kubenswrapper[4606]: I1212 01:37:11.473305 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gqsx2" event={"ID":"0cc07a26-a736-4069-b1ba-fd4d6bfe94d4","Type":"ContainerStarted","Data":"c0ab4daf560ab4a631394f4367e0851d41a313e04249000477c48d28f292bdd2"} Dec 12 01:37:11 crc kubenswrapper[4606]: I1212 01:37:11.500636 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-gqsx2" podStartSLOduration=2.6830314140000002 podStartE2EDuration="11.500593718s" podCreationTimestamp="2025-12-12 01:37:00 +0000 UTC" firstStartedPulling="2025-12-12 01:37:02.368775218 +0000 UTC m=+4412.914128084" lastFinishedPulling="2025-12-12 01:37:11.186337522 +0000 UTC m=+4421.731690388" observedRunningTime="2025-12-12 01:37:11.490319675 +0000 UTC m=+4422.035672551" watchObservedRunningTime="2025-12-12 01:37:11.500593718 +0000 UTC m=+4422.045946584" Dec 12 01:37:20 crc kubenswrapper[4606]: I1212 01:37:20.615541 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-gqsx2" Dec 12 01:37:20 crc kubenswrapper[4606]: I1212 01:37:20.616074 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-gqsx2" Dec 12 01:37:21 crc kubenswrapper[4606]: I1212 01:37:21.679084 4606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-gqsx2" podUID="0cc07a26-a736-4069-b1ba-fd4d6bfe94d4" containerName="registry-server" probeResult="failure" output=< Dec 12 01:37:21 crc kubenswrapper[4606]: timeout: failed to connect service ":50051" within 1s Dec 12 01:37:21 crc kubenswrapper[4606]: > Dec 12 01:37:23 crc kubenswrapper[4606]: I1212 01:37:23.700119 4606 scope.go:117] "RemoveContainer" containerID="1632cda7e473e31214f35a52504e8863e32c4bf8c760028b86ec1d6ad9265c6d" Dec 12 01:37:23 crc kubenswrapper[4606]: E1212 01:37:23.700366 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 01:37:30 crc kubenswrapper[4606]: I1212 01:37:30.675995 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-gqsx2" Dec 12 01:37:30 crc kubenswrapper[4606]: I1212 01:37:30.729643 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-gqsx2" Dec 12 01:37:31 crc kubenswrapper[4606]: I1212 01:37:31.481192 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gqsx2"] Dec 12 01:37:32 crc kubenswrapper[4606]: I1212 01:37:32.652243 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-gqsx2" podUID="0cc07a26-a736-4069-b1ba-fd4d6bfe94d4" containerName="registry-server" containerID="cri-o://c0ab4daf560ab4a631394f4367e0851d41a313e04249000477c48d28f292bdd2" gracePeriod=2 Dec 12 01:37:33 crc kubenswrapper[4606]: I1212 01:37:33.249211 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gqsx2" Dec 12 01:37:33 crc kubenswrapper[4606]: I1212 01:37:33.355569 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0cc07a26-a736-4069-b1ba-fd4d6bfe94d4-utilities\") pod \"0cc07a26-a736-4069-b1ba-fd4d6bfe94d4\" (UID: \"0cc07a26-a736-4069-b1ba-fd4d6bfe94d4\") " Dec 12 01:37:33 crc kubenswrapper[4606]: I1212 01:37:33.355698 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0cc07a26-a736-4069-b1ba-fd4d6bfe94d4-catalog-content\") pod \"0cc07a26-a736-4069-b1ba-fd4d6bfe94d4\" (UID: \"0cc07a26-a736-4069-b1ba-fd4d6bfe94d4\") " Dec 12 01:37:33 crc kubenswrapper[4606]: I1212 01:37:33.355779 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jsbdt\" (UniqueName: \"kubernetes.io/projected/0cc07a26-a736-4069-b1ba-fd4d6bfe94d4-kube-api-access-jsbdt\") pod \"0cc07a26-a736-4069-b1ba-fd4d6bfe94d4\" (UID: \"0cc07a26-a736-4069-b1ba-fd4d6bfe94d4\") " Dec 12 01:37:33 crc kubenswrapper[4606]: I1212 01:37:33.357261 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0cc07a26-a736-4069-b1ba-fd4d6bfe94d4-utilities" (OuterVolumeSpecName: "utilities") pod "0cc07a26-a736-4069-b1ba-fd4d6bfe94d4" (UID: "0cc07a26-a736-4069-b1ba-fd4d6bfe94d4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 01:37:33 crc kubenswrapper[4606]: I1212 01:37:33.360908 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0cc07a26-a736-4069-b1ba-fd4d6bfe94d4-kube-api-access-jsbdt" (OuterVolumeSpecName: "kube-api-access-jsbdt") pod "0cc07a26-a736-4069-b1ba-fd4d6bfe94d4" (UID: "0cc07a26-a736-4069-b1ba-fd4d6bfe94d4"). InnerVolumeSpecName "kube-api-access-jsbdt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 01:37:33 crc kubenswrapper[4606]: I1212 01:37:33.457426 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jsbdt\" (UniqueName: \"kubernetes.io/projected/0cc07a26-a736-4069-b1ba-fd4d6bfe94d4-kube-api-access-jsbdt\") on node \"crc\" DevicePath \"\"" Dec 12 01:37:33 crc kubenswrapper[4606]: I1212 01:37:33.457454 4606 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0cc07a26-a736-4069-b1ba-fd4d6bfe94d4-utilities\") on node \"crc\" DevicePath \"\"" Dec 12 01:37:33 crc kubenswrapper[4606]: I1212 01:37:33.465562 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0cc07a26-a736-4069-b1ba-fd4d6bfe94d4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0cc07a26-a736-4069-b1ba-fd4d6bfe94d4" (UID: "0cc07a26-a736-4069-b1ba-fd4d6bfe94d4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 01:37:33 crc kubenswrapper[4606]: I1212 01:37:33.558750 4606 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0cc07a26-a736-4069-b1ba-fd4d6bfe94d4-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 12 01:37:33 crc kubenswrapper[4606]: I1212 01:37:33.667086 4606 generic.go:334] "Generic (PLEG): container finished" podID="0cc07a26-a736-4069-b1ba-fd4d6bfe94d4" containerID="c0ab4daf560ab4a631394f4367e0851d41a313e04249000477c48d28f292bdd2" exitCode=0 Dec 12 01:37:33 crc kubenswrapper[4606]: I1212 01:37:33.667158 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gqsx2" event={"ID":"0cc07a26-a736-4069-b1ba-fd4d6bfe94d4","Type":"ContainerDied","Data":"c0ab4daf560ab4a631394f4367e0851d41a313e04249000477c48d28f292bdd2"} Dec 12 01:37:33 crc kubenswrapper[4606]: I1212 01:37:33.667234 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gqsx2" Dec 12 01:37:33 crc kubenswrapper[4606]: I1212 01:37:33.667261 4606 scope.go:117] "RemoveContainer" containerID="c0ab4daf560ab4a631394f4367e0851d41a313e04249000477c48d28f292bdd2" Dec 12 01:37:33 crc kubenswrapper[4606]: I1212 01:37:33.667244 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gqsx2" event={"ID":"0cc07a26-a736-4069-b1ba-fd4d6bfe94d4","Type":"ContainerDied","Data":"20227b7f8b0e8f76797bad737a1a8930a553cf68be6de20c05e8ede3904ea031"} Dec 12 01:37:33 crc kubenswrapper[4606]: I1212 01:37:33.745860 4606 scope.go:117] "RemoveContainer" containerID="a427cd1011a9ac04b9423b57b58e0587d79e090337ab4b6318a5dc8ec6620568" Dec 12 01:37:33 crc kubenswrapper[4606]: I1212 01:37:33.758423 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gqsx2"] Dec 12 01:37:33 crc kubenswrapper[4606]: I1212 01:37:33.758475 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-gqsx2"] Dec 12 01:37:33 crc kubenswrapper[4606]: I1212 01:37:33.786953 4606 scope.go:117] "RemoveContainer" containerID="8e88ebbab681835f363a52fab4c8467a2ebe6fbf1031e29d3957a78b1a88e8e1" Dec 12 01:37:33 crc kubenswrapper[4606]: I1212 01:37:33.832880 4606 scope.go:117] "RemoveContainer" containerID="c0ab4daf560ab4a631394f4367e0851d41a313e04249000477c48d28f292bdd2" Dec 12 01:37:33 crc kubenswrapper[4606]: E1212 01:37:33.833505 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0ab4daf560ab4a631394f4367e0851d41a313e04249000477c48d28f292bdd2\": container with ID starting with c0ab4daf560ab4a631394f4367e0851d41a313e04249000477c48d28f292bdd2 not found: ID does not exist" containerID="c0ab4daf560ab4a631394f4367e0851d41a313e04249000477c48d28f292bdd2" Dec 12 01:37:33 crc kubenswrapper[4606]: I1212 01:37:33.833533 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0ab4daf560ab4a631394f4367e0851d41a313e04249000477c48d28f292bdd2"} err="failed to get container status \"c0ab4daf560ab4a631394f4367e0851d41a313e04249000477c48d28f292bdd2\": rpc error: code = NotFound desc = could not find container \"c0ab4daf560ab4a631394f4367e0851d41a313e04249000477c48d28f292bdd2\": container with ID starting with c0ab4daf560ab4a631394f4367e0851d41a313e04249000477c48d28f292bdd2 not found: ID does not exist" Dec 12 01:37:33 crc kubenswrapper[4606]: I1212 01:37:33.833554 4606 scope.go:117] "RemoveContainer" containerID="a427cd1011a9ac04b9423b57b58e0587d79e090337ab4b6318a5dc8ec6620568" Dec 12 01:37:33 crc kubenswrapper[4606]: E1212 01:37:33.839072 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a427cd1011a9ac04b9423b57b58e0587d79e090337ab4b6318a5dc8ec6620568\": container with ID starting with a427cd1011a9ac04b9423b57b58e0587d79e090337ab4b6318a5dc8ec6620568 not found: ID does not exist" containerID="a427cd1011a9ac04b9423b57b58e0587d79e090337ab4b6318a5dc8ec6620568" Dec 12 01:37:33 crc kubenswrapper[4606]: I1212 01:37:33.839107 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a427cd1011a9ac04b9423b57b58e0587d79e090337ab4b6318a5dc8ec6620568"} err="failed to get container status \"a427cd1011a9ac04b9423b57b58e0587d79e090337ab4b6318a5dc8ec6620568\": rpc error: code = NotFound desc = could not find container \"a427cd1011a9ac04b9423b57b58e0587d79e090337ab4b6318a5dc8ec6620568\": container with ID starting with a427cd1011a9ac04b9423b57b58e0587d79e090337ab4b6318a5dc8ec6620568 not found: ID does not exist" Dec 12 01:37:33 crc kubenswrapper[4606]: I1212 01:37:33.839131 4606 scope.go:117] "RemoveContainer" containerID="8e88ebbab681835f363a52fab4c8467a2ebe6fbf1031e29d3957a78b1a88e8e1" Dec 12 01:37:33 crc kubenswrapper[4606]: E1212 01:37:33.841473 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e88ebbab681835f363a52fab4c8467a2ebe6fbf1031e29d3957a78b1a88e8e1\": container with ID starting with 8e88ebbab681835f363a52fab4c8467a2ebe6fbf1031e29d3957a78b1a88e8e1 not found: ID does not exist" containerID="8e88ebbab681835f363a52fab4c8467a2ebe6fbf1031e29d3957a78b1a88e8e1" Dec 12 01:37:33 crc kubenswrapper[4606]: I1212 01:37:33.841502 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e88ebbab681835f363a52fab4c8467a2ebe6fbf1031e29d3957a78b1a88e8e1"} err="failed to get container status \"8e88ebbab681835f363a52fab4c8467a2ebe6fbf1031e29d3957a78b1a88e8e1\": rpc error: code = NotFound desc = could not find container \"8e88ebbab681835f363a52fab4c8467a2ebe6fbf1031e29d3957a78b1a88e8e1\": container with ID starting with 8e88ebbab681835f363a52fab4c8467a2ebe6fbf1031e29d3957a78b1a88e8e1 not found: ID does not exist" Dec 12 01:37:34 crc kubenswrapper[4606]: I1212 01:37:34.700248 4606 scope.go:117] "RemoveContainer" containerID="1632cda7e473e31214f35a52504e8863e32c4bf8c760028b86ec1d6ad9265c6d" Dec 12 01:37:34 crc kubenswrapper[4606]: E1212 01:37:34.701059 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 01:37:35 crc kubenswrapper[4606]: I1212 01:37:35.711201 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0cc07a26-a736-4069-b1ba-fd4d6bfe94d4" path="/var/lib/kubelet/pods/0cc07a26-a736-4069-b1ba-fd4d6bfe94d4/volumes" Dec 12 01:37:47 crc kubenswrapper[4606]: I1212 01:37:47.699974 4606 scope.go:117] "RemoveContainer" containerID="1632cda7e473e31214f35a52504e8863e32c4bf8c760028b86ec1d6ad9265c6d" Dec 12 01:37:47 crc kubenswrapper[4606]: E1212 01:37:47.700915 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 01:37:59 crc kubenswrapper[4606]: I1212 01:37:59.706690 4606 scope.go:117] "RemoveContainer" containerID="1632cda7e473e31214f35a52504e8863e32c4bf8c760028b86ec1d6ad9265c6d" Dec 12 01:37:59 crc kubenswrapper[4606]: E1212 01:37:59.711571 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 01:38:11 crc kubenswrapper[4606]: I1212 01:38:11.705773 4606 scope.go:117] "RemoveContainer" containerID="1632cda7e473e31214f35a52504e8863e32c4bf8c760028b86ec1d6ad9265c6d" Dec 12 01:38:11 crc kubenswrapper[4606]: E1212 01:38:11.706385 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 01:38:26 crc kubenswrapper[4606]: I1212 01:38:26.700051 4606 scope.go:117] "RemoveContainer" containerID="1632cda7e473e31214f35a52504e8863e32c4bf8c760028b86ec1d6ad9265c6d" Dec 12 01:38:26 crc kubenswrapper[4606]: E1212 01:38:26.700885 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 01:38:40 crc kubenswrapper[4606]: I1212 01:38:40.701565 4606 scope.go:117] "RemoveContainer" containerID="1632cda7e473e31214f35a52504e8863e32c4bf8c760028b86ec1d6ad9265c6d" Dec 12 01:38:40 crc kubenswrapper[4606]: E1212 01:38:40.702422 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 01:38:49 crc kubenswrapper[4606]: I1212 01:38:49.321305 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-slpjk"] Dec 12 01:38:49 crc kubenswrapper[4606]: E1212 01:38:49.322115 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cc07a26-a736-4069-b1ba-fd4d6bfe94d4" containerName="extract-utilities" Dec 12 01:38:49 crc kubenswrapper[4606]: I1212 01:38:49.322128 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cc07a26-a736-4069-b1ba-fd4d6bfe94d4" containerName="extract-utilities" Dec 12 01:38:49 crc kubenswrapper[4606]: E1212 01:38:49.322158 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cc07a26-a736-4069-b1ba-fd4d6bfe94d4" containerName="registry-server" Dec 12 01:38:49 crc kubenswrapper[4606]: I1212 01:38:49.322164 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cc07a26-a736-4069-b1ba-fd4d6bfe94d4" containerName="registry-server" Dec 12 01:38:49 crc kubenswrapper[4606]: E1212 01:38:49.322191 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cc07a26-a736-4069-b1ba-fd4d6bfe94d4" containerName="extract-content" Dec 12 01:38:49 crc kubenswrapper[4606]: I1212 01:38:49.322198 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cc07a26-a736-4069-b1ba-fd4d6bfe94d4" containerName="extract-content" Dec 12 01:38:49 crc kubenswrapper[4606]: I1212 01:38:49.322404 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="0cc07a26-a736-4069-b1ba-fd4d6bfe94d4" containerName="registry-server" Dec 12 01:38:49 crc kubenswrapper[4606]: I1212 01:38:49.323864 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-slpjk" Dec 12 01:38:49 crc kubenswrapper[4606]: I1212 01:38:49.333110 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-slpjk"] Dec 12 01:38:49 crc kubenswrapper[4606]: I1212 01:38:49.388663 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3bf91206-a2cc-486f-b03b-a9029dc6d183-utilities\") pod \"community-operators-slpjk\" (UID: \"3bf91206-a2cc-486f-b03b-a9029dc6d183\") " pod="openshift-marketplace/community-operators-slpjk" Dec 12 01:38:49 crc kubenswrapper[4606]: I1212 01:38:49.388716 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3bf91206-a2cc-486f-b03b-a9029dc6d183-catalog-content\") pod \"community-operators-slpjk\" (UID: \"3bf91206-a2cc-486f-b03b-a9029dc6d183\") " pod="openshift-marketplace/community-operators-slpjk" Dec 12 01:38:49 crc kubenswrapper[4606]: I1212 01:38:49.388735 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9frvw\" (UniqueName: \"kubernetes.io/projected/3bf91206-a2cc-486f-b03b-a9029dc6d183-kube-api-access-9frvw\") pod \"community-operators-slpjk\" (UID: \"3bf91206-a2cc-486f-b03b-a9029dc6d183\") " pod="openshift-marketplace/community-operators-slpjk" Dec 12 01:38:49 crc kubenswrapper[4606]: I1212 01:38:49.490034 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3bf91206-a2cc-486f-b03b-a9029dc6d183-utilities\") pod \"community-operators-slpjk\" (UID: \"3bf91206-a2cc-486f-b03b-a9029dc6d183\") " pod="openshift-marketplace/community-operators-slpjk" Dec 12 01:38:49 crc kubenswrapper[4606]: I1212 01:38:49.490101 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3bf91206-a2cc-486f-b03b-a9029dc6d183-catalog-content\") pod \"community-operators-slpjk\" (UID: \"3bf91206-a2cc-486f-b03b-a9029dc6d183\") " pod="openshift-marketplace/community-operators-slpjk" Dec 12 01:38:49 crc kubenswrapper[4606]: I1212 01:38:49.490142 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9frvw\" (UniqueName: \"kubernetes.io/projected/3bf91206-a2cc-486f-b03b-a9029dc6d183-kube-api-access-9frvw\") pod \"community-operators-slpjk\" (UID: \"3bf91206-a2cc-486f-b03b-a9029dc6d183\") " pod="openshift-marketplace/community-operators-slpjk" Dec 12 01:38:49 crc kubenswrapper[4606]: I1212 01:38:49.490651 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3bf91206-a2cc-486f-b03b-a9029dc6d183-catalog-content\") pod \"community-operators-slpjk\" (UID: \"3bf91206-a2cc-486f-b03b-a9029dc6d183\") " pod="openshift-marketplace/community-operators-slpjk" Dec 12 01:38:49 crc kubenswrapper[4606]: I1212 01:38:49.490873 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3bf91206-a2cc-486f-b03b-a9029dc6d183-utilities\") pod \"community-operators-slpjk\" (UID: \"3bf91206-a2cc-486f-b03b-a9029dc6d183\") " pod="openshift-marketplace/community-operators-slpjk" Dec 12 01:38:49 crc kubenswrapper[4606]: I1212 01:38:49.508999 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9frvw\" (UniqueName: \"kubernetes.io/projected/3bf91206-a2cc-486f-b03b-a9029dc6d183-kube-api-access-9frvw\") pod \"community-operators-slpjk\" (UID: \"3bf91206-a2cc-486f-b03b-a9029dc6d183\") " pod="openshift-marketplace/community-operators-slpjk" Dec 12 01:38:49 crc kubenswrapper[4606]: I1212 01:38:49.643343 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-slpjk" Dec 12 01:38:50 crc kubenswrapper[4606]: I1212 01:38:50.245779 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-slpjk"] Dec 12 01:38:50 crc kubenswrapper[4606]: I1212 01:38:50.764109 4606 generic.go:334] "Generic (PLEG): container finished" podID="3bf91206-a2cc-486f-b03b-a9029dc6d183" containerID="02fe82f1a41e7bd9fc6188ecad7d991eb70114789b410565be8cd043a474b0a0" exitCode=0 Dec 12 01:38:50 crc kubenswrapper[4606]: I1212 01:38:50.764348 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-slpjk" event={"ID":"3bf91206-a2cc-486f-b03b-a9029dc6d183","Type":"ContainerDied","Data":"02fe82f1a41e7bd9fc6188ecad7d991eb70114789b410565be8cd043a474b0a0"} Dec 12 01:38:50 crc kubenswrapper[4606]: I1212 01:38:50.766300 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-slpjk" event={"ID":"3bf91206-a2cc-486f-b03b-a9029dc6d183","Type":"ContainerStarted","Data":"853f79d557f621fde34fa2363c0d29e8d9dbb651055b55170f507e537cc74872"} Dec 12 01:38:51 crc kubenswrapper[4606]: I1212 01:38:51.699609 4606 scope.go:117] "RemoveContainer" containerID="1632cda7e473e31214f35a52504e8863e32c4bf8c760028b86ec1d6ad9265c6d" Dec 12 01:38:51 crc kubenswrapper[4606]: E1212 01:38:51.705897 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 01:38:52 crc kubenswrapper[4606]: I1212 01:38:52.791474 4606 generic.go:334] "Generic (PLEG): container finished" podID="3bf91206-a2cc-486f-b03b-a9029dc6d183" containerID="5e5ed88ed6fee4838847fc1b1c0e5c79b46d5cc91f219695e4906a131abf20fb" exitCode=0 Dec 12 01:38:52 crc kubenswrapper[4606]: I1212 01:38:52.791817 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-slpjk" event={"ID":"3bf91206-a2cc-486f-b03b-a9029dc6d183","Type":"ContainerDied","Data":"5e5ed88ed6fee4838847fc1b1c0e5c79b46d5cc91f219695e4906a131abf20fb"} Dec 12 01:38:53 crc kubenswrapper[4606]: I1212 01:38:53.801789 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-slpjk" event={"ID":"3bf91206-a2cc-486f-b03b-a9029dc6d183","Type":"ContainerStarted","Data":"e5dcb840b691c94621d14086a09d43fed49d7c2b95543e6f76cd02f1f3e91c37"} Dec 12 01:38:53 crc kubenswrapper[4606]: I1212 01:38:53.837539 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-slpjk" podStartSLOduration=2.324652581 podStartE2EDuration="4.837473268s" podCreationTimestamp="2025-12-12 01:38:49 +0000 UTC" firstStartedPulling="2025-12-12 01:38:50.766331245 +0000 UTC m=+4521.311684111" lastFinishedPulling="2025-12-12 01:38:53.279151922 +0000 UTC m=+4523.824504798" observedRunningTime="2025-12-12 01:38:53.837120189 +0000 UTC m=+4524.382473055" watchObservedRunningTime="2025-12-12 01:38:53.837473268 +0000 UTC m=+4524.382826134" Dec 12 01:38:59 crc kubenswrapper[4606]: I1212 01:38:59.644256 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-slpjk" Dec 12 01:38:59 crc kubenswrapper[4606]: I1212 01:38:59.644934 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-slpjk" Dec 12 01:38:59 crc kubenswrapper[4606]: I1212 01:38:59.696673 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-slpjk" Dec 12 01:38:59 crc kubenswrapper[4606]: I1212 01:38:59.916016 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-slpjk" Dec 12 01:38:59 crc kubenswrapper[4606]: I1212 01:38:59.970160 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-slpjk"] Dec 12 01:39:01 crc kubenswrapper[4606]: I1212 01:39:01.870350 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-slpjk" podUID="3bf91206-a2cc-486f-b03b-a9029dc6d183" containerName="registry-server" containerID="cri-o://e5dcb840b691c94621d14086a09d43fed49d7c2b95543e6f76cd02f1f3e91c37" gracePeriod=2 Dec 12 01:39:02 crc kubenswrapper[4606]: I1212 01:39:02.599026 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-slpjk" Dec 12 01:39:02 crc kubenswrapper[4606]: I1212 01:39:02.765077 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9frvw\" (UniqueName: \"kubernetes.io/projected/3bf91206-a2cc-486f-b03b-a9029dc6d183-kube-api-access-9frvw\") pod \"3bf91206-a2cc-486f-b03b-a9029dc6d183\" (UID: \"3bf91206-a2cc-486f-b03b-a9029dc6d183\") " Dec 12 01:39:02 crc kubenswrapper[4606]: I1212 01:39:02.765189 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3bf91206-a2cc-486f-b03b-a9029dc6d183-catalog-content\") pod \"3bf91206-a2cc-486f-b03b-a9029dc6d183\" (UID: \"3bf91206-a2cc-486f-b03b-a9029dc6d183\") " Dec 12 01:39:02 crc kubenswrapper[4606]: I1212 01:39:02.765690 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3bf91206-a2cc-486f-b03b-a9029dc6d183-utilities\") pod \"3bf91206-a2cc-486f-b03b-a9029dc6d183\" (UID: \"3bf91206-a2cc-486f-b03b-a9029dc6d183\") " Dec 12 01:39:02 crc kubenswrapper[4606]: I1212 01:39:02.766524 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3bf91206-a2cc-486f-b03b-a9029dc6d183-utilities" (OuterVolumeSpecName: "utilities") pod "3bf91206-a2cc-486f-b03b-a9029dc6d183" (UID: "3bf91206-a2cc-486f-b03b-a9029dc6d183"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 01:39:02 crc kubenswrapper[4606]: I1212 01:39:02.771013 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3bf91206-a2cc-486f-b03b-a9029dc6d183-kube-api-access-9frvw" (OuterVolumeSpecName: "kube-api-access-9frvw") pod "3bf91206-a2cc-486f-b03b-a9029dc6d183" (UID: "3bf91206-a2cc-486f-b03b-a9029dc6d183"). InnerVolumeSpecName "kube-api-access-9frvw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 01:39:02 crc kubenswrapper[4606]: I1212 01:39:02.842931 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3bf91206-a2cc-486f-b03b-a9029dc6d183-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3bf91206-a2cc-486f-b03b-a9029dc6d183" (UID: "3bf91206-a2cc-486f-b03b-a9029dc6d183"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 01:39:02 crc kubenswrapper[4606]: I1212 01:39:02.868531 4606 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3bf91206-a2cc-486f-b03b-a9029dc6d183-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 12 01:39:02 crc kubenswrapper[4606]: I1212 01:39:02.868563 4606 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3bf91206-a2cc-486f-b03b-a9029dc6d183-utilities\") on node \"crc\" DevicePath \"\"" Dec 12 01:39:02 crc kubenswrapper[4606]: I1212 01:39:02.868575 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9frvw\" (UniqueName: \"kubernetes.io/projected/3bf91206-a2cc-486f-b03b-a9029dc6d183-kube-api-access-9frvw\") on node \"crc\" DevicePath \"\"" Dec 12 01:39:02 crc kubenswrapper[4606]: I1212 01:39:02.884285 4606 generic.go:334] "Generic (PLEG): container finished" podID="3bf91206-a2cc-486f-b03b-a9029dc6d183" containerID="e5dcb840b691c94621d14086a09d43fed49d7c2b95543e6f76cd02f1f3e91c37" exitCode=0 Dec 12 01:39:02 crc kubenswrapper[4606]: I1212 01:39:02.884335 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-slpjk" event={"ID":"3bf91206-a2cc-486f-b03b-a9029dc6d183","Type":"ContainerDied","Data":"e5dcb840b691c94621d14086a09d43fed49d7c2b95543e6f76cd02f1f3e91c37"} Dec 12 01:39:02 crc kubenswrapper[4606]: I1212 01:39:02.884358 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-slpjk" Dec 12 01:39:02 crc kubenswrapper[4606]: I1212 01:39:02.884380 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-slpjk" event={"ID":"3bf91206-a2cc-486f-b03b-a9029dc6d183","Type":"ContainerDied","Data":"853f79d557f621fde34fa2363c0d29e8d9dbb651055b55170f507e537cc74872"} Dec 12 01:39:02 crc kubenswrapper[4606]: I1212 01:39:02.884405 4606 scope.go:117] "RemoveContainer" containerID="e5dcb840b691c94621d14086a09d43fed49d7c2b95543e6f76cd02f1f3e91c37" Dec 12 01:39:02 crc kubenswrapper[4606]: I1212 01:39:02.925058 4606 scope.go:117] "RemoveContainer" containerID="5e5ed88ed6fee4838847fc1b1c0e5c79b46d5cc91f219695e4906a131abf20fb" Dec 12 01:39:02 crc kubenswrapper[4606]: I1212 01:39:02.929442 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-slpjk"] Dec 12 01:39:02 crc kubenswrapper[4606]: I1212 01:39:02.939150 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-slpjk"] Dec 12 01:39:02 crc kubenswrapper[4606]: I1212 01:39:02.945237 4606 scope.go:117] "RemoveContainer" containerID="02fe82f1a41e7bd9fc6188ecad7d991eb70114789b410565be8cd043a474b0a0" Dec 12 01:39:02 crc kubenswrapper[4606]: I1212 01:39:02.982671 4606 scope.go:117] "RemoveContainer" containerID="e5dcb840b691c94621d14086a09d43fed49d7c2b95543e6f76cd02f1f3e91c37" Dec 12 01:39:02 crc kubenswrapper[4606]: E1212 01:39:02.982893 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5dcb840b691c94621d14086a09d43fed49d7c2b95543e6f76cd02f1f3e91c37\": container with ID starting with e5dcb840b691c94621d14086a09d43fed49d7c2b95543e6f76cd02f1f3e91c37 not found: ID does not exist" containerID="e5dcb840b691c94621d14086a09d43fed49d7c2b95543e6f76cd02f1f3e91c37" Dec 12 01:39:02 crc kubenswrapper[4606]: I1212 01:39:02.982925 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5dcb840b691c94621d14086a09d43fed49d7c2b95543e6f76cd02f1f3e91c37"} err="failed to get container status \"e5dcb840b691c94621d14086a09d43fed49d7c2b95543e6f76cd02f1f3e91c37\": rpc error: code = NotFound desc = could not find container \"e5dcb840b691c94621d14086a09d43fed49d7c2b95543e6f76cd02f1f3e91c37\": container with ID starting with e5dcb840b691c94621d14086a09d43fed49d7c2b95543e6f76cd02f1f3e91c37 not found: ID does not exist" Dec 12 01:39:02 crc kubenswrapper[4606]: I1212 01:39:02.982944 4606 scope.go:117] "RemoveContainer" containerID="5e5ed88ed6fee4838847fc1b1c0e5c79b46d5cc91f219695e4906a131abf20fb" Dec 12 01:39:02 crc kubenswrapper[4606]: E1212 01:39:02.983310 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e5ed88ed6fee4838847fc1b1c0e5c79b46d5cc91f219695e4906a131abf20fb\": container with ID starting with 5e5ed88ed6fee4838847fc1b1c0e5c79b46d5cc91f219695e4906a131abf20fb not found: ID does not exist" containerID="5e5ed88ed6fee4838847fc1b1c0e5c79b46d5cc91f219695e4906a131abf20fb" Dec 12 01:39:02 crc kubenswrapper[4606]: I1212 01:39:02.983426 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e5ed88ed6fee4838847fc1b1c0e5c79b46d5cc91f219695e4906a131abf20fb"} err="failed to get container status \"5e5ed88ed6fee4838847fc1b1c0e5c79b46d5cc91f219695e4906a131abf20fb\": rpc error: code = NotFound desc = could not find container \"5e5ed88ed6fee4838847fc1b1c0e5c79b46d5cc91f219695e4906a131abf20fb\": container with ID starting with 5e5ed88ed6fee4838847fc1b1c0e5c79b46d5cc91f219695e4906a131abf20fb not found: ID does not exist" Dec 12 01:39:02 crc kubenswrapper[4606]: I1212 01:39:02.983542 4606 scope.go:117] "RemoveContainer" containerID="02fe82f1a41e7bd9fc6188ecad7d991eb70114789b410565be8cd043a474b0a0" Dec 12 01:39:02 crc kubenswrapper[4606]: E1212 01:39:02.983982 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02fe82f1a41e7bd9fc6188ecad7d991eb70114789b410565be8cd043a474b0a0\": container with ID starting with 02fe82f1a41e7bd9fc6188ecad7d991eb70114789b410565be8cd043a474b0a0 not found: ID does not exist" containerID="02fe82f1a41e7bd9fc6188ecad7d991eb70114789b410565be8cd043a474b0a0" Dec 12 01:39:02 crc kubenswrapper[4606]: I1212 01:39:02.984024 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02fe82f1a41e7bd9fc6188ecad7d991eb70114789b410565be8cd043a474b0a0"} err="failed to get container status \"02fe82f1a41e7bd9fc6188ecad7d991eb70114789b410565be8cd043a474b0a0\": rpc error: code = NotFound desc = could not find container \"02fe82f1a41e7bd9fc6188ecad7d991eb70114789b410565be8cd043a474b0a0\": container with ID starting with 02fe82f1a41e7bd9fc6188ecad7d991eb70114789b410565be8cd043a474b0a0 not found: ID does not exist" Dec 12 01:39:03 crc kubenswrapper[4606]: I1212 01:39:03.700807 4606 scope.go:117] "RemoveContainer" containerID="1632cda7e473e31214f35a52504e8863e32c4bf8c760028b86ec1d6ad9265c6d" Dec 12 01:39:03 crc kubenswrapper[4606]: I1212 01:39:03.736645 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3bf91206-a2cc-486f-b03b-a9029dc6d183" path="/var/lib/kubelet/pods/3bf91206-a2cc-486f-b03b-a9029dc6d183/volumes" Dec 12 01:39:04 crc kubenswrapper[4606]: I1212 01:39:04.906387 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" event={"ID":"a543e227-be89-40cb-941d-b4707cc28921","Type":"ContainerStarted","Data":"cf2702dedf8aea1ffac5420ea4eaa7ba89f37e569f5fac1a39fd527f11622c29"} Dec 12 01:40:37 crc kubenswrapper[4606]: I1212 01:40:37.854791 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-49cd7"] Dec 12 01:40:37 crc kubenswrapper[4606]: E1212 01:40:37.855705 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bf91206-a2cc-486f-b03b-a9029dc6d183" containerName="extract-utilities" Dec 12 01:40:37 crc kubenswrapper[4606]: I1212 01:40:37.855717 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bf91206-a2cc-486f-b03b-a9029dc6d183" containerName="extract-utilities" Dec 12 01:40:37 crc kubenswrapper[4606]: E1212 01:40:37.855739 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bf91206-a2cc-486f-b03b-a9029dc6d183" containerName="extract-content" Dec 12 01:40:37 crc kubenswrapper[4606]: I1212 01:40:37.855744 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bf91206-a2cc-486f-b03b-a9029dc6d183" containerName="extract-content" Dec 12 01:40:37 crc kubenswrapper[4606]: E1212 01:40:37.855759 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bf91206-a2cc-486f-b03b-a9029dc6d183" containerName="registry-server" Dec 12 01:40:37 crc kubenswrapper[4606]: I1212 01:40:37.855767 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bf91206-a2cc-486f-b03b-a9029dc6d183" containerName="registry-server" Dec 12 01:40:37 crc kubenswrapper[4606]: I1212 01:40:37.855938 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="3bf91206-a2cc-486f-b03b-a9029dc6d183" containerName="registry-server" Dec 12 01:40:37 crc kubenswrapper[4606]: I1212 01:40:37.857377 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-49cd7" Dec 12 01:40:37 crc kubenswrapper[4606]: I1212 01:40:37.884434 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-49cd7"] Dec 12 01:40:38 crc kubenswrapper[4606]: I1212 01:40:38.033953 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1526248f-6142-46fa-aa48-04097eab4666-catalog-content\") pod \"redhat-marketplace-49cd7\" (UID: \"1526248f-6142-46fa-aa48-04097eab4666\") " pod="openshift-marketplace/redhat-marketplace-49cd7" Dec 12 01:40:38 crc kubenswrapper[4606]: I1212 01:40:38.034696 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vs2m7\" (UniqueName: \"kubernetes.io/projected/1526248f-6142-46fa-aa48-04097eab4666-kube-api-access-vs2m7\") pod \"redhat-marketplace-49cd7\" (UID: \"1526248f-6142-46fa-aa48-04097eab4666\") " pod="openshift-marketplace/redhat-marketplace-49cd7" Dec 12 01:40:38 crc kubenswrapper[4606]: I1212 01:40:38.034895 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1526248f-6142-46fa-aa48-04097eab4666-utilities\") pod \"redhat-marketplace-49cd7\" (UID: \"1526248f-6142-46fa-aa48-04097eab4666\") " pod="openshift-marketplace/redhat-marketplace-49cd7" Dec 12 01:40:38 crc kubenswrapper[4606]: I1212 01:40:38.137314 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1526248f-6142-46fa-aa48-04097eab4666-catalog-content\") pod \"redhat-marketplace-49cd7\" (UID: \"1526248f-6142-46fa-aa48-04097eab4666\") " pod="openshift-marketplace/redhat-marketplace-49cd7" Dec 12 01:40:38 crc kubenswrapper[4606]: I1212 01:40:38.137407 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vs2m7\" (UniqueName: \"kubernetes.io/projected/1526248f-6142-46fa-aa48-04097eab4666-kube-api-access-vs2m7\") pod \"redhat-marketplace-49cd7\" (UID: \"1526248f-6142-46fa-aa48-04097eab4666\") " pod="openshift-marketplace/redhat-marketplace-49cd7" Dec 12 01:40:38 crc kubenswrapper[4606]: I1212 01:40:38.137462 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1526248f-6142-46fa-aa48-04097eab4666-utilities\") pod \"redhat-marketplace-49cd7\" (UID: \"1526248f-6142-46fa-aa48-04097eab4666\") " pod="openshift-marketplace/redhat-marketplace-49cd7" Dec 12 01:40:38 crc kubenswrapper[4606]: I1212 01:40:38.137791 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1526248f-6142-46fa-aa48-04097eab4666-catalog-content\") pod \"redhat-marketplace-49cd7\" (UID: \"1526248f-6142-46fa-aa48-04097eab4666\") " pod="openshift-marketplace/redhat-marketplace-49cd7" Dec 12 01:40:38 crc kubenswrapper[4606]: I1212 01:40:38.137829 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1526248f-6142-46fa-aa48-04097eab4666-utilities\") pod \"redhat-marketplace-49cd7\" (UID: \"1526248f-6142-46fa-aa48-04097eab4666\") " pod="openshift-marketplace/redhat-marketplace-49cd7" Dec 12 01:40:38 crc kubenswrapper[4606]: I1212 01:40:38.248531 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vs2m7\" (UniqueName: \"kubernetes.io/projected/1526248f-6142-46fa-aa48-04097eab4666-kube-api-access-vs2m7\") pod \"redhat-marketplace-49cd7\" (UID: \"1526248f-6142-46fa-aa48-04097eab4666\") " pod="openshift-marketplace/redhat-marketplace-49cd7" Dec 12 01:40:38 crc kubenswrapper[4606]: I1212 01:40:38.476290 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-49cd7" Dec 12 01:40:38 crc kubenswrapper[4606]: I1212 01:40:38.988342 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-49cd7"] Dec 12 01:40:39 crc kubenswrapper[4606]: I1212 01:40:39.786659 4606 generic.go:334] "Generic (PLEG): container finished" podID="1526248f-6142-46fa-aa48-04097eab4666" containerID="3f4593a3c9fdea97c81130ed846d1b4a708e9f11c6c138bd8d1949ed6f40ed2d" exitCode=0 Dec 12 01:40:39 crc kubenswrapper[4606]: I1212 01:40:39.786709 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-49cd7" event={"ID":"1526248f-6142-46fa-aa48-04097eab4666","Type":"ContainerDied","Data":"3f4593a3c9fdea97c81130ed846d1b4a708e9f11c6c138bd8d1949ed6f40ed2d"} Dec 12 01:40:39 crc kubenswrapper[4606]: I1212 01:40:39.786987 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-49cd7" event={"ID":"1526248f-6142-46fa-aa48-04097eab4666","Type":"ContainerStarted","Data":"dd06241f2ed0120f364d9a639bb9effa6ea157e4bc3bf36b0ec7358883b5b40d"} Dec 12 01:40:41 crc kubenswrapper[4606]: I1212 01:40:41.807146 4606 generic.go:334] "Generic (PLEG): container finished" podID="1526248f-6142-46fa-aa48-04097eab4666" containerID="2a566b9133443978d14e9667c26f4567b9e1b738c94346d62484cca41b62db2f" exitCode=0 Dec 12 01:40:41 crc kubenswrapper[4606]: I1212 01:40:41.807243 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-49cd7" event={"ID":"1526248f-6142-46fa-aa48-04097eab4666","Type":"ContainerDied","Data":"2a566b9133443978d14e9667c26f4567b9e1b738c94346d62484cca41b62db2f"} Dec 12 01:40:42 crc kubenswrapper[4606]: I1212 01:40:42.819792 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-49cd7" event={"ID":"1526248f-6142-46fa-aa48-04097eab4666","Type":"ContainerStarted","Data":"b16ca590e0d932005d092ab33b014f2d299551b100d3a319f7b4d559059ae4fe"} Dec 12 01:40:42 crc kubenswrapper[4606]: I1212 01:40:42.844278 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-49cd7" podStartSLOduration=3.351180683 podStartE2EDuration="5.844251625s" podCreationTimestamp="2025-12-12 01:40:37 +0000 UTC" firstStartedPulling="2025-12-12 01:40:39.788976364 +0000 UTC m=+4630.334329230" lastFinishedPulling="2025-12-12 01:40:42.282047296 +0000 UTC m=+4632.827400172" observedRunningTime="2025-12-12 01:40:42.838685887 +0000 UTC m=+4633.384038763" watchObservedRunningTime="2025-12-12 01:40:42.844251625 +0000 UTC m=+4633.389604481" Dec 12 01:40:48 crc kubenswrapper[4606]: I1212 01:40:48.476637 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-49cd7" Dec 12 01:40:48 crc kubenswrapper[4606]: I1212 01:40:48.477346 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-49cd7" Dec 12 01:40:48 crc kubenswrapper[4606]: I1212 01:40:48.598342 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-49cd7" Dec 12 01:40:48 crc kubenswrapper[4606]: I1212 01:40:48.944726 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-49cd7" Dec 12 01:40:48 crc kubenswrapper[4606]: I1212 01:40:48.999264 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-49cd7"] Dec 12 01:40:50 crc kubenswrapper[4606]: I1212 01:40:50.910274 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-49cd7" podUID="1526248f-6142-46fa-aa48-04097eab4666" containerName="registry-server" containerID="cri-o://b16ca590e0d932005d092ab33b014f2d299551b100d3a319f7b4d559059ae4fe" gracePeriod=2 Dec 12 01:40:51 crc kubenswrapper[4606]: I1212 01:40:51.568407 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-49cd7" Dec 12 01:40:51 crc kubenswrapper[4606]: I1212 01:40:51.695624 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vs2m7\" (UniqueName: \"kubernetes.io/projected/1526248f-6142-46fa-aa48-04097eab4666-kube-api-access-vs2m7\") pod \"1526248f-6142-46fa-aa48-04097eab4666\" (UID: \"1526248f-6142-46fa-aa48-04097eab4666\") " Dec 12 01:40:51 crc kubenswrapper[4606]: I1212 01:40:51.695746 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1526248f-6142-46fa-aa48-04097eab4666-catalog-content\") pod \"1526248f-6142-46fa-aa48-04097eab4666\" (UID: \"1526248f-6142-46fa-aa48-04097eab4666\") " Dec 12 01:40:51 crc kubenswrapper[4606]: I1212 01:40:51.695848 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1526248f-6142-46fa-aa48-04097eab4666-utilities\") pod \"1526248f-6142-46fa-aa48-04097eab4666\" (UID: \"1526248f-6142-46fa-aa48-04097eab4666\") " Dec 12 01:40:51 crc kubenswrapper[4606]: I1212 01:40:51.697036 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1526248f-6142-46fa-aa48-04097eab4666-utilities" (OuterVolumeSpecName: "utilities") pod "1526248f-6142-46fa-aa48-04097eab4666" (UID: "1526248f-6142-46fa-aa48-04097eab4666"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 01:40:51 crc kubenswrapper[4606]: I1212 01:40:51.701695 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1526248f-6142-46fa-aa48-04097eab4666-kube-api-access-vs2m7" (OuterVolumeSpecName: "kube-api-access-vs2m7") pod "1526248f-6142-46fa-aa48-04097eab4666" (UID: "1526248f-6142-46fa-aa48-04097eab4666"). InnerVolumeSpecName "kube-api-access-vs2m7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 01:40:51 crc kubenswrapper[4606]: I1212 01:40:51.729826 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1526248f-6142-46fa-aa48-04097eab4666-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1526248f-6142-46fa-aa48-04097eab4666" (UID: "1526248f-6142-46fa-aa48-04097eab4666"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 01:40:51 crc kubenswrapper[4606]: I1212 01:40:51.801687 4606 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1526248f-6142-46fa-aa48-04097eab4666-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 12 01:40:51 crc kubenswrapper[4606]: I1212 01:40:51.801738 4606 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1526248f-6142-46fa-aa48-04097eab4666-utilities\") on node \"crc\" DevicePath \"\"" Dec 12 01:40:51 crc kubenswrapper[4606]: I1212 01:40:51.801757 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vs2m7\" (UniqueName: \"kubernetes.io/projected/1526248f-6142-46fa-aa48-04097eab4666-kube-api-access-vs2m7\") on node \"crc\" DevicePath \"\"" Dec 12 01:40:51 crc kubenswrapper[4606]: I1212 01:40:51.926242 4606 generic.go:334] "Generic (PLEG): container finished" podID="1526248f-6142-46fa-aa48-04097eab4666" containerID="b16ca590e0d932005d092ab33b014f2d299551b100d3a319f7b4d559059ae4fe" exitCode=0 Dec 12 01:40:51 crc kubenswrapper[4606]: I1212 01:40:51.926849 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-49cd7" event={"ID":"1526248f-6142-46fa-aa48-04097eab4666","Type":"ContainerDied","Data":"b16ca590e0d932005d092ab33b014f2d299551b100d3a319f7b4d559059ae4fe"} Dec 12 01:40:51 crc kubenswrapper[4606]: I1212 01:40:51.927476 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-49cd7" event={"ID":"1526248f-6142-46fa-aa48-04097eab4666","Type":"ContainerDied","Data":"dd06241f2ed0120f364d9a639bb9effa6ea157e4bc3bf36b0ec7358883b5b40d"} Dec 12 01:40:51 crc kubenswrapper[4606]: I1212 01:40:51.926962 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-49cd7" Dec 12 01:40:51 crc kubenswrapper[4606]: I1212 01:40:51.928217 4606 scope.go:117] "RemoveContainer" containerID="b16ca590e0d932005d092ab33b014f2d299551b100d3a319f7b4d559059ae4fe" Dec 12 01:40:51 crc kubenswrapper[4606]: I1212 01:40:51.970519 4606 scope.go:117] "RemoveContainer" containerID="2a566b9133443978d14e9667c26f4567b9e1b738c94346d62484cca41b62db2f" Dec 12 01:40:51 crc kubenswrapper[4606]: I1212 01:40:51.978880 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-49cd7"] Dec 12 01:40:52 crc kubenswrapper[4606]: I1212 01:40:52.000298 4606 scope.go:117] "RemoveContainer" containerID="3f4593a3c9fdea97c81130ed846d1b4a708e9f11c6c138bd8d1949ed6f40ed2d" Dec 12 01:40:52 crc kubenswrapper[4606]: I1212 01:40:52.001512 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-49cd7"] Dec 12 01:40:52 crc kubenswrapper[4606]: I1212 01:40:52.045936 4606 scope.go:117] "RemoveContainer" containerID="b16ca590e0d932005d092ab33b014f2d299551b100d3a319f7b4d559059ae4fe" Dec 12 01:40:52 crc kubenswrapper[4606]: E1212 01:40:52.048104 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b16ca590e0d932005d092ab33b014f2d299551b100d3a319f7b4d559059ae4fe\": container with ID starting with b16ca590e0d932005d092ab33b014f2d299551b100d3a319f7b4d559059ae4fe not found: ID does not exist" containerID="b16ca590e0d932005d092ab33b014f2d299551b100d3a319f7b4d559059ae4fe" Dec 12 01:40:52 crc kubenswrapper[4606]: I1212 01:40:52.048162 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b16ca590e0d932005d092ab33b014f2d299551b100d3a319f7b4d559059ae4fe"} err="failed to get container status \"b16ca590e0d932005d092ab33b014f2d299551b100d3a319f7b4d559059ae4fe\": rpc error: code = NotFound desc = could not find container \"b16ca590e0d932005d092ab33b014f2d299551b100d3a319f7b4d559059ae4fe\": container with ID starting with b16ca590e0d932005d092ab33b014f2d299551b100d3a319f7b4d559059ae4fe not found: ID does not exist" Dec 12 01:40:52 crc kubenswrapper[4606]: I1212 01:40:52.050257 4606 scope.go:117] "RemoveContainer" containerID="2a566b9133443978d14e9667c26f4567b9e1b738c94346d62484cca41b62db2f" Dec 12 01:40:52 crc kubenswrapper[4606]: E1212 01:40:52.050755 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a566b9133443978d14e9667c26f4567b9e1b738c94346d62484cca41b62db2f\": container with ID starting with 2a566b9133443978d14e9667c26f4567b9e1b738c94346d62484cca41b62db2f not found: ID does not exist" containerID="2a566b9133443978d14e9667c26f4567b9e1b738c94346d62484cca41b62db2f" Dec 12 01:40:52 crc kubenswrapper[4606]: I1212 01:40:52.050796 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a566b9133443978d14e9667c26f4567b9e1b738c94346d62484cca41b62db2f"} err="failed to get container status \"2a566b9133443978d14e9667c26f4567b9e1b738c94346d62484cca41b62db2f\": rpc error: code = NotFound desc = could not find container \"2a566b9133443978d14e9667c26f4567b9e1b738c94346d62484cca41b62db2f\": container with ID starting with 2a566b9133443978d14e9667c26f4567b9e1b738c94346d62484cca41b62db2f not found: ID does not exist" Dec 12 01:40:52 crc kubenswrapper[4606]: I1212 01:40:52.050823 4606 scope.go:117] "RemoveContainer" containerID="3f4593a3c9fdea97c81130ed846d1b4a708e9f11c6c138bd8d1949ed6f40ed2d" Dec 12 01:40:52 crc kubenswrapper[4606]: E1212 01:40:52.051851 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f4593a3c9fdea97c81130ed846d1b4a708e9f11c6c138bd8d1949ed6f40ed2d\": container with ID starting with 3f4593a3c9fdea97c81130ed846d1b4a708e9f11c6c138bd8d1949ed6f40ed2d not found: ID does not exist" containerID="3f4593a3c9fdea97c81130ed846d1b4a708e9f11c6c138bd8d1949ed6f40ed2d" Dec 12 01:40:52 crc kubenswrapper[4606]: I1212 01:40:52.051892 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f4593a3c9fdea97c81130ed846d1b4a708e9f11c6c138bd8d1949ed6f40ed2d"} err="failed to get container status \"3f4593a3c9fdea97c81130ed846d1b4a708e9f11c6c138bd8d1949ed6f40ed2d\": rpc error: code = NotFound desc = could not find container \"3f4593a3c9fdea97c81130ed846d1b4a708e9f11c6c138bd8d1949ed6f40ed2d\": container with ID starting with 3f4593a3c9fdea97c81130ed846d1b4a708e9f11c6c138bd8d1949ed6f40ed2d not found: ID does not exist" Dec 12 01:40:53 crc kubenswrapper[4606]: I1212 01:40:53.712921 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1526248f-6142-46fa-aa48-04097eab4666" path="/var/lib/kubelet/pods/1526248f-6142-46fa-aa48-04097eab4666/volumes" Dec 12 01:41:32 crc kubenswrapper[4606]: I1212 01:41:32.010804 4606 patch_prober.go:28] interesting pod/machine-config-daemon-cqmz5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 12 01:41:32 crc kubenswrapper[4606]: I1212 01:41:32.011578 4606 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 12 01:42:02 crc kubenswrapper[4606]: I1212 01:42:02.010447 4606 patch_prober.go:28] interesting pod/machine-config-daemon-cqmz5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 12 01:42:02 crc kubenswrapper[4606]: I1212 01:42:02.011063 4606 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 12 01:42:32 crc kubenswrapper[4606]: I1212 01:42:32.010113 4606 patch_prober.go:28] interesting pod/machine-config-daemon-cqmz5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 12 01:42:32 crc kubenswrapper[4606]: I1212 01:42:32.010760 4606 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 12 01:42:32 crc kubenswrapper[4606]: I1212 01:42:32.010808 4606 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" Dec 12 01:42:32 crc kubenswrapper[4606]: I1212 01:42:32.011586 4606 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"cf2702dedf8aea1ffac5420ea4eaa7ba89f37e569f5fac1a39fd527f11622c29"} pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 12 01:42:32 crc kubenswrapper[4606]: I1212 01:42:32.011649 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" containerName="machine-config-daemon" containerID="cri-o://cf2702dedf8aea1ffac5420ea4eaa7ba89f37e569f5fac1a39fd527f11622c29" gracePeriod=600 Dec 12 01:42:32 crc kubenswrapper[4606]: I1212 01:42:32.955633 4606 generic.go:334] "Generic (PLEG): container finished" podID="a543e227-be89-40cb-941d-b4707cc28921" containerID="cf2702dedf8aea1ffac5420ea4eaa7ba89f37e569f5fac1a39fd527f11622c29" exitCode=0 Dec 12 01:42:32 crc kubenswrapper[4606]: I1212 01:42:32.955698 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" event={"ID":"a543e227-be89-40cb-941d-b4707cc28921","Type":"ContainerDied","Data":"cf2702dedf8aea1ffac5420ea4eaa7ba89f37e569f5fac1a39fd527f11622c29"} Dec 12 01:42:32 crc kubenswrapper[4606]: I1212 01:42:32.955957 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" event={"ID":"a543e227-be89-40cb-941d-b4707cc28921","Type":"ContainerStarted","Data":"e143db5c8656674439f9bedbd23be8e8c05444861157fb9728cbde496d7fbeb8"} Dec 12 01:42:32 crc kubenswrapper[4606]: I1212 01:42:32.955978 4606 scope.go:117] "RemoveContainer" containerID="1632cda7e473e31214f35a52504e8863e32c4bf8c760028b86ec1d6ad9265c6d" Dec 12 01:44:32 crc kubenswrapper[4606]: I1212 01:44:32.010246 4606 patch_prober.go:28] interesting pod/machine-config-daemon-cqmz5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 12 01:44:32 crc kubenswrapper[4606]: I1212 01:44:32.011111 4606 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 12 01:45:00 crc kubenswrapper[4606]: I1212 01:45:00.168577 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29425065-r66hb"] Dec 12 01:45:00 crc kubenswrapper[4606]: E1212 01:45:00.171389 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1526248f-6142-46fa-aa48-04097eab4666" containerName="registry-server" Dec 12 01:45:00 crc kubenswrapper[4606]: I1212 01:45:00.171546 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="1526248f-6142-46fa-aa48-04097eab4666" containerName="registry-server" Dec 12 01:45:00 crc kubenswrapper[4606]: E1212 01:45:00.171662 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1526248f-6142-46fa-aa48-04097eab4666" containerName="extract-utilities" Dec 12 01:45:00 crc kubenswrapper[4606]: I1212 01:45:00.171770 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="1526248f-6142-46fa-aa48-04097eab4666" containerName="extract-utilities" Dec 12 01:45:00 crc kubenswrapper[4606]: E1212 01:45:00.173079 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1526248f-6142-46fa-aa48-04097eab4666" containerName="extract-content" Dec 12 01:45:00 crc kubenswrapper[4606]: I1212 01:45:00.173235 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="1526248f-6142-46fa-aa48-04097eab4666" containerName="extract-content" Dec 12 01:45:00 crc kubenswrapper[4606]: I1212 01:45:00.173724 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="1526248f-6142-46fa-aa48-04097eab4666" containerName="registry-server" Dec 12 01:45:00 crc kubenswrapper[4606]: I1212 01:45:00.174912 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29425065-r66hb" Dec 12 01:45:00 crc kubenswrapper[4606]: I1212 01:45:00.177628 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 12 01:45:00 crc kubenswrapper[4606]: I1212 01:45:00.186283 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 12 01:45:00 crc kubenswrapper[4606]: I1212 01:45:00.192926 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29425065-r66hb"] Dec 12 01:45:00 crc kubenswrapper[4606]: I1212 01:45:00.327910 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8qcs\" (UniqueName: \"kubernetes.io/projected/d899b88d-475c-43f4-9102-5141a71d6b5a-kube-api-access-b8qcs\") pod \"collect-profiles-29425065-r66hb\" (UID: \"d899b88d-475c-43f4-9102-5141a71d6b5a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29425065-r66hb" Dec 12 01:45:00 crc kubenswrapper[4606]: I1212 01:45:00.328024 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d899b88d-475c-43f4-9102-5141a71d6b5a-config-volume\") pod \"collect-profiles-29425065-r66hb\" (UID: \"d899b88d-475c-43f4-9102-5141a71d6b5a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29425065-r66hb" Dec 12 01:45:00 crc kubenswrapper[4606]: I1212 01:45:00.328108 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d899b88d-475c-43f4-9102-5141a71d6b5a-secret-volume\") pod \"collect-profiles-29425065-r66hb\" (UID: \"d899b88d-475c-43f4-9102-5141a71d6b5a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29425065-r66hb" Dec 12 01:45:00 crc kubenswrapper[4606]: I1212 01:45:00.430538 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8qcs\" (UniqueName: \"kubernetes.io/projected/d899b88d-475c-43f4-9102-5141a71d6b5a-kube-api-access-b8qcs\") pod \"collect-profiles-29425065-r66hb\" (UID: \"d899b88d-475c-43f4-9102-5141a71d6b5a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29425065-r66hb" Dec 12 01:45:00 crc kubenswrapper[4606]: I1212 01:45:00.430672 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d899b88d-475c-43f4-9102-5141a71d6b5a-config-volume\") pod \"collect-profiles-29425065-r66hb\" (UID: \"d899b88d-475c-43f4-9102-5141a71d6b5a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29425065-r66hb" Dec 12 01:45:00 crc kubenswrapper[4606]: I1212 01:45:00.430769 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d899b88d-475c-43f4-9102-5141a71d6b5a-secret-volume\") pod \"collect-profiles-29425065-r66hb\" (UID: \"d899b88d-475c-43f4-9102-5141a71d6b5a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29425065-r66hb" Dec 12 01:45:00 crc kubenswrapper[4606]: I1212 01:45:00.431948 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d899b88d-475c-43f4-9102-5141a71d6b5a-config-volume\") pod \"collect-profiles-29425065-r66hb\" (UID: \"d899b88d-475c-43f4-9102-5141a71d6b5a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29425065-r66hb" Dec 12 01:45:00 crc kubenswrapper[4606]: I1212 01:45:00.446659 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d899b88d-475c-43f4-9102-5141a71d6b5a-secret-volume\") pod \"collect-profiles-29425065-r66hb\" (UID: \"d899b88d-475c-43f4-9102-5141a71d6b5a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29425065-r66hb" Dec 12 01:45:00 crc kubenswrapper[4606]: I1212 01:45:00.454730 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8qcs\" (UniqueName: \"kubernetes.io/projected/d899b88d-475c-43f4-9102-5141a71d6b5a-kube-api-access-b8qcs\") pod \"collect-profiles-29425065-r66hb\" (UID: \"d899b88d-475c-43f4-9102-5141a71d6b5a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29425065-r66hb" Dec 12 01:45:00 crc kubenswrapper[4606]: I1212 01:45:00.551904 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29425065-r66hb" Dec 12 01:45:01 crc kubenswrapper[4606]: I1212 01:45:01.049151 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29425065-r66hb"] Dec 12 01:45:01 crc kubenswrapper[4606]: I1212 01:45:01.406800 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29425065-r66hb" event={"ID":"d899b88d-475c-43f4-9102-5141a71d6b5a","Type":"ContainerStarted","Data":"2ef0d3c604415893f5346154d7f67890fa1d03df5f590d63f09ab6e1aa480e7a"} Dec 12 01:45:01 crc kubenswrapper[4606]: I1212 01:45:01.407240 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29425065-r66hb" event={"ID":"d899b88d-475c-43f4-9102-5141a71d6b5a","Type":"ContainerStarted","Data":"de782977c5c4a42207d1a3b6bc85b288f2328df64f95dcb3a1d511b3ade5ded0"} Dec 12 01:45:01 crc kubenswrapper[4606]: I1212 01:45:01.442251 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29425065-r66hb" podStartSLOduration=1.442221649 podStartE2EDuration="1.442221649s" podCreationTimestamp="2025-12-12 01:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 01:45:01.431763371 +0000 UTC m=+4891.977116247" watchObservedRunningTime="2025-12-12 01:45:01.442221649 +0000 UTC m=+4891.987574515" Dec 12 01:45:01 crc kubenswrapper[4606]: E1212 01:45:01.583316 4606 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd899b88d_475c_43f4_9102_5141a71d6b5a.slice/crio-2ef0d3c604415893f5346154d7f67890fa1d03df5f590d63f09ab6e1aa480e7a.scope\": RecentStats: unable to find data in memory cache]" Dec 12 01:45:02 crc kubenswrapper[4606]: I1212 01:45:02.010922 4606 patch_prober.go:28] interesting pod/machine-config-daemon-cqmz5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 12 01:45:02 crc kubenswrapper[4606]: I1212 01:45:02.011451 4606 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 12 01:45:02 crc kubenswrapper[4606]: I1212 01:45:02.420075 4606 generic.go:334] "Generic (PLEG): container finished" podID="d899b88d-475c-43f4-9102-5141a71d6b5a" containerID="2ef0d3c604415893f5346154d7f67890fa1d03df5f590d63f09ab6e1aa480e7a" exitCode=0 Dec 12 01:45:02 crc kubenswrapper[4606]: I1212 01:45:02.420142 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29425065-r66hb" event={"ID":"d899b88d-475c-43f4-9102-5141a71d6b5a","Type":"ContainerDied","Data":"2ef0d3c604415893f5346154d7f67890fa1d03df5f590d63f09ab6e1aa480e7a"} Dec 12 01:45:03 crc kubenswrapper[4606]: I1212 01:45:03.795204 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29425065-r66hb" Dec 12 01:45:03 crc kubenswrapper[4606]: I1212 01:45:03.948473 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d899b88d-475c-43f4-9102-5141a71d6b5a-config-volume\") pod \"d899b88d-475c-43f4-9102-5141a71d6b5a\" (UID: \"d899b88d-475c-43f4-9102-5141a71d6b5a\") " Dec 12 01:45:03 crc kubenswrapper[4606]: I1212 01:45:03.948621 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d899b88d-475c-43f4-9102-5141a71d6b5a-secret-volume\") pod \"d899b88d-475c-43f4-9102-5141a71d6b5a\" (UID: \"d899b88d-475c-43f4-9102-5141a71d6b5a\") " Dec 12 01:45:03 crc kubenswrapper[4606]: I1212 01:45:03.948703 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b8qcs\" (UniqueName: \"kubernetes.io/projected/d899b88d-475c-43f4-9102-5141a71d6b5a-kube-api-access-b8qcs\") pod \"d899b88d-475c-43f4-9102-5141a71d6b5a\" (UID: \"d899b88d-475c-43f4-9102-5141a71d6b5a\") " Dec 12 01:45:03 crc kubenswrapper[4606]: I1212 01:45:03.949507 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d899b88d-475c-43f4-9102-5141a71d6b5a-config-volume" (OuterVolumeSpecName: "config-volume") pod "d899b88d-475c-43f4-9102-5141a71d6b5a" (UID: "d899b88d-475c-43f4-9102-5141a71d6b5a"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 01:45:03 crc kubenswrapper[4606]: I1212 01:45:03.954480 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d899b88d-475c-43f4-9102-5141a71d6b5a-kube-api-access-b8qcs" (OuterVolumeSpecName: "kube-api-access-b8qcs") pod "d899b88d-475c-43f4-9102-5141a71d6b5a" (UID: "d899b88d-475c-43f4-9102-5141a71d6b5a"). InnerVolumeSpecName "kube-api-access-b8qcs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 01:45:03 crc kubenswrapper[4606]: I1212 01:45:03.955305 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d899b88d-475c-43f4-9102-5141a71d6b5a-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "d899b88d-475c-43f4-9102-5141a71d6b5a" (UID: "d899b88d-475c-43f4-9102-5141a71d6b5a"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 01:45:04 crc kubenswrapper[4606]: I1212 01:45:04.050921 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b8qcs\" (UniqueName: \"kubernetes.io/projected/d899b88d-475c-43f4-9102-5141a71d6b5a-kube-api-access-b8qcs\") on node \"crc\" DevicePath \"\"" Dec 12 01:45:04 crc kubenswrapper[4606]: I1212 01:45:04.051223 4606 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d899b88d-475c-43f4-9102-5141a71d6b5a-config-volume\") on node \"crc\" DevicePath \"\"" Dec 12 01:45:04 crc kubenswrapper[4606]: I1212 01:45:04.051379 4606 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d899b88d-475c-43f4-9102-5141a71d6b5a-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 12 01:45:04 crc kubenswrapper[4606]: I1212 01:45:04.439004 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29425065-r66hb" event={"ID":"d899b88d-475c-43f4-9102-5141a71d6b5a","Type":"ContainerDied","Data":"de782977c5c4a42207d1a3b6bc85b288f2328df64f95dcb3a1d511b3ade5ded0"} Dec 12 01:45:04 crc kubenswrapper[4606]: I1212 01:45:04.439305 4606 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="de782977c5c4a42207d1a3b6bc85b288f2328df64f95dcb3a1d511b3ade5ded0" Dec 12 01:45:04 crc kubenswrapper[4606]: I1212 01:45:04.439394 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29425065-r66hb" Dec 12 01:45:04 crc kubenswrapper[4606]: I1212 01:45:04.884499 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29425020-mjtqt"] Dec 12 01:45:04 crc kubenswrapper[4606]: I1212 01:45:04.896897 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29425020-mjtqt"] Dec 12 01:45:05 crc kubenswrapper[4606]: I1212 01:45:05.714345 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32af7f7b-1f31-4c55-9020-3401f0cbae70" path="/var/lib/kubelet/pods/32af7f7b-1f31-4c55-9020-3401f0cbae70/volumes" Dec 12 01:45:16 crc kubenswrapper[4606]: I1212 01:45:16.879160 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-jpzls"] Dec 12 01:45:16 crc kubenswrapper[4606]: E1212 01:45:16.880515 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d899b88d-475c-43f4-9102-5141a71d6b5a" containerName="collect-profiles" Dec 12 01:45:16 crc kubenswrapper[4606]: I1212 01:45:16.880540 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="d899b88d-475c-43f4-9102-5141a71d6b5a" containerName="collect-profiles" Dec 12 01:45:16 crc kubenswrapper[4606]: I1212 01:45:16.880860 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="d899b88d-475c-43f4-9102-5141a71d6b5a" containerName="collect-profiles" Dec 12 01:45:16 crc kubenswrapper[4606]: I1212 01:45:16.882910 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jpzls" Dec 12 01:45:16 crc kubenswrapper[4606]: I1212 01:45:16.898933 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jpzls"] Dec 12 01:45:16 crc kubenswrapper[4606]: I1212 01:45:16.969033 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nzwqr\" (UniqueName: \"kubernetes.io/projected/47d7d1db-e4f6-415f-aa05-f03d5e10c39a-kube-api-access-nzwqr\") pod \"certified-operators-jpzls\" (UID: \"47d7d1db-e4f6-415f-aa05-f03d5e10c39a\") " pod="openshift-marketplace/certified-operators-jpzls" Dec 12 01:45:16 crc kubenswrapper[4606]: I1212 01:45:16.969262 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47d7d1db-e4f6-415f-aa05-f03d5e10c39a-utilities\") pod \"certified-operators-jpzls\" (UID: \"47d7d1db-e4f6-415f-aa05-f03d5e10c39a\") " pod="openshift-marketplace/certified-operators-jpzls" Dec 12 01:45:16 crc kubenswrapper[4606]: I1212 01:45:16.969580 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47d7d1db-e4f6-415f-aa05-f03d5e10c39a-catalog-content\") pod \"certified-operators-jpzls\" (UID: \"47d7d1db-e4f6-415f-aa05-f03d5e10c39a\") " pod="openshift-marketplace/certified-operators-jpzls" Dec 12 01:45:17 crc kubenswrapper[4606]: I1212 01:45:17.074905 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47d7d1db-e4f6-415f-aa05-f03d5e10c39a-utilities\") pod \"certified-operators-jpzls\" (UID: \"47d7d1db-e4f6-415f-aa05-f03d5e10c39a\") " pod="openshift-marketplace/certified-operators-jpzls" Dec 12 01:45:17 crc kubenswrapper[4606]: I1212 01:45:17.075044 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47d7d1db-e4f6-415f-aa05-f03d5e10c39a-catalog-content\") pod \"certified-operators-jpzls\" (UID: \"47d7d1db-e4f6-415f-aa05-f03d5e10c39a\") " pod="openshift-marketplace/certified-operators-jpzls" Dec 12 01:45:17 crc kubenswrapper[4606]: I1212 01:45:17.075090 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nzwqr\" (UniqueName: \"kubernetes.io/projected/47d7d1db-e4f6-415f-aa05-f03d5e10c39a-kube-api-access-nzwqr\") pod \"certified-operators-jpzls\" (UID: \"47d7d1db-e4f6-415f-aa05-f03d5e10c39a\") " pod="openshift-marketplace/certified-operators-jpzls" Dec 12 01:45:17 crc kubenswrapper[4606]: I1212 01:45:17.075729 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47d7d1db-e4f6-415f-aa05-f03d5e10c39a-utilities\") pod \"certified-operators-jpzls\" (UID: \"47d7d1db-e4f6-415f-aa05-f03d5e10c39a\") " pod="openshift-marketplace/certified-operators-jpzls" Dec 12 01:45:17 crc kubenswrapper[4606]: I1212 01:45:17.076053 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47d7d1db-e4f6-415f-aa05-f03d5e10c39a-catalog-content\") pod \"certified-operators-jpzls\" (UID: \"47d7d1db-e4f6-415f-aa05-f03d5e10c39a\") " pod="openshift-marketplace/certified-operators-jpzls" Dec 12 01:45:17 crc kubenswrapper[4606]: I1212 01:45:17.116947 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nzwqr\" (UniqueName: \"kubernetes.io/projected/47d7d1db-e4f6-415f-aa05-f03d5e10c39a-kube-api-access-nzwqr\") pod \"certified-operators-jpzls\" (UID: \"47d7d1db-e4f6-415f-aa05-f03d5e10c39a\") " pod="openshift-marketplace/certified-operators-jpzls" Dec 12 01:45:17 crc kubenswrapper[4606]: I1212 01:45:17.212759 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jpzls" Dec 12 01:45:17 crc kubenswrapper[4606]: I1212 01:45:17.749267 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jpzls"] Dec 12 01:45:18 crc kubenswrapper[4606]: I1212 01:45:18.594939 4606 generic.go:334] "Generic (PLEG): container finished" podID="47d7d1db-e4f6-415f-aa05-f03d5e10c39a" containerID="c835fb304a5c7ed2907041032820096eda9b729c9cd32efaddd46c5de38d338d" exitCode=0 Dec 12 01:45:18 crc kubenswrapper[4606]: I1212 01:45:18.594981 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jpzls" event={"ID":"47d7d1db-e4f6-415f-aa05-f03d5e10c39a","Type":"ContainerDied","Data":"c835fb304a5c7ed2907041032820096eda9b729c9cd32efaddd46c5de38d338d"} Dec 12 01:45:18 crc kubenswrapper[4606]: I1212 01:45:18.595370 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jpzls" event={"ID":"47d7d1db-e4f6-415f-aa05-f03d5e10c39a","Type":"ContainerStarted","Data":"b7894d5731b7aae33d4db24133dad22d11f84df11a0dfa9b5fbcad58c79db0a6"} Dec 12 01:45:18 crc kubenswrapper[4606]: I1212 01:45:18.598857 4606 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 12 01:45:20 crc kubenswrapper[4606]: I1212 01:45:20.615572 4606 generic.go:334] "Generic (PLEG): container finished" podID="47d7d1db-e4f6-415f-aa05-f03d5e10c39a" containerID="d4001b2d6bfd77ad690d4f8022c42386bb8c074b1499ff6a758e0e2677c942b8" exitCode=0 Dec 12 01:45:20 crc kubenswrapper[4606]: I1212 01:45:20.615608 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jpzls" event={"ID":"47d7d1db-e4f6-415f-aa05-f03d5e10c39a","Type":"ContainerDied","Data":"d4001b2d6bfd77ad690d4f8022c42386bb8c074b1499ff6a758e0e2677c942b8"} Dec 12 01:45:22 crc kubenswrapper[4606]: I1212 01:45:22.639560 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jpzls" event={"ID":"47d7d1db-e4f6-415f-aa05-f03d5e10c39a","Type":"ContainerStarted","Data":"50c1fb4ecafe25236bf89c7ca1fc219290451181f224e63395633effa9e87f65"} Dec 12 01:45:22 crc kubenswrapper[4606]: I1212 01:45:22.663794 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-jpzls" podStartSLOduration=3.387053322 podStartE2EDuration="6.663778721s" podCreationTimestamp="2025-12-12 01:45:16 +0000 UTC" firstStartedPulling="2025-12-12 01:45:18.598554425 +0000 UTC m=+4909.143907291" lastFinishedPulling="2025-12-12 01:45:21.875279784 +0000 UTC m=+4912.420632690" observedRunningTime="2025-12-12 01:45:22.658723027 +0000 UTC m=+4913.204075893" watchObservedRunningTime="2025-12-12 01:45:22.663778721 +0000 UTC m=+4913.209131587" Dec 12 01:45:27 crc kubenswrapper[4606]: I1212 01:45:27.213819 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-jpzls" Dec 12 01:45:27 crc kubenswrapper[4606]: I1212 01:45:27.215428 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-jpzls" Dec 12 01:45:27 crc kubenswrapper[4606]: I1212 01:45:27.677034 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-jpzls" Dec 12 01:45:27 crc kubenswrapper[4606]: I1212 01:45:27.743080 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-jpzls" Dec 12 01:45:28 crc kubenswrapper[4606]: I1212 01:45:28.849051 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jpzls"] Dec 12 01:45:30 crc kubenswrapper[4606]: I1212 01:45:30.718065 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-jpzls" podUID="47d7d1db-e4f6-415f-aa05-f03d5e10c39a" containerName="registry-server" containerID="cri-o://50c1fb4ecafe25236bf89c7ca1fc219290451181f224e63395633effa9e87f65" gracePeriod=2 Dec 12 01:45:31 crc kubenswrapper[4606]: I1212 01:45:31.207860 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jpzls" Dec 12 01:45:31 crc kubenswrapper[4606]: I1212 01:45:31.349980 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwqr\" (UniqueName: \"kubernetes.io/projected/47d7d1db-e4f6-415f-aa05-f03d5e10c39a-kube-api-access-nzwqr\") pod \"47d7d1db-e4f6-415f-aa05-f03d5e10c39a\" (UID: \"47d7d1db-e4f6-415f-aa05-f03d5e10c39a\") " Dec 12 01:45:31 crc kubenswrapper[4606]: I1212 01:45:31.350103 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47d7d1db-e4f6-415f-aa05-f03d5e10c39a-catalog-content\") pod \"47d7d1db-e4f6-415f-aa05-f03d5e10c39a\" (UID: \"47d7d1db-e4f6-415f-aa05-f03d5e10c39a\") " Dec 12 01:45:31 crc kubenswrapper[4606]: I1212 01:45:31.350185 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47d7d1db-e4f6-415f-aa05-f03d5e10c39a-utilities\") pod \"47d7d1db-e4f6-415f-aa05-f03d5e10c39a\" (UID: \"47d7d1db-e4f6-415f-aa05-f03d5e10c39a\") " Dec 12 01:45:31 crc kubenswrapper[4606]: I1212 01:45:31.351041 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47d7d1db-e4f6-415f-aa05-f03d5e10c39a-utilities" (OuterVolumeSpecName: "utilities") pod "47d7d1db-e4f6-415f-aa05-f03d5e10c39a" (UID: "47d7d1db-e4f6-415f-aa05-f03d5e10c39a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 01:45:31 crc kubenswrapper[4606]: I1212 01:45:31.365665 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47d7d1db-e4f6-415f-aa05-f03d5e10c39a-kube-api-access-nzwqr" (OuterVolumeSpecName: "kube-api-access-nzwqr") pod "47d7d1db-e4f6-415f-aa05-f03d5e10c39a" (UID: "47d7d1db-e4f6-415f-aa05-f03d5e10c39a"). InnerVolumeSpecName "kube-api-access-nzwqr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 01:45:31 crc kubenswrapper[4606]: I1212 01:45:31.399449 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47d7d1db-e4f6-415f-aa05-f03d5e10c39a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "47d7d1db-e4f6-415f-aa05-f03d5e10c39a" (UID: "47d7d1db-e4f6-415f-aa05-f03d5e10c39a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 01:45:31 crc kubenswrapper[4606]: I1212 01:45:31.454483 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwqr\" (UniqueName: \"kubernetes.io/projected/47d7d1db-e4f6-415f-aa05-f03d5e10c39a-kube-api-access-nzwqr\") on node \"crc\" DevicePath \"\"" Dec 12 01:45:31 crc kubenswrapper[4606]: I1212 01:45:31.454527 4606 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47d7d1db-e4f6-415f-aa05-f03d5e10c39a-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 12 01:45:31 crc kubenswrapper[4606]: I1212 01:45:31.454541 4606 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47d7d1db-e4f6-415f-aa05-f03d5e10c39a-utilities\") on node \"crc\" DevicePath \"\"" Dec 12 01:45:31 crc kubenswrapper[4606]: I1212 01:45:31.730569 4606 generic.go:334] "Generic (PLEG): container finished" podID="47d7d1db-e4f6-415f-aa05-f03d5e10c39a" containerID="50c1fb4ecafe25236bf89c7ca1fc219290451181f224e63395633effa9e87f65" exitCode=0 Dec 12 01:45:31 crc kubenswrapper[4606]: I1212 01:45:31.730638 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jpzls" event={"ID":"47d7d1db-e4f6-415f-aa05-f03d5e10c39a","Type":"ContainerDied","Data":"50c1fb4ecafe25236bf89c7ca1fc219290451181f224e63395633effa9e87f65"} Dec 12 01:45:31 crc kubenswrapper[4606]: I1212 01:45:31.730889 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jpzls" event={"ID":"47d7d1db-e4f6-415f-aa05-f03d5e10c39a","Type":"ContainerDied","Data":"b7894d5731b7aae33d4db24133dad22d11f84df11a0dfa9b5fbcad58c79db0a6"} Dec 12 01:45:31 crc kubenswrapper[4606]: I1212 01:45:31.730913 4606 scope.go:117] "RemoveContainer" containerID="50c1fb4ecafe25236bf89c7ca1fc219290451181f224e63395633effa9e87f65" Dec 12 01:45:31 crc kubenswrapper[4606]: I1212 01:45:31.730918 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jpzls" Dec 12 01:45:31 crc kubenswrapper[4606]: I1212 01:45:31.765446 4606 scope.go:117] "RemoveContainer" containerID="d4001b2d6bfd77ad690d4f8022c42386bb8c074b1499ff6a758e0e2677c942b8" Dec 12 01:45:31 crc kubenswrapper[4606]: I1212 01:45:31.772133 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jpzls"] Dec 12 01:45:31 crc kubenswrapper[4606]: I1212 01:45:31.784900 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-jpzls"] Dec 12 01:45:31 crc kubenswrapper[4606]: I1212 01:45:31.790642 4606 scope.go:117] "RemoveContainer" containerID="c835fb304a5c7ed2907041032820096eda9b729c9cd32efaddd46c5de38d338d" Dec 12 01:45:31 crc kubenswrapper[4606]: I1212 01:45:31.838022 4606 scope.go:117] "RemoveContainer" containerID="50c1fb4ecafe25236bf89c7ca1fc219290451181f224e63395633effa9e87f65" Dec 12 01:45:31 crc kubenswrapper[4606]: E1212 01:45:31.838618 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50c1fb4ecafe25236bf89c7ca1fc219290451181f224e63395633effa9e87f65\": container with ID starting with 50c1fb4ecafe25236bf89c7ca1fc219290451181f224e63395633effa9e87f65 not found: ID does not exist" containerID="50c1fb4ecafe25236bf89c7ca1fc219290451181f224e63395633effa9e87f65" Dec 12 01:45:31 crc kubenswrapper[4606]: I1212 01:45:31.838722 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50c1fb4ecafe25236bf89c7ca1fc219290451181f224e63395633effa9e87f65"} err="failed to get container status \"50c1fb4ecafe25236bf89c7ca1fc219290451181f224e63395633effa9e87f65\": rpc error: code = NotFound desc = could not find container \"50c1fb4ecafe25236bf89c7ca1fc219290451181f224e63395633effa9e87f65\": container with ID starting with 50c1fb4ecafe25236bf89c7ca1fc219290451181f224e63395633effa9e87f65 not found: ID does not exist" Dec 12 01:45:31 crc kubenswrapper[4606]: I1212 01:45:31.838812 4606 scope.go:117] "RemoveContainer" containerID="d4001b2d6bfd77ad690d4f8022c42386bb8c074b1499ff6a758e0e2677c942b8" Dec 12 01:45:31 crc kubenswrapper[4606]: E1212 01:45:31.839315 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4001b2d6bfd77ad690d4f8022c42386bb8c074b1499ff6a758e0e2677c942b8\": container with ID starting with d4001b2d6bfd77ad690d4f8022c42386bb8c074b1499ff6a758e0e2677c942b8 not found: ID does not exist" containerID="d4001b2d6bfd77ad690d4f8022c42386bb8c074b1499ff6a758e0e2677c942b8" Dec 12 01:45:31 crc kubenswrapper[4606]: I1212 01:45:31.839353 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4001b2d6bfd77ad690d4f8022c42386bb8c074b1499ff6a758e0e2677c942b8"} err="failed to get container status \"d4001b2d6bfd77ad690d4f8022c42386bb8c074b1499ff6a758e0e2677c942b8\": rpc error: code = NotFound desc = could not find container \"d4001b2d6bfd77ad690d4f8022c42386bb8c074b1499ff6a758e0e2677c942b8\": container with ID starting with d4001b2d6bfd77ad690d4f8022c42386bb8c074b1499ff6a758e0e2677c942b8 not found: ID does not exist" Dec 12 01:45:31 crc kubenswrapper[4606]: I1212 01:45:31.839381 4606 scope.go:117] "RemoveContainer" containerID="c835fb304a5c7ed2907041032820096eda9b729c9cd32efaddd46c5de38d338d" Dec 12 01:45:31 crc kubenswrapper[4606]: E1212 01:45:31.839883 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c835fb304a5c7ed2907041032820096eda9b729c9cd32efaddd46c5de38d338d\": container with ID starting with c835fb304a5c7ed2907041032820096eda9b729c9cd32efaddd46c5de38d338d not found: ID does not exist" containerID="c835fb304a5c7ed2907041032820096eda9b729c9cd32efaddd46c5de38d338d" Dec 12 01:45:31 crc kubenswrapper[4606]: I1212 01:45:31.839914 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c835fb304a5c7ed2907041032820096eda9b729c9cd32efaddd46c5de38d338d"} err="failed to get container status \"c835fb304a5c7ed2907041032820096eda9b729c9cd32efaddd46c5de38d338d\": rpc error: code = NotFound desc = could not find container \"c835fb304a5c7ed2907041032820096eda9b729c9cd32efaddd46c5de38d338d\": container with ID starting with c835fb304a5c7ed2907041032820096eda9b729c9cd32efaddd46c5de38d338d not found: ID does not exist" Dec 12 01:45:32 crc kubenswrapper[4606]: I1212 01:45:32.010326 4606 patch_prober.go:28] interesting pod/machine-config-daemon-cqmz5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 12 01:45:32 crc kubenswrapper[4606]: I1212 01:45:32.010671 4606 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 12 01:45:32 crc kubenswrapper[4606]: I1212 01:45:32.010822 4606 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" Dec 12 01:45:32 crc kubenswrapper[4606]: I1212 01:45:32.011717 4606 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e143db5c8656674439f9bedbd23be8e8c05444861157fb9728cbde496d7fbeb8"} pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 12 01:45:32 crc kubenswrapper[4606]: I1212 01:45:32.011864 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" containerName="machine-config-daemon" containerID="cri-o://e143db5c8656674439f9bedbd23be8e8c05444861157fb9728cbde496d7fbeb8" gracePeriod=600 Dec 12 01:45:32 crc kubenswrapper[4606]: E1212 01:45:32.141201 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 01:45:32 crc kubenswrapper[4606]: I1212 01:45:32.744094 4606 generic.go:334] "Generic (PLEG): container finished" podID="a543e227-be89-40cb-941d-b4707cc28921" containerID="e143db5c8656674439f9bedbd23be8e8c05444861157fb9728cbde496d7fbeb8" exitCode=0 Dec 12 01:45:32 crc kubenswrapper[4606]: I1212 01:45:32.744275 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" event={"ID":"a543e227-be89-40cb-941d-b4707cc28921","Type":"ContainerDied","Data":"e143db5c8656674439f9bedbd23be8e8c05444861157fb9728cbde496d7fbeb8"} Dec 12 01:45:32 crc kubenswrapper[4606]: I1212 01:45:32.744374 4606 scope.go:117] "RemoveContainer" containerID="cf2702dedf8aea1ffac5420ea4eaa7ba89f37e569f5fac1a39fd527f11622c29" Dec 12 01:45:32 crc kubenswrapper[4606]: I1212 01:45:32.745362 4606 scope.go:117] "RemoveContainer" containerID="e143db5c8656674439f9bedbd23be8e8c05444861157fb9728cbde496d7fbeb8" Dec 12 01:45:32 crc kubenswrapper[4606]: E1212 01:45:32.745738 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 01:45:33 crc kubenswrapper[4606]: I1212 01:45:33.734134 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47d7d1db-e4f6-415f-aa05-f03d5e10c39a" path="/var/lib/kubelet/pods/47d7d1db-e4f6-415f-aa05-f03d5e10c39a/volumes" Dec 12 01:45:41 crc kubenswrapper[4606]: I1212 01:45:41.853602 4606 generic.go:334] "Generic (PLEG): container finished" podID="4b099e40-725d-42e2-84fc-6ed969a20e5f" containerID="c0bd3e1cd3aaff04345df35a82960b7e5528a79f5456f54d7e0bf0565d0f9c28" exitCode=1 Dec 12 01:45:41 crc kubenswrapper[4606]: I1212 01:45:41.854146 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"4b099e40-725d-42e2-84fc-6ed969a20e5f","Type":"ContainerDied","Data":"c0bd3e1cd3aaff04345df35a82960b7e5528a79f5456f54d7e0bf0565d0f9c28"} Dec 12 01:45:43 crc kubenswrapper[4606]: I1212 01:45:43.257848 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 12 01:45:43 crc kubenswrapper[4606]: I1212 01:45:43.331526 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/4b099e40-725d-42e2-84fc-6ed969a20e5f-openstack-config-secret\") pod \"4b099e40-725d-42e2-84fc-6ed969a20e5f\" (UID: \"4b099e40-725d-42e2-84fc-6ed969a20e5f\") " Dec 12 01:45:43 crc kubenswrapper[4606]: I1212 01:45:43.331954 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cf5ch\" (UniqueName: \"kubernetes.io/projected/4b099e40-725d-42e2-84fc-6ed969a20e5f-kube-api-access-cf5ch\") pod \"4b099e40-725d-42e2-84fc-6ed969a20e5f\" (UID: \"4b099e40-725d-42e2-84fc-6ed969a20e5f\") " Dec 12 01:45:43 crc kubenswrapper[4606]: I1212 01:45:43.332027 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4b099e40-725d-42e2-84fc-6ed969a20e5f-config-data\") pod \"4b099e40-725d-42e2-84fc-6ed969a20e5f\" (UID: \"4b099e40-725d-42e2-84fc-6ed969a20e5f\") " Dec 12 01:45:43 crc kubenswrapper[4606]: I1212 01:45:43.332059 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4b099e40-725d-42e2-84fc-6ed969a20e5f-ssh-key\") pod \"4b099e40-725d-42e2-84fc-6ed969a20e5f\" (UID: \"4b099e40-725d-42e2-84fc-6ed969a20e5f\") " Dec 12 01:45:43 crc kubenswrapper[4606]: I1212 01:45:43.332116 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/4b099e40-725d-42e2-84fc-6ed969a20e5f-ca-certs\") pod \"4b099e40-725d-42e2-84fc-6ed969a20e5f\" (UID: \"4b099e40-725d-42e2-84fc-6ed969a20e5f\") " Dec 12 01:45:43 crc kubenswrapper[4606]: I1212 01:45:43.332246 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"4b099e40-725d-42e2-84fc-6ed969a20e5f\" (UID: \"4b099e40-725d-42e2-84fc-6ed969a20e5f\") " Dec 12 01:45:43 crc kubenswrapper[4606]: I1212 01:45:43.332339 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/4b099e40-725d-42e2-84fc-6ed969a20e5f-test-operator-ephemeral-temporary\") pod \"4b099e40-725d-42e2-84fc-6ed969a20e5f\" (UID: \"4b099e40-725d-42e2-84fc-6ed969a20e5f\") " Dec 12 01:45:43 crc kubenswrapper[4606]: I1212 01:45:43.332382 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/4b099e40-725d-42e2-84fc-6ed969a20e5f-test-operator-ephemeral-workdir\") pod \"4b099e40-725d-42e2-84fc-6ed969a20e5f\" (UID: \"4b099e40-725d-42e2-84fc-6ed969a20e5f\") " Dec 12 01:45:43 crc kubenswrapper[4606]: I1212 01:45:43.332458 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/4b099e40-725d-42e2-84fc-6ed969a20e5f-openstack-config\") pod \"4b099e40-725d-42e2-84fc-6ed969a20e5f\" (UID: \"4b099e40-725d-42e2-84fc-6ed969a20e5f\") " Dec 12 01:45:43 crc kubenswrapper[4606]: I1212 01:45:43.339142 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b099e40-725d-42e2-84fc-6ed969a20e5f-config-data" (OuterVolumeSpecName: "config-data") pod "4b099e40-725d-42e2-84fc-6ed969a20e5f" (UID: "4b099e40-725d-42e2-84fc-6ed969a20e5f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 01:45:43 crc kubenswrapper[4606]: I1212 01:45:43.339480 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b099e40-725d-42e2-84fc-6ed969a20e5f-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "4b099e40-725d-42e2-84fc-6ed969a20e5f" (UID: "4b099e40-725d-42e2-84fc-6ed969a20e5f"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 01:45:43 crc kubenswrapper[4606]: I1212 01:45:43.350668 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "test-operator-logs") pod "4b099e40-725d-42e2-84fc-6ed969a20e5f" (UID: "4b099e40-725d-42e2-84fc-6ed969a20e5f"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 12 01:45:43 crc kubenswrapper[4606]: I1212 01:45:43.360828 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b099e40-725d-42e2-84fc-6ed969a20e5f-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "4b099e40-725d-42e2-84fc-6ed969a20e5f" (UID: "4b099e40-725d-42e2-84fc-6ed969a20e5f"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 01:45:43 crc kubenswrapper[4606]: I1212 01:45:43.363646 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b099e40-725d-42e2-84fc-6ed969a20e5f-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "4b099e40-725d-42e2-84fc-6ed969a20e5f" (UID: "4b099e40-725d-42e2-84fc-6ed969a20e5f"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 01:45:43 crc kubenswrapper[4606]: I1212 01:45:43.371421 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b099e40-725d-42e2-84fc-6ed969a20e5f-kube-api-access-cf5ch" (OuterVolumeSpecName: "kube-api-access-cf5ch") pod "4b099e40-725d-42e2-84fc-6ed969a20e5f" (UID: "4b099e40-725d-42e2-84fc-6ed969a20e5f"). InnerVolumeSpecName "kube-api-access-cf5ch". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 01:45:43 crc kubenswrapper[4606]: I1212 01:45:43.388553 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b099e40-725d-42e2-84fc-6ed969a20e5f-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "4b099e40-725d-42e2-84fc-6ed969a20e5f" (UID: "4b099e40-725d-42e2-84fc-6ed969a20e5f"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 01:45:43 crc kubenswrapper[4606]: I1212 01:45:43.390992 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b099e40-725d-42e2-84fc-6ed969a20e5f-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "4b099e40-725d-42e2-84fc-6ed969a20e5f" (UID: "4b099e40-725d-42e2-84fc-6ed969a20e5f"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 12 01:45:43 crc kubenswrapper[4606]: I1212 01:45:43.412297 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b099e40-725d-42e2-84fc-6ed969a20e5f-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "4b099e40-725d-42e2-84fc-6ed969a20e5f" (UID: "4b099e40-725d-42e2-84fc-6ed969a20e5f"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 12 01:45:43 crc kubenswrapper[4606]: I1212 01:45:43.434600 4606 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/4b099e40-725d-42e2-84fc-6ed969a20e5f-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Dec 12 01:45:43 crc kubenswrapper[4606]: I1212 01:45:43.434636 4606 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/4b099e40-725d-42e2-84fc-6ed969a20e5f-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Dec 12 01:45:43 crc kubenswrapper[4606]: I1212 01:45:43.434662 4606 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/4b099e40-725d-42e2-84fc-6ed969a20e5f-openstack-config\") on node \"crc\" DevicePath \"\"" Dec 12 01:45:43 crc kubenswrapper[4606]: I1212 01:45:43.434672 4606 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/4b099e40-725d-42e2-84fc-6ed969a20e5f-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Dec 12 01:45:43 crc kubenswrapper[4606]: I1212 01:45:43.434682 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cf5ch\" (UniqueName: \"kubernetes.io/projected/4b099e40-725d-42e2-84fc-6ed969a20e5f-kube-api-access-cf5ch\") on node \"crc\" DevicePath \"\"" Dec 12 01:45:43 crc kubenswrapper[4606]: I1212 01:45:43.434690 4606 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4b099e40-725d-42e2-84fc-6ed969a20e5f-config-data\") on node \"crc\" DevicePath \"\"" Dec 12 01:45:43 crc kubenswrapper[4606]: I1212 01:45:43.434698 4606 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4b099e40-725d-42e2-84fc-6ed969a20e5f-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 12 01:45:43 crc kubenswrapper[4606]: I1212 01:45:43.434706 4606 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/4b099e40-725d-42e2-84fc-6ed969a20e5f-ca-certs\") on node \"crc\" DevicePath \"\"" Dec 12 01:45:43 crc kubenswrapper[4606]: I1212 01:45:43.435242 4606 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Dec 12 01:45:43 crc kubenswrapper[4606]: I1212 01:45:43.454271 4606 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Dec 12 01:45:43 crc kubenswrapper[4606]: I1212 01:45:43.537273 4606 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Dec 12 01:45:43 crc kubenswrapper[4606]: I1212 01:45:43.701411 4606 scope.go:117] "RemoveContainer" containerID="e143db5c8656674439f9bedbd23be8e8c05444861157fb9728cbde496d7fbeb8" Dec 12 01:45:43 crc kubenswrapper[4606]: E1212 01:45:43.701939 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 01:45:43 crc kubenswrapper[4606]: I1212 01:45:43.876456 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"4b099e40-725d-42e2-84fc-6ed969a20e5f","Type":"ContainerDied","Data":"fd0ef95278326cc8cd5af7cacb365d6b41c6ce2731b5a2a8ad5191be2ef1a894"} Dec 12 01:45:43 crc kubenswrapper[4606]: I1212 01:45:43.876981 4606 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fd0ef95278326cc8cd5af7cacb365d6b41c6ce2731b5a2a8ad5191be2ef1a894" Dec 12 01:45:43 crc kubenswrapper[4606]: I1212 01:45:43.876484 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 12 01:45:46 crc kubenswrapper[4606]: I1212 01:45:46.012663 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 12 01:45:46 crc kubenswrapper[4606]: E1212 01:45:46.014407 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47d7d1db-e4f6-415f-aa05-f03d5e10c39a" containerName="registry-server" Dec 12 01:45:46 crc kubenswrapper[4606]: I1212 01:45:46.014438 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="47d7d1db-e4f6-415f-aa05-f03d5e10c39a" containerName="registry-server" Dec 12 01:45:46 crc kubenswrapper[4606]: E1212 01:45:46.014465 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47d7d1db-e4f6-415f-aa05-f03d5e10c39a" containerName="extract-utilities" Dec 12 01:45:46 crc kubenswrapper[4606]: I1212 01:45:46.014475 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="47d7d1db-e4f6-415f-aa05-f03d5e10c39a" containerName="extract-utilities" Dec 12 01:45:46 crc kubenswrapper[4606]: E1212 01:45:46.014512 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b099e40-725d-42e2-84fc-6ed969a20e5f" containerName="tempest-tests-tempest-tests-runner" Dec 12 01:45:46 crc kubenswrapper[4606]: I1212 01:45:46.014521 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b099e40-725d-42e2-84fc-6ed969a20e5f" containerName="tempest-tests-tempest-tests-runner" Dec 12 01:45:46 crc kubenswrapper[4606]: E1212 01:45:46.014543 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47d7d1db-e4f6-415f-aa05-f03d5e10c39a" containerName="extract-content" Dec 12 01:45:46 crc kubenswrapper[4606]: I1212 01:45:46.014551 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="47d7d1db-e4f6-415f-aa05-f03d5e10c39a" containerName="extract-content" Dec 12 01:45:46 crc kubenswrapper[4606]: I1212 01:45:46.014875 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b099e40-725d-42e2-84fc-6ed969a20e5f" containerName="tempest-tests-tempest-tests-runner" Dec 12 01:45:46 crc kubenswrapper[4606]: I1212 01:45:46.014912 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="47d7d1db-e4f6-415f-aa05-f03d5e10c39a" containerName="registry-server" Dec 12 01:45:46 crc kubenswrapper[4606]: I1212 01:45:46.015673 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 12 01:45:46 crc kubenswrapper[4606]: I1212 01:45:46.020392 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-xrmfx" Dec 12 01:45:46 crc kubenswrapper[4606]: I1212 01:45:46.026703 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 12 01:45:46 crc kubenswrapper[4606]: I1212 01:45:46.107958 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hzfg\" (UniqueName: \"kubernetes.io/projected/c43bd731-87e3-4a9a-8399-5936fe0a315f-kube-api-access-4hzfg\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"c43bd731-87e3-4a9a-8399-5936fe0a315f\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 12 01:45:46 crc kubenswrapper[4606]: I1212 01:45:46.108050 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"c43bd731-87e3-4a9a-8399-5936fe0a315f\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 12 01:45:46 crc kubenswrapper[4606]: I1212 01:45:46.209888 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"c43bd731-87e3-4a9a-8399-5936fe0a315f\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 12 01:45:46 crc kubenswrapper[4606]: I1212 01:45:46.210035 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4hzfg\" (UniqueName: \"kubernetes.io/projected/c43bd731-87e3-4a9a-8399-5936fe0a315f-kube-api-access-4hzfg\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"c43bd731-87e3-4a9a-8399-5936fe0a315f\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 12 01:45:46 crc kubenswrapper[4606]: I1212 01:45:46.210676 4606 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"c43bd731-87e3-4a9a-8399-5936fe0a315f\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 12 01:45:46 crc kubenswrapper[4606]: I1212 01:45:46.255493 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"c43bd731-87e3-4a9a-8399-5936fe0a315f\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 12 01:45:46 crc kubenswrapper[4606]: I1212 01:45:46.257666 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hzfg\" (UniqueName: \"kubernetes.io/projected/c43bd731-87e3-4a9a-8399-5936fe0a315f-kube-api-access-4hzfg\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"c43bd731-87e3-4a9a-8399-5936fe0a315f\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 12 01:45:46 crc kubenswrapper[4606]: I1212 01:45:46.341798 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 12 01:45:46 crc kubenswrapper[4606]: I1212 01:45:46.781870 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 12 01:45:46 crc kubenswrapper[4606]: I1212 01:45:46.940092 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"c43bd731-87e3-4a9a-8399-5936fe0a315f","Type":"ContainerStarted","Data":"fe08c47390ac51f09bd303cf7091f39564c1f95fc50aadee947e8030131e74bf"} Dec 12 01:45:48 crc kubenswrapper[4606]: I1212 01:45:48.958588 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"c43bd731-87e3-4a9a-8399-5936fe0a315f","Type":"ContainerStarted","Data":"86b58bc50031f2d2adf32452606109fa9a43ee03b729baa8dc709bf82cde65c3"} Dec 12 01:45:48 crc kubenswrapper[4606]: I1212 01:45:48.979353 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=3.017977571 podStartE2EDuration="3.979319554s" podCreationTimestamp="2025-12-12 01:45:45 +0000 UTC" firstStartedPulling="2025-12-12 01:45:46.792622609 +0000 UTC m=+4937.337975475" lastFinishedPulling="2025-12-12 01:45:47.753964592 +0000 UTC m=+4938.299317458" observedRunningTime="2025-12-12 01:45:48.975849542 +0000 UTC m=+4939.521202418" watchObservedRunningTime="2025-12-12 01:45:48.979319554 +0000 UTC m=+4939.524672440" Dec 12 01:45:55 crc kubenswrapper[4606]: I1212 01:45:55.699936 4606 scope.go:117] "RemoveContainer" containerID="e143db5c8656674439f9bedbd23be8e8c05444861157fb9728cbde496d7fbeb8" Dec 12 01:45:55 crc kubenswrapper[4606]: E1212 01:45:55.701849 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 01:46:04 crc kubenswrapper[4606]: I1212 01:46:04.742437 4606 scope.go:117] "RemoveContainer" containerID="6ec8ffdf536a2a4e21ce41c6e47cafa65561abafba622430f6d3b046f27ccb30" Dec 12 01:46:10 crc kubenswrapper[4606]: I1212 01:46:10.700285 4606 scope.go:117] "RemoveContainer" containerID="e143db5c8656674439f9bedbd23be8e8c05444861157fb9728cbde496d7fbeb8" Dec 12 01:46:10 crc kubenswrapper[4606]: E1212 01:46:10.701236 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 01:46:22 crc kubenswrapper[4606]: I1212 01:46:22.699782 4606 scope.go:117] "RemoveContainer" containerID="e143db5c8656674439f9bedbd23be8e8c05444861157fb9728cbde496d7fbeb8" Dec 12 01:46:22 crc kubenswrapper[4606]: E1212 01:46:22.701571 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 01:46:26 crc kubenswrapper[4606]: I1212 01:46:26.089279 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-bh6lp/must-gather-xw4n5"] Dec 12 01:46:26 crc kubenswrapper[4606]: I1212 01:46:26.094052 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bh6lp/must-gather-xw4n5" Dec 12 01:46:26 crc kubenswrapper[4606]: I1212 01:46:26.097276 4606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-bh6lp"/"default-dockercfg-qtq67" Dec 12 01:46:26 crc kubenswrapper[4606]: I1212 01:46:26.097314 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-bh6lp"/"kube-root-ca.crt" Dec 12 01:46:26 crc kubenswrapper[4606]: I1212 01:46:26.097397 4606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-bh6lp"/"openshift-service-ca.crt" Dec 12 01:46:26 crc kubenswrapper[4606]: I1212 01:46:26.126091 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-bh6lp/must-gather-xw4n5"] Dec 12 01:46:26 crc kubenswrapper[4606]: I1212 01:46:26.230463 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lck2b\" (UniqueName: \"kubernetes.io/projected/26000cea-02cc-4449-b125-39aa4ca0015f-kube-api-access-lck2b\") pod \"must-gather-xw4n5\" (UID: \"26000cea-02cc-4449-b125-39aa4ca0015f\") " pod="openshift-must-gather-bh6lp/must-gather-xw4n5" Dec 12 01:46:26 crc kubenswrapper[4606]: I1212 01:46:26.230551 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/26000cea-02cc-4449-b125-39aa4ca0015f-must-gather-output\") pod \"must-gather-xw4n5\" (UID: \"26000cea-02cc-4449-b125-39aa4ca0015f\") " pod="openshift-must-gather-bh6lp/must-gather-xw4n5" Dec 12 01:46:26 crc kubenswrapper[4606]: I1212 01:46:26.331784 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/26000cea-02cc-4449-b125-39aa4ca0015f-must-gather-output\") pod \"must-gather-xw4n5\" (UID: \"26000cea-02cc-4449-b125-39aa4ca0015f\") " pod="openshift-must-gather-bh6lp/must-gather-xw4n5" Dec 12 01:46:26 crc kubenswrapper[4606]: I1212 01:46:26.332116 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lck2b\" (UniqueName: \"kubernetes.io/projected/26000cea-02cc-4449-b125-39aa4ca0015f-kube-api-access-lck2b\") pod \"must-gather-xw4n5\" (UID: \"26000cea-02cc-4449-b125-39aa4ca0015f\") " pod="openshift-must-gather-bh6lp/must-gather-xw4n5" Dec 12 01:46:26 crc kubenswrapper[4606]: I1212 01:46:26.332657 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/26000cea-02cc-4449-b125-39aa4ca0015f-must-gather-output\") pod \"must-gather-xw4n5\" (UID: \"26000cea-02cc-4449-b125-39aa4ca0015f\") " pod="openshift-must-gather-bh6lp/must-gather-xw4n5" Dec 12 01:46:26 crc kubenswrapper[4606]: I1212 01:46:26.352000 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lck2b\" (UniqueName: \"kubernetes.io/projected/26000cea-02cc-4449-b125-39aa4ca0015f-kube-api-access-lck2b\") pod \"must-gather-xw4n5\" (UID: \"26000cea-02cc-4449-b125-39aa4ca0015f\") " pod="openshift-must-gather-bh6lp/must-gather-xw4n5" Dec 12 01:46:26 crc kubenswrapper[4606]: I1212 01:46:26.422968 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bh6lp/must-gather-xw4n5" Dec 12 01:46:26 crc kubenswrapper[4606]: I1212 01:46:26.902622 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-bh6lp/must-gather-xw4n5"] Dec 12 01:46:27 crc kubenswrapper[4606]: I1212 01:46:27.395004 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bh6lp/must-gather-xw4n5" event={"ID":"26000cea-02cc-4449-b125-39aa4ca0015f","Type":"ContainerStarted","Data":"bdc2af927e5680250597b4eadc8923ec48092d4b3986e952ec2ba07f75ba5853"} Dec 12 01:46:35 crc kubenswrapper[4606]: I1212 01:46:35.473265 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bh6lp/must-gather-xw4n5" event={"ID":"26000cea-02cc-4449-b125-39aa4ca0015f","Type":"ContainerStarted","Data":"cc179539ddc07f641d1fb9442736f702b44cd48f70827ccfe88e12570478b6a9"} Dec 12 01:46:35 crc kubenswrapper[4606]: I1212 01:46:35.473775 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bh6lp/must-gather-xw4n5" event={"ID":"26000cea-02cc-4449-b125-39aa4ca0015f","Type":"ContainerStarted","Data":"74e0cd3215c8d1f220c99a05faed3ceb40de2845f8e5cee6925167dbde206113"} Dec 12 01:46:35 crc kubenswrapper[4606]: I1212 01:46:35.492755 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-bh6lp/must-gather-xw4n5" podStartSLOduration=1.539785787 podStartE2EDuration="9.49273486s" podCreationTimestamp="2025-12-12 01:46:26 +0000 UTC" firstStartedPulling="2025-12-12 01:46:26.908115661 +0000 UTC m=+4977.453468527" lastFinishedPulling="2025-12-12 01:46:34.861064734 +0000 UTC m=+4985.406417600" observedRunningTime="2025-12-12 01:46:35.48633623 +0000 UTC m=+4986.031689096" watchObservedRunningTime="2025-12-12 01:46:35.49273486 +0000 UTC m=+4986.038087726" Dec 12 01:46:35 crc kubenswrapper[4606]: I1212 01:46:35.700333 4606 scope.go:117] "RemoveContainer" containerID="e143db5c8656674439f9bedbd23be8e8c05444861157fb9728cbde496d7fbeb8" Dec 12 01:46:35 crc kubenswrapper[4606]: E1212 01:46:35.700644 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 01:46:39 crc kubenswrapper[4606]: I1212 01:46:39.390438 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-bh6lp/crc-debug-wsbqc"] Dec 12 01:46:39 crc kubenswrapper[4606]: I1212 01:46:39.392018 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bh6lp/crc-debug-wsbqc" Dec 12 01:46:39 crc kubenswrapper[4606]: I1212 01:46:39.531911 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e381f901-eeb3-4134-832d-051517b0281d-host\") pod \"crc-debug-wsbqc\" (UID: \"e381f901-eeb3-4134-832d-051517b0281d\") " pod="openshift-must-gather-bh6lp/crc-debug-wsbqc" Dec 12 01:46:39 crc kubenswrapper[4606]: I1212 01:46:39.531981 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vtch\" (UniqueName: \"kubernetes.io/projected/e381f901-eeb3-4134-832d-051517b0281d-kube-api-access-9vtch\") pod \"crc-debug-wsbqc\" (UID: \"e381f901-eeb3-4134-832d-051517b0281d\") " pod="openshift-must-gather-bh6lp/crc-debug-wsbqc" Dec 12 01:46:39 crc kubenswrapper[4606]: I1212 01:46:39.633658 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e381f901-eeb3-4134-832d-051517b0281d-host\") pod \"crc-debug-wsbqc\" (UID: \"e381f901-eeb3-4134-832d-051517b0281d\") " pod="openshift-must-gather-bh6lp/crc-debug-wsbqc" Dec 12 01:46:39 crc kubenswrapper[4606]: I1212 01:46:39.633715 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9vtch\" (UniqueName: \"kubernetes.io/projected/e381f901-eeb3-4134-832d-051517b0281d-kube-api-access-9vtch\") pod \"crc-debug-wsbqc\" (UID: \"e381f901-eeb3-4134-832d-051517b0281d\") " pod="openshift-must-gather-bh6lp/crc-debug-wsbqc" Dec 12 01:46:39 crc kubenswrapper[4606]: I1212 01:46:39.634329 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e381f901-eeb3-4134-832d-051517b0281d-host\") pod \"crc-debug-wsbqc\" (UID: \"e381f901-eeb3-4134-832d-051517b0281d\") " pod="openshift-must-gather-bh6lp/crc-debug-wsbqc" Dec 12 01:46:39 crc kubenswrapper[4606]: I1212 01:46:39.762797 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vtch\" (UniqueName: \"kubernetes.io/projected/e381f901-eeb3-4134-832d-051517b0281d-kube-api-access-9vtch\") pod \"crc-debug-wsbqc\" (UID: \"e381f901-eeb3-4134-832d-051517b0281d\") " pod="openshift-must-gather-bh6lp/crc-debug-wsbqc" Dec 12 01:46:40 crc kubenswrapper[4606]: I1212 01:46:40.008576 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bh6lp/crc-debug-wsbqc" Dec 12 01:46:40 crc kubenswrapper[4606]: W1212 01:46:40.036001 4606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode381f901_eeb3_4134_832d_051517b0281d.slice/crio-3c4c0cddf8a37ec96d03dcda92499a845ada5aff973061d2d77b7d28f335a61c WatchSource:0}: Error finding container 3c4c0cddf8a37ec96d03dcda92499a845ada5aff973061d2d77b7d28f335a61c: Status 404 returned error can't find the container with id 3c4c0cddf8a37ec96d03dcda92499a845ada5aff973061d2d77b7d28f335a61c Dec 12 01:46:40 crc kubenswrapper[4606]: I1212 01:46:40.521318 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bh6lp/crc-debug-wsbqc" event={"ID":"e381f901-eeb3-4134-832d-051517b0281d","Type":"ContainerStarted","Data":"3c4c0cddf8a37ec96d03dcda92499a845ada5aff973061d2d77b7d28f335a61c"} Dec 12 01:46:48 crc kubenswrapper[4606]: I1212 01:46:48.700596 4606 scope.go:117] "RemoveContainer" containerID="e143db5c8656674439f9bedbd23be8e8c05444861157fb9728cbde496d7fbeb8" Dec 12 01:46:48 crc kubenswrapper[4606]: E1212 01:46:48.701354 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 01:46:53 crc kubenswrapper[4606]: I1212 01:46:53.644287 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bh6lp/crc-debug-wsbqc" event={"ID":"e381f901-eeb3-4134-832d-051517b0281d","Type":"ContainerStarted","Data":"47a82264c77f652dc997af38d2a6b0fbc52ce206c7ab942aca55c3273342d8a3"} Dec 12 01:46:53 crc kubenswrapper[4606]: I1212 01:46:53.669216 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-bh6lp/crc-debug-wsbqc" podStartSLOduration=2.164048552 podStartE2EDuration="14.669195701s" podCreationTimestamp="2025-12-12 01:46:39 +0000 UTC" firstStartedPulling="2025-12-12 01:46:40.037772275 +0000 UTC m=+4990.583125141" lastFinishedPulling="2025-12-12 01:46:52.542919414 +0000 UTC m=+5003.088272290" observedRunningTime="2025-12-12 01:46:53.66426685 +0000 UTC m=+5004.209619706" watchObservedRunningTime="2025-12-12 01:46:53.669195701 +0000 UTC m=+5004.214548567" Dec 12 01:46:59 crc kubenswrapper[4606]: I1212 01:46:59.699558 4606 scope.go:117] "RemoveContainer" containerID="e143db5c8656674439f9bedbd23be8e8c05444861157fb9728cbde496d7fbeb8" Dec 12 01:46:59 crc kubenswrapper[4606]: E1212 01:46:59.700283 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 01:47:11 crc kubenswrapper[4606]: I1212 01:47:11.700030 4606 scope.go:117] "RemoveContainer" containerID="e143db5c8656674439f9bedbd23be8e8c05444861157fb9728cbde496d7fbeb8" Dec 12 01:47:11 crc kubenswrapper[4606]: E1212 01:47:11.700901 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 01:47:22 crc kubenswrapper[4606]: I1212 01:47:22.700992 4606 scope.go:117] "RemoveContainer" containerID="e143db5c8656674439f9bedbd23be8e8c05444861157fb9728cbde496d7fbeb8" Dec 12 01:47:22 crc kubenswrapper[4606]: E1212 01:47:22.701708 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 01:47:37 crc kubenswrapper[4606]: I1212 01:47:37.699485 4606 scope.go:117] "RemoveContainer" containerID="e143db5c8656674439f9bedbd23be8e8c05444861157fb9728cbde496d7fbeb8" Dec 12 01:47:37 crc kubenswrapper[4606]: E1212 01:47:37.700239 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 01:47:45 crc kubenswrapper[4606]: I1212 01:47:45.158131 4606 generic.go:334] "Generic (PLEG): container finished" podID="e381f901-eeb3-4134-832d-051517b0281d" containerID="47a82264c77f652dc997af38d2a6b0fbc52ce206c7ab942aca55c3273342d8a3" exitCode=0 Dec 12 01:47:45 crc kubenswrapper[4606]: I1212 01:47:45.158222 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bh6lp/crc-debug-wsbqc" event={"ID":"e381f901-eeb3-4134-832d-051517b0281d","Type":"ContainerDied","Data":"47a82264c77f652dc997af38d2a6b0fbc52ce206c7ab942aca55c3273342d8a3"} Dec 12 01:47:46 crc kubenswrapper[4606]: I1212 01:47:46.322054 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bh6lp/crc-debug-wsbqc" Dec 12 01:47:46 crc kubenswrapper[4606]: I1212 01:47:46.349764 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9vtch\" (UniqueName: \"kubernetes.io/projected/e381f901-eeb3-4134-832d-051517b0281d-kube-api-access-9vtch\") pod \"e381f901-eeb3-4134-832d-051517b0281d\" (UID: \"e381f901-eeb3-4134-832d-051517b0281d\") " Dec 12 01:47:46 crc kubenswrapper[4606]: I1212 01:47:46.350359 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e381f901-eeb3-4134-832d-051517b0281d-host\") pod \"e381f901-eeb3-4134-832d-051517b0281d\" (UID: \"e381f901-eeb3-4134-832d-051517b0281d\") " Dec 12 01:47:46 crc kubenswrapper[4606]: I1212 01:47:46.350473 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e381f901-eeb3-4134-832d-051517b0281d-host" (OuterVolumeSpecName: "host") pod "e381f901-eeb3-4134-832d-051517b0281d" (UID: "e381f901-eeb3-4134-832d-051517b0281d"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 12 01:47:46 crc kubenswrapper[4606]: I1212 01:47:46.351237 4606 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e381f901-eeb3-4134-832d-051517b0281d-host\") on node \"crc\" DevicePath \"\"" Dec 12 01:47:46 crc kubenswrapper[4606]: I1212 01:47:46.353392 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-bh6lp/crc-debug-wsbqc"] Dec 12 01:47:46 crc kubenswrapper[4606]: I1212 01:47:46.355668 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e381f901-eeb3-4134-832d-051517b0281d-kube-api-access-9vtch" (OuterVolumeSpecName: "kube-api-access-9vtch") pod "e381f901-eeb3-4134-832d-051517b0281d" (UID: "e381f901-eeb3-4134-832d-051517b0281d"). InnerVolumeSpecName "kube-api-access-9vtch". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 01:47:46 crc kubenswrapper[4606]: I1212 01:47:46.364930 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-bh6lp/crc-debug-wsbqc"] Dec 12 01:47:46 crc kubenswrapper[4606]: I1212 01:47:46.453741 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9vtch\" (UniqueName: \"kubernetes.io/projected/e381f901-eeb3-4134-832d-051517b0281d-kube-api-access-9vtch\") on node \"crc\" DevicePath \"\"" Dec 12 01:47:47 crc kubenswrapper[4606]: I1212 01:47:47.186656 4606 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3c4c0cddf8a37ec96d03dcda92499a845ada5aff973061d2d77b7d28f335a61c" Dec 12 01:47:47 crc kubenswrapper[4606]: I1212 01:47:47.187079 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bh6lp/crc-debug-wsbqc" Dec 12 01:47:47 crc kubenswrapper[4606]: I1212 01:47:47.712876 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e381f901-eeb3-4134-832d-051517b0281d" path="/var/lib/kubelet/pods/e381f901-eeb3-4134-832d-051517b0281d/volumes" Dec 12 01:47:47 crc kubenswrapper[4606]: I1212 01:47:47.877204 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-bh6lp/crc-debug-l5sf7"] Dec 12 01:47:47 crc kubenswrapper[4606]: E1212 01:47:47.877620 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e381f901-eeb3-4134-832d-051517b0281d" containerName="container-00" Dec 12 01:47:47 crc kubenswrapper[4606]: I1212 01:47:47.877645 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="e381f901-eeb3-4134-832d-051517b0281d" containerName="container-00" Dec 12 01:47:47 crc kubenswrapper[4606]: I1212 01:47:47.877902 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="e381f901-eeb3-4134-832d-051517b0281d" containerName="container-00" Dec 12 01:47:47 crc kubenswrapper[4606]: I1212 01:47:47.878703 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bh6lp/crc-debug-l5sf7" Dec 12 01:47:47 crc kubenswrapper[4606]: I1212 01:47:47.996224 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e754ff3c-9c23-4154-88e2-22a73af88429-host\") pod \"crc-debug-l5sf7\" (UID: \"e754ff3c-9c23-4154-88e2-22a73af88429\") " pod="openshift-must-gather-bh6lp/crc-debug-l5sf7" Dec 12 01:47:47 crc kubenswrapper[4606]: I1212 01:47:47.996366 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlc5f\" (UniqueName: \"kubernetes.io/projected/e754ff3c-9c23-4154-88e2-22a73af88429-kube-api-access-wlc5f\") pod \"crc-debug-l5sf7\" (UID: \"e754ff3c-9c23-4154-88e2-22a73af88429\") " pod="openshift-must-gather-bh6lp/crc-debug-l5sf7" Dec 12 01:47:48 crc kubenswrapper[4606]: I1212 01:47:48.098377 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wlc5f\" (UniqueName: \"kubernetes.io/projected/e754ff3c-9c23-4154-88e2-22a73af88429-kube-api-access-wlc5f\") pod \"crc-debug-l5sf7\" (UID: \"e754ff3c-9c23-4154-88e2-22a73af88429\") " pod="openshift-must-gather-bh6lp/crc-debug-l5sf7" Dec 12 01:47:48 crc kubenswrapper[4606]: I1212 01:47:48.099075 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e754ff3c-9c23-4154-88e2-22a73af88429-host\") pod \"crc-debug-l5sf7\" (UID: \"e754ff3c-9c23-4154-88e2-22a73af88429\") " pod="openshift-must-gather-bh6lp/crc-debug-l5sf7" Dec 12 01:47:48 crc kubenswrapper[4606]: I1212 01:47:48.099121 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e754ff3c-9c23-4154-88e2-22a73af88429-host\") pod \"crc-debug-l5sf7\" (UID: \"e754ff3c-9c23-4154-88e2-22a73af88429\") " pod="openshift-must-gather-bh6lp/crc-debug-l5sf7" Dec 12 01:47:48 crc kubenswrapper[4606]: I1212 01:47:48.118849 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wlc5f\" (UniqueName: \"kubernetes.io/projected/e754ff3c-9c23-4154-88e2-22a73af88429-kube-api-access-wlc5f\") pod \"crc-debug-l5sf7\" (UID: \"e754ff3c-9c23-4154-88e2-22a73af88429\") " pod="openshift-must-gather-bh6lp/crc-debug-l5sf7" Dec 12 01:47:48 crc kubenswrapper[4606]: I1212 01:47:48.203683 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bh6lp/crc-debug-l5sf7" Dec 12 01:47:49 crc kubenswrapper[4606]: I1212 01:47:49.221472 4606 generic.go:334] "Generic (PLEG): container finished" podID="e754ff3c-9c23-4154-88e2-22a73af88429" containerID="df8f55afa453e3bb3b1ac0a41a255792d20d0df8f2dbabd1460b00b95021813f" exitCode=0 Dec 12 01:47:49 crc kubenswrapper[4606]: I1212 01:47:49.221550 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bh6lp/crc-debug-l5sf7" event={"ID":"e754ff3c-9c23-4154-88e2-22a73af88429","Type":"ContainerDied","Data":"df8f55afa453e3bb3b1ac0a41a255792d20d0df8f2dbabd1460b00b95021813f"} Dec 12 01:47:49 crc kubenswrapper[4606]: I1212 01:47:49.221784 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bh6lp/crc-debug-l5sf7" event={"ID":"e754ff3c-9c23-4154-88e2-22a73af88429","Type":"ContainerStarted","Data":"0f4ae3b2d01ef852a3a2696f2ecc4e47672f7d8e17c847ae50d0718f65808f54"} Dec 12 01:47:49 crc kubenswrapper[4606]: I1212 01:47:49.710781 4606 scope.go:117] "RemoveContainer" containerID="e143db5c8656674439f9bedbd23be8e8c05444861157fb9728cbde496d7fbeb8" Dec 12 01:47:49 crc kubenswrapper[4606]: E1212 01:47:49.711453 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 01:47:50 crc kubenswrapper[4606]: I1212 01:47:50.385840 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bh6lp/crc-debug-l5sf7" Dec 12 01:47:50 crc kubenswrapper[4606]: I1212 01:47:50.543931 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e754ff3c-9c23-4154-88e2-22a73af88429-host\") pod \"e754ff3c-9c23-4154-88e2-22a73af88429\" (UID: \"e754ff3c-9c23-4154-88e2-22a73af88429\") " Dec 12 01:47:50 crc kubenswrapper[4606]: I1212 01:47:50.544688 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wlc5f\" (UniqueName: \"kubernetes.io/projected/e754ff3c-9c23-4154-88e2-22a73af88429-kube-api-access-wlc5f\") pod \"e754ff3c-9c23-4154-88e2-22a73af88429\" (UID: \"e754ff3c-9c23-4154-88e2-22a73af88429\") " Dec 12 01:47:50 crc kubenswrapper[4606]: I1212 01:47:50.544021 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e754ff3c-9c23-4154-88e2-22a73af88429-host" (OuterVolumeSpecName: "host") pod "e754ff3c-9c23-4154-88e2-22a73af88429" (UID: "e754ff3c-9c23-4154-88e2-22a73af88429"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 12 01:47:50 crc kubenswrapper[4606]: I1212 01:47:50.551508 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e754ff3c-9c23-4154-88e2-22a73af88429-kube-api-access-wlc5f" (OuterVolumeSpecName: "kube-api-access-wlc5f") pod "e754ff3c-9c23-4154-88e2-22a73af88429" (UID: "e754ff3c-9c23-4154-88e2-22a73af88429"). InnerVolumeSpecName "kube-api-access-wlc5f". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 01:47:50 crc kubenswrapper[4606]: I1212 01:47:50.646382 4606 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e754ff3c-9c23-4154-88e2-22a73af88429-host\") on node \"crc\" DevicePath \"\"" Dec 12 01:47:50 crc kubenswrapper[4606]: I1212 01:47:50.646414 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wlc5f\" (UniqueName: \"kubernetes.io/projected/e754ff3c-9c23-4154-88e2-22a73af88429-kube-api-access-wlc5f\") on node \"crc\" DevicePath \"\"" Dec 12 01:47:51 crc kubenswrapper[4606]: I1212 01:47:51.245012 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bh6lp/crc-debug-l5sf7" event={"ID":"e754ff3c-9c23-4154-88e2-22a73af88429","Type":"ContainerDied","Data":"0f4ae3b2d01ef852a3a2696f2ecc4e47672f7d8e17c847ae50d0718f65808f54"} Dec 12 01:47:51 crc kubenswrapper[4606]: I1212 01:47:51.245046 4606 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0f4ae3b2d01ef852a3a2696f2ecc4e47672f7d8e17c847ae50d0718f65808f54" Dec 12 01:47:51 crc kubenswrapper[4606]: I1212 01:47:51.245099 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bh6lp/crc-debug-l5sf7" Dec 12 01:47:51 crc kubenswrapper[4606]: I1212 01:47:51.341708 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-bh6lp/crc-debug-l5sf7"] Dec 12 01:47:51 crc kubenswrapper[4606]: I1212 01:47:51.352985 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-bh6lp/crc-debug-l5sf7"] Dec 12 01:47:51 crc kubenswrapper[4606]: I1212 01:47:51.710294 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e754ff3c-9c23-4154-88e2-22a73af88429" path="/var/lib/kubelet/pods/e754ff3c-9c23-4154-88e2-22a73af88429/volumes" Dec 12 01:47:52 crc kubenswrapper[4606]: I1212 01:47:52.550527 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-bh6lp/crc-debug-b4nzw"] Dec 12 01:47:52 crc kubenswrapper[4606]: E1212 01:47:52.550909 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e754ff3c-9c23-4154-88e2-22a73af88429" containerName="container-00" Dec 12 01:47:52 crc kubenswrapper[4606]: I1212 01:47:52.550929 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="e754ff3c-9c23-4154-88e2-22a73af88429" containerName="container-00" Dec 12 01:47:52 crc kubenswrapper[4606]: I1212 01:47:52.551154 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="e754ff3c-9c23-4154-88e2-22a73af88429" containerName="container-00" Dec 12 01:47:52 crc kubenswrapper[4606]: I1212 01:47:52.551784 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bh6lp/crc-debug-b4nzw" Dec 12 01:47:52 crc kubenswrapper[4606]: I1212 01:47:52.683468 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/774193f9-0383-4b09-864c-8b99c0127627-host\") pod \"crc-debug-b4nzw\" (UID: \"774193f9-0383-4b09-864c-8b99c0127627\") " pod="openshift-must-gather-bh6lp/crc-debug-b4nzw" Dec 12 01:47:52 crc kubenswrapper[4606]: I1212 01:47:52.683588 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwvrk\" (UniqueName: \"kubernetes.io/projected/774193f9-0383-4b09-864c-8b99c0127627-kube-api-access-xwvrk\") pod \"crc-debug-b4nzw\" (UID: \"774193f9-0383-4b09-864c-8b99c0127627\") " pod="openshift-must-gather-bh6lp/crc-debug-b4nzw" Dec 12 01:47:52 crc kubenswrapper[4606]: I1212 01:47:52.785907 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwvrk\" (UniqueName: \"kubernetes.io/projected/774193f9-0383-4b09-864c-8b99c0127627-kube-api-access-xwvrk\") pod \"crc-debug-b4nzw\" (UID: \"774193f9-0383-4b09-864c-8b99c0127627\") " pod="openshift-must-gather-bh6lp/crc-debug-b4nzw" Dec 12 01:47:52 crc kubenswrapper[4606]: I1212 01:47:52.786316 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-4h9b9"] Dec 12 01:47:52 crc kubenswrapper[4606]: I1212 01:47:52.787582 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/774193f9-0383-4b09-864c-8b99c0127627-host\") pod \"crc-debug-b4nzw\" (UID: \"774193f9-0383-4b09-864c-8b99c0127627\") " pod="openshift-must-gather-bh6lp/crc-debug-b4nzw" Dec 12 01:47:52 crc kubenswrapper[4606]: I1212 01:47:52.787735 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/774193f9-0383-4b09-864c-8b99c0127627-host\") pod \"crc-debug-b4nzw\" (UID: \"774193f9-0383-4b09-864c-8b99c0127627\") " pod="openshift-must-gather-bh6lp/crc-debug-b4nzw" Dec 12 01:47:52 crc kubenswrapper[4606]: I1212 01:47:52.789754 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4h9b9" Dec 12 01:47:52 crc kubenswrapper[4606]: I1212 01:47:52.805614 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwvrk\" (UniqueName: \"kubernetes.io/projected/774193f9-0383-4b09-864c-8b99c0127627-kube-api-access-xwvrk\") pod \"crc-debug-b4nzw\" (UID: \"774193f9-0383-4b09-864c-8b99c0127627\") " pod="openshift-must-gather-bh6lp/crc-debug-b4nzw" Dec 12 01:47:52 crc kubenswrapper[4606]: I1212 01:47:52.814746 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4h9b9"] Dec 12 01:47:52 crc kubenswrapper[4606]: I1212 01:47:52.873729 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bh6lp/crc-debug-b4nzw" Dec 12 01:47:52 crc kubenswrapper[4606]: I1212 01:47:52.893083 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3de99974-3651-4063-8d9c-8057f1835fc8-utilities\") pod \"redhat-operators-4h9b9\" (UID: \"3de99974-3651-4063-8d9c-8057f1835fc8\") " pod="openshift-marketplace/redhat-operators-4h9b9" Dec 12 01:47:52 crc kubenswrapper[4606]: I1212 01:47:52.893277 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3de99974-3651-4063-8d9c-8057f1835fc8-catalog-content\") pod \"redhat-operators-4h9b9\" (UID: \"3de99974-3651-4063-8d9c-8057f1835fc8\") " pod="openshift-marketplace/redhat-operators-4h9b9" Dec 12 01:47:52 crc kubenswrapper[4606]: I1212 01:47:52.893427 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8rz7\" (UniqueName: \"kubernetes.io/projected/3de99974-3651-4063-8d9c-8057f1835fc8-kube-api-access-v8rz7\") pod \"redhat-operators-4h9b9\" (UID: \"3de99974-3651-4063-8d9c-8057f1835fc8\") " pod="openshift-marketplace/redhat-operators-4h9b9" Dec 12 01:47:52 crc kubenswrapper[4606]: W1212 01:47:52.900426 4606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod774193f9_0383_4b09_864c_8b99c0127627.slice/crio-861344cff4e69cf7b892f93d660d7286b9ab22576005b18fab6ddb81ef966aa6 WatchSource:0}: Error finding container 861344cff4e69cf7b892f93d660d7286b9ab22576005b18fab6ddb81ef966aa6: Status 404 returned error can't find the container with id 861344cff4e69cf7b892f93d660d7286b9ab22576005b18fab6ddb81ef966aa6 Dec 12 01:47:52 crc kubenswrapper[4606]: I1212 01:47:52.995205 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3de99974-3651-4063-8d9c-8057f1835fc8-catalog-content\") pod \"redhat-operators-4h9b9\" (UID: \"3de99974-3651-4063-8d9c-8057f1835fc8\") " pod="openshift-marketplace/redhat-operators-4h9b9" Dec 12 01:47:52 crc kubenswrapper[4606]: I1212 01:47:52.995494 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8rz7\" (UniqueName: \"kubernetes.io/projected/3de99974-3651-4063-8d9c-8057f1835fc8-kube-api-access-v8rz7\") pod \"redhat-operators-4h9b9\" (UID: \"3de99974-3651-4063-8d9c-8057f1835fc8\") " pod="openshift-marketplace/redhat-operators-4h9b9" Dec 12 01:47:52 crc kubenswrapper[4606]: I1212 01:47:52.995627 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3de99974-3651-4063-8d9c-8057f1835fc8-utilities\") pod \"redhat-operators-4h9b9\" (UID: \"3de99974-3651-4063-8d9c-8057f1835fc8\") " pod="openshift-marketplace/redhat-operators-4h9b9" Dec 12 01:47:52 crc kubenswrapper[4606]: I1212 01:47:52.995661 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3de99974-3651-4063-8d9c-8057f1835fc8-catalog-content\") pod \"redhat-operators-4h9b9\" (UID: \"3de99974-3651-4063-8d9c-8057f1835fc8\") " pod="openshift-marketplace/redhat-operators-4h9b9" Dec 12 01:47:52 crc kubenswrapper[4606]: I1212 01:47:52.995991 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3de99974-3651-4063-8d9c-8057f1835fc8-utilities\") pod \"redhat-operators-4h9b9\" (UID: \"3de99974-3651-4063-8d9c-8057f1835fc8\") " pod="openshift-marketplace/redhat-operators-4h9b9" Dec 12 01:47:53 crc kubenswrapper[4606]: I1212 01:47:53.011133 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8rz7\" (UniqueName: \"kubernetes.io/projected/3de99974-3651-4063-8d9c-8057f1835fc8-kube-api-access-v8rz7\") pod \"redhat-operators-4h9b9\" (UID: \"3de99974-3651-4063-8d9c-8057f1835fc8\") " pod="openshift-marketplace/redhat-operators-4h9b9" Dec 12 01:47:53 crc kubenswrapper[4606]: I1212 01:47:53.160931 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4h9b9" Dec 12 01:47:53 crc kubenswrapper[4606]: I1212 01:47:53.265771 4606 generic.go:334] "Generic (PLEG): container finished" podID="774193f9-0383-4b09-864c-8b99c0127627" containerID="5fc1e64020794dbc5748c8369390ccce8c7cdd824d16255a49616a98c549ce44" exitCode=0 Dec 12 01:47:53 crc kubenswrapper[4606]: I1212 01:47:53.265808 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bh6lp/crc-debug-b4nzw" event={"ID":"774193f9-0383-4b09-864c-8b99c0127627","Type":"ContainerDied","Data":"5fc1e64020794dbc5748c8369390ccce8c7cdd824d16255a49616a98c549ce44"} Dec 12 01:47:53 crc kubenswrapper[4606]: I1212 01:47:53.265831 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bh6lp/crc-debug-b4nzw" event={"ID":"774193f9-0383-4b09-864c-8b99c0127627","Type":"ContainerStarted","Data":"861344cff4e69cf7b892f93d660d7286b9ab22576005b18fab6ddb81ef966aa6"} Dec 12 01:47:53 crc kubenswrapper[4606]: I1212 01:47:53.307204 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-bh6lp/crc-debug-b4nzw"] Dec 12 01:47:53 crc kubenswrapper[4606]: I1212 01:47:53.347079 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-bh6lp/crc-debug-b4nzw"] Dec 12 01:47:53 crc kubenswrapper[4606]: I1212 01:47:53.752725 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4h9b9"] Dec 12 01:47:54 crc kubenswrapper[4606]: I1212 01:47:54.274859 4606 generic.go:334] "Generic (PLEG): container finished" podID="3de99974-3651-4063-8d9c-8057f1835fc8" containerID="4b1c698dd88cbfd585302941fb5e49bb0f3cee61783dbc567ce40b37990820c1" exitCode=0 Dec 12 01:47:54 crc kubenswrapper[4606]: I1212 01:47:54.275999 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4h9b9" event={"ID":"3de99974-3651-4063-8d9c-8057f1835fc8","Type":"ContainerDied","Data":"4b1c698dd88cbfd585302941fb5e49bb0f3cee61783dbc567ce40b37990820c1"} Dec 12 01:47:54 crc kubenswrapper[4606]: I1212 01:47:54.276022 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4h9b9" event={"ID":"3de99974-3651-4063-8d9c-8057f1835fc8","Type":"ContainerStarted","Data":"0d350bb892bcf119205fb04e012e3b4c442d3406f8d826e9f179b6cbb6d9168d"} Dec 12 01:47:54 crc kubenswrapper[4606]: I1212 01:47:54.373127 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bh6lp/crc-debug-b4nzw" Dec 12 01:47:54 crc kubenswrapper[4606]: I1212 01:47:54.422975 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xwvrk\" (UniqueName: \"kubernetes.io/projected/774193f9-0383-4b09-864c-8b99c0127627-kube-api-access-xwvrk\") pod \"774193f9-0383-4b09-864c-8b99c0127627\" (UID: \"774193f9-0383-4b09-864c-8b99c0127627\") " Dec 12 01:47:54 crc kubenswrapper[4606]: I1212 01:47:54.423295 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/774193f9-0383-4b09-864c-8b99c0127627-host\") pod \"774193f9-0383-4b09-864c-8b99c0127627\" (UID: \"774193f9-0383-4b09-864c-8b99c0127627\") " Dec 12 01:47:54 crc kubenswrapper[4606]: I1212 01:47:54.423392 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/774193f9-0383-4b09-864c-8b99c0127627-host" (OuterVolumeSpecName: "host") pod "774193f9-0383-4b09-864c-8b99c0127627" (UID: "774193f9-0383-4b09-864c-8b99c0127627"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 12 01:47:54 crc kubenswrapper[4606]: I1212 01:47:54.423743 4606 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/774193f9-0383-4b09-864c-8b99c0127627-host\") on node \"crc\" DevicePath \"\"" Dec 12 01:47:54 crc kubenswrapper[4606]: I1212 01:47:54.432833 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/774193f9-0383-4b09-864c-8b99c0127627-kube-api-access-xwvrk" (OuterVolumeSpecName: "kube-api-access-xwvrk") pod "774193f9-0383-4b09-864c-8b99c0127627" (UID: "774193f9-0383-4b09-864c-8b99c0127627"). InnerVolumeSpecName "kube-api-access-xwvrk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 01:47:54 crc kubenswrapper[4606]: I1212 01:47:54.525612 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xwvrk\" (UniqueName: \"kubernetes.io/projected/774193f9-0383-4b09-864c-8b99c0127627-kube-api-access-xwvrk\") on node \"crc\" DevicePath \"\"" Dec 12 01:47:55 crc kubenswrapper[4606]: I1212 01:47:55.284872 4606 scope.go:117] "RemoveContainer" containerID="5fc1e64020794dbc5748c8369390ccce8c7cdd824d16255a49616a98c549ce44" Dec 12 01:47:55 crc kubenswrapper[4606]: I1212 01:47:55.284946 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bh6lp/crc-debug-b4nzw" Dec 12 01:47:55 crc kubenswrapper[4606]: I1212 01:47:55.726908 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="774193f9-0383-4b09-864c-8b99c0127627" path="/var/lib/kubelet/pods/774193f9-0383-4b09-864c-8b99c0127627/volumes" Dec 12 01:47:56 crc kubenswrapper[4606]: I1212 01:47:56.297428 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4h9b9" event={"ID":"3de99974-3651-4063-8d9c-8057f1835fc8","Type":"ContainerStarted","Data":"9e2fc478fe9312c079d0c0680d735ef3e346628190238e11bef24472bbbfacd0"} Dec 12 01:47:59 crc kubenswrapper[4606]: I1212 01:47:59.327744 4606 generic.go:334] "Generic (PLEG): container finished" podID="3de99974-3651-4063-8d9c-8057f1835fc8" containerID="9e2fc478fe9312c079d0c0680d735ef3e346628190238e11bef24472bbbfacd0" exitCode=0 Dec 12 01:47:59 crc kubenswrapper[4606]: I1212 01:47:59.327826 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4h9b9" event={"ID":"3de99974-3651-4063-8d9c-8057f1835fc8","Type":"ContainerDied","Data":"9e2fc478fe9312c079d0c0680d735ef3e346628190238e11bef24472bbbfacd0"} Dec 12 01:48:00 crc kubenswrapper[4606]: I1212 01:48:00.346557 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4h9b9" event={"ID":"3de99974-3651-4063-8d9c-8057f1835fc8","Type":"ContainerStarted","Data":"07c650e620f9022e2d669833af0f46f7bcc023431aa6859b0cfd247733dd9b2f"} Dec 12 01:48:00 crc kubenswrapper[4606]: I1212 01:48:00.367516 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-4h9b9" podStartSLOduration=2.905497627 podStartE2EDuration="8.367469714s" podCreationTimestamp="2025-12-12 01:47:52 +0000 UTC" firstStartedPulling="2025-12-12 01:47:54.278081563 +0000 UTC m=+5064.823434429" lastFinishedPulling="2025-12-12 01:47:59.74005365 +0000 UTC m=+5070.285406516" observedRunningTime="2025-12-12 01:48:00.364834404 +0000 UTC m=+5070.910187280" watchObservedRunningTime="2025-12-12 01:48:00.367469714 +0000 UTC m=+5070.912822570" Dec 12 01:48:03 crc kubenswrapper[4606]: I1212 01:48:03.161754 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-4h9b9" Dec 12 01:48:03 crc kubenswrapper[4606]: I1212 01:48:03.163104 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-4h9b9" Dec 12 01:48:04 crc kubenswrapper[4606]: I1212 01:48:04.217450 4606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-4h9b9" podUID="3de99974-3651-4063-8d9c-8057f1835fc8" containerName="registry-server" probeResult="failure" output=< Dec 12 01:48:04 crc kubenswrapper[4606]: timeout: failed to connect service ":50051" within 1s Dec 12 01:48:04 crc kubenswrapper[4606]: > Dec 12 01:48:04 crc kubenswrapper[4606]: I1212 01:48:04.699849 4606 scope.go:117] "RemoveContainer" containerID="e143db5c8656674439f9bedbd23be8e8c05444861157fb9728cbde496d7fbeb8" Dec 12 01:48:04 crc kubenswrapper[4606]: E1212 01:48:04.700359 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 01:48:13 crc kubenswrapper[4606]: I1212 01:48:13.797235 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-4h9b9" Dec 12 01:48:13 crc kubenswrapper[4606]: I1212 01:48:13.865600 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-4h9b9" Dec 12 01:48:14 crc kubenswrapper[4606]: I1212 01:48:14.044081 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4h9b9"] Dec 12 01:48:15 crc kubenswrapper[4606]: I1212 01:48:15.491111 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-4h9b9" podUID="3de99974-3651-4063-8d9c-8057f1835fc8" containerName="registry-server" containerID="cri-o://07c650e620f9022e2d669833af0f46f7bcc023431aa6859b0cfd247733dd9b2f" gracePeriod=2 Dec 12 01:48:16 crc kubenswrapper[4606]: I1212 01:48:16.009568 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4h9b9" Dec 12 01:48:16 crc kubenswrapper[4606]: I1212 01:48:16.136913 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3de99974-3651-4063-8d9c-8057f1835fc8-utilities\") pod \"3de99974-3651-4063-8d9c-8057f1835fc8\" (UID: \"3de99974-3651-4063-8d9c-8057f1835fc8\") " Dec 12 01:48:16 crc kubenswrapper[4606]: I1212 01:48:16.137073 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3de99974-3651-4063-8d9c-8057f1835fc8-catalog-content\") pod \"3de99974-3651-4063-8d9c-8057f1835fc8\" (UID: \"3de99974-3651-4063-8d9c-8057f1835fc8\") " Dec 12 01:48:16 crc kubenswrapper[4606]: I1212 01:48:16.137163 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v8rz7\" (UniqueName: \"kubernetes.io/projected/3de99974-3651-4063-8d9c-8057f1835fc8-kube-api-access-v8rz7\") pod \"3de99974-3651-4063-8d9c-8057f1835fc8\" (UID: \"3de99974-3651-4063-8d9c-8057f1835fc8\") " Dec 12 01:48:16 crc kubenswrapper[4606]: I1212 01:48:16.138578 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3de99974-3651-4063-8d9c-8057f1835fc8-utilities" (OuterVolumeSpecName: "utilities") pod "3de99974-3651-4063-8d9c-8057f1835fc8" (UID: "3de99974-3651-4063-8d9c-8057f1835fc8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 01:48:16 crc kubenswrapper[4606]: I1212 01:48:16.149511 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3de99974-3651-4063-8d9c-8057f1835fc8-kube-api-access-v8rz7" (OuterVolumeSpecName: "kube-api-access-v8rz7") pod "3de99974-3651-4063-8d9c-8057f1835fc8" (UID: "3de99974-3651-4063-8d9c-8057f1835fc8"). InnerVolumeSpecName "kube-api-access-v8rz7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 01:48:16 crc kubenswrapper[4606]: I1212 01:48:16.239076 4606 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3de99974-3651-4063-8d9c-8057f1835fc8-utilities\") on node \"crc\" DevicePath \"\"" Dec 12 01:48:16 crc kubenswrapper[4606]: I1212 01:48:16.239405 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v8rz7\" (UniqueName: \"kubernetes.io/projected/3de99974-3651-4063-8d9c-8057f1835fc8-kube-api-access-v8rz7\") on node \"crc\" DevicePath \"\"" Dec 12 01:48:16 crc kubenswrapper[4606]: I1212 01:48:16.281525 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3de99974-3651-4063-8d9c-8057f1835fc8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3de99974-3651-4063-8d9c-8057f1835fc8" (UID: "3de99974-3651-4063-8d9c-8057f1835fc8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 01:48:16 crc kubenswrapper[4606]: I1212 01:48:16.340653 4606 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3de99974-3651-4063-8d9c-8057f1835fc8-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 12 01:48:16 crc kubenswrapper[4606]: I1212 01:48:16.502690 4606 generic.go:334] "Generic (PLEG): container finished" podID="3de99974-3651-4063-8d9c-8057f1835fc8" containerID="07c650e620f9022e2d669833af0f46f7bcc023431aa6859b0cfd247733dd9b2f" exitCode=0 Dec 12 01:48:16 crc kubenswrapper[4606]: I1212 01:48:16.502763 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4h9b9" event={"ID":"3de99974-3651-4063-8d9c-8057f1835fc8","Type":"ContainerDied","Data":"07c650e620f9022e2d669833af0f46f7bcc023431aa6859b0cfd247733dd9b2f"} Dec 12 01:48:16 crc kubenswrapper[4606]: I1212 01:48:16.502818 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4h9b9" event={"ID":"3de99974-3651-4063-8d9c-8057f1835fc8","Type":"ContainerDied","Data":"0d350bb892bcf119205fb04e012e3b4c442d3406f8d826e9f179b6cbb6d9168d"} Dec 12 01:48:16 crc kubenswrapper[4606]: I1212 01:48:16.502841 4606 scope.go:117] "RemoveContainer" containerID="07c650e620f9022e2d669833af0f46f7bcc023431aa6859b0cfd247733dd9b2f" Dec 12 01:48:16 crc kubenswrapper[4606]: I1212 01:48:16.503057 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4h9b9" Dec 12 01:48:16 crc kubenswrapper[4606]: I1212 01:48:16.534696 4606 scope.go:117] "RemoveContainer" containerID="9e2fc478fe9312c079d0c0680d735ef3e346628190238e11bef24472bbbfacd0" Dec 12 01:48:16 crc kubenswrapper[4606]: I1212 01:48:16.562645 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4h9b9"] Dec 12 01:48:16 crc kubenswrapper[4606]: I1212 01:48:16.566705 4606 scope.go:117] "RemoveContainer" containerID="4b1c698dd88cbfd585302941fb5e49bb0f3cee61783dbc567ce40b37990820c1" Dec 12 01:48:16 crc kubenswrapper[4606]: I1212 01:48:16.572248 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-4h9b9"] Dec 12 01:48:16 crc kubenswrapper[4606]: I1212 01:48:16.615673 4606 scope.go:117] "RemoveContainer" containerID="07c650e620f9022e2d669833af0f46f7bcc023431aa6859b0cfd247733dd9b2f" Dec 12 01:48:16 crc kubenswrapper[4606]: E1212 01:48:16.616113 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07c650e620f9022e2d669833af0f46f7bcc023431aa6859b0cfd247733dd9b2f\": container with ID starting with 07c650e620f9022e2d669833af0f46f7bcc023431aa6859b0cfd247733dd9b2f not found: ID does not exist" containerID="07c650e620f9022e2d669833af0f46f7bcc023431aa6859b0cfd247733dd9b2f" Dec 12 01:48:16 crc kubenswrapper[4606]: I1212 01:48:16.616193 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07c650e620f9022e2d669833af0f46f7bcc023431aa6859b0cfd247733dd9b2f"} err="failed to get container status \"07c650e620f9022e2d669833af0f46f7bcc023431aa6859b0cfd247733dd9b2f\": rpc error: code = NotFound desc = could not find container \"07c650e620f9022e2d669833af0f46f7bcc023431aa6859b0cfd247733dd9b2f\": container with ID starting with 07c650e620f9022e2d669833af0f46f7bcc023431aa6859b0cfd247733dd9b2f not found: ID does not exist" Dec 12 01:48:16 crc kubenswrapper[4606]: I1212 01:48:16.616219 4606 scope.go:117] "RemoveContainer" containerID="9e2fc478fe9312c079d0c0680d735ef3e346628190238e11bef24472bbbfacd0" Dec 12 01:48:16 crc kubenswrapper[4606]: E1212 01:48:16.616554 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e2fc478fe9312c079d0c0680d735ef3e346628190238e11bef24472bbbfacd0\": container with ID starting with 9e2fc478fe9312c079d0c0680d735ef3e346628190238e11bef24472bbbfacd0 not found: ID does not exist" containerID="9e2fc478fe9312c079d0c0680d735ef3e346628190238e11bef24472bbbfacd0" Dec 12 01:48:16 crc kubenswrapper[4606]: I1212 01:48:16.616574 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e2fc478fe9312c079d0c0680d735ef3e346628190238e11bef24472bbbfacd0"} err="failed to get container status \"9e2fc478fe9312c079d0c0680d735ef3e346628190238e11bef24472bbbfacd0\": rpc error: code = NotFound desc = could not find container \"9e2fc478fe9312c079d0c0680d735ef3e346628190238e11bef24472bbbfacd0\": container with ID starting with 9e2fc478fe9312c079d0c0680d735ef3e346628190238e11bef24472bbbfacd0 not found: ID does not exist" Dec 12 01:48:16 crc kubenswrapper[4606]: I1212 01:48:16.616588 4606 scope.go:117] "RemoveContainer" containerID="4b1c698dd88cbfd585302941fb5e49bb0f3cee61783dbc567ce40b37990820c1" Dec 12 01:48:16 crc kubenswrapper[4606]: E1212 01:48:16.618644 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b1c698dd88cbfd585302941fb5e49bb0f3cee61783dbc567ce40b37990820c1\": container with ID starting with 4b1c698dd88cbfd585302941fb5e49bb0f3cee61783dbc567ce40b37990820c1 not found: ID does not exist" containerID="4b1c698dd88cbfd585302941fb5e49bb0f3cee61783dbc567ce40b37990820c1" Dec 12 01:48:16 crc kubenswrapper[4606]: I1212 01:48:16.618681 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b1c698dd88cbfd585302941fb5e49bb0f3cee61783dbc567ce40b37990820c1"} err="failed to get container status \"4b1c698dd88cbfd585302941fb5e49bb0f3cee61783dbc567ce40b37990820c1\": rpc error: code = NotFound desc = could not find container \"4b1c698dd88cbfd585302941fb5e49bb0f3cee61783dbc567ce40b37990820c1\": container with ID starting with 4b1c698dd88cbfd585302941fb5e49bb0f3cee61783dbc567ce40b37990820c1 not found: ID does not exist" Dec 12 01:48:17 crc kubenswrapper[4606]: I1212 01:48:17.718111 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3de99974-3651-4063-8d9c-8057f1835fc8" path="/var/lib/kubelet/pods/3de99974-3651-4063-8d9c-8057f1835fc8/volumes" Dec 12 01:48:18 crc kubenswrapper[4606]: I1212 01:48:18.700101 4606 scope.go:117] "RemoveContainer" containerID="e143db5c8656674439f9bedbd23be8e8c05444861157fb9728cbde496d7fbeb8" Dec 12 01:48:18 crc kubenswrapper[4606]: E1212 01:48:18.700386 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 01:48:21 crc kubenswrapper[4606]: I1212 01:48:21.644748 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-5fd8976c8-ks4n2_a82de1f4-ae7b-42bf-ae94-b39ba56b7e95/barbican-api/0.log" Dec 12 01:48:21 crc kubenswrapper[4606]: I1212 01:48:21.751943 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-5fd8976c8-ks4n2_a82de1f4-ae7b-42bf-ae94-b39ba56b7e95/barbican-api-log/0.log" Dec 12 01:48:21 crc kubenswrapper[4606]: I1212 01:48:21.839367 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-56c8787fc4-ckfzn_92eda0c3-4480-4e66-b349-144d9fb32bad/barbican-keystone-listener/0.log" Dec 12 01:48:21 crc kubenswrapper[4606]: I1212 01:48:21.919063 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-56c8787fc4-ckfzn_92eda0c3-4480-4e66-b349-144d9fb32bad/barbican-keystone-listener-log/0.log" Dec 12 01:48:22 crc kubenswrapper[4606]: I1212 01:48:22.076766 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-5555587ffc-hjrpk_71f55b69-c2e2-49eb-b468-b1e940b54f1e/barbican-worker/0.log" Dec 12 01:48:22 crc kubenswrapper[4606]: I1212 01:48:22.086365 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-5555587ffc-hjrpk_71f55b69-c2e2-49eb-b468-b1e940b54f1e/barbican-worker-log/0.log" Dec 12 01:48:22 crc kubenswrapper[4606]: I1212 01:48:22.338630 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_dbd84191-5cbb-48c5-af82-bfad9996ee60/ceilometer-central-agent/0.log" Dec 12 01:48:22 crc kubenswrapper[4606]: I1212 01:48:22.434546 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-mbv7c_cd9c36e5-43c7-4723-b818-e8b4129d578a/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Dec 12 01:48:22 crc kubenswrapper[4606]: I1212 01:48:22.469292 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_dbd84191-5cbb-48c5-af82-bfad9996ee60/ceilometer-notification-agent/0.log" Dec 12 01:48:22 crc kubenswrapper[4606]: I1212 01:48:22.579632 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_dbd84191-5cbb-48c5-af82-bfad9996ee60/proxy-httpd/0.log" Dec 12 01:48:22 crc kubenswrapper[4606]: I1212 01:48:22.665229 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_dbd84191-5cbb-48c5-af82-bfad9996ee60/sg-core/0.log" Dec 12 01:48:22 crc kubenswrapper[4606]: I1212 01:48:22.793989 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_2cacde66-96b8-437e-86b5-aefba1e473ae/cinder-api-log/0.log" Dec 12 01:48:22 crc kubenswrapper[4606]: I1212 01:48:22.846971 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_2cacde66-96b8-437e-86b5-aefba1e473ae/cinder-api/0.log" Dec 12 01:48:22 crc kubenswrapper[4606]: I1212 01:48:22.976577 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_dedb8b2a-538c-4175-9ead-0d889ae2fd40/cinder-scheduler/0.log" Dec 12 01:48:23 crc kubenswrapper[4606]: I1212 01:48:23.025154 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_dedb8b2a-538c-4175-9ead-0d889ae2fd40/probe/0.log" Dec 12 01:48:23 crc kubenswrapper[4606]: I1212 01:48:23.212129 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-kz97r_ffbef1cf-b8bf-44ae-be40-52fa989c44d7/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 12 01:48:23 crc kubenswrapper[4606]: I1212 01:48:23.343652 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-ngd7j_2cd82033-0e61-42e1-b532-65e0baa9d60e/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 12 01:48:23 crc kubenswrapper[4606]: I1212 01:48:23.553620 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-94d468747-glhd9_9a4ba3b3-cc4c-4f65-92c7-ffe91f827bf9/init/0.log" Dec 12 01:48:23 crc kubenswrapper[4606]: I1212 01:48:23.713441 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-94d468747-glhd9_9a4ba3b3-cc4c-4f65-92c7-ffe91f827bf9/init/0.log" Dec 12 01:48:23 crc kubenswrapper[4606]: I1212 01:48:23.783769 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-ff8s5_0d1c2004-cdcb-4729-a4f2-43a08adf9c04/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Dec 12 01:48:23 crc kubenswrapper[4606]: I1212 01:48:23.935365 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-94d468747-glhd9_9a4ba3b3-cc4c-4f65-92c7-ffe91f827bf9/dnsmasq-dns/0.log" Dec 12 01:48:24 crc kubenswrapper[4606]: I1212 01:48:24.078574 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_24e9dfd8-9299-4981-b95d-a4200749037c/glance-httpd/0.log" Dec 12 01:48:24 crc kubenswrapper[4606]: I1212 01:48:24.087940 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_24e9dfd8-9299-4981-b95d-a4200749037c/glance-log/0.log" Dec 12 01:48:24 crc kubenswrapper[4606]: I1212 01:48:24.221657 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_aa0e877c-3d78-482d-8bb0-003663d82e4a/glance-httpd/0.log" Dec 12 01:48:24 crc kubenswrapper[4606]: I1212 01:48:24.288884 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_aa0e877c-3d78-482d-8bb0-003663d82e4a/glance-log/0.log" Dec 12 01:48:24 crc kubenswrapper[4606]: I1212 01:48:24.422284 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-b9fb498f6-62fcc_e38df57e-1a86-4c45-bf40-6282a6a049ed/horizon/2.log" Dec 12 01:48:24 crc kubenswrapper[4606]: I1212 01:48:24.541651 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-b9fb498f6-62fcc_e38df57e-1a86-4c45-bf40-6282a6a049ed/horizon/1.log" Dec 12 01:48:24 crc kubenswrapper[4606]: I1212 01:48:24.710861 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-9kplh_134d45c2-0084-4779-9125-b36e673a5cf8/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Dec 12 01:48:24 crc kubenswrapper[4606]: I1212 01:48:24.985250 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-b9fb498f6-62fcc_e38df57e-1a86-4c45-bf40-6282a6a049ed/horizon-log/0.log" Dec 12 01:48:25 crc kubenswrapper[4606]: I1212 01:48:25.080561 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-h8w5g_9ffe19a0-667c-400f-b80f-3ddedcbec6dd/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 12 01:48:25 crc kubenswrapper[4606]: I1212 01:48:25.520900 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29425021-wxzqx_dc2e18bf-398b-4f87-90c3-a14838b991d6/keystone-cron/0.log" Dec 12 01:48:25 crc kubenswrapper[4606]: I1212 01:48:25.694785 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-76b48998b8-ff8r8_5b032f38-cd06-4fa3-9db7-6405dbaffaf4/keystone-api/0.log" Dec 12 01:48:25 crc kubenswrapper[4606]: I1212 01:48:25.754847 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_14b69a07-590a-4574-b5b9-de1bfe8c8fcf/kube-state-metrics/0.log" Dec 12 01:48:25 crc kubenswrapper[4606]: I1212 01:48:25.959856 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-cdq4s_7698f12d-5dde-46ae-929e-472dfebb1a90/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Dec 12 01:48:26 crc kubenswrapper[4606]: I1212 01:48:26.725586 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-jm59d_9dafb619-e866-4f79-8e75-28c88fe1dfe7/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Dec 12 01:48:26 crc kubenswrapper[4606]: I1212 01:48:26.875919 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-d9886cd8c-2vtxs_c923042a-1c66-4db8-8e92-fc41e2f19b4f/neutron-httpd/0.log" Dec 12 01:48:27 crc kubenswrapper[4606]: I1212 01:48:27.124221 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-d9886cd8c-2vtxs_c923042a-1c66-4db8-8e92-fc41e2f19b4f/neutron-api/0.log" Dec 12 01:48:27 crc kubenswrapper[4606]: I1212 01:48:27.750234 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_a57ec7a6-8024-4745-8f9a-3c85bb363d82/nova-cell0-conductor-conductor/0.log" Dec 12 01:48:28 crc kubenswrapper[4606]: I1212 01:48:28.037353 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_11fcc545-3d1b-4a4a-b302-c1b565908edf/nova-cell1-conductor-conductor/0.log" Dec 12 01:48:28 crc kubenswrapper[4606]: I1212 01:48:28.440618 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_4674f3cc-27d8-4e55-9fe1-f13378aefbc8/nova-cell1-novncproxy-novncproxy/0.log" Dec 12 01:48:29 crc kubenswrapper[4606]: I1212 01:48:29.130392 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-4ls7g_f70ab77b-2e78-421a-8563-8d8d0e049800/nova-edpm-deployment-openstack-edpm-ipam/0.log" Dec 12 01:48:29 crc kubenswrapper[4606]: I1212 01:48:29.319517 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_ab424069-d5cd-4b92-b1d8-1311cffefad6/nova-api-log/0.log" Dec 12 01:48:29 crc kubenswrapper[4606]: I1212 01:48:29.581038 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_f3157a9f-4b11-4116-8be9-f4cb87e19b9f/nova-metadata-log/0.log" Dec 12 01:48:29 crc kubenswrapper[4606]: I1212 01:48:29.709745 4606 scope.go:117] "RemoveContainer" containerID="e143db5c8656674439f9bedbd23be8e8c05444861157fb9728cbde496d7fbeb8" Dec 12 01:48:29 crc kubenswrapper[4606]: E1212 01:48:29.710559 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 01:48:29 crc kubenswrapper[4606]: I1212 01:48:29.874499 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_ab424069-d5cd-4b92-b1d8-1311cffefad6/nova-api-api/0.log" Dec 12 01:48:30 crc kubenswrapper[4606]: I1212 01:48:30.228723 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_c47ad26c-c149-41c4-8527-5e604b61e0f0/nova-scheduler-scheduler/0.log" Dec 12 01:48:30 crc kubenswrapper[4606]: I1212 01:48:30.576699 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_469d04e8-23ca-4aba-b3f1-0c4ad8da1562/mysql-bootstrap/0.log" Dec 12 01:48:30 crc kubenswrapper[4606]: I1212 01:48:30.811149 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_469d04e8-23ca-4aba-b3f1-0c4ad8da1562/mysql-bootstrap/0.log" Dec 12 01:48:30 crc kubenswrapper[4606]: I1212 01:48:30.893801 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_469d04e8-23ca-4aba-b3f1-0c4ad8da1562/galera/0.log" Dec 12 01:48:31 crc kubenswrapper[4606]: I1212 01:48:31.083273 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_bdf45e39-242c-4bf6-b3e5-7bbb9e0a72b9/mysql-bootstrap/0.log" Dec 12 01:48:31 crc kubenswrapper[4606]: I1212 01:48:31.265418 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_eb404bf7-b4ad-4fd2-aeae-fc44a6315e39/memcached/0.log" Dec 12 01:48:31 crc kubenswrapper[4606]: I1212 01:48:31.372096 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_bdf45e39-242c-4bf6-b3e5-7bbb9e0a72b9/galera/0.log" Dec 12 01:48:31 crc kubenswrapper[4606]: I1212 01:48:31.394579 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_bdf45e39-242c-4bf6-b3e5-7bbb9e0a72b9/mysql-bootstrap/0.log" Dec 12 01:48:31 crc kubenswrapper[4606]: I1212 01:48:31.437895 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_f3157a9f-4b11-4116-8be9-f4cb87e19b9f/nova-metadata-metadata/0.log" Dec 12 01:48:31 crc kubenswrapper[4606]: I1212 01:48:31.569323 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_0d0864c8-b45f-4324-a56f-ff583d488da0/openstackclient/0.log" Dec 12 01:48:31 crc kubenswrapper[4606]: I1212 01:48:31.641206 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-666ck_015ed993-f4fd-4928-a5ec-d13ad04b0105/ovn-controller/0.log" Dec 12 01:48:31 crc kubenswrapper[4606]: I1212 01:48:31.766002 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-kzfxx_cba16cfd-fd29-4c8c-8c17-5fbecbdf6ee0/openstack-network-exporter/0.log" Dec 12 01:48:31 crc kubenswrapper[4606]: I1212 01:48:31.900187 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-fh8p9_23cadcb5-094e-4dc3-af06-6f1186b6cb98/ovsdb-server-init/0.log" Dec 12 01:48:32 crc kubenswrapper[4606]: I1212 01:48:32.011344 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-fh8p9_23cadcb5-094e-4dc3-af06-6f1186b6cb98/ovsdb-server-init/0.log" Dec 12 01:48:32 crc kubenswrapper[4606]: I1212 01:48:32.065673 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-fh8p9_23cadcb5-094e-4dc3-af06-6f1186b6cb98/ovsdb-server/0.log" Dec 12 01:48:32 crc kubenswrapper[4606]: I1212 01:48:32.068548 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-fh8p9_23cadcb5-094e-4dc3-af06-6f1186b6cb98/ovs-vswitchd/0.log" Dec 12 01:48:32 crc kubenswrapper[4606]: I1212 01:48:32.201835 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-zggvz_8b9161f7-ad7e-44ec-8fc1-cdc18e2c9d56/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Dec 12 01:48:32 crc kubenswrapper[4606]: I1212 01:48:32.376669 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_07c6dfb8-2190-4189-a3b7-f85da57160a1/ovn-northd/0.log" Dec 12 01:48:32 crc kubenswrapper[4606]: I1212 01:48:32.377240 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_07c6dfb8-2190-4189-a3b7-f85da57160a1/openstack-network-exporter/0.log" Dec 12 01:48:32 crc kubenswrapper[4606]: I1212 01:48:32.454126 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_867593e3-7035-4358-8583-0d2cb0878282/openstack-network-exporter/0.log" Dec 12 01:48:32 crc kubenswrapper[4606]: I1212 01:48:32.511214 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_867593e3-7035-4358-8583-0d2cb0878282/ovsdbserver-nb/0.log" Dec 12 01:48:32 crc kubenswrapper[4606]: I1212 01:48:32.595552 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_c68b0331-671b-4ca9-9f19-260d6faeada7/openstack-network-exporter/0.log" Dec 12 01:48:32 crc kubenswrapper[4606]: I1212 01:48:32.727588 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_c68b0331-671b-4ca9-9f19-260d6faeada7/ovsdbserver-sb/0.log" Dec 12 01:48:32 crc kubenswrapper[4606]: I1212 01:48:32.897245 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-9f4869f76-gqjw4_1ad12b18-e66a-4871-9a92-39e75070b4fb/placement-api/0.log" Dec 12 01:48:32 crc kubenswrapper[4606]: I1212 01:48:32.997388 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_b744992a-d383-4df5-859e-b24a8e70c1bb/setup-container/0.log" Dec 12 01:48:33 crc kubenswrapper[4606]: I1212 01:48:33.014189 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-9f4869f76-gqjw4_1ad12b18-e66a-4871-9a92-39e75070b4fb/placement-log/0.log" Dec 12 01:48:33 crc kubenswrapper[4606]: I1212 01:48:33.221143 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_b744992a-d383-4df5-859e-b24a8e70c1bb/setup-container/0.log" Dec 12 01:48:33 crc kubenswrapper[4606]: I1212 01:48:33.259935 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_b744992a-d383-4df5-859e-b24a8e70c1bb/rabbitmq/0.log" Dec 12 01:48:33 crc kubenswrapper[4606]: I1212 01:48:33.399638 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_aec304bf-3003-493d-9e17-3a2f75997bdb/setup-container/0.log" Dec 12 01:48:33 crc kubenswrapper[4606]: I1212 01:48:33.494386 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_aec304bf-3003-493d-9e17-3a2f75997bdb/setup-container/0.log" Dec 12 01:48:33 crc kubenswrapper[4606]: I1212 01:48:33.577866 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_aec304bf-3003-493d-9e17-3a2f75997bdb/rabbitmq/0.log" Dec 12 01:48:33 crc kubenswrapper[4606]: I1212 01:48:33.597701 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-qnzdn_979cac18-8b58-417f-907c-d0aa1cc7646a/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 12 01:48:33 crc kubenswrapper[4606]: I1212 01:48:33.748812 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-kqmmc_0118f359-bb8d-4b8a-be2b-0437ed43b303/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Dec 12 01:48:33 crc kubenswrapper[4606]: I1212 01:48:33.825365 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-lwk29_ba53dca3-2038-4e50-9c9e-ebed89ee7a86/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Dec 12 01:48:33 crc kubenswrapper[4606]: I1212 01:48:33.869769 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-fzq9w_dbfee972-c3b5-489c-adb3-c7e6720b67d0/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 12 01:48:34 crc kubenswrapper[4606]: I1212 01:48:34.004611 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-v76xq_b40df3ae-e397-4c46-ae83-781aafd30e5e/ssh-known-hosts-edpm-deployment/0.log" Dec 12 01:48:34 crc kubenswrapper[4606]: I1212 01:48:34.103879 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-7b868b57d5-xjh67_723ab405-5905-44a6-a625-39fbc78948ef/proxy-server/0.log" Dec 12 01:48:34 crc kubenswrapper[4606]: I1212 01:48:34.201590 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-t97v8_d4a54eac-00ee-452a-9c4b-e777e338e670/swift-ring-rebalance/0.log" Dec 12 01:48:34 crc kubenswrapper[4606]: I1212 01:48:34.213615 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-7b868b57d5-xjh67_723ab405-5905-44a6-a625-39fbc78948ef/proxy-httpd/0.log" Dec 12 01:48:34 crc kubenswrapper[4606]: I1212 01:48:34.330881 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6963a48d-4eff-4349-bc36-2356ec73c08c/account-auditor/0.log" Dec 12 01:48:34 crc kubenswrapper[4606]: I1212 01:48:34.436662 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6963a48d-4eff-4349-bc36-2356ec73c08c/account-reaper/0.log" Dec 12 01:48:34 crc kubenswrapper[4606]: I1212 01:48:34.479384 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6963a48d-4eff-4349-bc36-2356ec73c08c/account-replicator/0.log" Dec 12 01:48:34 crc kubenswrapper[4606]: I1212 01:48:34.567129 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6963a48d-4eff-4349-bc36-2356ec73c08c/account-server/0.log" Dec 12 01:48:34 crc kubenswrapper[4606]: I1212 01:48:34.609823 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6963a48d-4eff-4349-bc36-2356ec73c08c/container-replicator/0.log" Dec 12 01:48:34 crc kubenswrapper[4606]: I1212 01:48:34.627073 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6963a48d-4eff-4349-bc36-2356ec73c08c/container-auditor/0.log" Dec 12 01:48:34 crc kubenswrapper[4606]: I1212 01:48:34.630274 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6963a48d-4eff-4349-bc36-2356ec73c08c/container-server/0.log" Dec 12 01:48:34 crc kubenswrapper[4606]: I1212 01:48:34.770362 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6963a48d-4eff-4349-bc36-2356ec73c08c/container-updater/0.log" Dec 12 01:48:34 crc kubenswrapper[4606]: I1212 01:48:34.806095 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6963a48d-4eff-4349-bc36-2356ec73c08c/object-auditor/0.log" Dec 12 01:48:34 crc kubenswrapper[4606]: I1212 01:48:34.837029 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6963a48d-4eff-4349-bc36-2356ec73c08c/object-replicator/0.log" Dec 12 01:48:34 crc kubenswrapper[4606]: I1212 01:48:34.861629 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6963a48d-4eff-4349-bc36-2356ec73c08c/object-expirer/0.log" Dec 12 01:48:34 crc kubenswrapper[4606]: I1212 01:48:34.877368 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6963a48d-4eff-4349-bc36-2356ec73c08c/object-server/0.log" Dec 12 01:48:34 crc kubenswrapper[4606]: I1212 01:48:34.970369 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6963a48d-4eff-4349-bc36-2356ec73c08c/object-updater/0.log" Dec 12 01:48:35 crc kubenswrapper[4606]: I1212 01:48:35.034410 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6963a48d-4eff-4349-bc36-2356ec73c08c/rsync/0.log" Dec 12 01:48:35 crc kubenswrapper[4606]: I1212 01:48:35.049423 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6963a48d-4eff-4349-bc36-2356ec73c08c/swift-recon-cron/0.log" Dec 12 01:48:35 crc kubenswrapper[4606]: I1212 01:48:35.150338 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-2bdq4_bf6da039-4ee5-40c3-90b0-606cd302ee04/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Dec 12 01:48:35 crc kubenswrapper[4606]: I1212 01:48:35.371542 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_c43bd731-87e3-4a9a-8399-5936fe0a315f/test-operator-logs-container/0.log" Dec 12 01:48:35 crc kubenswrapper[4606]: I1212 01:48:35.566230 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-gph9x_5a1bfeda-d153-4465-b060-5f8b3b5d5b23/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 12 01:48:35 crc kubenswrapper[4606]: I1212 01:48:35.661565 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_4b099e40-725d-42e2-84fc-6ed969a20e5f/tempest-tests-tempest-tests-runner/0.log" Dec 12 01:48:41 crc kubenswrapper[4606]: I1212 01:48:41.699891 4606 scope.go:117] "RemoveContainer" containerID="e143db5c8656674439f9bedbd23be8e8c05444861157fb9728cbde496d7fbeb8" Dec 12 01:48:41 crc kubenswrapper[4606]: E1212 01:48:41.700553 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 01:48:51 crc kubenswrapper[4606]: I1212 01:48:51.219798 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-dxvf6"] Dec 12 01:48:51 crc kubenswrapper[4606]: E1212 01:48:51.220789 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3de99974-3651-4063-8d9c-8057f1835fc8" containerName="registry-server" Dec 12 01:48:51 crc kubenswrapper[4606]: I1212 01:48:51.220802 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="3de99974-3651-4063-8d9c-8057f1835fc8" containerName="registry-server" Dec 12 01:48:51 crc kubenswrapper[4606]: E1212 01:48:51.220825 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3de99974-3651-4063-8d9c-8057f1835fc8" containerName="extract-content" Dec 12 01:48:51 crc kubenswrapper[4606]: I1212 01:48:51.220831 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="3de99974-3651-4063-8d9c-8057f1835fc8" containerName="extract-content" Dec 12 01:48:51 crc kubenswrapper[4606]: E1212 01:48:51.220846 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3de99974-3651-4063-8d9c-8057f1835fc8" containerName="extract-utilities" Dec 12 01:48:51 crc kubenswrapper[4606]: I1212 01:48:51.220853 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="3de99974-3651-4063-8d9c-8057f1835fc8" containerName="extract-utilities" Dec 12 01:48:51 crc kubenswrapper[4606]: E1212 01:48:51.220880 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="774193f9-0383-4b09-864c-8b99c0127627" containerName="container-00" Dec 12 01:48:51 crc kubenswrapper[4606]: I1212 01:48:51.220886 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="774193f9-0383-4b09-864c-8b99c0127627" containerName="container-00" Dec 12 01:48:51 crc kubenswrapper[4606]: I1212 01:48:51.221053 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="774193f9-0383-4b09-864c-8b99c0127627" containerName="container-00" Dec 12 01:48:51 crc kubenswrapper[4606]: I1212 01:48:51.221075 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="3de99974-3651-4063-8d9c-8057f1835fc8" containerName="registry-server" Dec 12 01:48:51 crc kubenswrapper[4606]: I1212 01:48:51.222458 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dxvf6" Dec 12 01:48:51 crc kubenswrapper[4606]: I1212 01:48:51.227469 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dxvf6"] Dec 12 01:48:51 crc kubenswrapper[4606]: I1212 01:48:51.311464 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86afec2f-9c7d-4cdd-9e08-6ff06d5e78a8-utilities\") pod \"community-operators-dxvf6\" (UID: \"86afec2f-9c7d-4cdd-9e08-6ff06d5e78a8\") " pod="openshift-marketplace/community-operators-dxvf6" Dec 12 01:48:51 crc kubenswrapper[4606]: I1212 01:48:51.311634 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jx8wz\" (UniqueName: \"kubernetes.io/projected/86afec2f-9c7d-4cdd-9e08-6ff06d5e78a8-kube-api-access-jx8wz\") pod \"community-operators-dxvf6\" (UID: \"86afec2f-9c7d-4cdd-9e08-6ff06d5e78a8\") " pod="openshift-marketplace/community-operators-dxvf6" Dec 12 01:48:51 crc kubenswrapper[4606]: I1212 01:48:51.311751 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86afec2f-9c7d-4cdd-9e08-6ff06d5e78a8-catalog-content\") pod \"community-operators-dxvf6\" (UID: \"86afec2f-9c7d-4cdd-9e08-6ff06d5e78a8\") " pod="openshift-marketplace/community-operators-dxvf6" Dec 12 01:48:51 crc kubenswrapper[4606]: I1212 01:48:51.413026 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86afec2f-9c7d-4cdd-9e08-6ff06d5e78a8-catalog-content\") pod \"community-operators-dxvf6\" (UID: \"86afec2f-9c7d-4cdd-9e08-6ff06d5e78a8\") " pod="openshift-marketplace/community-operators-dxvf6" Dec 12 01:48:51 crc kubenswrapper[4606]: I1212 01:48:51.413103 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86afec2f-9c7d-4cdd-9e08-6ff06d5e78a8-utilities\") pod \"community-operators-dxvf6\" (UID: \"86afec2f-9c7d-4cdd-9e08-6ff06d5e78a8\") " pod="openshift-marketplace/community-operators-dxvf6" Dec 12 01:48:51 crc kubenswrapper[4606]: I1212 01:48:51.413164 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jx8wz\" (UniqueName: \"kubernetes.io/projected/86afec2f-9c7d-4cdd-9e08-6ff06d5e78a8-kube-api-access-jx8wz\") pod \"community-operators-dxvf6\" (UID: \"86afec2f-9c7d-4cdd-9e08-6ff06d5e78a8\") " pod="openshift-marketplace/community-operators-dxvf6" Dec 12 01:48:51 crc kubenswrapper[4606]: I1212 01:48:51.414228 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86afec2f-9c7d-4cdd-9e08-6ff06d5e78a8-utilities\") pod \"community-operators-dxvf6\" (UID: \"86afec2f-9c7d-4cdd-9e08-6ff06d5e78a8\") " pod="openshift-marketplace/community-operators-dxvf6" Dec 12 01:48:51 crc kubenswrapper[4606]: I1212 01:48:51.414346 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86afec2f-9c7d-4cdd-9e08-6ff06d5e78a8-catalog-content\") pod \"community-operators-dxvf6\" (UID: \"86afec2f-9c7d-4cdd-9e08-6ff06d5e78a8\") " pod="openshift-marketplace/community-operators-dxvf6" Dec 12 01:48:51 crc kubenswrapper[4606]: I1212 01:48:51.434911 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jx8wz\" (UniqueName: \"kubernetes.io/projected/86afec2f-9c7d-4cdd-9e08-6ff06d5e78a8-kube-api-access-jx8wz\") pod \"community-operators-dxvf6\" (UID: \"86afec2f-9c7d-4cdd-9e08-6ff06d5e78a8\") " pod="openshift-marketplace/community-operators-dxvf6" Dec 12 01:48:51 crc kubenswrapper[4606]: I1212 01:48:51.569671 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dxvf6" Dec 12 01:48:52 crc kubenswrapper[4606]: I1212 01:48:52.161604 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dxvf6"] Dec 12 01:48:52 crc kubenswrapper[4606]: I1212 01:48:52.699360 4606 scope.go:117] "RemoveContainer" containerID="e143db5c8656674439f9bedbd23be8e8c05444861157fb9728cbde496d7fbeb8" Dec 12 01:48:52 crc kubenswrapper[4606]: E1212 01:48:52.699849 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 01:48:52 crc kubenswrapper[4606]: I1212 01:48:52.857953 4606 generic.go:334] "Generic (PLEG): container finished" podID="86afec2f-9c7d-4cdd-9e08-6ff06d5e78a8" containerID="88613bbe238c2922333cb9de5bf9baf37a10309465434a95162a4b63c017de91" exitCode=0 Dec 12 01:48:52 crc kubenswrapper[4606]: I1212 01:48:52.858046 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dxvf6" event={"ID":"86afec2f-9c7d-4cdd-9e08-6ff06d5e78a8","Type":"ContainerDied","Data":"88613bbe238c2922333cb9de5bf9baf37a10309465434a95162a4b63c017de91"} Dec 12 01:48:52 crc kubenswrapper[4606]: I1212 01:48:52.858084 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dxvf6" event={"ID":"86afec2f-9c7d-4cdd-9e08-6ff06d5e78a8","Type":"ContainerStarted","Data":"805fcb1c5ef3038d6e547ddfb852622774db5d6220a82537b829cf12bc9374fc"} Dec 12 01:48:53 crc kubenswrapper[4606]: I1212 01:48:53.872715 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dxvf6" event={"ID":"86afec2f-9c7d-4cdd-9e08-6ff06d5e78a8","Type":"ContainerStarted","Data":"4828f18d57d8d062b61c3e3da4e6584081c59b89ccf80fb2dd36379ae8d754c4"} Dec 12 01:48:54 crc kubenswrapper[4606]: I1212 01:48:54.885496 4606 generic.go:334] "Generic (PLEG): container finished" podID="86afec2f-9c7d-4cdd-9e08-6ff06d5e78a8" containerID="4828f18d57d8d062b61c3e3da4e6584081c59b89ccf80fb2dd36379ae8d754c4" exitCode=0 Dec 12 01:48:54 crc kubenswrapper[4606]: I1212 01:48:54.885556 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dxvf6" event={"ID":"86afec2f-9c7d-4cdd-9e08-6ff06d5e78a8","Type":"ContainerDied","Data":"4828f18d57d8d062b61c3e3da4e6584081c59b89ccf80fb2dd36379ae8d754c4"} Dec 12 01:48:55 crc kubenswrapper[4606]: I1212 01:48:55.895789 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dxvf6" event={"ID":"86afec2f-9c7d-4cdd-9e08-6ff06d5e78a8","Type":"ContainerStarted","Data":"eb933663672a904ad71cc03c6a5ca5727c653039d466fdd01a64c63bfb994230"} Dec 12 01:48:55 crc kubenswrapper[4606]: I1212 01:48:55.928999 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-dxvf6" podStartSLOduration=2.400184998 podStartE2EDuration="4.928970232s" podCreationTimestamp="2025-12-12 01:48:51 +0000 UTC" firstStartedPulling="2025-12-12 01:48:52.897194744 +0000 UTC m=+5123.442547620" lastFinishedPulling="2025-12-12 01:48:55.425979988 +0000 UTC m=+5125.971332854" observedRunningTime="2025-12-12 01:48:55.921287809 +0000 UTC m=+5126.466640675" watchObservedRunningTime="2025-12-12 01:48:55.928970232 +0000 UTC m=+5126.474323098" Dec 12 01:49:01 crc kubenswrapper[4606]: I1212 01:49:01.570774 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-dxvf6" Dec 12 01:49:01 crc kubenswrapper[4606]: I1212 01:49:01.571115 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-dxvf6" Dec 12 01:49:01 crc kubenswrapper[4606]: I1212 01:49:01.620019 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-dxvf6" Dec 12 01:49:02 crc kubenswrapper[4606]: I1212 01:49:02.066632 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-dxvf6" Dec 12 01:49:02 crc kubenswrapper[4606]: I1212 01:49:02.127761 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-dxvf6"] Dec 12 01:49:03 crc kubenswrapper[4606]: I1212 01:49:03.959161 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-dxvf6" podUID="86afec2f-9c7d-4cdd-9e08-6ff06d5e78a8" containerName="registry-server" containerID="cri-o://eb933663672a904ad71cc03c6a5ca5727c653039d466fdd01a64c63bfb994230" gracePeriod=2 Dec 12 01:49:04 crc kubenswrapper[4606]: I1212 01:49:04.007137 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-97g95_a6f8bedd-5eb2-4092-abd9-34f8ccbed690/manager/0.log" Dec 12 01:49:04 crc kubenswrapper[4606]: I1212 01:49:04.026341 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-97g95_a6f8bedd-5eb2-4092-abd9-34f8ccbed690/kube-rbac-proxy/0.log" Dec 12 01:49:04 crc kubenswrapper[4606]: I1212 01:49:04.269815 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-6c677c69b-9npjw_b5316be9-1796-4bf0-aabf-ac9cf01c709b/kube-rbac-proxy/0.log" Dec 12 01:49:04 crc kubenswrapper[4606]: I1212 01:49:04.304922 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-6c677c69b-9npjw_b5316be9-1796-4bf0-aabf-ac9cf01c709b/manager/0.log" Dec 12 01:49:04 crc kubenswrapper[4606]: I1212 01:49:04.312895 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_dd89fd431413de875134da266d7ab4d16c4fc0ad81c223e66ec802ddc75wwz5_50a3ac6d-a23e-479c-9356-1b42add509da/util/0.log" Dec 12 01:49:04 crc kubenswrapper[4606]: I1212 01:49:04.574606 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dxvf6" Dec 12 01:49:04 crc kubenswrapper[4606]: I1212 01:49:04.668641 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_dd89fd431413de875134da266d7ab4d16c4fc0ad81c223e66ec802ddc75wwz5_50a3ac6d-a23e-479c-9356-1b42add509da/util/0.log" Dec 12 01:49:04 crc kubenswrapper[4606]: I1212 01:49:04.682659 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86afec2f-9c7d-4cdd-9e08-6ff06d5e78a8-catalog-content\") pod \"86afec2f-9c7d-4cdd-9e08-6ff06d5e78a8\" (UID: \"86afec2f-9c7d-4cdd-9e08-6ff06d5e78a8\") " Dec 12 01:49:04 crc kubenswrapper[4606]: I1212 01:49:04.682715 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86afec2f-9c7d-4cdd-9e08-6ff06d5e78a8-utilities\") pod \"86afec2f-9c7d-4cdd-9e08-6ff06d5e78a8\" (UID: \"86afec2f-9c7d-4cdd-9e08-6ff06d5e78a8\") " Dec 12 01:49:04 crc kubenswrapper[4606]: I1212 01:49:04.682969 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jx8wz\" (UniqueName: \"kubernetes.io/projected/86afec2f-9c7d-4cdd-9e08-6ff06d5e78a8-kube-api-access-jx8wz\") pod \"86afec2f-9c7d-4cdd-9e08-6ff06d5e78a8\" (UID: \"86afec2f-9c7d-4cdd-9e08-6ff06d5e78a8\") " Dec 12 01:49:04 crc kubenswrapper[4606]: I1212 01:49:04.683871 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86afec2f-9c7d-4cdd-9e08-6ff06d5e78a8-utilities" (OuterVolumeSpecName: "utilities") pod "86afec2f-9c7d-4cdd-9e08-6ff06d5e78a8" (UID: "86afec2f-9c7d-4cdd-9e08-6ff06d5e78a8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 01:49:04 crc kubenswrapper[4606]: I1212 01:49:04.703139 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86afec2f-9c7d-4cdd-9e08-6ff06d5e78a8-kube-api-access-jx8wz" (OuterVolumeSpecName: "kube-api-access-jx8wz") pod "86afec2f-9c7d-4cdd-9e08-6ff06d5e78a8" (UID: "86afec2f-9c7d-4cdd-9e08-6ff06d5e78a8"). InnerVolumeSpecName "kube-api-access-jx8wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 01:49:04 crc kubenswrapper[4606]: I1212 01:49:04.704745 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_dd89fd431413de875134da266d7ab4d16c4fc0ad81c223e66ec802ddc75wwz5_50a3ac6d-a23e-479c-9356-1b42add509da/pull/0.log" Dec 12 01:49:04 crc kubenswrapper[4606]: I1212 01:49:04.722900 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_dd89fd431413de875134da266d7ab4d16c4fc0ad81c223e66ec802ddc75wwz5_50a3ac6d-a23e-479c-9356-1b42add509da/pull/0.log" Dec 12 01:49:04 crc kubenswrapper[4606]: I1212 01:49:04.744254 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86afec2f-9c7d-4cdd-9e08-6ff06d5e78a8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "86afec2f-9c7d-4cdd-9e08-6ff06d5e78a8" (UID: "86afec2f-9c7d-4cdd-9e08-6ff06d5e78a8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 01:49:04 crc kubenswrapper[4606]: I1212 01:49:04.785010 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jx8wz\" (UniqueName: \"kubernetes.io/projected/86afec2f-9c7d-4cdd-9e08-6ff06d5e78a8-kube-api-access-jx8wz\") on node \"crc\" DevicePath \"\"" Dec 12 01:49:04 crc kubenswrapper[4606]: I1212 01:49:04.785041 4606 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86afec2f-9c7d-4cdd-9e08-6ff06d5e78a8-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 12 01:49:04 crc kubenswrapper[4606]: I1212 01:49:04.785051 4606 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86afec2f-9c7d-4cdd-9e08-6ff06d5e78a8-utilities\") on node \"crc\" DevicePath \"\"" Dec 12 01:49:04 crc kubenswrapper[4606]: I1212 01:49:04.937384 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_dd89fd431413de875134da266d7ab4d16c4fc0ad81c223e66ec802ddc75wwz5_50a3ac6d-a23e-479c-9356-1b42add509da/pull/0.log" Dec 12 01:49:04 crc kubenswrapper[4606]: I1212 01:49:04.945721 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_dd89fd431413de875134da266d7ab4d16c4fc0ad81c223e66ec802ddc75wwz5_50a3ac6d-a23e-479c-9356-1b42add509da/extract/0.log" Dec 12 01:49:04 crc kubenswrapper[4606]: I1212 01:49:04.969100 4606 generic.go:334] "Generic (PLEG): container finished" podID="86afec2f-9c7d-4cdd-9e08-6ff06d5e78a8" containerID="eb933663672a904ad71cc03c6a5ca5727c653039d466fdd01a64c63bfb994230" exitCode=0 Dec 12 01:49:04 crc kubenswrapper[4606]: I1212 01:49:04.969141 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dxvf6" event={"ID":"86afec2f-9c7d-4cdd-9e08-6ff06d5e78a8","Type":"ContainerDied","Data":"eb933663672a904ad71cc03c6a5ca5727c653039d466fdd01a64c63bfb994230"} Dec 12 01:49:04 crc kubenswrapper[4606]: I1212 01:49:04.969168 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dxvf6" Dec 12 01:49:04 crc kubenswrapper[4606]: I1212 01:49:04.969228 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dxvf6" event={"ID":"86afec2f-9c7d-4cdd-9e08-6ff06d5e78a8","Type":"ContainerDied","Data":"805fcb1c5ef3038d6e547ddfb852622774db5d6220a82537b829cf12bc9374fc"} Dec 12 01:49:04 crc kubenswrapper[4606]: I1212 01:49:04.969252 4606 scope.go:117] "RemoveContainer" containerID="eb933663672a904ad71cc03c6a5ca5727c653039d466fdd01a64c63bfb994230" Dec 12 01:49:04 crc kubenswrapper[4606]: I1212 01:49:04.988677 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_dd89fd431413de875134da266d7ab4d16c4fc0ad81c223e66ec802ddc75wwz5_50a3ac6d-a23e-479c-9356-1b42add509da/util/0.log" Dec 12 01:49:04 crc kubenswrapper[4606]: I1212 01:49:04.995068 4606 scope.go:117] "RemoveContainer" containerID="4828f18d57d8d062b61c3e3da4e6584081c59b89ccf80fb2dd36379ae8d754c4" Dec 12 01:49:05 crc kubenswrapper[4606]: I1212 01:49:05.013286 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-dxvf6"] Dec 12 01:49:05 crc kubenswrapper[4606]: I1212 01:49:05.026282 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-dxvf6"] Dec 12 01:49:05 crc kubenswrapper[4606]: I1212 01:49:05.038196 4606 scope.go:117] "RemoveContainer" containerID="88613bbe238c2922333cb9de5bf9baf37a10309465434a95162a4b63c017de91" Dec 12 01:49:05 crc kubenswrapper[4606]: I1212 01:49:05.154604 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-697fb699cf-2c5hc_93b508cc-be40-4c34-a5ea-81b58893894e/kube-rbac-proxy/0.log" Dec 12 01:49:05 crc kubenswrapper[4606]: I1212 01:49:05.196039 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-697fb699cf-2c5hc_93b508cc-be40-4c34-a5ea-81b58893894e/manager/0.log" Dec 12 01:49:05 crc kubenswrapper[4606]: I1212 01:49:05.588460 4606 scope.go:117] "RemoveContainer" containerID="eb933663672a904ad71cc03c6a5ca5727c653039d466fdd01a64c63bfb994230" Dec 12 01:49:05 crc kubenswrapper[4606]: E1212 01:49:05.589375 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb933663672a904ad71cc03c6a5ca5727c653039d466fdd01a64c63bfb994230\": container with ID starting with eb933663672a904ad71cc03c6a5ca5727c653039d466fdd01a64c63bfb994230 not found: ID does not exist" containerID="eb933663672a904ad71cc03c6a5ca5727c653039d466fdd01a64c63bfb994230" Dec 12 01:49:05 crc kubenswrapper[4606]: I1212 01:49:05.589443 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb933663672a904ad71cc03c6a5ca5727c653039d466fdd01a64c63bfb994230"} err="failed to get container status \"eb933663672a904ad71cc03c6a5ca5727c653039d466fdd01a64c63bfb994230\": rpc error: code = NotFound desc = could not find container \"eb933663672a904ad71cc03c6a5ca5727c653039d466fdd01a64c63bfb994230\": container with ID starting with eb933663672a904ad71cc03c6a5ca5727c653039d466fdd01a64c63bfb994230 not found: ID does not exist" Dec 12 01:49:05 crc kubenswrapper[4606]: I1212 01:49:05.589483 4606 scope.go:117] "RemoveContainer" containerID="4828f18d57d8d062b61c3e3da4e6584081c59b89ccf80fb2dd36379ae8d754c4" Dec 12 01:49:05 crc kubenswrapper[4606]: E1212 01:49:05.590629 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4828f18d57d8d062b61c3e3da4e6584081c59b89ccf80fb2dd36379ae8d754c4\": container with ID starting with 4828f18d57d8d062b61c3e3da4e6584081c59b89ccf80fb2dd36379ae8d754c4 not found: ID does not exist" containerID="4828f18d57d8d062b61c3e3da4e6584081c59b89ccf80fb2dd36379ae8d754c4" Dec 12 01:49:05 crc kubenswrapper[4606]: I1212 01:49:05.590684 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4828f18d57d8d062b61c3e3da4e6584081c59b89ccf80fb2dd36379ae8d754c4"} err="failed to get container status \"4828f18d57d8d062b61c3e3da4e6584081c59b89ccf80fb2dd36379ae8d754c4\": rpc error: code = NotFound desc = could not find container \"4828f18d57d8d062b61c3e3da4e6584081c59b89ccf80fb2dd36379ae8d754c4\": container with ID starting with 4828f18d57d8d062b61c3e3da4e6584081c59b89ccf80fb2dd36379ae8d754c4 not found: ID does not exist" Dec 12 01:49:05 crc kubenswrapper[4606]: I1212 01:49:05.590713 4606 scope.go:117] "RemoveContainer" containerID="88613bbe238c2922333cb9de5bf9baf37a10309465434a95162a4b63c017de91" Dec 12 01:49:05 crc kubenswrapper[4606]: E1212 01:49:05.591695 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88613bbe238c2922333cb9de5bf9baf37a10309465434a95162a4b63c017de91\": container with ID starting with 88613bbe238c2922333cb9de5bf9baf37a10309465434a95162a4b63c017de91 not found: ID does not exist" containerID="88613bbe238c2922333cb9de5bf9baf37a10309465434a95162a4b63c017de91" Dec 12 01:49:05 crc kubenswrapper[4606]: I1212 01:49:05.591736 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88613bbe238c2922333cb9de5bf9baf37a10309465434a95162a4b63c017de91"} err="failed to get container status \"88613bbe238c2922333cb9de5bf9baf37a10309465434a95162a4b63c017de91\": rpc error: code = NotFound desc = could not find container \"88613bbe238c2922333cb9de5bf9baf37a10309465434a95162a4b63c017de91\": container with ID starting with 88613bbe238c2922333cb9de5bf9baf37a10309465434a95162a4b63c017de91 not found: ID does not exist" Dec 12 01:49:05 crc kubenswrapper[4606]: I1212 01:49:05.702959 4606 scope.go:117] "RemoveContainer" containerID="e143db5c8656674439f9bedbd23be8e8c05444861157fb9728cbde496d7fbeb8" Dec 12 01:49:05 crc kubenswrapper[4606]: E1212 01:49:05.708218 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 01:49:05 crc kubenswrapper[4606]: I1212 01:49:05.726453 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86afec2f-9c7d-4cdd-9e08-6ff06d5e78a8" path="/var/lib/kubelet/pods/86afec2f-9c7d-4cdd-9e08-6ff06d5e78a8/volumes" Dec 12 01:49:05 crc kubenswrapper[4606]: I1212 01:49:05.860206 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-n54mf_1d9582d9-c931-4b43-8431-407d6c98cbc1/kube-rbac-proxy/0.log" Dec 12 01:49:05 crc kubenswrapper[4606]: I1212 01:49:05.862185 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-5697bb5779-mzf56_1c42899f-ae12-4c9b-b012-6ead724854cb/manager/0.log" Dec 12 01:49:05 crc kubenswrapper[4606]: I1212 01:49:05.888273 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-5697bb5779-mzf56_1c42899f-ae12-4c9b-b012-6ead724854cb/kube-rbac-proxy/0.log" Dec 12 01:49:06 crc kubenswrapper[4606]: I1212 01:49:06.056353 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-w5849_7663a2be-d4ba-43d4-bd35-7bf4b969a72d/kube-rbac-proxy/0.log" Dec 12 01:49:06 crc kubenswrapper[4606]: I1212 01:49:06.128678 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-w5849_7663a2be-d4ba-43d4-bd35-7bf4b969a72d/manager/0.log" Dec 12 01:49:06 crc kubenswrapper[4606]: I1212 01:49:06.181369 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-n54mf_1d9582d9-c931-4b43-8431-407d6c98cbc1/manager/0.log" Dec 12 01:49:06 crc kubenswrapper[4606]: I1212 01:49:06.343490 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-78d48bff9d-mnqs5_9ec63351-044c-4c07-b021-a2835b2290c8/kube-rbac-proxy/0.log" Dec 12 01:49:06 crc kubenswrapper[4606]: I1212 01:49:06.532583 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-967d97867-zgnm9_181d9f8e-1256-417e-ae8b-cc71d7fdc2b7/kube-rbac-proxy/0.log" Dec 12 01:49:06 crc kubenswrapper[4606]: I1212 01:49:06.619111 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-78d48bff9d-mnqs5_9ec63351-044c-4c07-b021-a2835b2290c8/manager/0.log" Dec 12 01:49:06 crc kubenswrapper[4606]: I1212 01:49:06.658392 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-967d97867-zgnm9_181d9f8e-1256-417e-ae8b-cc71d7fdc2b7/manager/0.log" Dec 12 01:49:06 crc kubenswrapper[4606]: I1212 01:49:06.759602 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-bdj9z_19a5895a-f008-411d-9ac2-6122eb52aa1e/kube-rbac-proxy/0.log" Dec 12 01:49:06 crc kubenswrapper[4606]: I1212 01:49:06.862376 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-bdj9z_19a5895a-f008-411d-9ac2-6122eb52aa1e/manager/0.log" Dec 12 01:49:06 crc kubenswrapper[4606]: I1212 01:49:06.950323 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-5b5fd79c9c-66jql_6130b694-1b33-495f-b0af-481805aa4727/kube-rbac-proxy/0.log" Dec 12 01:49:07 crc kubenswrapper[4606]: I1212 01:49:07.358199 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-5b5fd79c9c-66jql_6130b694-1b33-495f-b0af-481805aa4727/manager/0.log" Dec 12 01:49:07 crc kubenswrapper[4606]: I1212 01:49:07.409931 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-79c8c4686c-j44wr_7f8a5b5c-6158-4f24-8323-2afd6b9b2664/kube-rbac-proxy/0.log" Dec 12 01:49:07 crc kubenswrapper[4606]: I1212 01:49:07.460070 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-79c8c4686c-j44wr_7f8a5b5c-6158-4f24-8323-2afd6b9b2664/manager/0.log" Dec 12 01:49:07 crc kubenswrapper[4606]: I1212 01:49:07.650615 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-h5tgw_1d4554d9-9dc1-4d74-b8ea-f4c886c08fde/kube-rbac-proxy/0.log" Dec 12 01:49:07 crc kubenswrapper[4606]: I1212 01:49:07.670317 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-h5tgw_1d4554d9-9dc1-4d74-b8ea-f4c886c08fde/manager/0.log" Dec 12 01:49:07 crc kubenswrapper[4606]: I1212 01:49:07.872276 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-wqj4k_a6d74506-7048-4b2d-ba7f-46e83a508405/kube-rbac-proxy/0.log" Dec 12 01:49:07 crc kubenswrapper[4606]: I1212 01:49:07.955935 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-wqj4k_a6d74506-7048-4b2d-ba7f-46e83a508405/manager/0.log" Dec 12 01:49:08 crc kubenswrapper[4606]: I1212 01:49:08.023472 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-gtwmt_616771f5-4be8-4f22-86d8-dcd4a365a311/kube-rbac-proxy/0.log" Dec 12 01:49:08 crc kubenswrapper[4606]: I1212 01:49:08.085695 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-gtwmt_616771f5-4be8-4f22-86d8-dcd4a365a311/manager/0.log" Dec 12 01:49:08 crc kubenswrapper[4606]: I1212 01:49:08.161338 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-84b575879fkjwr5_02def546-751a-46ac-848a-367f0a7f84cb/manager/0.log" Dec 12 01:49:08 crc kubenswrapper[4606]: I1212 01:49:08.257922 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-84b575879fkjwr5_02def546-751a-46ac-848a-367f0a7f84cb/kube-rbac-proxy/0.log" Dec 12 01:49:08 crc kubenswrapper[4606]: I1212 01:49:08.618497 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-686f4c6566-qf7w8_6f1636a2-66b9-4641-9779-34142a76a14f/operator/0.log" Dec 12 01:49:08 crc kubenswrapper[4606]: I1212 01:49:08.648436 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-p546n_89ebf2d8-d7a1-4b1c-a90c-8236306cc7bd/registry-server/0.log" Dec 12 01:49:08 crc kubenswrapper[4606]: I1212 01:49:08.835829 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-hcqxk_3b429293-caf6-47e1-9976-01d6fca19c6c/kube-rbac-proxy/0.log" Dec 12 01:49:09 crc kubenswrapper[4606]: I1212 01:49:09.015268 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-pq6pj_8d1093f3-e1d5-45be-9682-2f3ccf90eda2/kube-rbac-proxy/0.log" Dec 12 01:49:09 crc kubenswrapper[4606]: I1212 01:49:09.064257 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-hcqxk_3b429293-caf6-47e1-9976-01d6fca19c6c/manager/0.log" Dec 12 01:49:09 crc kubenswrapper[4606]: I1212 01:49:09.151479 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-pq6pj_8d1093f3-e1d5-45be-9682-2f3ccf90eda2/manager/0.log" Dec 12 01:49:09 crc kubenswrapper[4606]: I1212 01:49:09.354586 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-6lzdf_e1c99848-c685-4782-bb57-71217db4db6c/operator/0.log" Dec 12 01:49:09 crc kubenswrapper[4606]: I1212 01:49:09.432199 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-6f7f89d9c9-mfj2g_d681c7e6-bef3-4733-875c-45d6b60643e5/manager/0.log" Dec 12 01:49:09 crc kubenswrapper[4606]: I1212 01:49:09.470818 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-9d58d64bc-g4nxc_0b3e1e95-7581-4453-af8b-6a23e4bba5fe/kube-rbac-proxy/0.log" Dec 12 01:49:09 crc kubenswrapper[4606]: I1212 01:49:09.577618 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-9d58d64bc-g4nxc_0b3e1e95-7581-4453-af8b-6a23e4bba5fe/manager/0.log" Dec 12 01:49:09 crc kubenswrapper[4606]: I1212 01:49:09.625382 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-58d5ff84df-st6cm_2a27185b-308d-419c-bc01-26714a1f0394/kube-rbac-proxy/0.log" Dec 12 01:49:09 crc kubenswrapper[4606]: I1212 01:49:09.751306 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-58d5ff84df-st6cm_2a27185b-308d-419c-bc01-26714a1f0394/manager/0.log" Dec 12 01:49:09 crc kubenswrapper[4606]: I1212 01:49:09.780908 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-qjxxm_d7b0479e-9d3b-48b0-a7dd-6388faf6cfc0/kube-rbac-proxy/0.log" Dec 12 01:49:09 crc kubenswrapper[4606]: I1212 01:49:09.866148 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-qjxxm_d7b0479e-9d3b-48b0-a7dd-6388faf6cfc0/manager/0.log" Dec 12 01:49:09 crc kubenswrapper[4606]: I1212 01:49:09.933550 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-75944c9b7-z2b8c_f289c6b8-d4dd-40da-ac6a-0249b4a3e9f8/kube-rbac-proxy/0.log" Dec 12 01:49:09 crc kubenswrapper[4606]: I1212 01:49:09.994246 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-75944c9b7-z2b8c_f289c6b8-d4dd-40da-ac6a-0249b4a3e9f8/manager/0.log" Dec 12 01:49:19 crc kubenswrapper[4606]: I1212 01:49:19.708634 4606 scope.go:117] "RemoveContainer" containerID="e143db5c8656674439f9bedbd23be8e8c05444861157fb9728cbde496d7fbeb8" Dec 12 01:49:19 crc kubenswrapper[4606]: E1212 01:49:19.709466 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 01:49:32 crc kubenswrapper[4606]: I1212 01:49:32.633483 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-4p7d9_848f49c7-d7b8-4490-9956-4014339c4a31/control-plane-machine-set-operator/0.log" Dec 12 01:49:32 crc kubenswrapper[4606]: I1212 01:49:32.823668 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-pkx4d_9d84c7aa-3fad-4d0c-ba9a-5577ba892a5b/kube-rbac-proxy/0.log" Dec 12 01:49:32 crc kubenswrapper[4606]: I1212 01:49:32.855873 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-pkx4d_9d84c7aa-3fad-4d0c-ba9a-5577ba892a5b/machine-api-operator/0.log" Dec 12 01:49:34 crc kubenswrapper[4606]: I1212 01:49:34.699737 4606 scope.go:117] "RemoveContainer" containerID="e143db5c8656674439f9bedbd23be8e8c05444861157fb9728cbde496d7fbeb8" Dec 12 01:49:34 crc kubenswrapper[4606]: E1212 01:49:34.700156 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 01:49:45 crc kubenswrapper[4606]: I1212 01:49:45.700129 4606 scope.go:117] "RemoveContainer" containerID="e143db5c8656674439f9bedbd23be8e8c05444861157fb9728cbde496d7fbeb8" Dec 12 01:49:45 crc kubenswrapper[4606]: E1212 01:49:45.700743 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 01:49:46 crc kubenswrapper[4606]: I1212 01:49:46.577217 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-8wvkl_4def20cb-2590-41e7-9c98-6fd10a84d049/cert-manager-controller/0.log" Dec 12 01:49:46 crc kubenswrapper[4606]: I1212 01:49:46.714303 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-vgqkv_95dcdca5-05de-43d2-a86c-757b112d1cd5/cert-manager-cainjector/0.log" Dec 12 01:49:46 crc kubenswrapper[4606]: I1212 01:49:46.764514 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-2vtxw_886ac20b-ef3b-459a-8539-6a7040bcd6fb/cert-manager-webhook/0.log" Dec 12 01:49:57 crc kubenswrapper[4606]: I1212 01:49:57.699713 4606 scope.go:117] "RemoveContainer" containerID="e143db5c8656674439f9bedbd23be8e8c05444861157fb9728cbde496d7fbeb8" Dec 12 01:49:57 crc kubenswrapper[4606]: E1212 01:49:57.700560 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 01:50:00 crc kubenswrapper[4606]: I1212 01:50:00.100680 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-6ff7998486-82lk9_27901900-87d9-45a6-a5cb-1fcf505917ee/nmstate-console-plugin/0.log" Dec 12 01:50:00 crc kubenswrapper[4606]: I1212 01:50:00.266492 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-m5nlb_c5c179d2-3e8f-4aa4-8b37-737c167dd42f/nmstate-handler/0.log" Dec 12 01:50:00 crc kubenswrapper[4606]: I1212 01:50:00.308596 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f7f7578db-95nwl_042d210a-3148-4706-8e99-798c7cab2239/kube-rbac-proxy/0.log" Dec 12 01:50:00 crc kubenswrapper[4606]: I1212 01:50:00.335319 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f7f7578db-95nwl_042d210a-3148-4706-8e99-798c7cab2239/nmstate-metrics/0.log" Dec 12 01:50:00 crc kubenswrapper[4606]: I1212 01:50:00.810990 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-6769fb99d-9fhgf_afff02ee-90b6-4315-ae47-8c8585994b6d/nmstate-operator/0.log" Dec 12 01:50:00 crc kubenswrapper[4606]: I1212 01:50:00.856394 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-f8fb84555-xpnht_fccbfa74-64a7-4920-a145-abde992f617d/nmstate-webhook/0.log" Dec 12 01:50:12 crc kubenswrapper[4606]: I1212 01:50:12.700445 4606 scope.go:117] "RemoveContainer" containerID="e143db5c8656674439f9bedbd23be8e8c05444861157fb9728cbde496d7fbeb8" Dec 12 01:50:12 crc kubenswrapper[4606]: E1212 01:50:12.701232 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 01:50:16 crc kubenswrapper[4606]: I1212 01:50:16.856957 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-5bddd4b946-74lqn_c2e79dcf-8eee-4042-b9b0-8edcf88f3fce/kube-rbac-proxy/0.log" Dec 12 01:50:16 crc kubenswrapper[4606]: I1212 01:50:16.997424 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-5bddd4b946-74lqn_c2e79dcf-8eee-4042-b9b0-8edcf88f3fce/controller/0.log" Dec 12 01:50:17 crc kubenswrapper[4606]: I1212 01:50:17.087489 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n4sjm_5d1c165f-9379-412b-b7aa-6e4da7c4717a/cp-frr-files/0.log" Dec 12 01:50:17 crc kubenswrapper[4606]: I1212 01:50:17.271620 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n4sjm_5d1c165f-9379-412b-b7aa-6e4da7c4717a/cp-frr-files/0.log" Dec 12 01:50:17 crc kubenswrapper[4606]: I1212 01:50:17.286278 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n4sjm_5d1c165f-9379-412b-b7aa-6e4da7c4717a/cp-metrics/0.log" Dec 12 01:50:17 crc kubenswrapper[4606]: I1212 01:50:17.300259 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n4sjm_5d1c165f-9379-412b-b7aa-6e4da7c4717a/cp-reloader/0.log" Dec 12 01:50:17 crc kubenswrapper[4606]: I1212 01:50:17.339768 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n4sjm_5d1c165f-9379-412b-b7aa-6e4da7c4717a/cp-reloader/0.log" Dec 12 01:50:18 crc kubenswrapper[4606]: I1212 01:50:18.240773 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n4sjm_5d1c165f-9379-412b-b7aa-6e4da7c4717a/cp-metrics/0.log" Dec 12 01:50:18 crc kubenswrapper[4606]: I1212 01:50:18.280750 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n4sjm_5d1c165f-9379-412b-b7aa-6e4da7c4717a/cp-frr-files/0.log" Dec 12 01:50:18 crc kubenswrapper[4606]: I1212 01:50:18.338468 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n4sjm_5d1c165f-9379-412b-b7aa-6e4da7c4717a/cp-metrics/0.log" Dec 12 01:50:18 crc kubenswrapper[4606]: I1212 01:50:18.349117 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n4sjm_5d1c165f-9379-412b-b7aa-6e4da7c4717a/cp-reloader/0.log" Dec 12 01:50:18 crc kubenswrapper[4606]: I1212 01:50:18.506671 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n4sjm_5d1c165f-9379-412b-b7aa-6e4da7c4717a/cp-frr-files/0.log" Dec 12 01:50:18 crc kubenswrapper[4606]: I1212 01:50:18.506780 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n4sjm_5d1c165f-9379-412b-b7aa-6e4da7c4717a/cp-reloader/0.log" Dec 12 01:50:18 crc kubenswrapper[4606]: I1212 01:50:18.579514 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n4sjm_5d1c165f-9379-412b-b7aa-6e4da7c4717a/cp-metrics/0.log" Dec 12 01:50:18 crc kubenswrapper[4606]: I1212 01:50:18.606375 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n4sjm_5d1c165f-9379-412b-b7aa-6e4da7c4717a/controller/0.log" Dec 12 01:50:18 crc kubenswrapper[4606]: I1212 01:50:18.793620 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n4sjm_5d1c165f-9379-412b-b7aa-6e4da7c4717a/kube-rbac-proxy/0.log" Dec 12 01:50:18 crc kubenswrapper[4606]: I1212 01:50:18.827794 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n4sjm_5d1c165f-9379-412b-b7aa-6e4da7c4717a/kube-rbac-proxy-frr/0.log" Dec 12 01:50:18 crc kubenswrapper[4606]: I1212 01:50:18.880695 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n4sjm_5d1c165f-9379-412b-b7aa-6e4da7c4717a/frr-metrics/0.log" Dec 12 01:50:19 crc kubenswrapper[4606]: I1212 01:50:19.000844 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n4sjm_5d1c165f-9379-412b-b7aa-6e4da7c4717a/reloader/0.log" Dec 12 01:50:19 crc kubenswrapper[4606]: I1212 01:50:19.181975 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7784b6fcf-gx2jt_886be8e2-677e-4bd4-81cf-032dd6d8a890/frr-k8s-webhook-server/0.log" Dec 12 01:50:19 crc kubenswrapper[4606]: I1212 01:50:19.394410 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-77595d9574-hx5wt_2d54ebef-2685-424c-8d1a-7d3d56a8681c/manager/0.log" Dec 12 01:50:19 crc kubenswrapper[4606]: I1212 01:50:19.922110 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-76c66465b9-m7hxs_5d954396-d8c4-45d2-97b3-3606eb503029/webhook-server/0.log" Dec 12 01:50:20 crc kubenswrapper[4606]: I1212 01:50:20.078242 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n4sjm_5d1c165f-9379-412b-b7aa-6e4da7c4717a/frr/0.log" Dec 12 01:50:20 crc kubenswrapper[4606]: I1212 01:50:20.143212 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-xngsn_166cd42d-4038-46ed-aa22-d264904eb215/kube-rbac-proxy/0.log" Dec 12 01:50:20 crc kubenswrapper[4606]: I1212 01:50:20.546975 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-xngsn_166cd42d-4038-46ed-aa22-d264904eb215/speaker/0.log" Dec 12 01:50:24 crc kubenswrapper[4606]: I1212 01:50:24.700348 4606 scope.go:117] "RemoveContainer" containerID="e143db5c8656674439f9bedbd23be8e8c05444861157fb9728cbde496d7fbeb8" Dec 12 01:50:24 crc kubenswrapper[4606]: E1212 01:50:24.703058 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 01:50:34 crc kubenswrapper[4606]: I1212 01:50:34.949855 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4mzlxf_ca9f70a4-aa76-4acc-bcd5-90581609d523/util/0.log" Dec 12 01:50:35 crc kubenswrapper[4606]: I1212 01:50:35.081242 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4mzlxf_ca9f70a4-aa76-4acc-bcd5-90581609d523/util/0.log" Dec 12 01:50:35 crc kubenswrapper[4606]: I1212 01:50:35.137938 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4mzlxf_ca9f70a4-aa76-4acc-bcd5-90581609d523/pull/0.log" Dec 12 01:50:35 crc kubenswrapper[4606]: I1212 01:50:35.197515 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4mzlxf_ca9f70a4-aa76-4acc-bcd5-90581609d523/pull/0.log" Dec 12 01:50:35 crc kubenswrapper[4606]: I1212 01:50:35.349478 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4mzlxf_ca9f70a4-aa76-4acc-bcd5-90581609d523/util/0.log" Dec 12 01:50:35 crc kubenswrapper[4606]: I1212 01:50:35.363302 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4mzlxf_ca9f70a4-aa76-4acc-bcd5-90581609d523/pull/0.log" Dec 12 01:50:35 crc kubenswrapper[4606]: I1212 01:50:35.465892 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4mzlxf_ca9f70a4-aa76-4acc-bcd5-90581609d523/extract/0.log" Dec 12 01:50:35 crc kubenswrapper[4606]: I1212 01:50:35.548886 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8bpn9m_dac2009b-c1b1-4c9a-8d8b-045e0c3b4545/util/0.log" Dec 12 01:50:35 crc kubenswrapper[4606]: I1212 01:50:35.764045 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8bpn9m_dac2009b-c1b1-4c9a-8d8b-045e0c3b4545/pull/0.log" Dec 12 01:50:35 crc kubenswrapper[4606]: I1212 01:50:35.769817 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8bpn9m_dac2009b-c1b1-4c9a-8d8b-045e0c3b4545/pull/0.log" Dec 12 01:50:35 crc kubenswrapper[4606]: I1212 01:50:35.781266 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8bpn9m_dac2009b-c1b1-4c9a-8d8b-045e0c3b4545/util/0.log" Dec 12 01:50:36 crc kubenswrapper[4606]: I1212 01:50:36.005530 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8bpn9m_dac2009b-c1b1-4c9a-8d8b-045e0c3b4545/extract/0.log" Dec 12 01:50:36 crc kubenswrapper[4606]: I1212 01:50:36.040796 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8bpn9m_dac2009b-c1b1-4c9a-8d8b-045e0c3b4545/pull/0.log" Dec 12 01:50:36 crc kubenswrapper[4606]: I1212 01:50:36.050470 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8bpn9m_dac2009b-c1b1-4c9a-8d8b-045e0c3b4545/util/0.log" Dec 12 01:50:36 crc kubenswrapper[4606]: I1212 01:50:36.661647 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-fp5q4_8290e814-06ee-41a9-a13a-d3c6c94c87b3/extract-utilities/0.log" Dec 12 01:50:36 crc kubenswrapper[4606]: I1212 01:50:36.699767 4606 scope.go:117] "RemoveContainer" containerID="e143db5c8656674439f9bedbd23be8e8c05444861157fb9728cbde496d7fbeb8" Dec 12 01:50:36 crc kubenswrapper[4606]: I1212 01:50:36.874641 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-fp5q4_8290e814-06ee-41a9-a13a-d3c6c94c87b3/extract-content/0.log" Dec 12 01:50:36 crc kubenswrapper[4606]: I1212 01:50:36.914114 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-fp5q4_8290e814-06ee-41a9-a13a-d3c6c94c87b3/extract-utilities/0.log" Dec 12 01:50:36 crc kubenswrapper[4606]: I1212 01:50:36.960705 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-fp5q4_8290e814-06ee-41a9-a13a-d3c6c94c87b3/extract-content/0.log" Dec 12 01:50:37 crc kubenswrapper[4606]: I1212 01:50:37.112605 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-fp5q4_8290e814-06ee-41a9-a13a-d3c6c94c87b3/extract-utilities/0.log" Dec 12 01:50:37 crc kubenswrapper[4606]: I1212 01:50:37.127277 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-fp5q4_8290e814-06ee-41a9-a13a-d3c6c94c87b3/extract-content/0.log" Dec 12 01:50:37 crc kubenswrapper[4606]: I1212 01:50:37.362294 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-x6rsl_bf73008f-0c71-4676-ae7a-8a3256c3df05/extract-utilities/0.log" Dec 12 01:50:37 crc kubenswrapper[4606]: I1212 01:50:37.602598 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-fp5q4_8290e814-06ee-41a9-a13a-d3c6c94c87b3/registry-server/0.log" Dec 12 01:50:37 crc kubenswrapper[4606]: I1212 01:50:37.725006 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-x6rsl_bf73008f-0c71-4676-ae7a-8a3256c3df05/extract-utilities/0.log" Dec 12 01:50:37 crc kubenswrapper[4606]: I1212 01:50:37.780944 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-x6rsl_bf73008f-0c71-4676-ae7a-8a3256c3df05/extract-content/0.log" Dec 12 01:50:37 crc kubenswrapper[4606]: I1212 01:50:37.786595 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-x6rsl_bf73008f-0c71-4676-ae7a-8a3256c3df05/extract-content/0.log" Dec 12 01:50:37 crc kubenswrapper[4606]: I1212 01:50:37.845612 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" event={"ID":"a543e227-be89-40cb-941d-b4707cc28921","Type":"ContainerStarted","Data":"1feb1ef0232c3ca356e8c29f9af5822520b4078000ea7efa9d0c5b72506209e7"} Dec 12 01:50:38 crc kubenswrapper[4606]: I1212 01:50:38.558035 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-x6rsl_bf73008f-0c71-4676-ae7a-8a3256c3df05/extract-utilities/0.log" Dec 12 01:50:38 crc kubenswrapper[4606]: I1212 01:50:38.679293 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-x6rsl_bf73008f-0c71-4676-ae7a-8a3256c3df05/extract-content/0.log" Dec 12 01:50:38 crc kubenswrapper[4606]: I1212 01:50:38.930878 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-xqq5z_bf10904b-21cd-4987-bedb-118b0992002a/marketplace-operator/0.log" Dec 12 01:50:39 crc kubenswrapper[4606]: I1212 01:50:39.047317 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-v92fv_74f91f8e-c973-4ffb-89d2-8b0683578a84/extract-utilities/0.log" Dec 12 01:50:39 crc kubenswrapper[4606]: I1212 01:50:39.352063 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-v92fv_74f91f8e-c973-4ffb-89d2-8b0683578a84/extract-content/0.log" Dec 12 01:50:39 crc kubenswrapper[4606]: I1212 01:50:39.373711 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-v92fv_74f91f8e-c973-4ffb-89d2-8b0683578a84/extract-content/0.log" Dec 12 01:50:39 crc kubenswrapper[4606]: I1212 01:50:39.430560 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-x6rsl_bf73008f-0c71-4676-ae7a-8a3256c3df05/registry-server/0.log" Dec 12 01:50:39 crc kubenswrapper[4606]: I1212 01:50:39.452107 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-v92fv_74f91f8e-c973-4ffb-89d2-8b0683578a84/extract-utilities/0.log" Dec 12 01:50:39 crc kubenswrapper[4606]: I1212 01:50:39.667733 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-v92fv_74f91f8e-c973-4ffb-89d2-8b0683578a84/extract-content/0.log" Dec 12 01:50:39 crc kubenswrapper[4606]: I1212 01:50:39.675046 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-v92fv_74f91f8e-c973-4ffb-89d2-8b0683578a84/extract-utilities/0.log" Dec 12 01:50:39 crc kubenswrapper[4606]: I1212 01:50:39.709757 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8whwl_da3f6c49-c4b6-4fee-a3f5-1635d73e62f2/extract-utilities/0.log" Dec 12 01:50:39 crc kubenswrapper[4606]: I1212 01:50:39.821240 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-v92fv_74f91f8e-c973-4ffb-89d2-8b0683578a84/registry-server/0.log" Dec 12 01:50:39 crc kubenswrapper[4606]: I1212 01:50:39.960600 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8whwl_da3f6c49-c4b6-4fee-a3f5-1635d73e62f2/extract-content/0.log" Dec 12 01:50:40 crc kubenswrapper[4606]: I1212 01:50:40.014514 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8whwl_da3f6c49-c4b6-4fee-a3f5-1635d73e62f2/extract-utilities/0.log" Dec 12 01:50:40 crc kubenswrapper[4606]: I1212 01:50:40.019862 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8whwl_da3f6c49-c4b6-4fee-a3f5-1635d73e62f2/extract-content/0.log" Dec 12 01:50:40 crc kubenswrapper[4606]: I1212 01:50:40.301544 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8whwl_da3f6c49-c4b6-4fee-a3f5-1635d73e62f2/extract-utilities/0.log" Dec 12 01:50:40 crc kubenswrapper[4606]: I1212 01:50:40.405002 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8whwl_da3f6c49-c4b6-4fee-a3f5-1635d73e62f2/extract-content/0.log" Dec 12 01:50:40 crc kubenswrapper[4606]: I1212 01:50:40.488863 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8whwl_da3f6c49-c4b6-4fee-a3f5-1635d73e62f2/registry-server/0.log" Dec 12 01:52:01 crc kubenswrapper[4606]: I1212 01:52:01.235384 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-xsjtc"] Dec 12 01:52:01 crc kubenswrapper[4606]: E1212 01:52:01.236478 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86afec2f-9c7d-4cdd-9e08-6ff06d5e78a8" containerName="extract-content" Dec 12 01:52:01 crc kubenswrapper[4606]: I1212 01:52:01.236493 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="86afec2f-9c7d-4cdd-9e08-6ff06d5e78a8" containerName="extract-content" Dec 12 01:52:01 crc kubenswrapper[4606]: E1212 01:52:01.236530 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86afec2f-9c7d-4cdd-9e08-6ff06d5e78a8" containerName="extract-utilities" Dec 12 01:52:01 crc kubenswrapper[4606]: I1212 01:52:01.236538 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="86afec2f-9c7d-4cdd-9e08-6ff06d5e78a8" containerName="extract-utilities" Dec 12 01:52:01 crc kubenswrapper[4606]: E1212 01:52:01.236557 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86afec2f-9c7d-4cdd-9e08-6ff06d5e78a8" containerName="registry-server" Dec 12 01:52:01 crc kubenswrapper[4606]: I1212 01:52:01.236562 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="86afec2f-9c7d-4cdd-9e08-6ff06d5e78a8" containerName="registry-server" Dec 12 01:52:01 crc kubenswrapper[4606]: I1212 01:52:01.236734 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="86afec2f-9c7d-4cdd-9e08-6ff06d5e78a8" containerName="registry-server" Dec 12 01:52:01 crc kubenswrapper[4606]: I1212 01:52:01.238688 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xsjtc" Dec 12 01:52:01 crc kubenswrapper[4606]: I1212 01:52:01.252249 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xsjtc"] Dec 12 01:52:01 crc kubenswrapper[4606]: I1212 01:52:01.319749 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zvq5\" (UniqueName: \"kubernetes.io/projected/c74bb6ce-3fe0-4777-a61f-46b5baf7d2fa-kube-api-access-6zvq5\") pod \"redhat-marketplace-xsjtc\" (UID: \"c74bb6ce-3fe0-4777-a61f-46b5baf7d2fa\") " pod="openshift-marketplace/redhat-marketplace-xsjtc" Dec 12 01:52:01 crc kubenswrapper[4606]: I1212 01:52:01.319835 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c74bb6ce-3fe0-4777-a61f-46b5baf7d2fa-catalog-content\") pod \"redhat-marketplace-xsjtc\" (UID: \"c74bb6ce-3fe0-4777-a61f-46b5baf7d2fa\") " pod="openshift-marketplace/redhat-marketplace-xsjtc" Dec 12 01:52:01 crc kubenswrapper[4606]: I1212 01:52:01.319861 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c74bb6ce-3fe0-4777-a61f-46b5baf7d2fa-utilities\") pod \"redhat-marketplace-xsjtc\" (UID: \"c74bb6ce-3fe0-4777-a61f-46b5baf7d2fa\") " pod="openshift-marketplace/redhat-marketplace-xsjtc" Dec 12 01:52:01 crc kubenswrapper[4606]: I1212 01:52:01.421478 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6zvq5\" (UniqueName: \"kubernetes.io/projected/c74bb6ce-3fe0-4777-a61f-46b5baf7d2fa-kube-api-access-6zvq5\") pod \"redhat-marketplace-xsjtc\" (UID: \"c74bb6ce-3fe0-4777-a61f-46b5baf7d2fa\") " pod="openshift-marketplace/redhat-marketplace-xsjtc" Dec 12 01:52:01 crc kubenswrapper[4606]: I1212 01:52:01.421576 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c74bb6ce-3fe0-4777-a61f-46b5baf7d2fa-catalog-content\") pod \"redhat-marketplace-xsjtc\" (UID: \"c74bb6ce-3fe0-4777-a61f-46b5baf7d2fa\") " pod="openshift-marketplace/redhat-marketplace-xsjtc" Dec 12 01:52:01 crc kubenswrapper[4606]: I1212 01:52:01.421608 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c74bb6ce-3fe0-4777-a61f-46b5baf7d2fa-utilities\") pod \"redhat-marketplace-xsjtc\" (UID: \"c74bb6ce-3fe0-4777-a61f-46b5baf7d2fa\") " pod="openshift-marketplace/redhat-marketplace-xsjtc" Dec 12 01:52:01 crc kubenswrapper[4606]: I1212 01:52:01.422077 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c74bb6ce-3fe0-4777-a61f-46b5baf7d2fa-utilities\") pod \"redhat-marketplace-xsjtc\" (UID: \"c74bb6ce-3fe0-4777-a61f-46b5baf7d2fa\") " pod="openshift-marketplace/redhat-marketplace-xsjtc" Dec 12 01:52:01 crc kubenswrapper[4606]: I1212 01:52:01.422217 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c74bb6ce-3fe0-4777-a61f-46b5baf7d2fa-catalog-content\") pod \"redhat-marketplace-xsjtc\" (UID: \"c74bb6ce-3fe0-4777-a61f-46b5baf7d2fa\") " pod="openshift-marketplace/redhat-marketplace-xsjtc" Dec 12 01:52:01 crc kubenswrapper[4606]: I1212 01:52:01.457121 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zvq5\" (UniqueName: \"kubernetes.io/projected/c74bb6ce-3fe0-4777-a61f-46b5baf7d2fa-kube-api-access-6zvq5\") pod \"redhat-marketplace-xsjtc\" (UID: \"c74bb6ce-3fe0-4777-a61f-46b5baf7d2fa\") " pod="openshift-marketplace/redhat-marketplace-xsjtc" Dec 12 01:52:01 crc kubenswrapper[4606]: I1212 01:52:01.560284 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xsjtc" Dec 12 01:52:02 crc kubenswrapper[4606]: I1212 01:52:02.134736 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xsjtc"] Dec 12 01:52:02 crc kubenswrapper[4606]: W1212 01:52:02.145326 4606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc74bb6ce_3fe0_4777_a61f_46b5baf7d2fa.slice/crio-abb0dd7862cf6b8eca4caa8fba976991b1b5684bbbda3fdad1b9468e5dd8c4a1 WatchSource:0}: Error finding container abb0dd7862cf6b8eca4caa8fba976991b1b5684bbbda3fdad1b9468e5dd8c4a1: Status 404 returned error can't find the container with id abb0dd7862cf6b8eca4caa8fba976991b1b5684bbbda3fdad1b9468e5dd8c4a1 Dec 12 01:52:02 crc kubenswrapper[4606]: I1212 01:52:02.789569 4606 generic.go:334] "Generic (PLEG): container finished" podID="c74bb6ce-3fe0-4777-a61f-46b5baf7d2fa" containerID="5b8d59404f7f53a5428906d89613a44c4720d4acaae6b6090c76071cf1c45a99" exitCode=0 Dec 12 01:52:02 crc kubenswrapper[4606]: I1212 01:52:02.789775 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xsjtc" event={"ID":"c74bb6ce-3fe0-4777-a61f-46b5baf7d2fa","Type":"ContainerDied","Data":"5b8d59404f7f53a5428906d89613a44c4720d4acaae6b6090c76071cf1c45a99"} Dec 12 01:52:02 crc kubenswrapper[4606]: I1212 01:52:02.790075 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xsjtc" event={"ID":"c74bb6ce-3fe0-4777-a61f-46b5baf7d2fa","Type":"ContainerStarted","Data":"abb0dd7862cf6b8eca4caa8fba976991b1b5684bbbda3fdad1b9468e5dd8c4a1"} Dec 12 01:52:02 crc kubenswrapper[4606]: I1212 01:52:02.791742 4606 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 12 01:52:03 crc kubenswrapper[4606]: I1212 01:52:03.800165 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xsjtc" event={"ID":"c74bb6ce-3fe0-4777-a61f-46b5baf7d2fa","Type":"ContainerStarted","Data":"22c4c325d1cfcf2b1115ed8083777b856755b070256b79553df68abb015f4f21"} Dec 12 01:52:04 crc kubenswrapper[4606]: I1212 01:52:04.812332 4606 generic.go:334] "Generic (PLEG): container finished" podID="c74bb6ce-3fe0-4777-a61f-46b5baf7d2fa" containerID="22c4c325d1cfcf2b1115ed8083777b856755b070256b79553df68abb015f4f21" exitCode=0 Dec 12 01:52:04 crc kubenswrapper[4606]: I1212 01:52:04.812593 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xsjtc" event={"ID":"c74bb6ce-3fe0-4777-a61f-46b5baf7d2fa","Type":"ContainerDied","Data":"22c4c325d1cfcf2b1115ed8083777b856755b070256b79553df68abb015f4f21"} Dec 12 01:52:05 crc kubenswrapper[4606]: I1212 01:52:05.822378 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xsjtc" event={"ID":"c74bb6ce-3fe0-4777-a61f-46b5baf7d2fa","Type":"ContainerStarted","Data":"8f8bcd96e2a52a6d5a4fc68c3ad127630dd78f7d49d1552030f3d646941eb81c"} Dec 12 01:52:05 crc kubenswrapper[4606]: I1212 01:52:05.860809 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-xsjtc" podStartSLOduration=2.260627808 podStartE2EDuration="4.860753358s" podCreationTimestamp="2025-12-12 01:52:01 +0000 UTC" firstStartedPulling="2025-12-12 01:52:02.791301775 +0000 UTC m=+5313.336654681" lastFinishedPulling="2025-12-12 01:52:05.391427355 +0000 UTC m=+5315.936780231" observedRunningTime="2025-12-12 01:52:05.850901317 +0000 UTC m=+5316.396254173" watchObservedRunningTime="2025-12-12 01:52:05.860753358 +0000 UTC m=+5316.406106234" Dec 12 01:52:11 crc kubenswrapper[4606]: I1212 01:52:11.560932 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-xsjtc" Dec 12 01:52:11 crc kubenswrapper[4606]: I1212 01:52:11.561655 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-xsjtc" Dec 12 01:52:11 crc kubenswrapper[4606]: I1212 01:52:11.616342 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-xsjtc" Dec 12 01:52:12 crc kubenswrapper[4606]: I1212 01:52:12.209715 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-xsjtc" Dec 12 01:52:12 crc kubenswrapper[4606]: I1212 01:52:12.267247 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xsjtc"] Dec 12 01:52:14 crc kubenswrapper[4606]: I1212 01:52:14.190556 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-xsjtc" podUID="c74bb6ce-3fe0-4777-a61f-46b5baf7d2fa" containerName="registry-server" containerID="cri-o://8f8bcd96e2a52a6d5a4fc68c3ad127630dd78f7d49d1552030f3d646941eb81c" gracePeriod=2 Dec 12 01:52:14 crc kubenswrapper[4606]: I1212 01:52:14.675544 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xsjtc" Dec 12 01:52:14 crc kubenswrapper[4606]: I1212 01:52:14.712108 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c74bb6ce-3fe0-4777-a61f-46b5baf7d2fa-catalog-content\") pod \"c74bb6ce-3fe0-4777-a61f-46b5baf7d2fa\" (UID: \"c74bb6ce-3fe0-4777-a61f-46b5baf7d2fa\") " Dec 12 01:52:14 crc kubenswrapper[4606]: I1212 01:52:14.712254 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6zvq5\" (UniqueName: \"kubernetes.io/projected/c74bb6ce-3fe0-4777-a61f-46b5baf7d2fa-kube-api-access-6zvq5\") pod \"c74bb6ce-3fe0-4777-a61f-46b5baf7d2fa\" (UID: \"c74bb6ce-3fe0-4777-a61f-46b5baf7d2fa\") " Dec 12 01:52:14 crc kubenswrapper[4606]: I1212 01:52:14.712312 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c74bb6ce-3fe0-4777-a61f-46b5baf7d2fa-utilities\") pod \"c74bb6ce-3fe0-4777-a61f-46b5baf7d2fa\" (UID: \"c74bb6ce-3fe0-4777-a61f-46b5baf7d2fa\") " Dec 12 01:52:14 crc kubenswrapper[4606]: I1212 01:52:14.713797 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c74bb6ce-3fe0-4777-a61f-46b5baf7d2fa-utilities" (OuterVolumeSpecName: "utilities") pod "c74bb6ce-3fe0-4777-a61f-46b5baf7d2fa" (UID: "c74bb6ce-3fe0-4777-a61f-46b5baf7d2fa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 01:52:14 crc kubenswrapper[4606]: I1212 01:52:14.721400 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c74bb6ce-3fe0-4777-a61f-46b5baf7d2fa-kube-api-access-6zvq5" (OuterVolumeSpecName: "kube-api-access-6zvq5") pod "c74bb6ce-3fe0-4777-a61f-46b5baf7d2fa" (UID: "c74bb6ce-3fe0-4777-a61f-46b5baf7d2fa"). InnerVolumeSpecName "kube-api-access-6zvq5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 01:52:14 crc kubenswrapper[4606]: I1212 01:52:14.756310 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c74bb6ce-3fe0-4777-a61f-46b5baf7d2fa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c74bb6ce-3fe0-4777-a61f-46b5baf7d2fa" (UID: "c74bb6ce-3fe0-4777-a61f-46b5baf7d2fa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 01:52:14 crc kubenswrapper[4606]: I1212 01:52:14.814959 4606 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c74bb6ce-3fe0-4777-a61f-46b5baf7d2fa-utilities\") on node \"crc\" DevicePath \"\"" Dec 12 01:52:14 crc kubenswrapper[4606]: I1212 01:52:14.815009 4606 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c74bb6ce-3fe0-4777-a61f-46b5baf7d2fa-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 12 01:52:14 crc kubenswrapper[4606]: I1212 01:52:14.815022 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6zvq5\" (UniqueName: \"kubernetes.io/projected/c74bb6ce-3fe0-4777-a61f-46b5baf7d2fa-kube-api-access-6zvq5\") on node \"crc\" DevicePath \"\"" Dec 12 01:52:15 crc kubenswrapper[4606]: I1212 01:52:15.202070 4606 generic.go:334] "Generic (PLEG): container finished" podID="c74bb6ce-3fe0-4777-a61f-46b5baf7d2fa" containerID="8f8bcd96e2a52a6d5a4fc68c3ad127630dd78f7d49d1552030f3d646941eb81c" exitCode=0 Dec 12 01:52:15 crc kubenswrapper[4606]: I1212 01:52:15.202237 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xsjtc" Dec 12 01:52:15 crc kubenswrapper[4606]: I1212 01:52:15.202263 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xsjtc" event={"ID":"c74bb6ce-3fe0-4777-a61f-46b5baf7d2fa","Type":"ContainerDied","Data":"8f8bcd96e2a52a6d5a4fc68c3ad127630dd78f7d49d1552030f3d646941eb81c"} Dec 12 01:52:15 crc kubenswrapper[4606]: I1212 01:52:15.204254 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xsjtc" event={"ID":"c74bb6ce-3fe0-4777-a61f-46b5baf7d2fa","Type":"ContainerDied","Data":"abb0dd7862cf6b8eca4caa8fba976991b1b5684bbbda3fdad1b9468e5dd8c4a1"} Dec 12 01:52:15 crc kubenswrapper[4606]: I1212 01:52:15.204286 4606 scope.go:117] "RemoveContainer" containerID="8f8bcd96e2a52a6d5a4fc68c3ad127630dd78f7d49d1552030f3d646941eb81c" Dec 12 01:52:15 crc kubenswrapper[4606]: I1212 01:52:15.253981 4606 scope.go:117] "RemoveContainer" containerID="22c4c325d1cfcf2b1115ed8083777b856755b070256b79553df68abb015f4f21" Dec 12 01:52:15 crc kubenswrapper[4606]: I1212 01:52:15.257051 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xsjtc"] Dec 12 01:52:15 crc kubenswrapper[4606]: I1212 01:52:15.270115 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-xsjtc"] Dec 12 01:52:15 crc kubenswrapper[4606]: I1212 01:52:15.278317 4606 scope.go:117] "RemoveContainer" containerID="5b8d59404f7f53a5428906d89613a44c4720d4acaae6b6090c76071cf1c45a99" Dec 12 01:52:15 crc kubenswrapper[4606]: I1212 01:52:15.343281 4606 scope.go:117] "RemoveContainer" containerID="8f8bcd96e2a52a6d5a4fc68c3ad127630dd78f7d49d1552030f3d646941eb81c" Dec 12 01:52:15 crc kubenswrapper[4606]: E1212 01:52:15.343904 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f8bcd96e2a52a6d5a4fc68c3ad127630dd78f7d49d1552030f3d646941eb81c\": container with ID starting with 8f8bcd96e2a52a6d5a4fc68c3ad127630dd78f7d49d1552030f3d646941eb81c not found: ID does not exist" containerID="8f8bcd96e2a52a6d5a4fc68c3ad127630dd78f7d49d1552030f3d646941eb81c" Dec 12 01:52:15 crc kubenswrapper[4606]: I1212 01:52:15.343944 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f8bcd96e2a52a6d5a4fc68c3ad127630dd78f7d49d1552030f3d646941eb81c"} err="failed to get container status \"8f8bcd96e2a52a6d5a4fc68c3ad127630dd78f7d49d1552030f3d646941eb81c\": rpc error: code = NotFound desc = could not find container \"8f8bcd96e2a52a6d5a4fc68c3ad127630dd78f7d49d1552030f3d646941eb81c\": container with ID starting with 8f8bcd96e2a52a6d5a4fc68c3ad127630dd78f7d49d1552030f3d646941eb81c not found: ID does not exist" Dec 12 01:52:15 crc kubenswrapper[4606]: I1212 01:52:15.343969 4606 scope.go:117] "RemoveContainer" containerID="22c4c325d1cfcf2b1115ed8083777b856755b070256b79553df68abb015f4f21" Dec 12 01:52:15 crc kubenswrapper[4606]: E1212 01:52:15.344254 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22c4c325d1cfcf2b1115ed8083777b856755b070256b79553df68abb015f4f21\": container with ID starting with 22c4c325d1cfcf2b1115ed8083777b856755b070256b79553df68abb015f4f21 not found: ID does not exist" containerID="22c4c325d1cfcf2b1115ed8083777b856755b070256b79553df68abb015f4f21" Dec 12 01:52:15 crc kubenswrapper[4606]: I1212 01:52:15.344283 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22c4c325d1cfcf2b1115ed8083777b856755b070256b79553df68abb015f4f21"} err="failed to get container status \"22c4c325d1cfcf2b1115ed8083777b856755b070256b79553df68abb015f4f21\": rpc error: code = NotFound desc = could not find container \"22c4c325d1cfcf2b1115ed8083777b856755b070256b79553df68abb015f4f21\": container with ID starting with 22c4c325d1cfcf2b1115ed8083777b856755b070256b79553df68abb015f4f21 not found: ID does not exist" Dec 12 01:52:15 crc kubenswrapper[4606]: I1212 01:52:15.344299 4606 scope.go:117] "RemoveContainer" containerID="5b8d59404f7f53a5428906d89613a44c4720d4acaae6b6090c76071cf1c45a99" Dec 12 01:52:15 crc kubenswrapper[4606]: E1212 01:52:15.345089 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b8d59404f7f53a5428906d89613a44c4720d4acaae6b6090c76071cf1c45a99\": container with ID starting with 5b8d59404f7f53a5428906d89613a44c4720d4acaae6b6090c76071cf1c45a99 not found: ID does not exist" containerID="5b8d59404f7f53a5428906d89613a44c4720d4acaae6b6090c76071cf1c45a99" Dec 12 01:52:15 crc kubenswrapper[4606]: I1212 01:52:15.345477 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b8d59404f7f53a5428906d89613a44c4720d4acaae6b6090c76071cf1c45a99"} err="failed to get container status \"5b8d59404f7f53a5428906d89613a44c4720d4acaae6b6090c76071cf1c45a99\": rpc error: code = NotFound desc = could not find container \"5b8d59404f7f53a5428906d89613a44c4720d4acaae6b6090c76071cf1c45a99\": container with ID starting with 5b8d59404f7f53a5428906d89613a44c4720d4acaae6b6090c76071cf1c45a99 not found: ID does not exist" Dec 12 01:52:15 crc kubenswrapper[4606]: I1212 01:52:15.709993 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c74bb6ce-3fe0-4777-a61f-46b5baf7d2fa" path="/var/lib/kubelet/pods/c74bb6ce-3fe0-4777-a61f-46b5baf7d2fa/volumes" Dec 12 01:53:02 crc kubenswrapper[4606]: I1212 01:53:02.011275 4606 patch_prober.go:28] interesting pod/machine-config-daemon-cqmz5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 12 01:53:02 crc kubenswrapper[4606]: I1212 01:53:02.012014 4606 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 12 01:53:04 crc kubenswrapper[4606]: I1212 01:53:04.998599 4606 scope.go:117] "RemoveContainer" containerID="47a82264c77f652dc997af38d2a6b0fbc52ce206c7ab942aca55c3273342d8a3" Dec 12 01:53:32 crc kubenswrapper[4606]: I1212 01:53:32.010527 4606 patch_prober.go:28] interesting pod/machine-config-daemon-cqmz5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 12 01:53:32 crc kubenswrapper[4606]: I1212 01:53:32.011088 4606 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 12 01:53:50 crc kubenswrapper[4606]: I1212 01:53:50.207800 4606 generic.go:334] "Generic (PLEG): container finished" podID="26000cea-02cc-4449-b125-39aa4ca0015f" containerID="74e0cd3215c8d1f220c99a05faed3ceb40de2845f8e5cee6925167dbde206113" exitCode=0 Dec 12 01:53:50 crc kubenswrapper[4606]: I1212 01:53:50.207917 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bh6lp/must-gather-xw4n5" event={"ID":"26000cea-02cc-4449-b125-39aa4ca0015f","Type":"ContainerDied","Data":"74e0cd3215c8d1f220c99a05faed3ceb40de2845f8e5cee6925167dbde206113"} Dec 12 01:53:50 crc kubenswrapper[4606]: I1212 01:53:50.209373 4606 scope.go:117] "RemoveContainer" containerID="74e0cd3215c8d1f220c99a05faed3ceb40de2845f8e5cee6925167dbde206113" Dec 12 01:53:50 crc kubenswrapper[4606]: I1212 01:53:50.782273 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-bh6lp_must-gather-xw4n5_26000cea-02cc-4449-b125-39aa4ca0015f/gather/0.log" Dec 12 01:53:58 crc kubenswrapper[4606]: I1212 01:53:58.973996 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-bh6lp/must-gather-xw4n5"] Dec 12 01:53:58 crc kubenswrapper[4606]: I1212 01:53:58.975336 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-bh6lp/must-gather-xw4n5" podUID="26000cea-02cc-4449-b125-39aa4ca0015f" containerName="copy" containerID="cri-o://cc179539ddc07f641d1fb9442736f702b44cd48f70827ccfe88e12570478b6a9" gracePeriod=2 Dec 12 01:53:58 crc kubenswrapper[4606]: I1212 01:53:58.990260 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-bh6lp/must-gather-xw4n5"] Dec 12 01:53:59 crc kubenswrapper[4606]: I1212 01:53:59.319811 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-bh6lp_must-gather-xw4n5_26000cea-02cc-4449-b125-39aa4ca0015f/copy/0.log" Dec 12 01:53:59 crc kubenswrapper[4606]: I1212 01:53:59.320642 4606 generic.go:334] "Generic (PLEG): container finished" podID="26000cea-02cc-4449-b125-39aa4ca0015f" containerID="cc179539ddc07f641d1fb9442736f702b44cd48f70827ccfe88e12570478b6a9" exitCode=143 Dec 12 01:53:59 crc kubenswrapper[4606]: I1212 01:53:59.410925 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-bh6lp_must-gather-xw4n5_26000cea-02cc-4449-b125-39aa4ca0015f/copy/0.log" Dec 12 01:53:59 crc kubenswrapper[4606]: I1212 01:53:59.411463 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bh6lp/must-gather-xw4n5" Dec 12 01:53:59 crc kubenswrapper[4606]: I1212 01:53:59.568298 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/26000cea-02cc-4449-b125-39aa4ca0015f-must-gather-output\") pod \"26000cea-02cc-4449-b125-39aa4ca0015f\" (UID: \"26000cea-02cc-4449-b125-39aa4ca0015f\") " Dec 12 01:53:59 crc kubenswrapper[4606]: I1212 01:53:59.568736 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lck2b\" (UniqueName: \"kubernetes.io/projected/26000cea-02cc-4449-b125-39aa4ca0015f-kube-api-access-lck2b\") pod \"26000cea-02cc-4449-b125-39aa4ca0015f\" (UID: \"26000cea-02cc-4449-b125-39aa4ca0015f\") " Dec 12 01:53:59 crc kubenswrapper[4606]: I1212 01:53:59.585628 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26000cea-02cc-4449-b125-39aa4ca0015f-kube-api-access-lck2b" (OuterVolumeSpecName: "kube-api-access-lck2b") pod "26000cea-02cc-4449-b125-39aa4ca0015f" (UID: "26000cea-02cc-4449-b125-39aa4ca0015f"). InnerVolumeSpecName "kube-api-access-lck2b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 01:53:59 crc kubenswrapper[4606]: I1212 01:53:59.670476 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lck2b\" (UniqueName: \"kubernetes.io/projected/26000cea-02cc-4449-b125-39aa4ca0015f-kube-api-access-lck2b\") on node \"crc\" DevicePath \"\"" Dec 12 01:53:59 crc kubenswrapper[4606]: I1212 01:53:59.739378 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/26000cea-02cc-4449-b125-39aa4ca0015f-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "26000cea-02cc-4449-b125-39aa4ca0015f" (UID: "26000cea-02cc-4449-b125-39aa4ca0015f"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 01:53:59 crc kubenswrapper[4606]: I1212 01:53:59.772079 4606 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/26000cea-02cc-4449-b125-39aa4ca0015f-must-gather-output\") on node \"crc\" DevicePath \"\"" Dec 12 01:54:00 crc kubenswrapper[4606]: I1212 01:54:00.339900 4606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-bh6lp_must-gather-xw4n5_26000cea-02cc-4449-b125-39aa4ca0015f/copy/0.log" Dec 12 01:54:00 crc kubenswrapper[4606]: I1212 01:54:00.340457 4606 scope.go:117] "RemoveContainer" containerID="cc179539ddc07f641d1fb9442736f702b44cd48f70827ccfe88e12570478b6a9" Dec 12 01:54:00 crc kubenswrapper[4606]: I1212 01:54:00.340634 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bh6lp/must-gather-xw4n5" Dec 12 01:54:00 crc kubenswrapper[4606]: I1212 01:54:00.373985 4606 scope.go:117] "RemoveContainer" containerID="74e0cd3215c8d1f220c99a05faed3ceb40de2845f8e5cee6925167dbde206113" Dec 12 01:54:01 crc kubenswrapper[4606]: I1212 01:54:01.711050 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26000cea-02cc-4449-b125-39aa4ca0015f" path="/var/lib/kubelet/pods/26000cea-02cc-4449-b125-39aa4ca0015f/volumes" Dec 12 01:54:02 crc kubenswrapper[4606]: I1212 01:54:02.010284 4606 patch_prober.go:28] interesting pod/machine-config-daemon-cqmz5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 12 01:54:02 crc kubenswrapper[4606]: I1212 01:54:02.010361 4606 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 12 01:54:02 crc kubenswrapper[4606]: I1212 01:54:02.010416 4606 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" Dec 12 01:54:02 crc kubenswrapper[4606]: I1212 01:54:02.011269 4606 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1feb1ef0232c3ca356e8c29f9af5822520b4078000ea7efa9d0c5b72506209e7"} pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 12 01:54:02 crc kubenswrapper[4606]: I1212 01:54:02.011335 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" containerName="machine-config-daemon" containerID="cri-o://1feb1ef0232c3ca356e8c29f9af5822520b4078000ea7efa9d0c5b72506209e7" gracePeriod=600 Dec 12 01:54:02 crc kubenswrapper[4606]: I1212 01:54:02.363642 4606 generic.go:334] "Generic (PLEG): container finished" podID="a543e227-be89-40cb-941d-b4707cc28921" containerID="1feb1ef0232c3ca356e8c29f9af5822520b4078000ea7efa9d0c5b72506209e7" exitCode=0 Dec 12 01:54:02 crc kubenswrapper[4606]: I1212 01:54:02.364405 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" event={"ID":"a543e227-be89-40cb-941d-b4707cc28921","Type":"ContainerDied","Data":"1feb1ef0232c3ca356e8c29f9af5822520b4078000ea7efa9d0c5b72506209e7"} Dec 12 01:54:02 crc kubenswrapper[4606]: I1212 01:54:02.364510 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" event={"ID":"a543e227-be89-40cb-941d-b4707cc28921","Type":"ContainerStarted","Data":"b9451c93e67195b53473de9f49f9e75a0b20fd899d008fc9d95b12dc527e5eca"} Dec 12 01:54:02 crc kubenswrapper[4606]: I1212 01:54:02.364542 4606 scope.go:117] "RemoveContainer" containerID="e143db5c8656674439f9bedbd23be8e8c05444861157fb9728cbde496d7fbeb8" Dec 12 01:54:05 crc kubenswrapper[4606]: I1212 01:54:05.096028 4606 scope.go:117] "RemoveContainer" containerID="df8f55afa453e3bb3b1ac0a41a255792d20d0df8f2dbabd1460b00b95021813f" Dec 12 01:56:02 crc kubenswrapper[4606]: I1212 01:56:02.010724 4606 patch_prober.go:28] interesting pod/machine-config-daemon-cqmz5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 12 01:56:02 crc kubenswrapper[4606]: I1212 01:56:02.011449 4606 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 12 01:56:32 crc kubenswrapper[4606]: I1212 01:56:32.011015 4606 patch_prober.go:28] interesting pod/machine-config-daemon-cqmz5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 12 01:56:32 crc kubenswrapper[4606]: I1212 01:56:32.011662 4606 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 12 01:56:37 crc kubenswrapper[4606]: I1212 01:56:37.480916 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-cftxh"] Dec 12 01:56:37 crc kubenswrapper[4606]: E1212 01:56:37.481969 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c74bb6ce-3fe0-4777-a61f-46b5baf7d2fa" containerName="registry-server" Dec 12 01:56:37 crc kubenswrapper[4606]: I1212 01:56:37.481985 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="c74bb6ce-3fe0-4777-a61f-46b5baf7d2fa" containerName="registry-server" Dec 12 01:56:37 crc kubenswrapper[4606]: E1212 01:56:37.482001 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c74bb6ce-3fe0-4777-a61f-46b5baf7d2fa" containerName="extract-utilities" Dec 12 01:56:37 crc kubenswrapper[4606]: I1212 01:56:37.482010 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="c74bb6ce-3fe0-4777-a61f-46b5baf7d2fa" containerName="extract-utilities" Dec 12 01:56:37 crc kubenswrapper[4606]: E1212 01:56:37.482029 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c74bb6ce-3fe0-4777-a61f-46b5baf7d2fa" containerName="extract-content" Dec 12 01:56:37 crc kubenswrapper[4606]: I1212 01:56:37.482037 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="c74bb6ce-3fe0-4777-a61f-46b5baf7d2fa" containerName="extract-content" Dec 12 01:56:37 crc kubenswrapper[4606]: E1212 01:56:37.482068 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26000cea-02cc-4449-b125-39aa4ca0015f" containerName="gather" Dec 12 01:56:37 crc kubenswrapper[4606]: I1212 01:56:37.482076 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="26000cea-02cc-4449-b125-39aa4ca0015f" containerName="gather" Dec 12 01:56:37 crc kubenswrapper[4606]: E1212 01:56:37.482089 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26000cea-02cc-4449-b125-39aa4ca0015f" containerName="copy" Dec 12 01:56:37 crc kubenswrapper[4606]: I1212 01:56:37.482098 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="26000cea-02cc-4449-b125-39aa4ca0015f" containerName="copy" Dec 12 01:56:37 crc kubenswrapper[4606]: I1212 01:56:37.482392 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="c74bb6ce-3fe0-4777-a61f-46b5baf7d2fa" containerName="registry-server" Dec 12 01:56:37 crc kubenswrapper[4606]: I1212 01:56:37.482413 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="26000cea-02cc-4449-b125-39aa4ca0015f" containerName="gather" Dec 12 01:56:37 crc kubenswrapper[4606]: I1212 01:56:37.482442 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="26000cea-02cc-4449-b125-39aa4ca0015f" containerName="copy" Dec 12 01:56:37 crc kubenswrapper[4606]: I1212 01:56:37.484307 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cftxh" Dec 12 01:56:37 crc kubenswrapper[4606]: I1212 01:56:37.510278 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cftxh"] Dec 12 01:56:37 crc kubenswrapper[4606]: I1212 01:56:37.634358 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19ac4ab0-ce2f-4b83-bdda-17389bf729ca-catalog-content\") pod \"certified-operators-cftxh\" (UID: \"19ac4ab0-ce2f-4b83-bdda-17389bf729ca\") " pod="openshift-marketplace/certified-operators-cftxh" Dec 12 01:56:37 crc kubenswrapper[4606]: I1212 01:56:37.635246 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqcmq\" (UniqueName: \"kubernetes.io/projected/19ac4ab0-ce2f-4b83-bdda-17389bf729ca-kube-api-access-gqcmq\") pod \"certified-operators-cftxh\" (UID: \"19ac4ab0-ce2f-4b83-bdda-17389bf729ca\") " pod="openshift-marketplace/certified-operators-cftxh" Dec 12 01:56:37 crc kubenswrapper[4606]: I1212 01:56:37.635407 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19ac4ab0-ce2f-4b83-bdda-17389bf729ca-utilities\") pod \"certified-operators-cftxh\" (UID: \"19ac4ab0-ce2f-4b83-bdda-17389bf729ca\") " pod="openshift-marketplace/certified-operators-cftxh" Dec 12 01:56:37 crc kubenswrapper[4606]: I1212 01:56:37.737507 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gqcmq\" (UniqueName: \"kubernetes.io/projected/19ac4ab0-ce2f-4b83-bdda-17389bf729ca-kube-api-access-gqcmq\") pod \"certified-operators-cftxh\" (UID: \"19ac4ab0-ce2f-4b83-bdda-17389bf729ca\") " pod="openshift-marketplace/certified-operators-cftxh" Dec 12 01:56:37 crc kubenswrapper[4606]: I1212 01:56:37.737829 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19ac4ab0-ce2f-4b83-bdda-17389bf729ca-utilities\") pod \"certified-operators-cftxh\" (UID: \"19ac4ab0-ce2f-4b83-bdda-17389bf729ca\") " pod="openshift-marketplace/certified-operators-cftxh" Dec 12 01:56:37 crc kubenswrapper[4606]: I1212 01:56:37.738038 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19ac4ab0-ce2f-4b83-bdda-17389bf729ca-catalog-content\") pod \"certified-operators-cftxh\" (UID: \"19ac4ab0-ce2f-4b83-bdda-17389bf729ca\") " pod="openshift-marketplace/certified-operators-cftxh" Dec 12 01:56:37 crc kubenswrapper[4606]: I1212 01:56:37.738381 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19ac4ab0-ce2f-4b83-bdda-17389bf729ca-utilities\") pod \"certified-operators-cftxh\" (UID: \"19ac4ab0-ce2f-4b83-bdda-17389bf729ca\") " pod="openshift-marketplace/certified-operators-cftxh" Dec 12 01:56:37 crc kubenswrapper[4606]: I1212 01:56:37.738496 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19ac4ab0-ce2f-4b83-bdda-17389bf729ca-catalog-content\") pod \"certified-operators-cftxh\" (UID: \"19ac4ab0-ce2f-4b83-bdda-17389bf729ca\") " pod="openshift-marketplace/certified-operators-cftxh" Dec 12 01:56:37 crc kubenswrapper[4606]: I1212 01:56:37.757295 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqcmq\" (UniqueName: \"kubernetes.io/projected/19ac4ab0-ce2f-4b83-bdda-17389bf729ca-kube-api-access-gqcmq\") pod \"certified-operators-cftxh\" (UID: \"19ac4ab0-ce2f-4b83-bdda-17389bf729ca\") " pod="openshift-marketplace/certified-operators-cftxh" Dec 12 01:56:37 crc kubenswrapper[4606]: I1212 01:56:37.833232 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cftxh" Dec 12 01:56:38 crc kubenswrapper[4606]: I1212 01:56:38.396672 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cftxh"] Dec 12 01:56:39 crc kubenswrapper[4606]: I1212 01:56:39.072139 4606 generic.go:334] "Generic (PLEG): container finished" podID="19ac4ab0-ce2f-4b83-bdda-17389bf729ca" containerID="b37810c1d6a699cb9698009d6a006b108da201247b2b7403e77b4dec7f27c002" exitCode=0 Dec 12 01:56:39 crc kubenswrapper[4606]: I1212 01:56:39.072265 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cftxh" event={"ID":"19ac4ab0-ce2f-4b83-bdda-17389bf729ca","Type":"ContainerDied","Data":"b37810c1d6a699cb9698009d6a006b108da201247b2b7403e77b4dec7f27c002"} Dec 12 01:56:39 crc kubenswrapper[4606]: I1212 01:56:39.072500 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cftxh" event={"ID":"19ac4ab0-ce2f-4b83-bdda-17389bf729ca","Type":"ContainerStarted","Data":"7ff31e36c8168054a6c19d2363a6798094b22c528fb6f4f77638b1ac9f02b291"} Dec 12 01:56:41 crc kubenswrapper[4606]: I1212 01:56:41.104696 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cftxh" event={"ID":"19ac4ab0-ce2f-4b83-bdda-17389bf729ca","Type":"ContainerStarted","Data":"2536a48b4f3d6e7571f6cb2d8e18d6634e4d6bb29b3dc3f5b43567ac7ae072b3"} Dec 12 01:56:43 crc kubenswrapper[4606]: I1212 01:56:43.127376 4606 generic.go:334] "Generic (PLEG): container finished" podID="19ac4ab0-ce2f-4b83-bdda-17389bf729ca" containerID="2536a48b4f3d6e7571f6cb2d8e18d6634e4d6bb29b3dc3f5b43567ac7ae072b3" exitCode=0 Dec 12 01:56:43 crc kubenswrapper[4606]: I1212 01:56:43.127730 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cftxh" event={"ID":"19ac4ab0-ce2f-4b83-bdda-17389bf729ca","Type":"ContainerDied","Data":"2536a48b4f3d6e7571f6cb2d8e18d6634e4d6bb29b3dc3f5b43567ac7ae072b3"} Dec 12 01:56:44 crc kubenswrapper[4606]: I1212 01:56:44.143168 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cftxh" event={"ID":"19ac4ab0-ce2f-4b83-bdda-17389bf729ca","Type":"ContainerStarted","Data":"43a17eaa66c30d4061745be1998e9db345dbc5a8ec49b9427bf738220126431a"} Dec 12 01:56:47 crc kubenswrapper[4606]: I1212 01:56:47.833522 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-cftxh" Dec 12 01:56:47 crc kubenswrapper[4606]: I1212 01:56:47.833789 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-cftxh" Dec 12 01:56:47 crc kubenswrapper[4606]: I1212 01:56:47.884037 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-cftxh" Dec 12 01:56:48 crc kubenswrapper[4606]: I1212 01:56:48.223359 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-cftxh" Dec 12 01:56:48 crc kubenswrapper[4606]: I1212 01:56:48.558133 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-cftxh" podStartSLOduration=7.005780953 podStartE2EDuration="11.558099337s" podCreationTimestamp="2025-12-12 01:56:37 +0000 UTC" firstStartedPulling="2025-12-12 01:56:39.074488566 +0000 UTC m=+5589.619841432" lastFinishedPulling="2025-12-12 01:56:43.62680695 +0000 UTC m=+5594.172159816" observedRunningTime="2025-12-12 01:56:44.171238209 +0000 UTC m=+5594.716591095" watchObservedRunningTime="2025-12-12 01:56:48.558099337 +0000 UTC m=+5599.103452203" Dec 12 01:56:48 crc kubenswrapper[4606]: I1212 01:56:48.776275 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-cftxh"] Dec 12 01:56:50 crc kubenswrapper[4606]: I1212 01:56:50.206732 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-cftxh" podUID="19ac4ab0-ce2f-4b83-bdda-17389bf729ca" containerName="registry-server" containerID="cri-o://43a17eaa66c30d4061745be1998e9db345dbc5a8ec49b9427bf738220126431a" gracePeriod=2 Dec 12 01:56:50 crc kubenswrapper[4606]: I1212 01:56:50.733567 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cftxh" Dec 12 01:56:50 crc kubenswrapper[4606]: I1212 01:56:50.850684 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19ac4ab0-ce2f-4b83-bdda-17389bf729ca-utilities\") pod \"19ac4ab0-ce2f-4b83-bdda-17389bf729ca\" (UID: \"19ac4ab0-ce2f-4b83-bdda-17389bf729ca\") " Dec 12 01:56:50 crc kubenswrapper[4606]: I1212 01:56:50.851087 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gqcmq\" (UniqueName: \"kubernetes.io/projected/19ac4ab0-ce2f-4b83-bdda-17389bf729ca-kube-api-access-gqcmq\") pod \"19ac4ab0-ce2f-4b83-bdda-17389bf729ca\" (UID: \"19ac4ab0-ce2f-4b83-bdda-17389bf729ca\") " Dec 12 01:56:50 crc kubenswrapper[4606]: I1212 01:56:50.851186 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19ac4ab0-ce2f-4b83-bdda-17389bf729ca-catalog-content\") pod \"19ac4ab0-ce2f-4b83-bdda-17389bf729ca\" (UID: \"19ac4ab0-ce2f-4b83-bdda-17389bf729ca\") " Dec 12 01:56:50 crc kubenswrapper[4606]: I1212 01:56:50.851442 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/19ac4ab0-ce2f-4b83-bdda-17389bf729ca-utilities" (OuterVolumeSpecName: "utilities") pod "19ac4ab0-ce2f-4b83-bdda-17389bf729ca" (UID: "19ac4ab0-ce2f-4b83-bdda-17389bf729ca"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 01:56:50 crc kubenswrapper[4606]: I1212 01:56:50.851955 4606 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19ac4ab0-ce2f-4b83-bdda-17389bf729ca-utilities\") on node \"crc\" DevicePath \"\"" Dec 12 01:56:50 crc kubenswrapper[4606]: I1212 01:56:50.857443 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19ac4ab0-ce2f-4b83-bdda-17389bf729ca-kube-api-access-gqcmq" (OuterVolumeSpecName: "kube-api-access-gqcmq") pod "19ac4ab0-ce2f-4b83-bdda-17389bf729ca" (UID: "19ac4ab0-ce2f-4b83-bdda-17389bf729ca"). InnerVolumeSpecName "kube-api-access-gqcmq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 01:56:50 crc kubenswrapper[4606]: I1212 01:56:50.904811 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/19ac4ab0-ce2f-4b83-bdda-17389bf729ca-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "19ac4ab0-ce2f-4b83-bdda-17389bf729ca" (UID: "19ac4ab0-ce2f-4b83-bdda-17389bf729ca"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 01:56:50 crc kubenswrapper[4606]: I1212 01:56:50.953531 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gqcmq\" (UniqueName: \"kubernetes.io/projected/19ac4ab0-ce2f-4b83-bdda-17389bf729ca-kube-api-access-gqcmq\") on node \"crc\" DevicePath \"\"" Dec 12 01:56:50 crc kubenswrapper[4606]: I1212 01:56:50.953561 4606 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19ac4ab0-ce2f-4b83-bdda-17389bf729ca-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 12 01:56:51 crc kubenswrapper[4606]: I1212 01:56:51.221811 4606 generic.go:334] "Generic (PLEG): container finished" podID="19ac4ab0-ce2f-4b83-bdda-17389bf729ca" containerID="43a17eaa66c30d4061745be1998e9db345dbc5a8ec49b9427bf738220126431a" exitCode=0 Dec 12 01:56:51 crc kubenswrapper[4606]: I1212 01:56:51.221851 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cftxh" event={"ID":"19ac4ab0-ce2f-4b83-bdda-17389bf729ca","Type":"ContainerDied","Data":"43a17eaa66c30d4061745be1998e9db345dbc5a8ec49b9427bf738220126431a"} Dec 12 01:56:51 crc kubenswrapper[4606]: I1212 01:56:51.221878 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cftxh" event={"ID":"19ac4ab0-ce2f-4b83-bdda-17389bf729ca","Type":"ContainerDied","Data":"7ff31e36c8168054a6c19d2363a6798094b22c528fb6f4f77638b1ac9f02b291"} Dec 12 01:56:51 crc kubenswrapper[4606]: I1212 01:56:51.221895 4606 scope.go:117] "RemoveContainer" containerID="43a17eaa66c30d4061745be1998e9db345dbc5a8ec49b9427bf738220126431a" Dec 12 01:56:51 crc kubenswrapper[4606]: I1212 01:56:51.222025 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cftxh" Dec 12 01:56:51 crc kubenswrapper[4606]: I1212 01:56:51.256204 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-cftxh"] Dec 12 01:56:51 crc kubenswrapper[4606]: I1212 01:56:51.261925 4606 scope.go:117] "RemoveContainer" containerID="2536a48b4f3d6e7571f6cb2d8e18d6634e4d6bb29b3dc3f5b43567ac7ae072b3" Dec 12 01:56:51 crc kubenswrapper[4606]: I1212 01:56:51.264761 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-cftxh"] Dec 12 01:56:51 crc kubenswrapper[4606]: I1212 01:56:51.284946 4606 scope.go:117] "RemoveContainer" containerID="b37810c1d6a699cb9698009d6a006b108da201247b2b7403e77b4dec7f27c002" Dec 12 01:56:51 crc kubenswrapper[4606]: I1212 01:56:51.323615 4606 scope.go:117] "RemoveContainer" containerID="43a17eaa66c30d4061745be1998e9db345dbc5a8ec49b9427bf738220126431a" Dec 12 01:56:51 crc kubenswrapper[4606]: E1212 01:56:51.324028 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43a17eaa66c30d4061745be1998e9db345dbc5a8ec49b9427bf738220126431a\": container with ID starting with 43a17eaa66c30d4061745be1998e9db345dbc5a8ec49b9427bf738220126431a not found: ID does not exist" containerID="43a17eaa66c30d4061745be1998e9db345dbc5a8ec49b9427bf738220126431a" Dec 12 01:56:51 crc kubenswrapper[4606]: I1212 01:56:51.324073 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43a17eaa66c30d4061745be1998e9db345dbc5a8ec49b9427bf738220126431a"} err="failed to get container status \"43a17eaa66c30d4061745be1998e9db345dbc5a8ec49b9427bf738220126431a\": rpc error: code = NotFound desc = could not find container \"43a17eaa66c30d4061745be1998e9db345dbc5a8ec49b9427bf738220126431a\": container with ID starting with 43a17eaa66c30d4061745be1998e9db345dbc5a8ec49b9427bf738220126431a not found: ID does not exist" Dec 12 01:56:51 crc kubenswrapper[4606]: I1212 01:56:51.324109 4606 scope.go:117] "RemoveContainer" containerID="2536a48b4f3d6e7571f6cb2d8e18d6634e4d6bb29b3dc3f5b43567ac7ae072b3" Dec 12 01:56:51 crc kubenswrapper[4606]: E1212 01:56:51.324466 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2536a48b4f3d6e7571f6cb2d8e18d6634e4d6bb29b3dc3f5b43567ac7ae072b3\": container with ID starting with 2536a48b4f3d6e7571f6cb2d8e18d6634e4d6bb29b3dc3f5b43567ac7ae072b3 not found: ID does not exist" containerID="2536a48b4f3d6e7571f6cb2d8e18d6634e4d6bb29b3dc3f5b43567ac7ae072b3" Dec 12 01:56:51 crc kubenswrapper[4606]: I1212 01:56:51.324501 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2536a48b4f3d6e7571f6cb2d8e18d6634e4d6bb29b3dc3f5b43567ac7ae072b3"} err="failed to get container status \"2536a48b4f3d6e7571f6cb2d8e18d6634e4d6bb29b3dc3f5b43567ac7ae072b3\": rpc error: code = NotFound desc = could not find container \"2536a48b4f3d6e7571f6cb2d8e18d6634e4d6bb29b3dc3f5b43567ac7ae072b3\": container with ID starting with 2536a48b4f3d6e7571f6cb2d8e18d6634e4d6bb29b3dc3f5b43567ac7ae072b3 not found: ID does not exist" Dec 12 01:56:51 crc kubenswrapper[4606]: I1212 01:56:51.324537 4606 scope.go:117] "RemoveContainer" containerID="b37810c1d6a699cb9698009d6a006b108da201247b2b7403e77b4dec7f27c002" Dec 12 01:56:51 crc kubenswrapper[4606]: E1212 01:56:51.325761 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b37810c1d6a699cb9698009d6a006b108da201247b2b7403e77b4dec7f27c002\": container with ID starting with b37810c1d6a699cb9698009d6a006b108da201247b2b7403e77b4dec7f27c002 not found: ID does not exist" containerID="b37810c1d6a699cb9698009d6a006b108da201247b2b7403e77b4dec7f27c002" Dec 12 01:56:51 crc kubenswrapper[4606]: I1212 01:56:51.325786 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b37810c1d6a699cb9698009d6a006b108da201247b2b7403e77b4dec7f27c002"} err="failed to get container status \"b37810c1d6a699cb9698009d6a006b108da201247b2b7403e77b4dec7f27c002\": rpc error: code = NotFound desc = could not find container \"b37810c1d6a699cb9698009d6a006b108da201247b2b7403e77b4dec7f27c002\": container with ID starting with b37810c1d6a699cb9698009d6a006b108da201247b2b7403e77b4dec7f27c002 not found: ID does not exist" Dec 12 01:56:51 crc kubenswrapper[4606]: I1212 01:56:51.722545 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19ac4ab0-ce2f-4b83-bdda-17389bf729ca" path="/var/lib/kubelet/pods/19ac4ab0-ce2f-4b83-bdda-17389bf729ca/volumes" Dec 12 01:57:02 crc kubenswrapper[4606]: I1212 01:57:02.010812 4606 patch_prober.go:28] interesting pod/machine-config-daemon-cqmz5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 12 01:57:02 crc kubenswrapper[4606]: I1212 01:57:02.011410 4606 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 12 01:57:02 crc kubenswrapper[4606]: I1212 01:57:02.011467 4606 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" Dec 12 01:57:02 crc kubenswrapper[4606]: I1212 01:57:02.012329 4606 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b9451c93e67195b53473de9f49f9e75a0b20fd899d008fc9d95b12dc527e5eca"} pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 12 01:57:02 crc kubenswrapper[4606]: I1212 01:57:02.012409 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" containerName="machine-config-daemon" containerID="cri-o://b9451c93e67195b53473de9f49f9e75a0b20fd899d008fc9d95b12dc527e5eca" gracePeriod=600 Dec 12 01:57:02 crc kubenswrapper[4606]: E1212 01:57:02.174986 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 01:57:02 crc kubenswrapper[4606]: I1212 01:57:02.338247 4606 generic.go:334] "Generic (PLEG): container finished" podID="a543e227-be89-40cb-941d-b4707cc28921" containerID="b9451c93e67195b53473de9f49f9e75a0b20fd899d008fc9d95b12dc527e5eca" exitCode=0 Dec 12 01:57:02 crc kubenswrapper[4606]: I1212 01:57:02.338282 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" event={"ID":"a543e227-be89-40cb-941d-b4707cc28921","Type":"ContainerDied","Data":"b9451c93e67195b53473de9f49f9e75a0b20fd899d008fc9d95b12dc527e5eca"} Dec 12 01:57:02 crc kubenswrapper[4606]: I1212 01:57:02.338346 4606 scope.go:117] "RemoveContainer" containerID="1feb1ef0232c3ca356e8c29f9af5822520b4078000ea7efa9d0c5b72506209e7" Dec 12 01:57:02 crc kubenswrapper[4606]: I1212 01:57:02.339064 4606 scope.go:117] "RemoveContainer" containerID="b9451c93e67195b53473de9f49f9e75a0b20fd899d008fc9d95b12dc527e5eca" Dec 12 01:57:02 crc kubenswrapper[4606]: E1212 01:57:02.339859 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 01:57:12 crc kubenswrapper[4606]: I1212 01:57:12.699832 4606 scope.go:117] "RemoveContainer" containerID="b9451c93e67195b53473de9f49f9e75a0b20fd899d008fc9d95b12dc527e5eca" Dec 12 01:57:12 crc kubenswrapper[4606]: E1212 01:57:12.700749 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 01:57:23 crc kubenswrapper[4606]: I1212 01:57:23.701194 4606 scope.go:117] "RemoveContainer" containerID="b9451c93e67195b53473de9f49f9e75a0b20fd899d008fc9d95b12dc527e5eca" Dec 12 01:57:23 crc kubenswrapper[4606]: E1212 01:57:23.701938 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 01:57:38 crc kubenswrapper[4606]: I1212 01:57:38.699823 4606 scope.go:117] "RemoveContainer" containerID="b9451c93e67195b53473de9f49f9e75a0b20fd899d008fc9d95b12dc527e5eca" Dec 12 01:57:38 crc kubenswrapper[4606]: E1212 01:57:38.700574 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 01:57:50 crc kubenswrapper[4606]: I1212 01:57:50.699683 4606 scope.go:117] "RemoveContainer" containerID="b9451c93e67195b53473de9f49f9e75a0b20fd899d008fc9d95b12dc527e5eca" Dec 12 01:57:50 crc kubenswrapper[4606]: E1212 01:57:50.700397 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 01:57:58 crc kubenswrapper[4606]: I1212 01:57:58.380471 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-n8fp5"] Dec 12 01:57:58 crc kubenswrapper[4606]: E1212 01:57:58.381734 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19ac4ab0-ce2f-4b83-bdda-17389bf729ca" containerName="extract-content" Dec 12 01:57:58 crc kubenswrapper[4606]: I1212 01:57:58.381755 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="19ac4ab0-ce2f-4b83-bdda-17389bf729ca" containerName="extract-content" Dec 12 01:57:58 crc kubenswrapper[4606]: E1212 01:57:58.381789 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19ac4ab0-ce2f-4b83-bdda-17389bf729ca" containerName="extract-utilities" Dec 12 01:57:58 crc kubenswrapper[4606]: I1212 01:57:58.381802 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="19ac4ab0-ce2f-4b83-bdda-17389bf729ca" containerName="extract-utilities" Dec 12 01:57:58 crc kubenswrapper[4606]: E1212 01:57:58.381825 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19ac4ab0-ce2f-4b83-bdda-17389bf729ca" containerName="registry-server" Dec 12 01:57:58 crc kubenswrapper[4606]: I1212 01:57:58.381840 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="19ac4ab0-ce2f-4b83-bdda-17389bf729ca" containerName="registry-server" Dec 12 01:57:58 crc kubenswrapper[4606]: I1212 01:57:58.382225 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="19ac4ab0-ce2f-4b83-bdda-17389bf729ca" containerName="registry-server" Dec 12 01:57:58 crc kubenswrapper[4606]: I1212 01:57:58.384581 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n8fp5" Dec 12 01:57:58 crc kubenswrapper[4606]: I1212 01:57:58.407262 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-n8fp5"] Dec 12 01:57:58 crc kubenswrapper[4606]: I1212 01:57:58.449593 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5bc78d7-2f72-4dbb-a106-75e40d7a8431-utilities\") pod \"redhat-operators-n8fp5\" (UID: \"b5bc78d7-2f72-4dbb-a106-75e40d7a8431\") " pod="openshift-marketplace/redhat-operators-n8fp5" Dec 12 01:57:58 crc kubenswrapper[4606]: I1212 01:57:58.449920 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5bc78d7-2f72-4dbb-a106-75e40d7a8431-catalog-content\") pod \"redhat-operators-n8fp5\" (UID: \"b5bc78d7-2f72-4dbb-a106-75e40d7a8431\") " pod="openshift-marketplace/redhat-operators-n8fp5" Dec 12 01:57:58 crc kubenswrapper[4606]: I1212 01:57:58.450071 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kf7q8\" (UniqueName: \"kubernetes.io/projected/b5bc78d7-2f72-4dbb-a106-75e40d7a8431-kube-api-access-kf7q8\") pod \"redhat-operators-n8fp5\" (UID: \"b5bc78d7-2f72-4dbb-a106-75e40d7a8431\") " pod="openshift-marketplace/redhat-operators-n8fp5" Dec 12 01:57:58 crc kubenswrapper[4606]: I1212 01:57:58.552472 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5bc78d7-2f72-4dbb-a106-75e40d7a8431-utilities\") pod \"redhat-operators-n8fp5\" (UID: \"b5bc78d7-2f72-4dbb-a106-75e40d7a8431\") " pod="openshift-marketplace/redhat-operators-n8fp5" Dec 12 01:57:58 crc kubenswrapper[4606]: I1212 01:57:58.552769 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5bc78d7-2f72-4dbb-a106-75e40d7a8431-catalog-content\") pod \"redhat-operators-n8fp5\" (UID: \"b5bc78d7-2f72-4dbb-a106-75e40d7a8431\") " pod="openshift-marketplace/redhat-operators-n8fp5" Dec 12 01:57:58 crc kubenswrapper[4606]: I1212 01:57:58.552796 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kf7q8\" (UniqueName: \"kubernetes.io/projected/b5bc78d7-2f72-4dbb-a106-75e40d7a8431-kube-api-access-kf7q8\") pod \"redhat-operators-n8fp5\" (UID: \"b5bc78d7-2f72-4dbb-a106-75e40d7a8431\") " pod="openshift-marketplace/redhat-operators-n8fp5" Dec 12 01:57:58 crc kubenswrapper[4606]: I1212 01:57:58.553407 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5bc78d7-2f72-4dbb-a106-75e40d7a8431-utilities\") pod \"redhat-operators-n8fp5\" (UID: \"b5bc78d7-2f72-4dbb-a106-75e40d7a8431\") " pod="openshift-marketplace/redhat-operators-n8fp5" Dec 12 01:57:58 crc kubenswrapper[4606]: I1212 01:57:58.553410 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5bc78d7-2f72-4dbb-a106-75e40d7a8431-catalog-content\") pod \"redhat-operators-n8fp5\" (UID: \"b5bc78d7-2f72-4dbb-a106-75e40d7a8431\") " pod="openshift-marketplace/redhat-operators-n8fp5" Dec 12 01:57:58 crc kubenswrapper[4606]: I1212 01:57:58.575163 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kf7q8\" (UniqueName: \"kubernetes.io/projected/b5bc78d7-2f72-4dbb-a106-75e40d7a8431-kube-api-access-kf7q8\") pod \"redhat-operators-n8fp5\" (UID: \"b5bc78d7-2f72-4dbb-a106-75e40d7a8431\") " pod="openshift-marketplace/redhat-operators-n8fp5" Dec 12 01:57:58 crc kubenswrapper[4606]: I1212 01:57:58.717548 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n8fp5" Dec 12 01:57:59 crc kubenswrapper[4606]: I1212 01:57:59.260426 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-n8fp5"] Dec 12 01:57:59 crc kubenswrapper[4606]: I1212 01:57:59.919300 4606 generic.go:334] "Generic (PLEG): container finished" podID="b5bc78d7-2f72-4dbb-a106-75e40d7a8431" containerID="3eef6e6b2251159eb0e388bf077fc285d93b2be4f44fea6ebe6a48443279c0a2" exitCode=0 Dec 12 01:57:59 crc kubenswrapper[4606]: I1212 01:57:59.919381 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n8fp5" event={"ID":"b5bc78d7-2f72-4dbb-a106-75e40d7a8431","Type":"ContainerDied","Data":"3eef6e6b2251159eb0e388bf077fc285d93b2be4f44fea6ebe6a48443279c0a2"} Dec 12 01:57:59 crc kubenswrapper[4606]: I1212 01:57:59.919710 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n8fp5" event={"ID":"b5bc78d7-2f72-4dbb-a106-75e40d7a8431","Type":"ContainerStarted","Data":"142ace09cfb1bd5b6b04d32f731b4976f7cc330547b3f2bfaa5977cedafa86de"} Dec 12 01:57:59 crc kubenswrapper[4606]: I1212 01:57:59.921226 4606 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 12 01:58:00 crc kubenswrapper[4606]: I1212 01:58:00.932516 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n8fp5" event={"ID":"b5bc78d7-2f72-4dbb-a106-75e40d7a8431","Type":"ContainerStarted","Data":"08e9986973ba3752e77162a87833970e9b78fee9fc52fb90fb5435961ca53ade"} Dec 12 01:58:03 crc kubenswrapper[4606]: I1212 01:58:03.704202 4606 scope.go:117] "RemoveContainer" containerID="b9451c93e67195b53473de9f49f9e75a0b20fd899d008fc9d95b12dc527e5eca" Dec 12 01:58:03 crc kubenswrapper[4606]: E1212 01:58:03.705819 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 01:58:04 crc kubenswrapper[4606]: I1212 01:58:04.976715 4606 generic.go:334] "Generic (PLEG): container finished" podID="b5bc78d7-2f72-4dbb-a106-75e40d7a8431" containerID="08e9986973ba3752e77162a87833970e9b78fee9fc52fb90fb5435961ca53ade" exitCode=0 Dec 12 01:58:04 crc kubenswrapper[4606]: I1212 01:58:04.976754 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n8fp5" event={"ID":"b5bc78d7-2f72-4dbb-a106-75e40d7a8431","Type":"ContainerDied","Data":"08e9986973ba3752e77162a87833970e9b78fee9fc52fb90fb5435961ca53ade"} Dec 12 01:58:05 crc kubenswrapper[4606]: I1212 01:58:05.989684 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n8fp5" event={"ID":"b5bc78d7-2f72-4dbb-a106-75e40d7a8431","Type":"ContainerStarted","Data":"a2ab53e1597448252a1e71634255b20fb715ad7270527ae26d0fd97ff413c441"} Dec 12 01:58:06 crc kubenswrapper[4606]: I1212 01:58:06.018980 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-n8fp5" podStartSLOduration=2.260764002 podStartE2EDuration="8.018945649s" podCreationTimestamp="2025-12-12 01:57:58 +0000 UTC" firstStartedPulling="2025-12-12 01:57:59.921011386 +0000 UTC m=+5670.466364252" lastFinishedPulling="2025-12-12 01:58:05.679193023 +0000 UTC m=+5676.224545899" observedRunningTime="2025-12-12 01:58:06.015324613 +0000 UTC m=+5676.560677519" watchObservedRunningTime="2025-12-12 01:58:06.018945649 +0000 UTC m=+5676.564298515" Dec 12 01:58:08 crc kubenswrapper[4606]: I1212 01:58:08.718634 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-n8fp5" Dec 12 01:58:08 crc kubenswrapper[4606]: I1212 01:58:08.719334 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-n8fp5" Dec 12 01:58:09 crc kubenswrapper[4606]: I1212 01:58:09.786971 4606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-n8fp5" podUID="b5bc78d7-2f72-4dbb-a106-75e40d7a8431" containerName="registry-server" probeResult="failure" output=< Dec 12 01:58:09 crc kubenswrapper[4606]: timeout: failed to connect service ":50051" within 1s Dec 12 01:58:09 crc kubenswrapper[4606]: > Dec 12 01:58:16 crc kubenswrapper[4606]: I1212 01:58:16.700520 4606 scope.go:117] "RemoveContainer" containerID="b9451c93e67195b53473de9f49f9e75a0b20fd899d008fc9d95b12dc527e5eca" Dec 12 01:58:16 crc kubenswrapper[4606]: E1212 01:58:16.701876 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 01:58:18 crc kubenswrapper[4606]: I1212 01:58:18.783877 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-n8fp5" Dec 12 01:58:18 crc kubenswrapper[4606]: I1212 01:58:18.851558 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-n8fp5" Dec 12 01:58:19 crc kubenswrapper[4606]: I1212 01:58:19.037517 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-n8fp5"] Dec 12 01:58:20 crc kubenswrapper[4606]: I1212 01:58:20.154479 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-n8fp5" podUID="b5bc78d7-2f72-4dbb-a106-75e40d7a8431" containerName="registry-server" containerID="cri-o://a2ab53e1597448252a1e71634255b20fb715ad7270527ae26d0fd97ff413c441" gracePeriod=2 Dec 12 01:58:20 crc kubenswrapper[4606]: I1212 01:58:20.654613 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n8fp5" Dec 12 01:58:20 crc kubenswrapper[4606]: I1212 01:58:20.786530 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5bc78d7-2f72-4dbb-a106-75e40d7a8431-utilities\") pod \"b5bc78d7-2f72-4dbb-a106-75e40d7a8431\" (UID: \"b5bc78d7-2f72-4dbb-a106-75e40d7a8431\") " Dec 12 01:58:20 crc kubenswrapper[4606]: I1212 01:58:20.787005 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5bc78d7-2f72-4dbb-a106-75e40d7a8431-catalog-content\") pod \"b5bc78d7-2f72-4dbb-a106-75e40d7a8431\" (UID: \"b5bc78d7-2f72-4dbb-a106-75e40d7a8431\") " Dec 12 01:58:20 crc kubenswrapper[4606]: I1212 01:58:20.787271 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kf7q8\" (UniqueName: \"kubernetes.io/projected/b5bc78d7-2f72-4dbb-a106-75e40d7a8431-kube-api-access-kf7q8\") pod \"b5bc78d7-2f72-4dbb-a106-75e40d7a8431\" (UID: \"b5bc78d7-2f72-4dbb-a106-75e40d7a8431\") " Dec 12 01:58:20 crc kubenswrapper[4606]: I1212 01:58:20.789473 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5bc78d7-2f72-4dbb-a106-75e40d7a8431-utilities" (OuterVolumeSpecName: "utilities") pod "b5bc78d7-2f72-4dbb-a106-75e40d7a8431" (UID: "b5bc78d7-2f72-4dbb-a106-75e40d7a8431"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 01:58:20 crc kubenswrapper[4606]: I1212 01:58:20.797721 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5bc78d7-2f72-4dbb-a106-75e40d7a8431-kube-api-access-kf7q8" (OuterVolumeSpecName: "kube-api-access-kf7q8") pod "b5bc78d7-2f72-4dbb-a106-75e40d7a8431" (UID: "b5bc78d7-2f72-4dbb-a106-75e40d7a8431"). InnerVolumeSpecName "kube-api-access-kf7q8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 01:58:20 crc kubenswrapper[4606]: I1212 01:58:20.888838 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kf7q8\" (UniqueName: \"kubernetes.io/projected/b5bc78d7-2f72-4dbb-a106-75e40d7a8431-kube-api-access-kf7q8\") on node \"crc\" DevicePath \"\"" Dec 12 01:58:20 crc kubenswrapper[4606]: I1212 01:58:20.888868 4606 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5bc78d7-2f72-4dbb-a106-75e40d7a8431-utilities\") on node \"crc\" DevicePath \"\"" Dec 12 01:58:20 crc kubenswrapper[4606]: I1212 01:58:20.937164 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5bc78d7-2f72-4dbb-a106-75e40d7a8431-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b5bc78d7-2f72-4dbb-a106-75e40d7a8431" (UID: "b5bc78d7-2f72-4dbb-a106-75e40d7a8431"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 01:58:20 crc kubenswrapper[4606]: I1212 01:58:20.990575 4606 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5bc78d7-2f72-4dbb-a106-75e40d7a8431-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 12 01:58:21 crc kubenswrapper[4606]: I1212 01:58:21.166816 4606 generic.go:334] "Generic (PLEG): container finished" podID="b5bc78d7-2f72-4dbb-a106-75e40d7a8431" containerID="a2ab53e1597448252a1e71634255b20fb715ad7270527ae26d0fd97ff413c441" exitCode=0 Dec 12 01:58:21 crc kubenswrapper[4606]: I1212 01:58:21.167009 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n8fp5" event={"ID":"b5bc78d7-2f72-4dbb-a106-75e40d7a8431","Type":"ContainerDied","Data":"a2ab53e1597448252a1e71634255b20fb715ad7270527ae26d0fd97ff413c441"} Dec 12 01:58:21 crc kubenswrapper[4606]: I1212 01:58:21.167146 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n8fp5" Dec 12 01:58:21 crc kubenswrapper[4606]: I1212 01:58:21.167923 4606 scope.go:117] "RemoveContainer" containerID="a2ab53e1597448252a1e71634255b20fb715ad7270527ae26d0fd97ff413c441" Dec 12 01:58:21 crc kubenswrapper[4606]: I1212 01:58:21.167802 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n8fp5" event={"ID":"b5bc78d7-2f72-4dbb-a106-75e40d7a8431","Type":"ContainerDied","Data":"142ace09cfb1bd5b6b04d32f731b4976f7cc330547b3f2bfaa5977cedafa86de"} Dec 12 01:58:21 crc kubenswrapper[4606]: I1212 01:58:21.206824 4606 scope.go:117] "RemoveContainer" containerID="08e9986973ba3752e77162a87833970e9b78fee9fc52fb90fb5435961ca53ade" Dec 12 01:58:21 crc kubenswrapper[4606]: I1212 01:58:21.227515 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-n8fp5"] Dec 12 01:58:21 crc kubenswrapper[4606]: I1212 01:58:21.250528 4606 scope.go:117] "RemoveContainer" containerID="3eef6e6b2251159eb0e388bf077fc285d93b2be4f44fea6ebe6a48443279c0a2" Dec 12 01:58:21 crc kubenswrapper[4606]: I1212 01:58:21.254339 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-n8fp5"] Dec 12 01:58:21 crc kubenswrapper[4606]: I1212 01:58:21.294289 4606 scope.go:117] "RemoveContainer" containerID="a2ab53e1597448252a1e71634255b20fb715ad7270527ae26d0fd97ff413c441" Dec 12 01:58:21 crc kubenswrapper[4606]: E1212 01:58:21.295823 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2ab53e1597448252a1e71634255b20fb715ad7270527ae26d0fd97ff413c441\": container with ID starting with a2ab53e1597448252a1e71634255b20fb715ad7270527ae26d0fd97ff413c441 not found: ID does not exist" containerID="a2ab53e1597448252a1e71634255b20fb715ad7270527ae26d0fd97ff413c441" Dec 12 01:58:21 crc kubenswrapper[4606]: I1212 01:58:21.295964 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2ab53e1597448252a1e71634255b20fb715ad7270527ae26d0fd97ff413c441"} err="failed to get container status \"a2ab53e1597448252a1e71634255b20fb715ad7270527ae26d0fd97ff413c441\": rpc error: code = NotFound desc = could not find container \"a2ab53e1597448252a1e71634255b20fb715ad7270527ae26d0fd97ff413c441\": container with ID starting with a2ab53e1597448252a1e71634255b20fb715ad7270527ae26d0fd97ff413c441 not found: ID does not exist" Dec 12 01:58:21 crc kubenswrapper[4606]: I1212 01:58:21.296069 4606 scope.go:117] "RemoveContainer" containerID="08e9986973ba3752e77162a87833970e9b78fee9fc52fb90fb5435961ca53ade" Dec 12 01:58:21 crc kubenswrapper[4606]: E1212 01:58:21.296569 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08e9986973ba3752e77162a87833970e9b78fee9fc52fb90fb5435961ca53ade\": container with ID starting with 08e9986973ba3752e77162a87833970e9b78fee9fc52fb90fb5435961ca53ade not found: ID does not exist" containerID="08e9986973ba3752e77162a87833970e9b78fee9fc52fb90fb5435961ca53ade" Dec 12 01:58:21 crc kubenswrapper[4606]: I1212 01:58:21.296603 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08e9986973ba3752e77162a87833970e9b78fee9fc52fb90fb5435961ca53ade"} err="failed to get container status \"08e9986973ba3752e77162a87833970e9b78fee9fc52fb90fb5435961ca53ade\": rpc error: code = NotFound desc = could not find container \"08e9986973ba3752e77162a87833970e9b78fee9fc52fb90fb5435961ca53ade\": container with ID starting with 08e9986973ba3752e77162a87833970e9b78fee9fc52fb90fb5435961ca53ade not found: ID does not exist" Dec 12 01:58:21 crc kubenswrapper[4606]: I1212 01:58:21.296626 4606 scope.go:117] "RemoveContainer" containerID="3eef6e6b2251159eb0e388bf077fc285d93b2be4f44fea6ebe6a48443279c0a2" Dec 12 01:58:21 crc kubenswrapper[4606]: E1212 01:58:21.296934 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3eef6e6b2251159eb0e388bf077fc285d93b2be4f44fea6ebe6a48443279c0a2\": container with ID starting with 3eef6e6b2251159eb0e388bf077fc285d93b2be4f44fea6ebe6a48443279c0a2 not found: ID does not exist" containerID="3eef6e6b2251159eb0e388bf077fc285d93b2be4f44fea6ebe6a48443279c0a2" Dec 12 01:58:21 crc kubenswrapper[4606]: I1212 01:58:21.296957 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3eef6e6b2251159eb0e388bf077fc285d93b2be4f44fea6ebe6a48443279c0a2"} err="failed to get container status \"3eef6e6b2251159eb0e388bf077fc285d93b2be4f44fea6ebe6a48443279c0a2\": rpc error: code = NotFound desc = could not find container \"3eef6e6b2251159eb0e388bf077fc285d93b2be4f44fea6ebe6a48443279c0a2\": container with ID starting with 3eef6e6b2251159eb0e388bf077fc285d93b2be4f44fea6ebe6a48443279c0a2 not found: ID does not exist" Dec 12 01:58:21 crc kubenswrapper[4606]: I1212 01:58:21.720561 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5bc78d7-2f72-4dbb-a106-75e40d7a8431" path="/var/lib/kubelet/pods/b5bc78d7-2f72-4dbb-a106-75e40d7a8431/volumes" Dec 12 01:58:31 crc kubenswrapper[4606]: I1212 01:58:31.699786 4606 scope.go:117] "RemoveContainer" containerID="b9451c93e67195b53473de9f49f9e75a0b20fd899d008fc9d95b12dc527e5eca" Dec 12 01:58:31 crc kubenswrapper[4606]: E1212 01:58:31.700484 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 01:58:42 crc kubenswrapper[4606]: I1212 01:58:42.699572 4606 scope.go:117] "RemoveContainer" containerID="b9451c93e67195b53473de9f49f9e75a0b20fd899d008fc9d95b12dc527e5eca" Dec 12 01:58:42 crc kubenswrapper[4606]: E1212 01:58:42.700346 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 01:58:52 crc kubenswrapper[4606]: I1212 01:58:52.893865 4606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-swvkl"] Dec 12 01:58:52 crc kubenswrapper[4606]: E1212 01:58:52.896668 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5bc78d7-2f72-4dbb-a106-75e40d7a8431" containerName="registry-server" Dec 12 01:58:52 crc kubenswrapper[4606]: I1212 01:58:52.896708 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5bc78d7-2f72-4dbb-a106-75e40d7a8431" containerName="registry-server" Dec 12 01:58:52 crc kubenswrapper[4606]: E1212 01:58:52.896742 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5bc78d7-2f72-4dbb-a106-75e40d7a8431" containerName="extract-content" Dec 12 01:58:52 crc kubenswrapper[4606]: I1212 01:58:52.896752 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5bc78d7-2f72-4dbb-a106-75e40d7a8431" containerName="extract-content" Dec 12 01:58:52 crc kubenswrapper[4606]: E1212 01:58:52.896778 4606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5bc78d7-2f72-4dbb-a106-75e40d7a8431" containerName="extract-utilities" Dec 12 01:58:52 crc kubenswrapper[4606]: I1212 01:58:52.896787 4606 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5bc78d7-2f72-4dbb-a106-75e40d7a8431" containerName="extract-utilities" Dec 12 01:58:52 crc kubenswrapper[4606]: I1212 01:58:52.897043 4606 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5bc78d7-2f72-4dbb-a106-75e40d7a8431" containerName="registry-server" Dec 12 01:58:52 crc kubenswrapper[4606]: I1212 01:58:52.898864 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-swvkl" Dec 12 01:58:52 crc kubenswrapper[4606]: I1212 01:58:52.904394 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-swvkl"] Dec 12 01:58:53 crc kubenswrapper[4606]: I1212 01:58:53.000077 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/594610e2-7cb3-447f-91fe-89e0ed9d5720-utilities\") pod \"community-operators-swvkl\" (UID: \"594610e2-7cb3-447f-91fe-89e0ed9d5720\") " pod="openshift-marketplace/community-operators-swvkl" Dec 12 01:58:53 crc kubenswrapper[4606]: I1212 01:58:53.000151 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/594610e2-7cb3-447f-91fe-89e0ed9d5720-catalog-content\") pod \"community-operators-swvkl\" (UID: \"594610e2-7cb3-447f-91fe-89e0ed9d5720\") " pod="openshift-marketplace/community-operators-swvkl" Dec 12 01:58:53 crc kubenswrapper[4606]: I1212 01:58:53.000648 4606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vwd4\" (UniqueName: \"kubernetes.io/projected/594610e2-7cb3-447f-91fe-89e0ed9d5720-kube-api-access-2vwd4\") pod \"community-operators-swvkl\" (UID: \"594610e2-7cb3-447f-91fe-89e0ed9d5720\") " pod="openshift-marketplace/community-operators-swvkl" Dec 12 01:58:53 crc kubenswrapper[4606]: I1212 01:58:53.102020 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2vwd4\" (UniqueName: \"kubernetes.io/projected/594610e2-7cb3-447f-91fe-89e0ed9d5720-kube-api-access-2vwd4\") pod \"community-operators-swvkl\" (UID: \"594610e2-7cb3-447f-91fe-89e0ed9d5720\") " pod="openshift-marketplace/community-operators-swvkl" Dec 12 01:58:53 crc kubenswrapper[4606]: I1212 01:58:53.102145 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/594610e2-7cb3-447f-91fe-89e0ed9d5720-utilities\") pod \"community-operators-swvkl\" (UID: \"594610e2-7cb3-447f-91fe-89e0ed9d5720\") " pod="openshift-marketplace/community-operators-swvkl" Dec 12 01:58:53 crc kubenswrapper[4606]: I1212 01:58:53.102175 4606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/594610e2-7cb3-447f-91fe-89e0ed9d5720-catalog-content\") pod \"community-operators-swvkl\" (UID: \"594610e2-7cb3-447f-91fe-89e0ed9d5720\") " pod="openshift-marketplace/community-operators-swvkl" Dec 12 01:58:53 crc kubenswrapper[4606]: I1212 01:58:53.102756 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/594610e2-7cb3-447f-91fe-89e0ed9d5720-catalog-content\") pod \"community-operators-swvkl\" (UID: \"594610e2-7cb3-447f-91fe-89e0ed9d5720\") " pod="openshift-marketplace/community-operators-swvkl" Dec 12 01:58:53 crc kubenswrapper[4606]: I1212 01:58:53.102804 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/594610e2-7cb3-447f-91fe-89e0ed9d5720-utilities\") pod \"community-operators-swvkl\" (UID: \"594610e2-7cb3-447f-91fe-89e0ed9d5720\") " pod="openshift-marketplace/community-operators-swvkl" Dec 12 01:58:53 crc kubenswrapper[4606]: I1212 01:58:53.124384 4606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vwd4\" (UniqueName: \"kubernetes.io/projected/594610e2-7cb3-447f-91fe-89e0ed9d5720-kube-api-access-2vwd4\") pod \"community-operators-swvkl\" (UID: \"594610e2-7cb3-447f-91fe-89e0ed9d5720\") " pod="openshift-marketplace/community-operators-swvkl" Dec 12 01:58:53 crc kubenswrapper[4606]: I1212 01:58:53.237579 4606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-swvkl" Dec 12 01:58:53 crc kubenswrapper[4606]: I1212 01:58:53.824995 4606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-swvkl"] Dec 12 01:58:54 crc kubenswrapper[4606]: I1212 01:58:54.502390 4606 generic.go:334] "Generic (PLEG): container finished" podID="594610e2-7cb3-447f-91fe-89e0ed9d5720" containerID="f3452dea4a73c6ded14fa1066b22e76c22de461d7a55ef6811f3f08cbf8ac58d" exitCode=0 Dec 12 01:58:54 crc kubenswrapper[4606]: I1212 01:58:54.502469 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-swvkl" event={"ID":"594610e2-7cb3-447f-91fe-89e0ed9d5720","Type":"ContainerDied","Data":"f3452dea4a73c6ded14fa1066b22e76c22de461d7a55ef6811f3f08cbf8ac58d"} Dec 12 01:58:54 crc kubenswrapper[4606]: I1212 01:58:54.502660 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-swvkl" event={"ID":"594610e2-7cb3-447f-91fe-89e0ed9d5720","Type":"ContainerStarted","Data":"68ff662a9db169a0f59a9da0e642a9ec59e1e30394d1ea3896d4b5121bd696eb"} Dec 12 01:58:55 crc kubenswrapper[4606]: I1212 01:58:55.514422 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-swvkl" event={"ID":"594610e2-7cb3-447f-91fe-89e0ed9d5720","Type":"ContainerStarted","Data":"323b42e9644876a6b77fdced6040e87addac15f1477be489ee678e20daba3b26"} Dec 12 01:58:56 crc kubenswrapper[4606]: I1212 01:58:56.527321 4606 generic.go:334] "Generic (PLEG): container finished" podID="594610e2-7cb3-447f-91fe-89e0ed9d5720" containerID="323b42e9644876a6b77fdced6040e87addac15f1477be489ee678e20daba3b26" exitCode=0 Dec 12 01:58:56 crc kubenswrapper[4606]: I1212 01:58:56.527513 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-swvkl" event={"ID":"594610e2-7cb3-447f-91fe-89e0ed9d5720","Type":"ContainerDied","Data":"323b42e9644876a6b77fdced6040e87addac15f1477be489ee678e20daba3b26"} Dec 12 01:58:56 crc kubenswrapper[4606]: I1212 01:58:56.700899 4606 scope.go:117] "RemoveContainer" containerID="b9451c93e67195b53473de9f49f9e75a0b20fd899d008fc9d95b12dc527e5eca" Dec 12 01:58:56 crc kubenswrapper[4606]: E1212 01:58:56.701417 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921" Dec 12 01:58:57 crc kubenswrapper[4606]: I1212 01:58:57.542378 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-swvkl" event={"ID":"594610e2-7cb3-447f-91fe-89e0ed9d5720","Type":"ContainerStarted","Data":"1a6d54e72d1887529bcb56ee2da76fba0782c71835b53117da0a21672bf08932"} Dec 12 01:58:57 crc kubenswrapper[4606]: I1212 01:58:57.563724 4606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-swvkl" podStartSLOduration=2.834420776 podStartE2EDuration="5.563704602s" podCreationTimestamp="2025-12-12 01:58:52 +0000 UTC" firstStartedPulling="2025-12-12 01:58:54.508110115 +0000 UTC m=+5725.053463001" lastFinishedPulling="2025-12-12 01:58:57.237393961 +0000 UTC m=+5727.782746827" observedRunningTime="2025-12-12 01:58:57.55759258 +0000 UTC m=+5728.102945446" watchObservedRunningTime="2025-12-12 01:58:57.563704602 +0000 UTC m=+5728.109057478" Dec 12 01:59:03 crc kubenswrapper[4606]: I1212 01:59:03.237887 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-swvkl" Dec 12 01:59:03 crc kubenswrapper[4606]: I1212 01:59:03.238670 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-swvkl" Dec 12 01:59:03 crc kubenswrapper[4606]: I1212 01:59:03.311276 4606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-swvkl" Dec 12 01:59:03 crc kubenswrapper[4606]: I1212 01:59:03.725030 4606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-swvkl" Dec 12 01:59:03 crc kubenswrapper[4606]: I1212 01:59:03.778018 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-swvkl"] Dec 12 01:59:05 crc kubenswrapper[4606]: I1212 01:59:05.688315 4606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-swvkl" podUID="594610e2-7cb3-447f-91fe-89e0ed9d5720" containerName="registry-server" containerID="cri-o://1a6d54e72d1887529bcb56ee2da76fba0782c71835b53117da0a21672bf08932" gracePeriod=2 Dec 12 01:59:06 crc kubenswrapper[4606]: I1212 01:59:06.209254 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-swvkl" Dec 12 01:59:06 crc kubenswrapper[4606]: I1212 01:59:06.322155 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/594610e2-7cb3-447f-91fe-89e0ed9d5720-utilities\") pod \"594610e2-7cb3-447f-91fe-89e0ed9d5720\" (UID: \"594610e2-7cb3-447f-91fe-89e0ed9d5720\") " Dec 12 01:59:06 crc kubenswrapper[4606]: I1212 01:59:06.322280 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/594610e2-7cb3-447f-91fe-89e0ed9d5720-catalog-content\") pod \"594610e2-7cb3-447f-91fe-89e0ed9d5720\" (UID: \"594610e2-7cb3-447f-91fe-89e0ed9d5720\") " Dec 12 01:59:06 crc kubenswrapper[4606]: I1212 01:59:06.322453 4606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2vwd4\" (UniqueName: \"kubernetes.io/projected/594610e2-7cb3-447f-91fe-89e0ed9d5720-kube-api-access-2vwd4\") pod \"594610e2-7cb3-447f-91fe-89e0ed9d5720\" (UID: \"594610e2-7cb3-447f-91fe-89e0ed9d5720\") " Dec 12 01:59:06 crc kubenswrapper[4606]: I1212 01:59:06.323444 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/594610e2-7cb3-447f-91fe-89e0ed9d5720-utilities" (OuterVolumeSpecName: "utilities") pod "594610e2-7cb3-447f-91fe-89e0ed9d5720" (UID: "594610e2-7cb3-447f-91fe-89e0ed9d5720"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 01:59:06 crc kubenswrapper[4606]: I1212 01:59:06.335261 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/594610e2-7cb3-447f-91fe-89e0ed9d5720-kube-api-access-2vwd4" (OuterVolumeSpecName: "kube-api-access-2vwd4") pod "594610e2-7cb3-447f-91fe-89e0ed9d5720" (UID: "594610e2-7cb3-447f-91fe-89e0ed9d5720"). InnerVolumeSpecName "kube-api-access-2vwd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 12 01:59:06 crc kubenswrapper[4606]: I1212 01:59:06.393034 4606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/594610e2-7cb3-447f-91fe-89e0ed9d5720-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "594610e2-7cb3-447f-91fe-89e0ed9d5720" (UID: "594610e2-7cb3-447f-91fe-89e0ed9d5720"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 12 01:59:06 crc kubenswrapper[4606]: I1212 01:59:06.424912 4606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2vwd4\" (UniqueName: \"kubernetes.io/projected/594610e2-7cb3-447f-91fe-89e0ed9d5720-kube-api-access-2vwd4\") on node \"crc\" DevicePath \"\"" Dec 12 01:59:06 crc kubenswrapper[4606]: I1212 01:59:06.424950 4606 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/594610e2-7cb3-447f-91fe-89e0ed9d5720-utilities\") on node \"crc\" DevicePath \"\"" Dec 12 01:59:06 crc kubenswrapper[4606]: I1212 01:59:06.424963 4606 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/594610e2-7cb3-447f-91fe-89e0ed9d5720-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 12 01:59:06 crc kubenswrapper[4606]: I1212 01:59:06.705621 4606 generic.go:334] "Generic (PLEG): container finished" podID="594610e2-7cb3-447f-91fe-89e0ed9d5720" containerID="1a6d54e72d1887529bcb56ee2da76fba0782c71835b53117da0a21672bf08932" exitCode=0 Dec 12 01:59:06 crc kubenswrapper[4606]: I1212 01:59:06.705683 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-swvkl" event={"ID":"594610e2-7cb3-447f-91fe-89e0ed9d5720","Type":"ContainerDied","Data":"1a6d54e72d1887529bcb56ee2da76fba0782c71835b53117da0a21672bf08932"} Dec 12 01:59:06 crc kubenswrapper[4606]: I1212 01:59:06.705721 4606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-swvkl" event={"ID":"594610e2-7cb3-447f-91fe-89e0ed9d5720","Type":"ContainerDied","Data":"68ff662a9db169a0f59a9da0e642a9ec59e1e30394d1ea3896d4b5121bd696eb"} Dec 12 01:59:06 crc kubenswrapper[4606]: I1212 01:59:06.705750 4606 scope.go:117] "RemoveContainer" containerID="1a6d54e72d1887529bcb56ee2da76fba0782c71835b53117da0a21672bf08932" Dec 12 01:59:06 crc kubenswrapper[4606]: I1212 01:59:06.705749 4606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-swvkl" Dec 12 01:59:06 crc kubenswrapper[4606]: I1212 01:59:06.749434 4606 scope.go:117] "RemoveContainer" containerID="323b42e9644876a6b77fdced6040e87addac15f1477be489ee678e20daba3b26" Dec 12 01:59:06 crc kubenswrapper[4606]: I1212 01:59:06.767151 4606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-swvkl"] Dec 12 01:59:06 crc kubenswrapper[4606]: I1212 01:59:06.779530 4606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-swvkl"] Dec 12 01:59:06 crc kubenswrapper[4606]: I1212 01:59:06.780257 4606 scope.go:117] "RemoveContainer" containerID="f3452dea4a73c6ded14fa1066b22e76c22de461d7a55ef6811f3f08cbf8ac58d" Dec 12 01:59:06 crc kubenswrapper[4606]: I1212 01:59:06.837708 4606 scope.go:117] "RemoveContainer" containerID="1a6d54e72d1887529bcb56ee2da76fba0782c71835b53117da0a21672bf08932" Dec 12 01:59:06 crc kubenswrapper[4606]: E1212 01:59:06.838149 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a6d54e72d1887529bcb56ee2da76fba0782c71835b53117da0a21672bf08932\": container with ID starting with 1a6d54e72d1887529bcb56ee2da76fba0782c71835b53117da0a21672bf08932 not found: ID does not exist" containerID="1a6d54e72d1887529bcb56ee2da76fba0782c71835b53117da0a21672bf08932" Dec 12 01:59:06 crc kubenswrapper[4606]: I1212 01:59:06.838277 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a6d54e72d1887529bcb56ee2da76fba0782c71835b53117da0a21672bf08932"} err="failed to get container status \"1a6d54e72d1887529bcb56ee2da76fba0782c71835b53117da0a21672bf08932\": rpc error: code = NotFound desc = could not find container \"1a6d54e72d1887529bcb56ee2da76fba0782c71835b53117da0a21672bf08932\": container with ID starting with 1a6d54e72d1887529bcb56ee2da76fba0782c71835b53117da0a21672bf08932 not found: ID does not exist" Dec 12 01:59:06 crc kubenswrapper[4606]: I1212 01:59:06.838371 4606 scope.go:117] "RemoveContainer" containerID="323b42e9644876a6b77fdced6040e87addac15f1477be489ee678e20daba3b26" Dec 12 01:59:06 crc kubenswrapper[4606]: E1212 01:59:06.838888 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"323b42e9644876a6b77fdced6040e87addac15f1477be489ee678e20daba3b26\": container with ID starting with 323b42e9644876a6b77fdced6040e87addac15f1477be489ee678e20daba3b26 not found: ID does not exist" containerID="323b42e9644876a6b77fdced6040e87addac15f1477be489ee678e20daba3b26" Dec 12 01:59:06 crc kubenswrapper[4606]: I1212 01:59:06.838925 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"323b42e9644876a6b77fdced6040e87addac15f1477be489ee678e20daba3b26"} err="failed to get container status \"323b42e9644876a6b77fdced6040e87addac15f1477be489ee678e20daba3b26\": rpc error: code = NotFound desc = could not find container \"323b42e9644876a6b77fdced6040e87addac15f1477be489ee678e20daba3b26\": container with ID starting with 323b42e9644876a6b77fdced6040e87addac15f1477be489ee678e20daba3b26 not found: ID does not exist" Dec 12 01:59:06 crc kubenswrapper[4606]: I1212 01:59:06.838950 4606 scope.go:117] "RemoveContainer" containerID="f3452dea4a73c6ded14fa1066b22e76c22de461d7a55ef6811f3f08cbf8ac58d" Dec 12 01:59:06 crc kubenswrapper[4606]: E1212 01:59:06.839276 4606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3452dea4a73c6ded14fa1066b22e76c22de461d7a55ef6811f3f08cbf8ac58d\": container with ID starting with f3452dea4a73c6ded14fa1066b22e76c22de461d7a55ef6811f3f08cbf8ac58d not found: ID does not exist" containerID="f3452dea4a73c6ded14fa1066b22e76c22de461d7a55ef6811f3f08cbf8ac58d" Dec 12 01:59:06 crc kubenswrapper[4606]: I1212 01:59:06.839368 4606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3452dea4a73c6ded14fa1066b22e76c22de461d7a55ef6811f3f08cbf8ac58d"} err="failed to get container status \"f3452dea4a73c6ded14fa1066b22e76c22de461d7a55ef6811f3f08cbf8ac58d\": rpc error: code = NotFound desc = could not find container \"f3452dea4a73c6ded14fa1066b22e76c22de461d7a55ef6811f3f08cbf8ac58d\": container with ID starting with f3452dea4a73c6ded14fa1066b22e76c22de461d7a55ef6811f3f08cbf8ac58d not found: ID does not exist" Dec 12 01:59:07 crc kubenswrapper[4606]: I1212 01:59:07.714021 4606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="594610e2-7cb3-447f-91fe-89e0ed9d5720" path="/var/lib/kubelet/pods/594610e2-7cb3-447f-91fe-89e0ed9d5720/volumes" Dec 12 01:59:09 crc kubenswrapper[4606]: I1212 01:59:09.715415 4606 scope.go:117] "RemoveContainer" containerID="b9451c93e67195b53473de9f49f9e75a0b20fd899d008fc9d95b12dc527e5eca" Dec 12 01:59:09 crc kubenswrapper[4606]: E1212 01:59:09.715784 4606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cqmz5_openshift-machine-config-operator(a543e227-be89-40cb-941d-b4707cc28921)\"" pod="openshift-machine-config-operator/machine-config-daemon-cqmz5" podUID="a543e227-be89-40cb-941d-b4707cc28921"